WorldWideScience

Sample records for reliably define sufficient

  1. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.

    2017-04-17

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  2. Design of fuel cell powered data centers for sufficient reliability and availability

    Science.gov (United States)

    Ritchie, Alexa J.; Brouwer, Jacob

    2018-04-01

    It is challenging to design a sufficiently reliable fuel cell electrical system for use in data centers, which require 99.9999% uptime. Such a system could lower emissions and increase data center efficiency, but the reliability and availability of such a system must be analyzed and understood. Currently, extensive backup equipment is used to ensure electricity availability. The proposed design alternative uses multiple fuel cell systems each supporting a small number of servers to eliminate backup power equipment provided the fuel cell design has sufficient reliability and availability. Potential system designs are explored for the entire data center and for individual fuel cells. Reliability block diagram analysis of the fuel cell systems was accomplished to understand the reliability of the systems without repair or redundant technologies. From this analysis, it was apparent that redundant components would be necessary. A program was written in MATLAB to show that the desired system reliability could be achieved by a combination of parallel components, regardless of the number of additional components needed. Having shown that the desired reliability was achievable through some combination of components, a dynamic programming analysis was undertaken to assess the ideal allocation of parallel components.

  3. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A.; Milligan, Michael; Brinkman, Greg; Bloom, Aaron; Clark, Kara; Denholm, Paul

    2016-12-01

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about the suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.

  4. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  5. Definably compact groups definable in real closed fields. I

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We study definably compact definably connected groups definable in a sufficiently saturated real closed field $R$. We introduce the notion of group-generic point for $\\bigvee$-definable groups and show the existence of group-generic points for definably compact groups definable in a sufficiently saturated o-minimal expansion of a real closed field. We use this notion along with some properties of generic sets to prove that for every definably compact definably connected group $G$ definable in...

  6. Reliable and Fault-Tolerant Software-Defined Network Operations Scheme for Remote 3D Printing

    Science.gov (United States)

    Kim, Dongkyun; Gil, Joon-Min

    2015-03-01

    The recent wide expansion of applicable three-dimensional (3D) printing and software-defined networking (SDN) technologies has led to a great deal of attention being focused on efficient remote control of manufacturing processes. SDN is a renowned paradigm for network softwarization, which has helped facilitate remote manufacturing in association with high network performance, since SDN is designed to control network paths and traffic flows, guaranteeing improved quality of services by obtaining network requests from end-applications on demand through the separated SDN controller or control plane. However, current SDN approaches are generally focused on the controls and automation of the networks, which indicates that there is a lack of management plane development designed for a reliable and fault-tolerant SDN environment. Therefore, in addition to the inherent advantage of SDN, this paper proposes a new software-defined network operations center (SD-NOC) architecture to strengthen the reliability and fault-tolerance of SDN in terms of network operations and management in particular. The cooperation and orchestration between SDN and SD-NOC are also introduced for the SDN failover processes based on four principal SDN breakdown scenarios derived from the failures of the controller, SDN nodes, and connected links. The abovementioned SDN troubles significantly reduce the network reachability to remote devices (e.g., 3D printers, super high-definition cameras, etc.) and the reliability of relevant control processes. Our performance consideration and analysis results show that the proposed scheme can shrink operations and management overheads of SDN, which leads to the enhancement of responsiveness and reliability of SDN for remote 3D printing and control processes.

  7. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps.

    Science.gov (United States)

    Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B

    2017-04-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.

  8. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    Science.gov (United States)

    Varikuti, Deepthi P.; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T.; Eickhoff, Simon B.

    2016-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that grey matter masking improved the reliability of connectivity estimates, whereas de-noising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources. PMID:27550015

  9. Improving risk assessment by defining consistent and reliable system scenarios

    Directory of Open Access Journals (Sweden)

    B. Mazzorana

    2009-02-01

    Full Text Available During the entire procedure of risk assessment for hydrologic hazards, the selection of consistent and reliable scenarios, constructed in a strictly systematic way, is fundamental for the quality and reproducibility of the results. However, subjective assumptions on relevant impact variables such as sediment transport intensity on the system loading side and weak point response mechanisms repeatedly cause biases in the results, and consequently affect transparency and required quality standards. Furthermore, the system response of mitigation measures to extreme event loadings represents another key variable in hazard assessment, as well as the integral risk management including intervention planning. Formative Scenario Analysis, as a supplement to conventional risk assessment methods, is a technique to construct well-defined sets of assumptions to gain insight into a specific case and the potential system behaviour. By two case studies, carried out (1 to analyse sediment transport dynamics in a torrent section equipped with control measures, and (2 to identify hazards induced by woody debris transport at hydraulic weak points, the applicability of the Formative Scenario Analysis technique is presented. It is argued that during scenario planning in general and with respect to integral risk management in particular, Formative Scenario Analysis allows for the development of reliable and reproducible scenarios in order to design more specifically an application framework for the sustainable assessment of natural hazards impact. The overall aim is to optimise the hazard mapping and zoning procedure by methodologically integrating quantitative and qualitative knowledge.

  10. A novel ontology approach to support design for reliability considering environmental effects.

    Science.gov (United States)

    Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.

  11. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  12. Procalcitonin is not sufficiently reliable to be the sole marker of neonatal sepsis of nosocomial origin

    Directory of Open Access Journals (Sweden)

    Moro Serrano Manuel

    2006-05-01

    Full Text Available Abstract Background It has recently been suggested that serum procalcitonin (PCT is of value in the diagnosis of neonatal sepsis, with varying results. The aim of this prospective multicenter study was to assess the usefulness of PCT as a marker of neonatal sepsis of nosocomial origin. Methods One hundred infants aged between 4 and 28 days of life admitted to the Neonatology Services of 13 acute-care teaching hospitals in Spain over 1-year with clinical suspicion of neonatal sepsis of nosocomial origin were included in the study. Serum PCT concentrations were determined by a specific immunoluminometric assay. The reliability of PCT for the diagnosis of nosocomial neonatal sepsis at the time of suspicion of infection and at 12–24 h and 36–48 h after the onset of symptoms was calculated by receiver-operating characteristics (ROC curves. The Youden's index (sensitivity + specificity - 1 was used for determination of optimal cutoff values of the diagnostic tests in the different postnatal periods. Sensitivity, specificity, and the likelihood ratio of a positive and negative result with the 95% confidence interval (CI were calculated. Results The diagnosis of nosocomial sepsis was confirmed in 61 neonates. Serum PCT concentrations were significantly higher at initial suspicion and at 12–24 h and 36–48 h after the onset of symptoms in neonates with confirmed sepsis than in neonates with clinically suspected but not confirmed sepsis. Optimal PCT thresholds according to ROC curves were 0.59 ng/mL at the time of suspicion of sepsis (sensitivity 81.4%, specificity 80.6%; 1.34 ng/mL within 12–24 h of birth (sensitivity 73.7%, specificity 80.6%, and 0.69 ng/mL within 36–48 h of birth (sensitivity 86.5%, specificity 72.7%. Conclusion Serum PCT concentrations showed a moderate diagnostic reliability for the detection of nosocomial neonatal sepsis from the time of suspicion of infection. PCT is not sufficiently reliable to be the sole marker of

  13. How do cognitively impaired elderly patients define "testament": reliability and validity of the testament definition scale.

    Science.gov (United States)

    Heinik, J; Werner, P; Lin, R

    1999-01-01

    The testament definition scale (TDS) is a specifically designed six-item scale aimed at measuring the respondent's capacity to define "testament." We assessed the reliability and validity of this new short scale in 31 community-dwelling cognitively impaired elderly patients. Interrater reliability for the six items ranged from .87 to .97. The interrater reliability for the total score was .77. Significant correlations were found between the TDS score and the Mini-Mental State Examination (MMSE) and the Cambridge Cognitive Examination scores (r = .71 and .72 respectively, p = .001). Criterion validity yielded significantly different means for subjects with MMSE scores of 24-30 and 0-23: mean 3.9 and 1.6 respectively (t(20) = 4.7, p = .001). Using a cutoff point of 0-2 vs. 3+, 79% of the subjects were correctly classified as severely cognitively impaired, with only 8.3% false positives, and a positive predictive value of 94%. Thus, TDS was found both reliable and valid. This scale, however, is not synonymous with testamentary capacity. The discussion deals with the methodological limitations of this study, and highlights the practical as well as the theoretical relevance of TDS. Future studies are warranted to elucidate the relationships between TDS and existing legal requirements of testamentary capacity.

  14. Increasing urban water self-sufficiency: New era, new challenges

    DEFF Research Database (Denmark)

    Rygaard, Martin; Binning, Philip John; Albrechtsen, Hans-Jørgen

    2011-01-01

    and 15 in-depth case studies, solutions used to increase water self-sufficiency in urban areas are analyzed. The main drivers for increased self-sufficiency were identified to be direct and indirect lack of water, constrained infrastructure, high quality water demands and commercial and institutional...... pressures. Case studies demonstrate increases in self-sufficiency ratios to as much as 80% with contributions from recycled water, seawater desalination and rainwater collection. The introduction of alternative water resources raises several challenges: energy requirements vary by more than a factor of ten...... amongst the alternative techniques, wastewater reclamation can lead to the appearance of trace contaminants in drinking water, and changes to the drinking water system can meet tough resistance from the public. Public water-supply managers aim to achieve a high level of reliability and stability. We...

  15. Economic efficiency or self-sufficiency: alternative strategies for oil consumers?

    International Nuclear Information System (INIS)

    Heal, D.W.

    1992-01-01

    The ideal energy source is low cost (efficient) and reliable (secure). The high price and perceived political unreliability of Middle East oil supplies prompted a nearly worldwide trend towards energy self-sufficiency. Gains in energy efficiency, which have been most marked in the OECD, are permanent and, prompted by environmental concern, probably progressive. But the opportunity that is still available to low cost oil suppliers to regain lost markets will only be realized if those supplies are demonstrably reliable. (author)

  16. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany

    2016-04-26

    This presentation provides an overview of new and ongoing NREL research that aims to improve our understanding of reliability and revenue sufficiency challenges through modeling tools within a markets framework.

  17. Deuterium-tritium fuel self-sufficiency in fusion reactors

    International Nuclear Information System (INIS)

    Abdou, M.A.; Vold, E.L.; Gung, C.Y.; Youssef, M.Z.; Shin, K.

    1986-01-01

    Conditions necessary to achieve deuterium-tritium fuel self-sufficiency in fusion reactors are derived through extensive modeling and calculations of the required and achievable tritium breeding ratios as functions of the many reactor parameters and candidate design concepts. It is found that the excess margin in the breeding potential is not sufficient to cover all present uncertainties. Thus, the goal of attaining fuel self-sufficiency significantly restricts the allowable parameter space and design concepts. For example, the required breeding ratio can be reduced by (A) attaining high tritium fractional burnup, >5%, in the plasma, (B) achieving very high reliability, >99%, and very short times, <1 day, to fix failures in the tritium processing system, and (C) ensuring that nonradioactive decay losses from all subsystems are extremely low, e.g., <0.1% for the plasma exhaust processing system. The uncertainties due to nuclear data and calculational methods are found to be significant, but they are substantially smaller than those due to uncertainties in system definition

  18. A Simple Negative Interaction in the Positive Transcriptional Feedback of a Single Gene Is Sufficient to Produce Reliable Oscillations

    Science.gov (United States)

    Miró-Bueno, Jesús M.; Rodríguez-Patón, Alfonso

    2011-01-01

    Negative and positive transcriptional feedback loops are present in natural and synthetic genetic oscillators. A single gene with negative transcriptional feedback needs a time delay and sufficiently strong nonlinearity in the transmission of the feedback signal in order to produce biochemical rhythms. A single gene with only positive transcriptional feedback does not produce oscillations. Here, we demonstrate that this single-gene network in conjunction with a simple negative interaction can also easily produce rhythms. We examine a model comprised of two well-differentiated parts. The first is a positive feedback created by a protein that binds to the promoter of its own gene and activates the transcription. The second is a negative interaction in which a repressor molecule prevents this protein from binding to its promoter. A stochastic study shows that the system is robust to noise. A deterministic study identifies that the dynamics of the oscillator are mainly driven by two types of biomolecules: the protein, and the complex formed by the repressor and this protein. The main conclusion of this paper is that a simple and usual negative interaction, such as degradation, sequestration or inhibition, acting on the positive transcriptional feedback of a single gene is a sufficient condition to produce reliable oscillations. One gene is enough and the positive transcriptional feedback signal does not need to activate a second repressor gene. This means that at the genetic level an explicit negative feedback loop is not necessary. The model needs neither cooperative binding reactions nor the formation of protein multimers. Therefore, our findings could help to clarify the design principles of cellular clocks and constitute a new efficient tool for engineering synthetic genetic oscillators. PMID:22205920

  19. Intrarater and interrater reliability for measurements in videofluoroscopy of swallowing

    International Nuclear Information System (INIS)

    Baijens, Laura; Barikroo, Ali; Pilz, Walmari

    2013-01-01

    Objective: Intrarater and interrater reliability is crucial to the quality of diagnostic and therapy-effect studies. This paper reports on a systematic review of studies on intrarater and interrater reliability for measurements in videofluoroscopy of swallowing. The aim of this review was to summarize and qualitatively analyze published studies on that topic. Materials and methods: Those published up to March 2013 were found through a comprehensive electronic database search using PubMed, Embase, and The Cochrane Library. Two reviewers independently assessed the studies using strict inclusion criteria. Results: Nineteen studies were included and then qualitatively analyzed. In several of these, methodological problems were found. Moreover, intrarater and interrater reliability varied with the measure applied. A meta-analysis was not carried out as studies were not of sufficient quality to warrant doing so. Conclusion: In order to achieve reliable measurements in videofluoroscopy of swallowing, it is recommended that raters use well-defined guidelines for the levels of ordinal visuoperceptual variables. Furthermore, in order to make the measurements reliable (intrarater and interrater) it is recommended that, following protocolled pre-experimental training, the raters should have maximum consensus about the definition of the measured variables

  20. Intrarater and interrater reliability for measurements in videofluoroscopy of swallowing

    Energy Technology Data Exchange (ETDEWEB)

    Baijens, Laura, E-mail: laura.baijens@mumc.nl [Department of Otorhinolaryngology, Head and Neck Surgery, Maastricht University Medical Center, Maastricht (Netherlands); Barikroo, Ali, E-mail: a.Barikroo@ufl.edu [Swallowing Research Laboratory, Department of Speech, Language and Hearing Sciences, College of Public Health and Health Professions, University of Florida, Gainesville, FL (United States); Pilz, Walmari, E-mail: walmari.pilz@mumc.nl [Department of Otorhinolaryngology, Head and Neck Surgery, Maastricht University Medical Center, Maastricht (Netherlands)

    2013-10-01

    Objective: Intrarater and interrater reliability is crucial to the quality of diagnostic and therapy-effect studies. This paper reports on a systematic review of studies on intrarater and interrater reliability for measurements in videofluoroscopy of swallowing. The aim of this review was to summarize and qualitatively analyze published studies on that topic. Materials and methods: Those published up to March 2013 were found through a comprehensive electronic database search using PubMed, Embase, and The Cochrane Library. Two reviewers independently assessed the studies using strict inclusion criteria. Results: Nineteen studies were included and then qualitatively analyzed. In several of these, methodological problems were found. Moreover, intrarater and interrater reliability varied with the measure applied. A meta-analysis was not carried out as studies were not of sufficient quality to warrant doing so. Conclusion: In order to achieve reliable measurements in videofluoroscopy of swallowing, it is recommended that raters use well-defined guidelines for the levels of ordinal visuoperceptual variables. Furthermore, in order to make the measurements reliable (intrarater and interrater) it is recommended that, following protocolled pre-experimental training, the raters should have maximum consensus about the definition of the measured variables.

  1. Long-Term Resource Adequacy, Long-Term Flexibility Requirements, and Revenue Sufficiency

    Energy Technology Data Exchange (ETDEWEB)

    Milligan, Michael [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bloom, Aaron P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ela, Erik [Electric Power Research Institute; Botterud, Audun [Argonne National Laboratory; Levin, Todd [Argonne National Laboratory

    2018-02-15

    Variable generation (VG) can reduce market prices over time and also the energy that other suppliers can sell in the market. The suppliers that are needed to provide capacity and flexibility to meet the long-term reliability requirements may, therefore, earn less revenue. This chapter discusses the topics of resource adequacy and revenue sufficiency - that is, determining and acquiring the quantity of capacity that will be needed at some future date and ensuring that those suppliers that offer the capacity receive sufficient revenue to recover their costs. The focus is on the investment time horizon and the installation of sufficient generation capability. First, the chapter discusses resource adequacy, including newer methods of determining adequacy metrics. The chapter then focuses on revenue sufficiency and how suppliers have sufficient opportunity to recover their total costs. The chapter closes with a description of the mechanisms traditionally adopted by electricity markets to mitigate the issues of resource adequacy and revenue sufficiency and discusses the most recent market design changes to address these issues.

  2. User constraints for reliable user-defined smart home scenarios

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Nielsen, Michael Kvist; Pedersen, Thomas

    2016-01-01

    Defining control scenarios in a smart home is a difficult task for end users. In particular, one concern is that user-defined scenarios could lead to unsafe or undesired state of the system. To help them explore scenario specifications, we propose in this paper a system that enables specification...

  3. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  4. Sufficient Condition for Estimation in Designing H∞ Filter-Based SLAM

    Directory of Open Access Journals (Sweden)

    Nur Aqilah Othman

    2015-01-01

    Full Text Available Extended Kalman filter (EKF is often employed in determining the position of mobile robot and landmarks in simultaneous localization and mapping (SLAM. Nonetheless, there are some disadvantages of using EKF, namely, the requirement of Gaussian distribution for the state and noises, as well as the fact that it requires the smallest possible initial state covariance. This has led researchers to find alternative ways to mitigate the aforementioned shortcomings. Therefore, this study is conducted to propose an alternative technique by implementing H∞ filter in SLAM instead of EKF. In implementing H∞ filter in SLAM, the parameters of the filter especially γ need to be properly defined to prevent finite escape time problem. Hence, this study proposes a sufficient condition for the estimation purposes. Two distinct cases of initial state covariance are analysed considering an indoor environment to ensure the best solution for SLAM problem exists along with considerations of process and measurement noises statistical behaviour. If the prescribed conditions are not satisfied, then the estimation would exhibit unbounded uncertainties and consequently results in erroneous inference about the robot and landmarks estimation. The simulation results have shown the reliability and consistency as suggested by the theoretical analysis and our previous findings.

  5. Sufficient conditions for uniqueness of the weak value

    International Nuclear Information System (INIS)

    Dressel, J; Jordan, A N

    2012-01-01

    We review and clarify the sufficient conditions for uniquely defining the generalized weak value as the weak limit of a conditioned average using the contextual values formalism introduced in Dressel, Agarwal and Jordan (2010 Phys. Rev. Lett. http://dx.doi.org/10.1103/PhysRevLett.104.240401). We also respond to criticism of our work by Parrott (arXiv:1105.4188v1) concerning a proposed counter-example to the uniqueness of the definition of the generalized weak value. The counter-example does not satisfy our prescription in the case of an underspecified measurement context. We show that when the contextual values formalism is properly applied to this example, a natural interpretation of the measurement emerges and the unique definition in the weak limit holds. We also prove a theorem regarding the uniqueness of the definition under our sufficient conditions for the general case. Finally, a second proposed counter-example by Parrott (arXiv:1105.4188v6) is shown not to satisfy the sufficiency conditions for the provided theorem. (paper)

  6. Power transformer reliability modelling

    NARCIS (Netherlands)

    Schijndel, van A.

    2010-01-01

    Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has

  7. Reliability Assessment Of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    Reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources. Therefore the turbine components should be designed to have sufficient reliability but also not be too costly (and safe). This paper presents models...... for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades, substructure and foundation. But since the function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects...... of these components are discussed and it is described how there reliability influences the reliability of the structural components. Two illustrative examples are presented considering uncertainty modeling, reliability assessment and calibration of partial safety factors for structural wind turbine components exposed...

  8. Developing Ultra Reliable Life Support for the Moon and Mars

    Science.gov (United States)

    Jones, Harry W.

    2009-01-01

    Recycling life support systems can achieve ultra reliability by using spares to replace failed components. The added mass for spares is approximately equal to the original system mass, provided the original system reliability is not very low. Acceptable reliability can be achieved for the space shuttle and space station by preventive maintenance and by replacing failed units, However, this maintenance and repair depends on a logistics supply chain that provides the needed spares. The Mars mission must take all the needed spares at launch. The Mars mission also must achieve ultra reliability, a very low failure rate per hour, since it requires years rather than weeks and cannot be cut short if a failure occurs. Also, the Mars mission has a much higher mass launch cost per kilogram than shuttle or station. Achieving ultra reliable space life support with acceptable mass will require a well-planned and extensive development effort. Analysis must define the reliability requirement and allocate it to subsystems and components. Technologies, components, and materials must be designed and selected for high reliability. Extensive testing is needed to ascertain very low failure rates. Systems design should segregate the failure causes in the smallest, most easily replaceable parts. The systems must be designed, produced, integrated, and tested without impairing system reliability. Maintenance and failed unit replacement should not introduce any additional probability of failure. The overall system must be tested sufficiently to identify any design errors. A program to develop ultra reliable space life support systems with acceptable mass must start soon if it is to produce timely results for the moon and Mars.

  9. CANDU: Meeting the demand for energy self-sufficiency

    International Nuclear Information System (INIS)

    Lawson, D.S.

    1985-01-01

    The success of the CANDU program can been seen quickly by examining the comparison of typical electricity bills in various provinces of Canada. The provinces of Quebec and Manitoba benefit b extensive hydro electric schemes, many of which were constructed years ago at low capital cost. In Ontario, the economic growth has outstripped these low cost sources of hydro power and hence the province has to rely on thermal sources of electricity generation. The success of the CANDU program is shown by the fact that it can contribute over a third of electricity in Ontario while keeping the total electricity rate comparable with that of those provinces that can rely on low cost hydro sources. Energy self-sufficiency encompasses a spectrum of requirements. One consideration would be the reliable supply and control of fuel during the operating life of a power plant: A greater degree of self-sufficiency would be obtained by having an involvement in the building and engineering of the power plant prior to its operation

  10. Wireless Channel Modeling Perspectives for Ultra-Reliable Communications

    DEFF Research Database (Denmark)

    Eggers, Patrick Claus F.; Popovski, Petar

    2018-01-01

    Ultra-Reliable Communication (URC) is one of the distinctive features of the upcoming 5G wireless communication. The level of reliability, going down to packet error rates (PER) of $10^{-9}$, should be sufficiently convincing in order to remove cables in an industrial setting or provide remote co...

  11. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  12. Reliability, Dimensionality, and Internal Consistency as Defined by Cronbach: Distinct Albeit Related Concepts

    Science.gov (United States)

    Davenport, Ernest C.; Davison, Mark L.; Liou, Pey-Yan; Love, Quintin U.

    2015-01-01

    This article uses definitions provided by Cronbach in his seminal paper for coefficient a to show the concepts of reliability, dimensionality, and internal consistency are distinct but interrelated. The article begins with a critique of the definition of reliability and then explores mathematical properties of Cronbach's a. Internal consistency…

  13. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  14. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  15. Defining clusters in APT reconstructions of ODS steels.

    Science.gov (United States)

    Williams, Ceri A; Haley, Daniel; Marquis, Emmanuelle A; Smith, George D W; Moody, Michael P

    2013-09-01

    Oxide nanoclusters in a consolidated Fe-14Cr-2W-0.3Ti-0.3Y₂O₃ ODS steel and in the alloy powder after mechanical alloying (but before consolidation) are investigated by atom probe tomography (APT). The maximum separation method is a standard method to define and characterise clusters from within APT data, but this work shows that the extent of clustering between the two materials is sufficiently different that the nanoclusters in the mechanically alloyed powder and in the consolidated material cannot be compared directly using the same cluster selection parameters. As the cluster selection parameters influence the size and composition of the clusters significantly, a procedure to optimise the input parameters for the maximum separation method is proposed by sweeping the d(max) and N(min) parameter space. By applying this method of cluster parameter selection combined with a 'matrix correction' to account for trajectory aberrations, differences in the oxide nanoclusters can then be reliably quantified. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Sufficiency Grounded as Sufficiently Free: A Reply to Shlomi Segall

    DEFF Research Database (Denmark)

    Nielsen, Lasse

    2016-01-01

    be grounded on (i) any personal value, nor (ii) any impersonal value. Consequently, sufficientarianism is groundless. This article contains a rejoinder to this critique. Its main claim is that the value of autonomy holds strong potential for grounding sufficiency. It argues, firstly, that autonomy carries...... both personal value for its recipient as well as impersonal value, and that both of these values are suitable for grounding sufficiency. It thus follows that we should reject both (i) and (ii). Secondly, although autonomy is presumably the strongest candidate for grounding sufficiency, the article...... provides some counterargument to Segall’s rejection of the other candidates—the impersonal value of virtue; the personal value for the allocator; and the personal value for others. If the arguments are sound, they show that we need not worry about sufficientarianism being groundless....

  17. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  18. Necessary and Sufficient Conditions for Pareto Optimality in Infinite Horizon Cooperative Differential Games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2011-01-01

    In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for infinite horizon cooperative differential games. We consider games defined by non autonomous and discounted autonomous systems. The obtained results are used to analyze the regular

  19. System reliability analysis with natural language and expert's subjectivity

    International Nuclear Information System (INIS)

    Onisawa, T.

    1996-01-01

    This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method

  20. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  1. Reliability of application of inspection procedures

    Energy Technology Data Exchange (ETDEWEB)

    Murgatroyd, R A

    1988-12-31

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC). 3 refs.

  2. Reliability of application of inspection procedures

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1988-01-01

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC)

  3. NASA reliability preferred practices for design and test

    Science.gov (United States)

    1991-01-01

    Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.

  4. Software Defined Multiband EVA Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to propose a reliable, lightweight, programmable, multi-band, multi-mode, miniaturized frequency-agile EVA software defined radio...

  5. Software Defined Multiband EVA Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of Phase 2 is to build a reliable, lightweight, programmable, multi-mode, miniaturized EVA Software Defined Radio (SDR) that supports data telemetry,...

  6. A reliability-risk modelling of nuclear rad-waste facilities

    International Nuclear Information System (INIS)

    Lehmann, P.H.; El-Bassioni, A.A.

    1975-01-01

    Rad-waste disposal systems of nuclear power sites are designed and operated to collect, delay, contain, and concentrate radioactive wastes from reactor plant processes such that on-site and off-site exposures to radiation are well below permissible limits. To assist the designer in achieving minimum release/exposure goals, a computerized reliability-risk model has been developed to simulate the rad-waste system. The objectives of the model are to furnish a practical tool for quantifying the effects of changes in system configuration, operation, and equipment, and for the identification of weak segments in the system design. Primarily, the model comprises a marriage of system analysis, reliability analysis, and release-risk assessment. Provisions have been included in the model to permit the optimization of the system design subject to constraints on cost and rad-releases. The system analysis phase involves the preparation of a physical and functional description of the rad-waste facility accompanied by the formation of a system tree diagram. The reliability analysis phase embodies the formulation of appropriate reliability models and the collection of model parameters. Release-risk assessment constitutes the analytical basis whereupon further system and reliability analyses may be warranted. Release-risk represents the potential for release of radioactivity and is defined as the product of an element's unreliability at time, t, and the radioactivity available for release in time interval, Δt. A computer code (RARISK) has been written to simulate the tree diagram of the rad-waste system. Reliability and release-risk results have been generated for cases which examined the process flow paths of typical rad-waste systems, the effects of repair and standby, the variations of equipment failure and repair rates, and changes in system configurations. The essential feature of this model is that a complex system like the rad-waste facility can be easily decomposed into its

  7. Reliability assessment of Wind turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2015-01-01

    Wind turbines can be considered as structures that are in between civil engineering structures and machines since they consist of structural components and many electrical and machine components together with a control system. Further, a wind turbine is not a one-of-a-kind structure...... but manufactured in series production based on many component tests, some prototype tests and zeroseries wind turbines. These characteristics influence the reliability assessment where focus in this paper is on the structural components. Levelized Cost Of Energy is very important for wind energy, especially when...... comparing to other energy sources. Therefore much focus is on cost reductions and improved reliability both for offshore and onshore wind turbines. The wind turbine components should be designed to have sufficient reliability level with respect to both extreme and fatigue loads but also not be too costly...

  8. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    Science.gov (United States)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  9. Implementasi Wireless Quality of Service dengan Metode Load Switching Jaringan Seluler Menggunakan Software Defined Network untuk Meningkatkan Network Reliability pada Jaringan Dinamis

    Directory of Open Access Journals (Sweden)

    Yoga Bayu Aji Pranawa

    2017-03-01

    Full Text Available Wireless Quality of Service (QOS adalah salah satu dimensi mobilitas, yaitu sebuah metode yang digunakan untuk menjaga kualitas suatu jaringan nirkabel. QOS diperlukan sebagai sebuah metode untuk memenuhi kriteria pelayanan sistem bagi pengguna, yaitu confidentiality, integrity, dan availability. Beberapa aspek yang menjadi topik utama dalam QOS adalah failure and recovery mechanism, variable bandwidth, computing distribution, discovery mechanism, variable lantency, dan performance feedback. Wireless yang dibahas pada penelitian ini dititik beratkan pada jaringan seluler yang cenderung tidak reliable pada daerah tertentu. Oleh karena itu dibutuhkan sebuah mekanisme yang dapat mengatasi tidak stabilnya jaringan seluler tersebut. mplementasi mekanisme yang diterapkan pada penelitian ini adalah dengan menerapkan load switching pada jaringan seluler dengan menggunakan beberapa provider dan menerapkan teknologi Software Defined Network (SDN. Berdasarkan hasi uji coba dapat disimpulkan bahwa sistem yang dibuat pada penelitian ini dapat menerapkan wireless quality of service dan meningkatkan network reliability sebesar 65,29% dan 83,87% lebih baik untuk penggunaan tanpa waktu tunggu dan dengan waktu tunggu pada suatu jaringan  dinamis.

  10. Ultra Reliable Closed Loop Life Support for Long Space Missions

    Science.gov (United States)

    Jones, Harry W.; Ewert, Michael K.

    2010-01-01

    Spacecraft human life support systems can achieve ultra reliability by providing sufficient spares to replace all failed components. The additional mass of spares for ultra reliability is approximately equal to the original system mass, provided that the original system reliability is not too low. Acceptable reliability can be achieved for the Space Shuttle and Space Station by preventive maintenance and by replacing failed units. However, on-demand maintenance and repair requires a logistics supply chain in place to provide the needed spares. In contrast, a Mars or other long space mission must take along all the needed spares, since resupply is not possible. Long missions must achieve ultra reliability, a very low failure rate per hour, since they will take years rather than weeks and cannot be cut short if a failure occurs. Also, distant missions have a much higher mass launch cost per kilogram than near-Earth missions. Achieving ultra reliable spacecraft life support systems with acceptable mass will require a well-planned and extensive development effort. Analysis must determine the reliability requirement and allocate it to subsystems and components. Ultra reliability requires reducing the intrinsic failure causes, providing spares to replace failed components and having "graceful" failure modes. Technologies, components, and materials must be selected and designed for high reliability. Long duration testing is needed to confirm very low failure rates. Systems design should segregate the failure causes in the smallest, most easily replaceable parts. The system must be designed, developed, integrated, and tested with system reliability in mind. Maintenance and reparability of failed units must not add to the probability of failure. The overall system must be tested sufficiently to identify any design errors. A program to develop ultra reliable space life support systems with acceptable mass should start soon since it must be a long term effort.

  11. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  12. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  13. Cue Reliability Represented in the Shape of Tuning Curves in the Owl's Sound Localization System.

    Science.gov (United States)

    Cazettes, Fanny; Fischer, Brian J; Peña, Jose L

    2016-02-17

    Optimal use of sensory information requires that the brain estimates the reliability of sensory cues, but the neural correlate of cue reliability relevant for behavior is not well defined. Here, we addressed this issue by examining how the reliability of spatial cue influences neuronal responses and behavior in the owl's auditory system. We show that the firing rate and spatial selectivity changed with cue reliability due to the mechanisms generating the tuning to the sound localization cue. We found that the correlated variability among neurons strongly depended on the shape of the tuning curves. Finally, we demonstrated that the change in the neurons' selectivity was necessary and sufficient for a network of stochastic neurons to predict behavior when sensory cues were corrupted with noise. This study demonstrates that the shape of tuning curves can stand alone as a coding dimension of environmental statistics. In natural environments, sensory cues are often corrupted by noise and are therefore unreliable. To make the best decisions, the brain must estimate the degree to which a cue can be trusted. The behaviorally relevant neural correlates of cue reliability are debated. In this study, we used the barn owl's sound localization system to address this question. We demonstrated that the mechanisms that account for spatial selectivity also explained how neural responses changed with degraded signals. This allowed for the neurons' selectivity to capture cue reliability, influencing the population readout commanding the owl's sound-orienting behavior. Copyright © 2016 the authors 0270-6474/16/362101-10$15.00/0.

  14. Aspect Of Reliability In Airport Business Continuity Management

    Directory of Open Access Journals (Sweden)

    Kozłowski Michał

    2015-11-01

    Full Text Available The paper presents the issue of ensuring the continuity of the operation at the airport. Requirements and objectives relating to business continuity management have been defined in accordance with ISO 22301 international standard. Conducted a study of reliability issues operation of the airport. Defined the function of the reliability and operational readiness of the airport. Presented the concept of using function of operational readiness in the risk assessment for the continuity of the airport.

  15. Highly Reliable PON Optical Splitters for Optical Access Networks in Outside Environments

    Science.gov (United States)

    Watanabe, Hiroshi; Araki, Noriyuki; Fujimoto, Hisashi

    Broadband optical access services are spreading throughout the world, and the number of fiber to the home (FTTH) subscribers is increasing rapidly. Telecom operators are constructing passive optical networks (PONs) to provide optical access services. Externally installed optical splitters for PONs are very important passive devices in optical access networks, and they must provide satisfactory performance as outdoor plant over long periods. Therefore, we calculate the failure rate of optical access networks and assign a failure rate to the optical splitters in optical access networks. The maximum cumulative failure rate of 1 × 8 optical splitters was calculated as 0.025 for an optical access fiber length of 2.1km and a 20-year operating lifetime. We examined planar lightwave circuit (PLC) type optical splitters for use as outside plant in terms of their optical characteristics and environmental reliability. We confirmed that PLC type optical splitters have sufficient optical performance for a PON splitter and sufficient reliability as outside plant in accordance with ITU-T standard values. We estimated the lifetimes of three kinds of PLC type optical splitters by using accelerated aging tests. The estimated failure rate of these splitters installed in optical access networks was below the target value for the cumulative failure rate, and we confirmed that they have sufficient reliability to maintain the quality of the network service. We developed 1 × 8 optical splitter modules with plug and socket type optical connectors and optical fiber cords for optical aerial closures designed for use as outside plant. These technologies make it easy to install optical splitters in an aerial optical closure. The optical splitter modules have sufficient optical performance levels for PONs because the insertion loss at the commercially used wavelengths of 1.31 and 1.55µm is less than the criterion established by ITU-T Recommendation G.671 for optical splitters. We performed a

  16. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gallo, Giulia [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Milligan, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-06-01

    Revenue insufficiency, or the missing money problem, occurs when the revenues that generators earn from the market are not sufficient to cover both fixed and variable costs to remain in the market and/or justify investments in new capacity, which may be needed for reliability. The near-zero marginal cost of variable renewable generators further exacerbates these revenue challenges. Estimating the extent of the missing money problem in current electricity markets is an important, nontrivial task that requires representing both how the power system operates and how market participants behave. This paper explores the missing money problem using a production cost model that represented a simplified version of the Electric Reliability Council of Texas (ERCOT) energy-only market for the years 2012-2014. We evaluate how various market structures -- including market behavior, ancillary services, and changing fleet compositions -- affect net revenues in this ERCOT-like system. In most production cost modeling exercises, resources are assumed to offer their marginal capabilities at marginal costs. Although this assumption is reasonable for feasibility studies and long-term planning, it does not adequately consider the market behaviors that impact revenue sufficiency. In this work, we simulate a limited set of market participant strategic bidding behaviors by means of different sets of markups; these markups are applied to the true production costs of all gas generators, which are the most prominent generators in ERCOT. Results show that markups can help generators increase their net revenues overall, although net revenues may increase or decrease depending on the technology and the year under study. Results also confirm that conventional, variable-cost-based production cost simulations do not capture prices accurately, and this particular feature calls for proxies for strategic behaviors (e.g., markups) and more accurate representations of how electricity markets work. The

  17. Conceptual Software Reliability Prediction Models for Nuclear Power Plant Safety Systems

    International Nuclear Information System (INIS)

    Johnson, G.; Lawrence, D.; Yu, H.

    2000-01-01

    The objective of this project is to develop a method to predict the potential reliability of software to be used in a digital system instrumentation and control system. The reliability prediction is to make use of existing measures of software reliability such as those described in IEEE Std 982 and 982.2. This prediction must be of sufficient accuracy to provide a value for uncertainty that could be used in a nuclear power plant probabilistic risk assessment (PRA). For the purposes of the project, reliability was defined to be the probability that the digital system will successfully perform its intended safety function (for the distribution of conditions under which it is expected to respond) upon demand with no unintended functions that might affect system safety. The ultimate objective is to use the identified measures to develop a method for predicting the potential quantitative reliability of a digital system. The reliability prediction models proposed in this report are conceptual in nature. That is, possible prediction techniques are proposed and trial models are built, but in order to become a useful tool for predicting reliability, the models must be tested, modified according to the results, and validated. Using methods outlined by this project, models could be constructed to develop reliability estimates for elements of software systems. This would require careful review and refinement of the models, development of model parameters from actual experience data or expert elicitation, and careful validation. By combining these reliability estimates (generated from the validated models for the constituent parts) in structural software models, the reliability of the software system could then be predicted. Modeling digital system reliability will also require that methods be developed for combining reliability estimates for hardware and software. System structural models must also be developed in order to predict system reliability based upon the reliability

  18. Reliability Analysis of Wireless Sensor Networks Using Markovian Model

    Directory of Open Access Journals (Sweden)

    Jin Zhu

    2012-01-01

    Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.

  19. Assessment of the human factor in the quantification of technical system reliability taking into consideration cognitive-causal aspects. Partial project 2. Modeling of the human behavior for reliability considerations. Final report

    International Nuclear Information System (INIS)

    Jennerich, Marco; Imbsweiler, Jonas; Straeter, Oliver; Arenius, Marcus

    2015-03-01

    cognitive load profiles can be used to assess the reliability of human behavior. The created cross-domain data base also allows the evaluation of failure mechanisms and the weighting of the effective factors. For the validation of this methodology a cognitive load profile for a nuclear specific reference scenario was defined. Here a steam generator tube rupture during the start-up procedure (reactor power <20%) without detection of the N16-Signal and, as additional disorder, a damaged valve seat in the feed water supply of an intact steam generator was defined. In this scenario, the identification of the defective steam generator is difficult for the control room personnel. It may lead to isolate a wrong, so functioning steam generator with sustained loss of coolant from the primary into the secondary circuit and possibly later in the scenario to a not sufficient core cooling. Based on the defined scenarios were reviewed and the validity of the results have been discussed and validated, in cooperation with instructors KSG (Kraftwerks-Simulator-Gesellschaft mbH). The findings were quantified and provided to the project partners, the Institute of Process Engineering, Process Automation and Instrumentation, University of Applied Science Zittau / Goerlitz to implement them in their fuzzy set approach. Overall, an approach has been developed to support probabilistic safety analysis with event data. Due to the cross-domain data in the event database, the methodology developed can be used for other safety-critical industries. It can make a huge contribution to safety in these high reliability industries.

  20. Dissipativity-Based Reliable Control for Fuzzy Markov Jump Systems With Actuator Faults.

    Science.gov (United States)

    Tao, Jie; Lu, Renquan; Shi, Peng; Su, Hongye; Wu, Zheng-Guang

    2017-09-01

    This paper is concerned with the problem of reliable dissipative control for Takagi-Sugeno fuzzy systems with Markov jumping parameters. Considering the influence of actuator faults, a sufficient condition is developed to ensure that the resultant closed-loop system is stochastically stable and strictly ( Q, S,R )-dissipative based on a relaxed approach in which mode-dependent and fuzzy-basis-dependent Lyapunov functions are employed. Then a reliable dissipative control for fuzzy Markov jump systems is designed, with sufficient condition proposed for the existence of guaranteed stability and dissipativity controller. The effectiveness and potential of the obtained design method is verified by two simulation examples.

  1. Measuring the construct of executive control in schizophrenia: defining and validating translational animal paradigms for discovery research.

    Science.gov (United States)

    Gilmour, Gary; Arguello, Alexander; Bari, Andrea; Brown, Verity J; Carter, Cameron; Floresco, Stan B; Jentsch, David J; Tait, David S; Young, Jared W; Robbins, Trevor W

    2013-11-01

    Executive control is an aspect of cognitive function known to be impaired in schizophrenia. Previous meetings of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) group have more precisely defined executive control in terms of two constructs: "rule generation and selection", and "dynamic adjustments of control". Next, human cognitive tasks that may effectively measure performance with regard to these constructs were identified to be developed into practical and reliable measures for use in treatment development. The aim of this round of CNTRICS meetings was to define animal paradigms that have sufficient promise to warrant further investigation for their utility in measuring these constructs. Accordingly, "reversal learning" and the "attentional set-shifting task" were nominated to assess the construct of rule generation and selection, and the "stop signal task" for the construct of dynamic adjustments of control. These tasks are described in more detail here, with a particular focus on their utility for drug discovery efforts. Presently, each assay has strengths and weaknesses with regard to this point and increased emphasis on improving practical aspects of testing, understanding predictive validity, and defining biomarkers of performance represent important objectives in attaining confidence in translational validity here. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  2. Is One Trial Sufficient to Obtain Excellent Pressure Pain Threshold Reliability in the Low Back of Asymptomatic Individuals? A Test-Retest Study.

    Science.gov (United States)

    Balaguier, Romain; Madeleine, Pascal; Vuillerme, Nicolas

    2016-01-01

    The assessment of pressure pain threshold (PPT) provides a quantitative value related to the mechanical sensitivity to pain of deep structures. Although excellent reliability of PPT has been reported in numerous anatomical locations, its absolute and relative reliability in the lower back region remains to be determined. Because of the high prevalence of low back pain in the general population and because low back pain is one of the leading causes of disability in industrialized countries, assessing pressure pain thresholds over the low back is particularly of interest. The purpose of this study study was (1) to evaluate the intra- and inter- absolute and relative reliability of PPT within 14 locations covering the low back region of asymptomatic individuals and (2) to determine the number of trial required to ensure reliable PPT measurements. Fifteen asymptomatic subjects were included in this study. PPTs were assessed among 14 anatomical locations in the low back region over two sessions separated by one hour interval. For the two sessions, three PPT assessments were performed on each location. Reliability was assessed computing intraclass correlation coefficients (ICC), standard error of measurement (SEM) and minimum detectable change (MDC) for all possible combinations between trials and sessions. Bland-Altman plots were also generated to assess potential bias in the dataset. Relative reliability for both intra- and inter- session was almost perfect with ICC ranged from 0.85 to 0.99. With respect to the intra-session, no statistical difference was reported for ICCs and SEM regardless of the conducted comparisons between trials. Conversely, for inter-session, ICCs and SEM values were significantly larger when two consecutive PPT measurements were used for data analysis. No significant difference was observed for the comparison between two consecutive measurements and three measurements. Excellent relative and absolute reliabilities were reported for both intra

  3. Requirements on qualification, competence and sufficient number of personnel for NPP operation

    International Nuclear Information System (INIS)

    Simon, M.

    2004-01-01

    The safe operation of NPPs presupposes qualified personnel on site in sufficient numbers. While the acquisition and preservation of technical expertise and the qualification of the shift personnel and other staff is well regulated by regulatory guidelines in Germany, there is a lack of such regulations with the exception for shift personnel - for the minimum number of technical personnel required for safe operation of a NPP. By order of the BMU, an attempt was made with this study to work out the requirements for qualification, competence and number of personnel to be maintained at the plant, representing the minimum requirements for safe operation of a NPP. The scope of the project was restricted to requirements for technical plant personnel. The aim was to work out requirements which would be as independent as possible of the existing organisation in a particular power plant. This study therefore does not assume a given organisational structure but was rather more oriented on the work processes in a NPP which are the basis for planning and performing routine work in the plant. For the study a work process model of typical tasks in a NPP had to be developed. Then, the tasks to be performed within the so defined work processes were described (task profiles) on the basis of existing manuals for plant organisation. From these task profiles such tasks were defined or selected which shall not be delegated to external personnel for specific reasons, and which were called vital competences. To keep these vital competences at the plant, an assessment and/or calculation of the necessary number of plant technical personnel was made using the task profiles for responsible personnel, but also by the evaluation of thousands of work orders for maintenance personnel. On the basis of these data, a proposal was made for the minimal number of technical personnel which is necessary to operate a NPP unit safely. Beside of this number, general criteria were developed which should be

  4. Determinants of 25(OH)D sufficiency in obese minority children: selecting outcome measures and analytic approaches.

    Science.gov (United States)

    Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri

    2011-06-01

    To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D 25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.

  5. Test-retest reliability of cognitive EEG

    Science.gov (United States)

    McEvoy, L. K.; Smith, M. E.; Gevins, A.

    2000-01-01

    OBJECTIVE: Task-related EEG is sensitive to changes in cognitive state produced by increased task difficulty and by transient impairment. If task-related EEG has high test-retest reliability, it could be used as part of a clinical test to assess changes in cognitive function. The aim of this study was to determine the reliability of the EEG recorded during the performance of a working memory (WM) task and a psychomotor vigilance task (PVT). METHODS: EEG was recorded while subjects rested quietly and while they performed the tasks. Within session (test-retest interval of approximately 1 h) and between session (test-retest interval of approximately 7 days) reliability was calculated for four EEG components: frontal midline theta at Fz, posterior theta at Pz, and slow and fast alpha at Pz. RESULTS: Task-related EEG was highly reliable within and between sessions (r0.9 for all components in WM task, and r0.8 for all components in the PVT). Resting EEG also showed high reliability, although the magnitude of the correlation was somewhat smaller than that of the task-related EEG (r0.7 for all 4 components). CONCLUSIONS: These results suggest that under appropriate conditions, task-related EEG has sufficient retest reliability for use in assessing clinical changes in cognitive status.

  6. Reliability Modeling of Double Beam Bridge Crane

    Science.gov (United States)

    Han, Zhu; Tong, Yifei; Luan, Jiahui; Xiangdong, Li

    2018-05-01

    This paper briefly described the structure of double beam bridge crane and the basic parameters of double beam bridge crane are defined. According to the structure and system division of double beam bridge crane, the reliability architecture of double beam bridge crane system is proposed, and the reliability mathematical model is constructed.

  7. Quality assurance and reliability in the Japanese electronics industry

    Science.gov (United States)

    Pecht, Michael; Boulton, William R.

    1995-02-01

    Quality and reliability are two attributes required for all Japanese products, although the JTEC panel found these attributes to be secondary to customer cost requirements. While our Japanese hosts gave presentations on the challenges of technology, cost, and miniaturization, quality and reliability were infrequently the focus of our discussions. Quality and reliability were assumed to be sufficient to meet customer needs. Fujitsu's slogan, 'quality built-in, with cost and performance as prime consideration,' illustrates this point. Sony's definition of a next-generation product is 'one that is going to be half the size and half the price at the same performance of the existing one'. Quality and reliability are so integral to Japan's electronics industry that they need no new emphasis.

  8. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  9. Energy self-sufficient sensory ball screw drive; Energieautarker sensorischer Kugelgewindetrieb

    Energy Technology Data Exchange (ETDEWEB)

    Bertram, Oliver

    2012-07-01

    Nowadays the availability of machine tools plays a decisive role in competition to increase in productivity. From state of the art it arises, that ball screw drives are the most abusive component in feed drives because of abrasive wear. Furthermore condition monitoring enables avoiding unplanned machine failure and increasing the availability of the deployed production facility. Thereby the application of additional sensors allows the direct acquisition of wear correlative measurements. To reduce the required effort for integration and increase the robustness, reliability and clarity in industrial environment energy self-sufficient sensor systems can be applied. In this thesis the development and investigation of an energy self-sufficient sensory ball screw drive with direct measurement of wear correlative pretension for condition monitoring application is described. The prototype measures the pretension with force sensors based on strain gauges. The sensor system includes microcontroller-based electronics for signal processing as well as wireless data transmission with ZigBee-standard. A hybrid system assures the energy supply of the sensor system. On the one hand a stepper motor generator produces electrical energy from the motion energy of the ball screw drive. On the other hand an energy buffer based on super caps is reloaded in stationary position by wireless energy transmission. For verification a prototype system is build up. In measurements the sensory and energetic characteristics of the energy self-sufficient sensor systems are analyzed. Moreover, the functionality of the ball screw drive as well as the signal characteristics of the force sensors are examined for different pretensions. In addition, pretension losses due to wear are established in realized endurance trials, which means that timely maintenance can be planned.

  10. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  11. Joint interval reliability for Markov systems with an application in transmission line reliability

    International Nuclear Information System (INIS)

    Csenki, Attila

    2007-01-01

    We consider Markov reliability models whose finite state space is partitioned into the set of up states U and the set of down states D . Given a collection of k disjoint time intervals I l =[t l ,t l +x l ], l=1,...,k, the joint interval reliability is defined as the probability of the system being in U for all time instances in I 1 union ... union I k . A closed form expression is derived here for the joint interval reliability for this class of models. The result is applied to power transmission lines in a two-state fluctuating environment. We use the Linux versions of the free packages Maxima and Scilab in our implementation for symbolic and numerical work, respectively

  12. Engineering Design Handbook: Development Guide for Reliability. Part Three. Reliability Prediction

    Science.gov (United States)

    1976-01-01

    to t is pa(t)=l-qa(t) (10-6) This is the reliability of being closed, defined for this interval. 2 The probability that a contact viH be open...Monte Carlo simulation. Few people can know all about all available programs. Special- ists can assist in selecting a few from the avail- able many

  13. The dynamic simulation model of soybean in Central Java to support food self sufficiency: A supply chain perspective

    Science.gov (United States)

    Oktyajati, Nancy; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    Consider food become one of the basic human needs in order to survive so food sufficiency become very important. Food sufficiency of soybean commodity in Central Java still depends on imported soybean. Insufficiency of soybean because of there is much gap between local soybean productions and its demand. In the year 2016 the shortage of supply soybean commodity as much 68.79%. Soybean is an important and strategic commodity after rice and corn. The increasing consumption of soybean is related to increasing population, increasing incomes, changing of healthy life style. The aims of this study are to determine the soybean dynamic model based on supply chain perspective, define the proper price of local soybean to trigger increasing of local production, and to define the alternative solution to support food self sufficiency. This study will capture the real condition into dynamics model, then simulate a series of scenario into a computer program to obtain the best results. This study will be conducted the following first scenario with government intervention policy and second without government intervention policy. The best solution of the alternative can be used as government consideration for governmental policy. The results of the propose scenarios showed that self sufficiency on soybean can be achieved after the next 20 years by increasing planting area 4% and land productivity 1% per year.

  14. Analogical reasoning for reliability analysis based on generic data

    Energy Technology Data Exchange (ETDEWEB)

    Kozin, Igor O

    1996-10-01

    The paper suggests using the systemic concept 'analogy' for the foundation of an approach to analyze system reliability on the basis of generic data, describing the method of structuring the set that defines analogical models, an approach of transition from the analogical model to a reliability model and a way of obtaining reliability intervals of analogous objects.

  15. Analogical reasoning for reliability analysis based on generic data

    International Nuclear Information System (INIS)

    Kozin, Igor O.

    1996-01-01

    The paper suggests using the systemic concept 'analogy' for the foundation of an approach to analyze system reliability on the basis of generic data, describing the method of structuring the set that defines analogical models, an approach of transition from the analogical model to a reliability model and a way of obtaining reliability intervals of analogous objects

  16. power system reliability in supplying nuclear reactors

    International Nuclear Information System (INIS)

    Gad, M.M.M.

    2007-01-01

    this thesis presents a simple technique for deducing minimal cut set (MCS) from the defined minimal path set (MPS) of generic distribution system and this technique have been used to evaluate the basic reliability indices of Egypt's second research reactor (ETRR-2) electrical distribution network. the alternative system configurations are then studied to evaluate their impact on service reliability. the proposed MCS approach considers both sustained and temporary outage. the temporary outage constitutes an important parameter in characterizing the system reliability indices for critical load point in distribution system. it is also consider the power quality impact on the reliability indices

  17. Reliability of the ECHOWS Tool for Assessment of Patient Interviewing Skills.

    Science.gov (United States)

    Boissonnault, Jill S; Evans, Kerrie; Tuttle, Neil; Hetzel, Scott J; Boissonnault, William G

    2016-04-01

    History taking is an important component of patient/client management. Assessment of student history-taking competency can be achieved via a standardized tool. The ECHOWS tool has been shown to be valid with modest intrarater reliability in a previous study but did not demonstrate sufficient power to definitively prove its stability. The purposes of this study were: (1) to assess the reliability of the ECHOWS tool for student assessment of patient interviewing skills and (2) to determine whether the tool discerns between novice and experienced skill levels. A reliability and construct validity assessment was conducted. Three faculty members from the United States and Australia scored videotaped histories from standardized patients taken by students and experienced clinicians from each of these countries. The tapes were scored twice, 3 to 6 weeks apart. Reliability was assessed using interclass correlation coefficients (ICCs) and repeated measures. Analysis of variance models assessed the ability of the tool to discern between novice and experienced skill levels. The ECHOWS tool showed excellent intrarater reliability (ICC [3,1]=.74-.89) and good interrater reliability (ICC [2,1]=.55) as a whole. The summary of performance (S) section showed poor interrater reliability (ICC [2,1]=.27). There was no statistical difference in performance on the tool between novice and experienced clinicians. A possible ceiling effect may occur when standardized patients are not coached to provide complex and obtuse responses to interviewer questions. Variation in familiarity with the ECHOWS tool and in use of the online training may have influenced scoring of the S section. The ECHOWS tool demonstrates excellent intrarater reliability and moderate interrater reliability. Sufficient training with the tool prior to student assessment is recommended. The S section must evolve in order to provide a more discerning measure of interviewing skills. © 2016 American Physical Therapy

  18. Mobile phone radiation health risk controversy: the reliability and sufficiency of science behind the safety standards.

    Science.gov (United States)

    Leszczynski, Dariusz; Xu, Zhengping

    2010-01-27

    There is ongoing discussion whether the mobile phone radiation causes any health effects. The International Commission on Non-Ionizing Radiation Protection, the International Committee on Electromagnetic Safety and the World Health Organization are assuring that there is no proven health risk and that the present safety limits protect all mobile phone users. However, based on the available scientific evidence, the situation is not as clear. The majority of the evidence comes from in vitro laboratory studies and is of very limited use for determining health risk. Animal toxicology studies are inadequate because it is not possible to "overdose" microwave radiation, as it is done with chemical agents, due to simultaneous induction of heating side-effects. There is a lack of human volunteer studies that would, in unbiased way, demonstrate whether human body responds at all to mobile phone radiation. Finally, the epidemiological evidence is insufficient due to, among others, selection and misclassification bias and the low sensitivity of this approach in detection of health risk within the population. This indicates that the presently available scientific evidence is insufficient to prove reliability of the current safety standards. Therefore, we recommend to use precaution when dealing with mobile phones and, whenever possible and feasible, to limit body exposure to this radiation. Continuation of the research on mobile phone radiation effects is needed in order to improve the basis and the reliability of the safety standards.

  19. Mobile phone radiation health risk controversy: the reliability and sufficiency of science behind the safety standards

    Directory of Open Access Journals (Sweden)

    Leszczynski Dariusz

    2010-01-01

    Full Text Available Abstract There is ongoing discussion whether the mobile phone radiation causes any health effects. The International Commission on Non-Ionizing Radiation Protection, the International Committee on Electromagnetic Safety and the World Health Organization are assuring that there is no proven health risk and that the present safety limits protect all mobile phone users. However, based on the available scientific evidence, the situation is not as clear. The majority of the evidence comes from in vitro laboratory studies and is of very limited use for determining health risk. Animal toxicology studies are inadequate because it is not possible to "overdose" microwave radiation, as it is done with chemical agents, due to simultaneous induction of heating side-effects. There is a lack of human volunteer studies that would, in unbiased way, demonstrate whether human body responds at all to mobile phone radiation. Finally, the epidemiological evidence is insufficient due to, among others, selection and misclassification bias and the low sensitivity of this approach in detection of health risk within the population. This indicates that the presently available scientific evidence is insufficient to prove reliability of the current safety standards. Therefore, we recommend to use precaution when dealing with mobile phones and, whenever possible and feasible, to limit body exposure to this radiation. Continuation of the research on mobile phone radiation effects is needed in order to improve the basis and the reliability of the safety standards.

  20. Development of a highly reliable CRT processor

    International Nuclear Information System (INIS)

    Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya

    1996-01-01

    Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)

  1. Time-dependent reliability sensitivity analysis of motion mechanisms

    International Nuclear Information System (INIS)

    Wei, Pengfei; Song, Jingwen; Lu, Zhenzhou; Yue, Zhufeng

    2016-01-01

    Reliability sensitivity analysis aims at identifying the source of structure/mechanism failure, and quantifying the effects of each random source or their distribution parameters on failure probability or reliability. In this paper, the time-dependent parametric reliability sensitivity (PRS) analysis as well as the global reliability sensitivity (GRS) analysis is introduced for the motion mechanisms. The PRS indices are defined as the partial derivatives of the time-dependent reliability w.r.t. the distribution parameters of each random input variable, and they quantify the effect of the small change of each distribution parameter on the time-dependent reliability. The GRS indices are defined for quantifying the individual, interaction and total contributions of the uncertainty in each random input variable to the time-dependent reliability. The envelope function method combined with the first order approximation of the motion error function is introduced for efficiently estimating the time-dependent PRS and GRS indices. Both the time-dependent PRS and GRS analysis techniques can be especially useful for reliability-based design. This significance of the proposed methods as well as the effectiveness of the envelope function method for estimating the time-dependent PRS and GRS indices are demonstrated with a four-bar mechanism and a car rack-and-pinion steering linkage. - Highlights: • Time-dependent parametric reliability sensitivity analysis is presented. • Time-dependent global reliability sensitivity analysis is presented for mechanisms. • The proposed method is especially useful for enhancing the kinematic reliability. • An envelope method is introduced for efficiently implementing the proposed methods. • The proposed method is demonstrated by two real planar mechanisms.

  2. Joint interval reliability for Markov systems with an application in transmission line reliability

    Energy Technology Data Exchange (ETDEWEB)

    Csenki, Attila [School of Computing and Mathematics, University of Bradford, Bradford, West Yorkshire, BD7 1DP (United Kingdom)]. E-mail: a.csenki@bradford.ac.uk

    2007-06-15

    We consider Markov reliability models whose finite state space is partitioned into the set of up states {sub U} and the set of down states {sub D}. Given a collection of k disjoint time intervals I{sub l}=[t{sub l},t{sub l}+x{sub l}], l=1,...,k, the joint interval reliability is defined as the probability of the system being in {sub U} for all time instances in I{sub 1} union ... union I{sub k}. A closed form expression is derived here for the joint interval reliability for this class of models. The result is applied to power transmission lines in a two-state fluctuating environment. We use the Linux versions of the free packages Maxima and Scilab in our implementation for symbolic and numerical work, respectively.

  3. 77 FR 64920 - Revisions to Reliability Standard for Transmission Vegetation Management

    Science.gov (United States)

    2012-10-24

    ... reliability of the Bulk Electric System.'' NERC defines ``System Operating Limit'' as ``[t]he value (such as... values or gives reason to revisit the Reliability Standard. Accordingly, consistent with the activity...] Revisions to Reliability Standard for Transmission Vegetation Management AGENCY: Federal Energy Regulatory...

  4. Developing Reliable Life Support for Mars

    Science.gov (United States)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and

  5. Application of genetic algorithm for reliability allocation in nuclear power plants

    International Nuclear Information System (INIS)

    Yang, Joon-Eon; Hwang, Mee-Jung; Sung, Tae-Yong; Jin, Youngho

    1999-01-01

    Reliability allocation is an optimization process of minimizing the total plant costs subject to the overall plant safety goal constraints. Reliability allocation was applied to determine the reliability characteristics of reactor systems, subsystems, major components and plant procedures that are consistent with a set of top-level performance goals; the core melt frequency, acute fatalities and latent fatalities. Reliability allocation can be performed to improve the design, operation and safety of new and/or existing nuclear power plants. Reliability allocation is a kind of a difficult multi-objective optimization problem as well as a global optimization problem. The genetic algorithm, known as one of the most powerful tools for most optimization problems, is applied to the reliability allocation problem of a typical pressurized water reactor in this article. One of the main problems of reliability allocation is defining realistic objective functions. Hence, in order to optimize the reliability of the system, the cost for improving and/or degrading the reliability of the system should be included in the reliability allocation process. We used techniques derived from the value impact analysis to define the realistic objective function in this article

  6. A possibilistic uncertainty model in classical reliability theory

    International Nuclear Information System (INIS)

    De Cooman, G.; Capelle, B.

    1994-01-01

    The authors argue that a possibilistic uncertainty model can be used to represent linguistic uncertainty about the states of a system and of its components. Furthermore, the basic properties of the application of this model to classical reliability theory are studied. The notion of the possibilistic reliability of a system or a component is defined. Based on the concept of a binary structure function, the important notion of a possibilistic function is introduced. It allows to calculate the possibilistic reliability of a system in terms of the possibilistic reliabilities of its components

  7. Structural Reliability of Wind Turbine Blades

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    turbine blades. The main purpose is to draw a clear picture of how reliability-based design of wind turbines can be done in practice. The objectives of the thesis are to create methodologies for efficient reliability assessment of composite materials and composite wind turbine blades, and to map...... the uncertainties in the processes, materials and external conditions that have an effect on the health of a composite structure. The study considers all stages in a reliability analysis, from defining models of structural components to obtaining the reliability index and calibration of partial safety factors...... by developing new models and standards or carrying out tests The following aspects are covered in detail: ⋅ The probabilistic aspects of ultimate strength of composite laminates are addressed. Laminated plates are considered as a general structural reliability system where each layer in a laminate is a separate...

  8. A set of key questions to assess the stress among bank employees and its reliability

    Directory of Open Access Journals (Sweden)

    Alice Mannocci

    2017-03-01

    Full Text Available The aims of the present study are: to realize a tool, clear and helpful, to assess the occupational distress level in bank employees in Italy; secondly to assess the reliability of the tool. Eight sentences were considered after a consensus meeting that involved different professional figures. 70 questionnaires were collected. The overall Cronbach’s alpha was 0.596, a sufficient reliability was found. The elimination of one sentences (“I haven’t time to dedicate myself to my hobbies/activities/stuff” increases alpha’s value from 0.596 to 0.620, and thus reach fully sufficient score. The claim “The pace of change on work place exceeds my capacity for adaptation” maximises the change of the level of reliability (Inter item Correlation = 0.528.

  9. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  10. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  11. The validity and reliability of the Dutch Effort-Reward Imbalance Questionnaire

    NARCIS (Netherlands)

    Hanson, E. K.; Schaufeli, W.; Vrijkotte, T.; Plomp, N. H.; Godaert, G. L.

    2000-01-01

    The reliability and validity of the Effort-Reward Imbalance Questionnaire were tested in 775 blue- and white-collar workers in the Netherlands. Cronbach's alpha revealed sufficient internal consistency of all subscales except Need for Control. With exploratory probabilistic scaling (Mokken)

  12. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    International Nuclear Information System (INIS)

    Avrutin, V; Granados, A; Schanz, M

    2011-01-01

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs

  13. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    Science.gov (United States)

    Avrutin, V.; Granados, A.; Schanz, M.

    2011-09-01

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs.

  14. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  15. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  16. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. The experiences show that the operational reliability is higher than the test reliability User's interest is on the operational reliability rather than on the test reliability, however. With the assumption that the difference in reliability results from the change of environment, testing environment factors comprising the aging factor and the coverage factor are defined in this study to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results are close to the actual data

  17. Resolution of GSI B-56 - Emergency diesel generator reliability

    International Nuclear Information System (INIS)

    Serkiz, A.W.

    1989-01-01

    The need for an emergency diesel generator (EDG) reliability program has been established by 10 CFR Part 50, Section 50.63, Loss of All Alternating Current Power, which requires that licensees assess their station blackout coping and recovery capability. EDGs are the principal emergency ac power sources for avoiding a station blackout. Regulatory Guide 1.155, Station Blackout, identifies a need for (1) a nuclear unit EDG reliability level of at least 0.95, and (2) an EDG reliability program to monitor and maintain the required EDG reliability levels. NUMARC-8700, Guidelines and Technical Bases for NUMARC Initiatives Addressing Station Blackout at Light Water Reactors, also provides guidance on such needs. The resolution of GSI B-56, Diesel Reliability will be accomplished by issuing Regulatory Guide 1.9, Rev. 3, Selection, Design, Qualification, Testing, and Reliability of Diesel Generator Units Used as Onsite Electric Power Systems at Nuclear Plants. This revision will integrate into a single regulatory guide pertinent guidance previously addressed in R.G. 1.9, Rev. 2, R.G. 1.108, and Generic Letter 84-15. R.G. 1.9 has been expanded to define the principal elements of an EDG reliability program for monitoring and maintaining EDG reliability levels selected for SBO. In addition, alert levels and corrective actions have been defined to detect a deteriorating situation for all EDGs assigned to a particular nuclear unit, as well as an individual problem EDG

  18. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  19. LMFBR fuel analysis. Task C: Reliability aspects of LMFBRs. Final report, October 1, 1976--September 30, 1977

    International Nuclear Information System (INIS)

    Bastl, W.; Kastenberg, W.E.

    1977-10-01

    An analysis is presented for the availability of the electrical power supplies upon reactor shutdown. Successful power supply is defined in terms of the ability of the associated pumps (pump motors) to provide forced circulation and to deliver sufficient feedwater for proper cooldown of the core. Previous investigations of the reliability of the CRBR shutdown heat removal system concentrated on the mechanical systems and/or did not yet consider the diverse power supply. The shutdown heat removal system (SHRS) is discussed in the light of the availability of the electrical power systems, depending upon various types of initiating events. The unavailabilities of the essential power distribution and power supply buses are estimated, so that they can easily be used in connection with analyses of the entire SHRS. Further estimates include mechanical failure of the pumps

  20. Reliability and continuous regeneration model

    Directory of Open Access Journals (Sweden)

    Anna Pavlisková

    2006-06-01

    Full Text Available The failure-free function of an object is very important for the service. This leads to the interest in the determination of the object reliability and failure intensity. The reliability of an element is defined by the theory of probability.The element durability T is a continuous random variate with the probability density f. The failure intensity (tλ is a very important reliability characteristics of the element. Often it is an increasing function, which corresponds to the element ageing. We disposed of the data about a belt conveyor failures recorded during the period of 90 months. The given ses behaves according to the normal distribution. By using a mathematical analysis and matematical statistics, we found the failure intensity function (tλ. The function (tλ increases almost linearly.

  1. Reliable software for unreliable hardware a cross layer perspective

    CERN Document Server

    Rehman, Semeen; Henkel, Jörg

    2016-01-01

    This book describes novel software concepts to increase reliability under user-defined constraints. The authors’ approach bridges, for the first time, the reliability gap between hardware and software. Readers will learn how to achieve increased soft error resilience on unreliable hardware, while exploiting the inherent error masking characteristics and error (stemming from soft errors, aging, and process variations) mitigations potential at different software layers. · Provides a comprehensive overview of reliability modeling and optimization techniques at different hardware and software levels; · Describes novel optimization techniques for software cross-layer reliability, targeting unreliable hardware.

  2. The CEGB approach to defining the commissioning tests for prime movers

    Energy Technology Data Exchange (ETDEWEB)

    Horne, B. E. [CEGB, Generation Development and Construction Division, Barnett Way, Barnwood, Gloucester GL4 7RS (United Kingdom)

    1986-02-15

    This paper describes the CEGB approach to demonstrating during commissioning the adequacy of the reliability of the large on-site essential electrical power sources installed in the CAGR power stations. In this approach the reliability requirements of the essential electrical supplies at the power stations are defined and then the reliability requirements of the particular gas turbine and diesel generator installation derived. The paper outlines the probabilistic methods used in arriving at the specific start and run test programmes which were subsequently carried out. The results achieved in these test programmes in demonstrating that the reliability requirements were satisfied, are presented In the paper. (author)

  3. The CEGB approach to defining the commissioning tests for prime movers

    International Nuclear Information System (INIS)

    Horne, B.E.

    1986-01-01

    This paper describes the CEGB approach to demonstrating during commissioning the adequacy of the reliability of the large on-site essential electrical power sources installed in the CAGR power stations. In this approach the reliability requirements of the essential electrical supplies at the power stations are defined and then the reliability requirements of the particular gas turbine and diesel generator installation derived. The paper outlines the probabilistic methods used in arriving at the specific start and run test programmes which were subsequently carried out. The results achieved in these test programmes in demonstrating that the reliability requirements were satisfied, are presented In the paper. (author)

  4. Reliability engineering for nuclear and other high technology systems

    International Nuclear Information System (INIS)

    Lakner, A.A.; Anderson, R.T.

    1985-01-01

    This book is written for the reliability instructor, program manager, system engineer, design engineer, reliability engineer, nuclear regulator, probability risk assessment (PRA) analyst, general manager and others who are involved in system hardware acquisition, design and operation and are concerned with plant safety and operational cost-effectiveness. It provides criteria, guidelines and comprehensive engineering data affecting reliability; it covers the key aspects of system reliability as it relates to conceptual planning, cost tradeoff decisions, specification, contractor selection, design, test and plant acceptance and operation. It treats reliability as an integrated methodology, explicitly describing life cycle management techniques as well as the basic elements of a total hardware development program, including: reliability parameters and design improvement attributes, reliability testing, reliability engineering and control. It describes how these elements can be defined during procurement, and implemented during design and development to yield reliable equipment. (author)

  5. High-Reliability Health Care: Getting There from Here

    Science.gov (United States)

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  6. On Undefined and Meaningless in Lambda Definability

    OpenAIRE

    de Vries, Fer-Jan

    2016-01-01

    We distinguish between undefined terms as used in lambda definability of partial recursive functions and meaningless terms as used in infinite lambda calculus for the infinitary terms models that generalise the Bohm model. While there are uncountable many known sets of meaningless terms, there are four known sets of undefined terms. Two of these four are sets of meaningless terms. In this paper we first present set of sufficient conditions for a set of lambda terms to se...

  7. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    Science.gov (United States)

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  8. Assessing reliability in energy supply systems

    International Nuclear Information System (INIS)

    McCarthy, Ryan W.; Ogden, Joan M.; Sperling, Daniel

    2007-01-01

    Reliability has always been a concern in the energy sector, but concerns are escalating as energy demand increases and the political stability of many energy supply regions becomes more questionable. But how does one define and measure reliability? We introduce a method to assess reliability in energy supply systems in terms of adequacy and security. It derives from reliability assessment frameworks developed for the electricity sector, which are extended to include qualitative considerations and to be applicable to new energy systems by incorporating decision-making processes based on expert opinion and multi-attribute utility theory. The method presented here is flexible and can be applied to any energy system. To illustrate its use, we apply the method to two hydrogen pathways: (1) centralized steam reforming of imported liquefied natural gas with pipeline distribution of hydrogen, and (2) on-site electrolysis of water using renewable electricity produced independently from the electricity grid

  9. Inclusion of fatigue effects in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, Candice D. [Vanderbilt University, Nashville, TN (United States); Mahadevan, Sankaran, E-mail: sankaran.mahadevan@vanderbilt.edu [Vanderbilt University, Nashville, TN (United States)

    2011-11-15

    The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: >We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. > We discuss the difficulties in defining and measuring fatigue. > We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.

  10. Distributed controller clustering in software defined networks.

    Directory of Open Access Journals (Sweden)

    Ahmed Abdelaziz

    Full Text Available Software Defined Networking (SDN is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN SDN and Open Network Operating System (ONOS controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  11. Distributed controller clustering in software defined networks.

    Science.gov (United States)

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  12. Assessing the Impact of Imperfect Diagnosis on Service Reliability

    DEFF Research Database (Denmark)

    Grønbæk, Lars Jesper; Schwefel, Hans-Peter; Kjærgaard, Jens Kristian

    2010-01-01

    , representative diagnosis performance metrics have been defined and their closed-form solutions obtained for the Markov model. These equations enable model parameterization from traces of implemented diagnosis components. The diagnosis model has been integrated in a reliability model assessing the impact...... of the diagnosis functions for the studied reliability problem. In a simulation study we finally analyze trade-off properties of diagnosis heuristics from literature, map them to the analytic Markov model, and investigate its suitability for service reliability optimization....

  13. Reliability analysis of an offshore structure

    DEFF Research Database (Denmark)

    Sorensen, J. D.; Faber, M. H.; Thoft-Christensen, P.

    1992-01-01

    A jacket type offshore structure from the North Sea is considered. The time variant reliability is estimated for failure defined as brittle fracture and crack through the tubular member walls. The stochastic modelling is described. The hot spot stress spectral moments as function of the stochasti...

  14. Importance of independent and dependent human error to system reliability and plant safety

    International Nuclear Information System (INIS)

    Dach, K.

    1988-08-01

    Uncertainty analysis of the quantification of the unavailability for the emergency core cooling system was made. The reliability analysis of the low pressure injection system (LPIS) of the ECCS of WWER-440 reactor was also performed. Results of reliability analysis proved that LPIS reliability under normal conditions is sufficient and can be increased by two orders of magnitude. This increase in reliability can be achieved by means of simple changes such as securing an opening of the quick-acting fittings at LPIS discharge line. A method for analysis of systems uncertainty with periodic inspected components was elaborated and verified by performing an analysis of the medium size system. Refs, figs and tabs

  15. Sufficiency does energy consumption become a moral issue?

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Adrian (Socio-economic Inst. and Univ. Research Priority Programme in Ethics, Univ. of Zuerich, Zuerich (Switzerland))

    2009-07-01

    Reducing the externalities from energy use is crucial for sustainability. There are basically four ways to reduce externalities from energy use: increasing technical efficiency ('energy input per unit energy service'), increasing economic efficiency ('internalising external costs'), using 'clean' energy sources with few externalities, or sufficiency ('identifying 'optimal' energy service levels'). A combination of those strategies is most promising for sustainable energy systems. However, the debate on sustainable energy is dominated by efficiency and clean energy strategies, while sufficiency plays a minor role. Efficiency and clean energy face several problems, though. Thus, the current debate should be complemented with a critical discussion of sufficiency. In this paper, I develop a concept of sufficiency, which is adequate for liberal societies. I focus on ethical foundations for sufficiency, as the discussion of such is missing or cursory only in the existing literature. I first show that many examples of sufficiency can be understood as (economic) efficiency, but that the two concepts do not coincide. I then show that sufficiency based on moralization of actions can be understood as implementation of the boundary conditions for social justice that come with notions of liberal societies, in particular the duty not to harm other people. By this, to increase sufficiency becomes a duty beyond individual taste. I further illustrate this in the context of the adverse effects of climate change as externalities from energy use.

  16. Lessons learned in the implementation of Integrated Safety Management at DOE Order Compliance Sites vs Necessary and Sufficient Sites

    International Nuclear Information System (INIS)

    Hill, R.L.

    2000-01-01

    This paper summarizes the development and implementation of Integrated Safety Management (ISM) at an Order Compliance Site (Savannah River Site) and a Necessary and Sufficient Site (Nevada Test Site). A discussion of each core safety function of ISM is followed by an example from an Order Compliance Site and a Necessary and Sufficient Site. The Savannah River Site was the first DOE site to have a DOE Headquarters-validated and approved ISM System. The NTS is beginning the process of verification and validation. This paper defines successful strategies for integrating Environment, Safety, and Health management into work under various scenarios

  17. Social network analysis via multi-state reliability and conditional influence models

    International Nuclear Information System (INIS)

    Schneider, Kellie; Rainwater, Chase; Pohl, Ed; Hernandez, Ivan; Ramirez-Marquez, Jose Emmanuel

    2013-01-01

    This paper incorporates multi-state reliability measures into the assessment of a social network in which influence is treated as a multi-state commodity that flows through the network. The reliability of the network is defined as the probability that at least a certain level of influence reaches an intended target. We consider an individual's influence level as a function of the influence levels received from preceding actors in the network. We define several communication functions which describe the level of influence a particular actor will pass along to other actors within the network. Illustrative examples are presented, and the network reliability under the various communication influence levels is computed using exhaustive enumeration for a small example and Monte Carlo simulation for larger, more realistic sized examples.

  18. Reliability-based optimal structural design by the decoupling approach

    International Nuclear Information System (INIS)

    Royset, J.O.; Der Kiureghian, A.; Polak, E.

    2001-01-01

    A decoupling approach for solving optimal structural design problems involving reliability terms in the objective function, the constraint set or both is discussed and extended. The approach employs a reformulation of each problem, in which reliability terms are replaced by deterministic functions. The reformulated problems can be solved by existing semi-infinite optimization algorithms and computational reliability methods. It is shown that the reformulated problems produce solutions that are identical to those of the original problems when the limit-state functions defining the reliability problem are affine. For nonaffine limit-state functions, approximate solutions are obtained by solving series of reformulated problems. An important advantage of the approach is that the required reliability and optimization calculations are completely decoupled, thus allowing flexibility in the choice of the optimization algorithm and the reliability computation method

  19. Top-down and bottom-up definitions of human failure events in human reliability analysis

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2014-01-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down - defined as a subset of the PRA - whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up - derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  20. Reliable Decentralized Control of Fuzzy Discrete-Event Systems and a Test Algorithm.

    Science.gov (United States)

    Liu, Fuchun; Dziong, Zbigniew

    2013-02-01

    A framework for decentralized control of fuzzy discrete-event systems (FDESs) has been recently presented to guarantee the achievement of a given specification under the joint control of all local fuzzy supervisors. As a continuation, this paper addresses the reliable decentralized control of FDESs in face of possible failures of some local fuzzy supervisors. Roughly speaking, for an FDES equipped with n local fuzzy supervisors, a decentralized supervisor is called k-reliable (1 ≤ k ≤ n) provided that the control performance will not be degraded even when n - k local fuzzy supervisors fail. A necessary and sufficient condition for the existence of k-reliable decentralized supervisors of FDESs is proposed by introducing the notions of M̃uc-controllability and k-reliable coobservability of fuzzy language. In particular, a polynomial-time algorithm to test the k-reliable coobservability is developed by a constructive methodology, which indicates that the existence of k-reliable decentralized supervisors of FDESs can be checked with a polynomial complexity.

  1. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  2. High-reliability health care: getting there from here.

    Science.gov (United States)

    Chassin, Mark R; Loeb, Jerod M

    2013-09-01

    Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer "project fatigue" because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals' readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research

  3. Analysis of Parking Reliability Guidance of Urban Parking Variable Message Sign System

    OpenAIRE

    Zhenyu Mei; Ye Tian; Dongping Li

    2012-01-01

    Operators of parking guidance and information systems (PGIS) often encounter difficulty in determining when and how to provide reliable car park availability information to drivers. Reliability has become a key factor to ensure the benefits of urban PGIS. The present paper is the first to define the guiding parking reliability of urban parking variable message signs (VMSs). By analyzing the parking choice under guiding and optional parking lots, a guiding parking reliability model was constru...

  4. DYNAMIC SUFFICIENCY OF THE MAGNETICALLY SUSPENDED TRAIN

    Directory of Open Access Journals (Sweden)

    V. A. Polyakov

    2013-11-01

    Full Text Available Purpose. The basic criterion of the magnetically suspended train's consumer estimation is a quality of its mechanical motion. This motion is realized in unpredictable conditions and, for purposefulness preservation, should adapt to them. Such adaptation is possible only within the limits of system’s dynamic sufficiency. Sufficiency is understood as presence at system of resources, which allow one to realize its demanded motions without violating actual restrictions. Therefore presence of such resources is a necessary condition of preservation of required purposefulness of train's dynamics, and verification of the mentioned sufficiency is the major component of this dynamic research. Methodology. Methods of the set theory are used in work. Desirable and actual approachability spaces of the train are found. The train is considered dynamically sufficient in zones of the specified spaces overlapping. Findings. Within the limits of the accepted treatment of train's dynamic sufficiency, verification of its presence, as well as a stock (or deficiency of preservations can be executed by the search and the subsequent estimation of such overlapping zones. Operatively (directly during motion it can be realized on the train's ODC with use, for example, of computer mathematics system Mathematica. It possesses extensive opportunities of highly efficient and, at the same time, demanding an expense concerning small resources information manipulation. The efficiency of using of created technique is illustrated on an example of vehicle's acceleration research. Calculation is executed with use of the constructed computer model of interaction of an independent traction electromagnetic subsystem of an artifact with its mechanical subsystem. Originality. The technique of verification of the high-speed magnetically suspended train's dynamic sufficiency is developed. The technique is highly efficient, it provides sufficient presentation and demands an expense of the

  5. In rDNS We Trust : Revisiting a Common Data-Source’s Reliability

    NARCIS (Netherlands)

    Fiebig, T.; Borgolte, Kevin; Hao, Shuang; Kruegel, Christopher; Vigna, Giovanny; Feldmann, Anja; Beverly, Robert; Smaragdakis, Georgios; Feldmann, Anja

    2018-01-01

    Reverse DNS (rDNS) is regularly used as a data source in Internet measurement research. However, existing work is polarized on its reliability, and new techniques to collect active IPv6 datasets have not yet been sufficiently evaluated. In this paper, we investigate active and passive data

  6. Self-sufficiency, free trade and safety.

    Science.gov (United States)

    Rautonen, Jukka

    2010-01-01

    The relationship between free trade, self-sufficiency and safety of blood and blood components has been a perennial discussion topic in the blood service community. Traditionally, national self-sufficiency has been perceived as the ultimate goal that would also maximize safety. However, very few countries are, or can be, truly self-sufficient when self-sufficiency is understood correctly to encompass the whole value chain from the blood donor to the finished product. This is most striking when plasma derived medicines are considered. Free trade of blood products, or competition, as such can have a negative or positive effect on blood safety. Further, free trade of equipment and reagents and several plasma medicines is actually necessary to meet the domestic demand for blood and blood derivatives in most countries. Opposing free trade due to dogmatic reasons is not in the best interest of any country and will be especially harmful for the developing world. Competition between blood services in the USA has been present for decades. The more than threefold differences in blood product prices between European blood services indicate that competition is long overdue in Europe, too. This competition should be welcomed but carefully and proactively regulated to avoid putting safe and secure blood supply at risk. Copyright 2009 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  7. Reliability calculation of cracked components using probabilistic fracture mechanics and a Markovian approach

    International Nuclear Information System (INIS)

    Schmidt, T.

    1988-01-01

    The numerical reliability calculation of cracked construction components under cyclical fatigue stress can be done with the help of models of probabilistic fracture mechanics. An alternative to the Monte Carlo simulation method is examined; the alternative method is based on the description of failure processes with the help of a Markov process. The Markov method is traced back directly to the stochastic parameters of a two-dimensional fracture mechanics model, the effects of inspections and repairs also being considered. The probability of failure and expected failure frequency can be determined as time functions with the transition and conditional probabilities of the original or derived Markov process. For concrete calculation, an approximative Markov chain is designed which, under certain conditions, is capable of giving a sufficient approximation of the original Markov process and the reliability characteristics determined by it. The application of the MARKOV program code developed into an algorithm reveals sufficient conformity with the Monte Carlo reference results. The starting point of the investigation was the 'Deutsche Risikostudie B (DWR)' ('German Risk Study B (DWR)'), specifically, the reliability of the main coolant line. (orig./HP) [de

  8. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  9. An investigation into the minimum accelerometry wear time for reliable estimates of habitual physical activity and definition of a standard measurement day in pre-school children.

    Science.gov (United States)

    Hislop, Jane; Law, James; Rush, Robert; Grainger, Andrew; Bulley, Cathy; Reilly, John J; Mercer, Tom

    2014-11-01

    The purpose of this study was to determine the number of hours and days of accelerometry data necessary to provide a reliable estimate of habitual physical activity in pre-school children. The impact of a weekend day on reliability estimates was also determined and standard measurement days were defined for weekend and weekdays.Accelerometry data were collected from 112 children (60 males, 52 females, mean (SD) 3.7 (0.7)yr) over 7 d. The Spearman-Brown Prophecy formula (S-B prophecy formula) was used to predict the number of days and hours of data required to achieve an intraclass correlation coefficient (ICC) of 0.7. The impact of including a weekend day was evaluated by comparing the reliability coefficient (r) for any 4 d of data with data for 4 d including one weekend day.Our observations indicate that 3 d of accelerometry monitoring, regardless of whether it includes a weekend day, for at least 7 h  d(-1) offers sufficient reliability to characterise total physical activity and sedentary behaviour of pre-school children. These findings offer an approach that addresses the underlying tension in epidemiologic surveillance studies between the need to maintain acceptable measurement rigour and retention of a representatively meaningful sample size.

  10. Reliability, availability, and quality assurance considerations for fusion components

    International Nuclear Information System (INIS)

    Buende, R.

    1995-01-01

    The complexity of magnetic confinement machines has been a matter of concern in developing fusion power plants as electricity generating stations because it might reduce plant availability. A comprehensive reliability and availability (R and A) programme to determine the availability of a next step fusion machine was performed during definition and conceptual design of the Next European Torus. In addition to giving an overview of the expected contributions to unavailability of the various components, this activity identified the basic approach to be taken to specify and to achieve necessary improvements. This paper, after giving some basic definitions, describes the essentials of the R and A programme, its results, and the guidelines derived for further work towards a sufficiently reliable fusion plant. These guidelines refer to improvement of the reliability database and the quality assurance to be performed at the design stage of a next step machine. (orig.)

  11. Reliability and validity of the de Morton Mobility Index in individuals with sub-acute stroke.

    Science.gov (United States)

    Braun, Tobias; Marks, Detlef; Thiel, Christian; Grüneberg, Christian

    2018-02-04

    To establish the validity and reliability of the de Morton Mobility Index (DEMMI) in patients with sub-acute stroke. This cross-sectional study was performed in a neurological rehabilitation hospital. We assessed unidimensionality, construct validity, internal consistency reliability, inter-rater reliability, minimal detectable change and possible floor and ceiling effects of the DEMMI in adult patients with sub-acute stroke. The study included a total sample of 121 patients with sub-acute stroke. We analysed validity (n = 109) and reliability (n = 51) in two sub-samples. Rasch analysis indicated unidimensionality with an overall fit to the model (chi-square = 12.37, p = 0.577). All hypotheses on construct validity were confirmed. Internal consistency reliability (Cronbach's alpha = 0.94) and inter-rater reliability (intraclass correlation coefficient = 0.95; 95% confidence interval: 0.92-0.97) were excellent. The minimal detectable change with 90% confidence was 13 points. No floor or ceiling effects were evident. These results indicate unidimensionality, sufficient internal consistency reliability, inter-rater reliability, and construct validity of the DEMMI in patients with a sub-acute stroke. Advantages of the DEMMI in clinical application are the short administration time, no need for special equipment and interval level data. The de Morton Mobility Index, therefore, may be a useful performance-based bedside test to measure mobility in individuals with a sub-acute stroke across the whole mobility spectrum. Implications for Rehabilitation The de Morton Mobility Index (DEMMI) is an unidimensional measurement instrument of mobility in individuals with sub-acute stroke. The DEMMI has excellent internal consistency and inter-rater reliability, and sufficient construct validity. The minimal detectable change of the DEMMI with 90% confidence in stroke rehabilitation is 13 points. The lack of any floor or ceiling effects on hospital admission indicates

  12. Inclusion of fatigue effects in human reliability analysis

    International Nuclear Information System (INIS)

    Griffith, Candice D.; Mahadevan, Sankaran

    2011-01-01

    The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: →We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. → We discuss the difficulties in defining and measuring fatigue. → We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.

  13. Reliability of the Radiographic Sagittal and Frontal Tibiotalar Alignment after Ankle Arthrodesis.

    Science.gov (United States)

    Willegger, Madeleine; Holinka, Johannes; Nemecek, Elena; Bock, Peter; Wanivenhaus, Axel Hugo; Windhager, Reinhard; Schuh, Reinhard

    2016-01-01

    Accurate measurement of the tibiotalar alignment is important in radiographic outcome assessment of ankle arthrodesis (AA). In studies, various radiological methods have been used to measure the tibiotalar alignment leading to facultative misinterpretation of results. However, to our knowledge, no previous study has investigated the reliability of tibiotalar alignment measurement in AA. We aimed to investigate the reliability of four different methods of measurement of the frontal and sagittal tibiotalar alignment after AA, and to further clarify the most reliable method for determining the longitudinal axis of the tibia. Thirty-eight weight bearing anterior to posterior and lateral ankle radiographs of thirty-seven patients who had undergone AA with a two screw fixation technique were selected. Three observers measured the frontal tibiotalar angle (FTTA) and the sagittal tibiotalar angle (STTA) using four different methods. The methods differed by the definition of the longitudinal tibial axis. Method A was defined by a line drawn along the lateral tibial border in anterior to posterior radiographs and along the posterior tibial border in lateral radiographs. Method B was defined by a line connecting two points in the middle of the proximal and the distal tibial shaft. Method C was drawn "freestyle"along the longitudinal axis of the tibia, and method D was defined by a line connecting the center of the tibial articular surface and a point in the middle of the proximal tibial shaft. Intra- and interobserver correlation coefficients (ICC) and repeated measurement ANOVA were calculated to assess measurement reliability and accuracy. All four methods showed excellent inter- and intraobserver reliability for the FTTA and the STTA. When the longitudinal tibial axis is defined by connecting two points in the middle of the proximal and the distal tibial shaft, the highest interobserver reliability for the FTTA (ICC: 0.980; CI 95%: 0.966-0.989) and for the STTA (ICC: 0

  14. Reliability of the Radiographic Sagittal and Frontal Tibiotalar Alignment after Ankle Arthrodesis.

    Directory of Open Access Journals (Sweden)

    Madeleine Willegger

    Full Text Available Accurate measurement of the tibiotalar alignment is important in radiographic outcome assessment of ankle arthrodesis (AA. In studies, various radiological methods have been used to measure the tibiotalar alignment leading to facultative misinterpretation of results. However, to our knowledge, no previous study has investigated the reliability of tibiotalar alignment measurement in AA. We aimed to investigate the reliability of four different methods of measurement of the frontal and sagittal tibiotalar alignment after AA, and to further clarify the most reliable method for determining the longitudinal axis of the tibia.Thirty-eight weight bearing anterior to posterior and lateral ankle radiographs of thirty-seven patients who had undergone AA with a two screw fixation technique were selected. Three observers measured the frontal tibiotalar angle (FTTA and the sagittal tibiotalar angle (STTA using four different methods. The methods differed by the definition of the longitudinal tibial axis. Method A was defined by a line drawn along the lateral tibial border in anterior to posterior radiographs and along the posterior tibial border in lateral radiographs. Method B was defined by a line connecting two points in the middle of the proximal and the distal tibial shaft. Method C was drawn "freestyle"along the longitudinal axis of the tibia, and method D was defined by a line connecting the center of the tibial articular surface and a point in the middle of the proximal tibial shaft. Intra- and interobserver correlation coefficients (ICC and repeated measurement ANOVA were calculated to assess measurement reliability and accuracy.All four methods showed excellent inter- and intraobserver reliability for the FTTA and the STTA. When the longitudinal tibial axis is defined by connecting two points in the middle of the proximal and the distal tibial shaft, the highest interobserver reliability for the FTTA (ICC: 0.980; CI 95%: 0.966-0.989 and for the

  15. Analysis of operating reliability of WWER-1000 unit

    International Nuclear Information System (INIS)

    Bortlik, J.

    1985-01-01

    The nuclear power unit was divided into 33 technological units. Input data for reliability analysis were surveys of operating results obtained from the IAEA information system and certain indexes of the reliability of technological equipment determined using the Bayes formula. The missing reliability data for technological equipment were used from the basic variant. The fault tree of the WWER-1000 unit was determined for the peak event defined as the impossibility of reaching 100%, 75% and 50% of rated power. The period was observed of the nuclear power plant operation with reduced output owing to defect and the respective time needed for a repair of the equipment. The calculation of the availability of the WWER-1000 unit was made for different variant situations. Certain indexes of the operating reliability of the WWER-1000 unit which are the result of a detailed reliability analysis are tabulated for selected variants. (E.S.)

  16. Ultra-Reliable Communication in 5G Wireless Systems

    DEFF Research Database (Denmark)

    Popovski, Petar

    2014-01-01

    —Wireless 5G systems will not only be “4G, but faster”. One of the novel features discussed in relation to 5G is Ultra-Reliable Communication (URC), an operation mode not present in today’s wireless systems. URC refers to provision of certain level of communication service almost 100 % of the time....... Example URC applications include reliable cloud connectivity, critical connections for industrial automation and reliable wireless coordination among vehicles. This paper puts forward a systematic view on URC in 5G wireless systems. It starts by analyzing the fundamental mechanisms that constitute......-term URC (URC-S). The second dimension is represented by the type of reliability impairment that can affect the communication reliability in a given scenario. The main objective of this paper is to create the context for defining and solving the new engineering problems posed by URC in 5G....

  17. Provision of reliable core cooling in vessel-type boiling reactors

    International Nuclear Information System (INIS)

    Alferov, N.S.; Balunov, B.F.; Davydov, S.A.

    1987-01-01

    Methods for providing reliable core cooling in vessel-type boiling reactors with natural circulation for heat supply are analysed. The solution of this problem is reduced to satisfaction of two conditions such as: water confinement over the reactor core necessary in case of an accident and confinement of sufficient coolant flow rate through the bottom cross section of fuel assemblies for some time. The reliable fuel element cooling under conditions of a maximum credible accident (brittle failure of a reactor vessel) is shown to be provided practically in any accident, using the safety vessel in combination with the application of means of standard operation and minimal composition and capacity of ECCS

  18. Defining the "Correct Form": Using Biomechanics to Develop Reliable and Valid Assessment Instruments

    Science.gov (United States)

    Satern, Miriam N.

    2011-01-01

    Physical educators should be able to define the "correct form" they expect to see each student performing in their classes. Moreover, they should be able to go beyond assessing students' skill levels by measuring the outcomes (products) of movements (i.e., how far they throw the ball or how many successful attempts are completed) or counting the…

  19. Defining enthesitis in spondyloarthritis by ultrasound

    DEFF Research Database (Denmark)

    Terslev, Lene; Naredo, E; Iagnocco, A

    2014-01-01

    Objective: To standardize ultrasound (US) in enthesitis. Methods: An Initial Delphi exercise was undertaken to define US detected enthesitis and its core components. These definitions were subsequently tested on static images taken from Spondyloarthritis (SpA) patients in order to evaluate...... elementary component. On static images the intra-observer reliability showed a high degree of variability for the detection of elementary lesions with kappa coefficients ranging from 0.14 - 1. The inter-observer kappa value was variable with the lowest kappa for enthesophytes (0.24) and the best for Doppler...... activity at the enthesis (0.63). Conclusion: This is the first consensus based definition of US enthesitis and its elementary components and the first step performed to ensure a higher degree of homogeneity and comparability of results between studies and in daily clinical work. Defining Enthesitis...

  20. Reliability of high mobility SiGe channel MOSFETs for future CMOS applications

    CERN Document Server

    Franco, Jacopo; Groeseneken, Guido

    2014-01-01

    Due to the ever increasing electric fields in scaled CMOS devices, reliability is becoming a showstopper for further scaled technology nodes. Although several groups have already demonstrated functional Si channel devices with aggressively scaled Equivalent Oxide Thickness (EOT) down to 5Å, a 10 year reliable device operation cannot be guaranteed anymore due to severe Negative Bias Temperature Instability. This book focuses on the reliability of the novel (Si)Ge channel quantum well pMOSFET technology. This technology is being considered for possible implementation in next CMOS technology nodes, thanks to its benefit in terms of carrier mobility and device threshold voltage tuning. We observe that it also opens a degree of freedom for device reliability optimization. By properly tuning the device gate stack, sufficiently reliable ultra-thin EOT devices with a 10 years lifetime at operating conditions are demonstrated. The extensive experimental datasets collected on a variety of processed 300mm wafers and pr...

  1. Reliability Analysis of Elasto-Plastic Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1984-01-01

    . Failure of this type of system is defined either as formation of a mechanism or by failure of a prescribed number of elements. In the first case failure is independent of the order in which the elements fail, but this is not so by the second definition. The reliability analysis consists of two parts...... are described and the two definitions of failure can be used by the first formulation, but only the failure definition based on formation of a mechanism by the second formulation. The second part of the reliability analysis is an estimate of the failure probability for the structure on the basis...

  2. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  3. Stability analysis of switched linear systems defined by graphs

    OpenAIRE

    Athanasopoulos, Nikolaos; Lazar, Mircea

    2015-01-01

    We present necessary and sufficient conditions for global exponential stability for switched discrete-time linear systems, under arbitrary switching, which is constrained within a set of admissible transitions. The class of systems studied includes the family of systems under arbitrary switching, periodic systems, and systems with minimum and maximum dwell time specifications. To reach the result, we describe the set of rules that define the admissible transitions with a weighted directed gra...

  4. Reliability of impedance cardiography in measuring central haemodynamics

    DEFF Research Database (Denmark)

    Mehlsen, J; Bonde, J; Stadeager, C

    1991-01-01

    The purpose of the study described here was to investigate the reliability of impedance cardiography (IC) in measuring cardiac output (CO) and central blood volume. Absolute values and changes in these variables obtained by impedance cardiography and by isotope- or thermodilution techniques were...... suitable for repeated measurements in studies on the haemodynamic effects of physiological or pharmacological intervention. Impedance cardiography is sufficiently reliable for comparison of absolute values of CO between different groups of patients. We cannot recommend impedance cardiography...... healthy subjects and in 25 unmedicated patients with ischaemic heart disease. We obtained significant correlations between absolute values (y = 0.68x + 1.48) and changes (y = 1.00x + 0.0003) in CO measured by IC and isotope- or thermodilution. IC significantly overestimated absolute values of CO (P less...

  5. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  6. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  7. Reliability and maintainability data acquisition in equipment development tests

    International Nuclear Information System (INIS)

    Haire, M.J.; Gift, E.H.

    1983-10-01

    The need for collection of reliability, maintainability, and availability data adds a new dimension to the data acquisition requirements of equipment development tests. This report describes the reliability and maintainability data that are considered necessary to ensure that sufficient and high quality data exist for a comprehensive, quantitative evaluation of equipment and system availability. These necessary data are presented as a set of data collection forms. Three data acquisition forms are discussed: an inventory and technical data form, which is filed by the design engineer when the design is finished or the equipment is received; an event report form, which is completed by the senior test operator at each shutdown; and a maintainability report, which is a collaborative effort between senior operators and lead engineers and is completed on restart. In addition, elements of a reliability, maintainability evaluation program are described. Emphasis is placed on the role of data, its storage, and use in such a program

  8. Designing the optimal bit: balancing energetic cost, speed and reliability.

    Science.gov (United States)

    Deshpande, Abhishek; Gopalkrishnan, Manoj; Ouldridge, Thomas E; Jones, Nick S

    2017-08-01

    We consider the challenge of operating a reliable bit that can be rapidly erased. We find that both erasing and reliability times are non-monotonic in the underlying friction, leading to a trade-off between erasing speed and bit reliability. Fast erasure is possible at the expense of low reliability at moderate friction, and high reliability comes at the expense of slow erasure in the underdamped and overdamped limits. Within a given class of bit parameters and control strategies, we define 'optimal' designs of bits that meet the desired reliability and erasing time requirements with the lowest operational work cost. We find that optimal designs always saturate the bound on the erasing time requirement, but can exceed the required reliability time if critically damped. The non-trivial geometry of the reliability and erasing time scales allows us to exclude large regions of parameter space as suboptimal. We find that optimal designs are either critically damped or close to critical damping under the erasing procedure.

  9. Selenide isotope generator for the Galileo mission. Reliability program plan

    International Nuclear Information System (INIS)

    1978-10-01

    The reliability program plan for the Selenide Isotope Generator (SIG) program is presented. It delineates the specific tasks that will be accomplished by Teledyne Energy Systems and its suppliers during design, development, fabrication and test of deliverable Radioisotopic Thermoelectric Generators (RTG), Electrical Heated Thermoelectric Generators (ETG) and associated Ground Support Equipment (GSE). The Plan is formulated in general accordance with procedures specified in DOE Reliability Engineering Program Requirements Publication No. SNS-2, dated June 17, 1974. The Reliability Program Plan presented herein defines the total reliability effort without further reference to Government Specifications. The reliability tasks to be accomplished are delineated herein and become the basis for contract compliance to the extent specified in the SIG contract Statement of Work

  10. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  11. Reliability centred maintenance of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Kovacs, Zoltan; Novakova, Helena; Hlavac, Pavol; Janicek, Frantisek

    2011-01-01

    A method for the optimization of preventive maintenance nuclear power plant equipment, i.e. reliability centred maintenance, is described. The method enables procedures and procedure schedules to be defined such as allow the maintenance cost to be minimized without compromising operational safety or reliability. Also, combinations of facilities which remain available and ensure reliable operation of the reactor unit during the maintenance of other pieces of equipment are identified. The condition-based maintenance concept is used in this process, thereby preventing unnecessary operator interventions into the equipment, which are often associated with human errors. Where probabilistic safety assessment is available, the most important structures, systems and components with the highest maintenance priority can be identified. (orig.)

  12. Scheduling of Crude Oil Operations in Refinery without Sufficient Charging Tanks Using Petri Nets

    Directory of Open Access Journals (Sweden)

    Yan An

    2017-05-01

    Full Text Available A short-term schedule for crude oil operations in a refinery should define and sequence the activities in detail. Each activity involves both discrete-event and continuous variables. The combinatorial nature of the scheduling problem makes it difficult to solve. For such a scheduling problem, charging tanks are a type of critical resources. If the number of charging tanks is not sufficient, the scheduling problem is further complicated. This work conducts a study on the scheduling problem of crude oil operations without sufficient charging tanks. In this case, to make a refinery able to operate, a charging tank has to be in simultaneous charging and feeding to a distiller for some time, called simultaneously-charging-and-feeding (SCF mode, leading to disturbance to the oil distillation in distillers. A hybrid Petri net model is developed to describe the behavior of the system. Then, a scheduling method is proposed to find a schedule such that the SCF mode is minimally used. It is computationally efficient. An industrial case study is given to demonstrate the obtained results.

  13. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  14. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  15. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  16. A guide to reliability data collection, validation and storage

    International Nuclear Information System (INIS)

    Stevens, B.

    1986-01-01

    The EuReDatA Working Group produced a basic document that addressed many of the problems associated with the design of a suitable data collection scheme to achieve pre-defined objectives. The book that resulted from this work describes the need for reliability data, data sources and collection procedures, component description and classification, form design, data management, updating and checking procedures, the estimation of failure rates, availability and utilisation factors, and uncertainties in reliability parameters. (DG)

  17. Development of web-based reliability data base platform

    International Nuclear Information System (INIS)

    Hwang, Seok Won; Lee, Chang Ju; Sung, Key Yong

    2004-01-01

    Probabilistic safety assessment (PSA) is a systematic technique which estimates the degree of risk impacts to the public due to an accident scenario. Estimating the occurrence frequencies and consequences of potential scenarios requires a thorough analysis of the accident details and all fundamental parameters. The robustness of PSA to check weaknesses in a design and operation will allow a better informed and balanced decision to be reached. The fundamental parameters for PSA, such as the component failure rates, should be estimated under the condition of steady collection of the evidence throughout the operational period. However, since any single plant data does not sufficiently enough to provide an adequate PSA result, in actual, the whole operating data was commonly used to estimate the reliability parameters for the same type of components. The reliability data of any component type consists of two categories; the generic that is based on the operating experiences of whole plants, and the plant-specific that is based on the operation of a specific plant of interest. The generic data is highly essential for new or recently-built nuclear power plants (NPPs). Generally, the reliability data base may be categorized into the component reliability, initiating event frequencies, human performance, and so on. Among these data, the component reliability seems a key element because it has the most abundant population. Therefore, the component reliability data is essential for taking a part in the quantification of accident sequences because it becomes an input of various basic events which consists of the fault tree

  18. Behavioral reliability program for the nuclear industry. Technical report

    International Nuclear Information System (INIS)

    Buchanan, J.C.; Davis, S.O.; Dunnette, M.D.; Meyer, P.; Sharac, J.

    1981-07-01

    The subject of the study was the development of standards for a behavioral observation program which could be used by the NRC licensed nuclear industry to detect indications of emotional instability in its employees who have access to protected and vital areas. Emphasis was placed on those observable characteristics which could be assessed by supervisors or peers in a work environment. The behavioral reliability program, as was defined in this report, encompasses the concept and basic components of the program, the definition of the behavioral reliability program, the definition of the behavioral reliability criterion, and a set of instructions for the creation and implementation of the program by an individual facility

  19. Reliability In A White Rabbit Network

    CERN Document Server

    Lipiński, M; Wlostowski, T; Prados, C

    2011-01-01

    White Rabbit (WR) is a time-deterministic, low-latency Ethernet-based network which enables transparent, subns accuracy timing distribution. It is being developed to replace the General Machine Timing (GMT) system currently used at CERN and will become the foundation for the control system of the Facility for Antiproton and Ion Research (FAIR) at GSI. High reliability is an important issue inWR’s design, since unavailability of the accelerator’s control system will directly translate into expensive downtime of the machine. A typical WR network is required to lose not more than a single message per year. Due toWR’s complexity, the translation of this real-world-requirement into a reliability-requirement constitutes an interesting issue on its own – a WR network is considered functional only if it provides all its services to all its clients at any time. This paper defines reliability in WR and describes how it was addressed by dividing it into sub-domains: deterministic packet delivery, data resilience...

  20. Reliability-based assessment of polyethylene pipe creep lifetime

    International Nuclear Information System (INIS)

    Khelif, Rabia; Chateauneuf, Alaa; Chaoui, Kamel

    2007-01-01

    Lifetime management of underground pipelines is mandatory for safe hydrocarbon transmission and distribution systems. The use of high-density polyethylene tubes subjected to internal pressure, external loading and environmental variations requires a reliability study in order to define the service limits and the optimal operating conditions. In service, the time-dependent phenomena, especially creep, take place during the pipe lifetime, leading to significant strength reduction. In this work, the reliability-based assessment of pipe lifetime models is carried out, in order to propose a probabilistic methodology for lifetime model selection and to determine the pipe safety levels as well as the most important parameters for pipeline reliability. This study is enhanced by parametric analysis on pipe configuration, gas pressure and operating temperature

  1. Reliability-based assessment of polyethylene pipe creep lifetime

    Energy Technology Data Exchange (ETDEWEB)

    Khelif, Rabia [LaMI-UBP and IFMA, Campus de Clermont-Fd, Les Cezeaux, BP 265, 63175 Aubiere Cedex (France); LR3MI, Departement de Genie Mecanique, Universite Badji Mokhtar, BP 12, Annaba 23000 (Algeria)], E-mail: rabia.khelif@ifma.fr; Chateauneuf, Alaa [LGC-University Blaise Pascal, Campus des Cezeaux, BP 206, 63174 Aubiere Cedex (France)], E-mail: alaa.chateauneuf@polytech.univ-bpclermont.fr; Chaoui, Kamel [LR3MI, Departement de Genie Mecanique, Universite Badji Mokhtar, BP 12, Annaba 23000 (Algeria)], E-mail: chaoui@univ-annaba.org

    2007-12-15

    Lifetime management of underground pipelines is mandatory for safe hydrocarbon transmission and distribution systems. The use of high-density polyethylene tubes subjected to internal pressure, external loading and environmental variations requires a reliability study in order to define the service limits and the optimal operating conditions. In service, the time-dependent phenomena, especially creep, take place during the pipe lifetime, leading to significant strength reduction. In this work, the reliability-based assessment of pipe lifetime models is carried out, in order to propose a probabilistic methodology for lifetime model selection and to determine the pipe safety levels as well as the most important parameters for pipeline reliability. This study is enhanced by parametric analysis on pipe configuration, gas pressure and operating temperature.

  2. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  3. Prediction of software operational reliability using testing environment factor

    International Nuclear Information System (INIS)

    Jung, Hoan Sung

    1995-02-01

    Software reliability is especially important to customers these days. The need to quantify software reliability of safety-critical systems has been received very special attention and the reliability is rated as one of software's most important attributes. Since the software is an intellectual product of human activity and since it is logically complex, the failures are inevitable. No standard models have been established to prove the correctness and to estimate the reliability of software systems by analysis and/or testing. For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is on the operational reliability rather than on the test reliability, however. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, testing environment factor comprising the aging factor and the coverage factor are defined in this work to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factor Test reliability can also be estimated with this approach without any model change. The application results are close to the actual data. The approach used in this thesis is expected to be applicable to ultra high reliable software systems that are used in nuclear power plants, airplanes, and other safety-critical applications

  4. Reliability of self-rated tinnitus distress and association with psychological symptom patterns.

    Science.gov (United States)

    Hiller, W; Goebel, G; Rief, W

    1994-05-01

    Psychological complaints were investigated in two samples of 60 and 138 in-patients suffering from chronic tinnitus. We administered the Tinnitus Questionnaire (TQ), a 52-item self-rating scale which differentiates between dimensions of emotional and cognitive distress, intrusiveness, auditory perceptual difficulties, sleep disturbances and somatic complaints. The test-retest reliability was .94 for the TQ global score and between .86 and .93 for subscales. Three independent analyses were conducted to estimate the split-half reliability (internal consistency) which was only slightly lower than the test-retest values for scales with a relatively small number of items. Reliability was sufficient also on the level of single items. Low correlation between the TQ and the Hopkins Symptom Checklist (SCL-90-R) indicate a distinct quality of tinnitus-related and general psychological disturbances.

  5. Quality control methods in accelerometer data processing: defining minimum wear time.

    Directory of Open Access Journals (Sweden)

    Carly Rich

    Full Text Available BACKGROUND: When using accelerometers to measure physical activity, researchers need to determine whether subjects have worn their device for a sufficient period to be included in analyses. We propose a minimum wear criterion using population-based accelerometer data, and explore the influence of gender and the purposeful inclusion of children with weekend data on reliability. METHODS: Accelerometer data obtained during the age seven sweep of the UK Millennium Cohort Study were analysed. Children were asked to wear an ActiGraph GT1M accelerometer for seven days. Reliability coefficients(r of mean daily counts/minute were calculated using the Spearman-Brown formula based on the intraclass correlation coefficient. An r of 1.0 indicates that all the variation is between- rather than within-children and that measurement is 100% reliable. An r of 0.8 is often regarded as acceptable reliability. Analyses were repeated on data from children who met different minimum daily wear times (one to 10 hours and wear days (one to seven days. Analyses were conducted for all children, separately for boys and girls, and separately for children with and without weekend data. RESULTS: At least one hour of wear time data was obtained from 7,704 singletons. Reliability increased as the minimum number of days and the daily wear time increased. A high reliability (r = 0.86 and sample size (n = 6,528 was achieved when children with ≥ two days lasting ≥10 hours/day were included in analyses. Reliability coefficients were similar for both genders. Purposeful sampling of children with weekend data resulted in comparable reliabilities to those calculated independent of weekend wear. CONCLUSION: Quality control procedures should be undertaken before analysing accelerometer data in large-scale studies. Using data from children with ≥ two days lasting ≥10 hours/day should provide reliable estimates of physical activity. It's unnecessary to include only children

  6. Research of radioecological processes by methods of the theory of reliability

    International Nuclear Information System (INIS)

    Kutlakhmedov, Yu.A.; Salivon, A.G.; Pchelovskaya, S.A.; Rodina, V.V.; Bevza, A.G.; Matveeva, I.V.

    2012-01-01

    Theory and the models of radiocapacity ecosystems using the theory and models of reliability have allowed adequately to describe the laws of migration and radionuclides distribution for different types ecosystems of reservoirs and land. The theory and the models of radiocapacity allow strictly to define critical elements of ecosystem where it is necessary to expect temporary or final depoting of radionuclides.The approach on the basis of application biogenic tracers allows within the framework of the theory both models of radiocapacity and reliability simultaneously to estimate the processes of radionuclides migration, to define the dozes of loading on biota ecosystems, and to establish fundamental parameters of radionuclides redistribution speeds and others pollutants in different types of ecosystems.

  7. The reliability of the Adelaide in-shoe foot model.

    Science.gov (United States)

    Bishop, Chris; Hillier, Susan; Thewlis, Dominic

    2017-07-01

    Understanding the biomechanics of the foot is essential for many areas of research and clinical practice such as orthotic interventions and footwear development. Despite the widespread attention paid to the biomechanics of the foot during gait, what largely remains unknown is how the foot moves inside the shoe. This study investigated the reliability of the Adelaide In-Shoe Foot Model, which was designed to quantify in-shoe foot kinematics and kinetics during walking. Intra-rater reliability was assessed in 30 participants over five walking trials whilst wearing shoes during two data collection sessions, separated by one week. Sufficient reliability for use was interpreted as a coefficient of multiple correlation and intra-class correlation coefficient of >0.61. Inter-rater reliability was investigated separately in a second sample of 10 adults by two researchers with experience in applying markers for the purpose of motion analysis. The results indicated good consistency in waveform estimation for most kinematic and kinetic data, as well as good inter-and intra-rater reliability. The exception is the peak medial ground reaction force, the minimum abduction angle and the peak abduction/adduction external hindfoot joint moments which resulted in less than acceptable repeatability. Based on our results, the Adelaide in-shoe foot model can be used with confidence for 24 commonly measured biomechanical variables during shod walking. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Dementia caregiver burden: reliability of the Brazilian version of the Zarit caregiver burden interview

    Directory of Open Access Journals (Sweden)

    Taub Anita

    2004-01-01

    Full Text Available The object of this article is to examine the reliability of the Brazilian version of the Zarit Caregiver Burden Interview (ZBI. The instrument is a 22-item scale assessing the extent to which caregivers view their responsibilities as having an adverse impact on their social life, health, emotional well-being, and finances. We assessed 50 primary informal caregivers of demented patients coming from 3 different health care centers, using the test-retest method. Analysis of the results showed an intraclass reliability coefficient of 0.88, while Cronbach's coefficient alpha was 0.77 for the test and 0.80 for the retest items. The Brazilian version of ZBI shows sufficient reliability, comparable to the original version.

  9. Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information

    Science.gov (United States)

    Brall, Aron

    2013-01-01

    This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.

  10. Damage Model for Reliability Assessment of Solder Joints in Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    environmental factors. Reliability assessment for such type of products conventionally is performed by classical reliability techniques based on test data. Usually conventional reliability approaches are time and resource consuming activities. Thus in this paper we choose a physics of failure approach to define...... damage model by Miner’s rule. Our attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Based on the proposed method it is described how to find the damage level for a given temperature loading profile. The proposed method is discussed...

  11. Cost-effective solutions to maintaining smart grid reliability

    Science.gov (United States)

    Qin, Qiu

    As the aging power systems are increasingly working closer to the capacity and thermal limits, maintaining an sufficient reliability has been of great concern to the government agency, utility companies and users. This dissertation focuses on improving the reliability of transmission and distribution systems. Based on the wide area measurements, multiple model algorithms are developed to diagnose transmission line three-phase short to ground faults in the presence of protection misoperations. The multiple model algorithms utilize the electric network dynamics to provide prompt and reliable diagnosis outcomes. Computational complexity of the diagnosis algorithm is reduced by using a two-step heuristic. The multiple model algorithm is incorporated into a hybrid simulation framework, which consist of both continuous state simulation and discrete event simulation, to study the operation of transmission systems. With hybrid simulation, line switching strategy for enhancing the tolerance to protection misoperations is studied based on the concept of security index, which involves the faulted mode probability and stability coverage. Local measurements are used to track the generator state and faulty mode probabilities are calculated in the multiple model algorithms. FACTS devices are considered as controllers for the transmission system. The placement of FACTS devices into power systems is investigated with a criterion of maintaining a prescribed level of control reconfigurability. Control reconfigurability measures the small signal combined controllability and observability of a power system with an additional requirement on fault tolerance. For the distribution systems, a hierarchical framework, including a high level recloser allocation scheme and a low level recloser placement scheme, is presented. The impacts of recloser placement on the reliability indices is analyzed. Evaluation of reliability indices in the placement process is carried out via discrete event

  12. Information geometry and sufficient statistics

    Czech Academy of Sciences Publication Activity Database

    Ay, N.; Jost, J.; Le, Hong-Van; Schwachhöfer, L.

    2015-01-01

    Roč. 162, 1-2 (2015), s. 327-364 ISSN 0178-8051 Institutional support: RVO:67985840 Keywords : Fisher quadratic form * Amari-Chentsov tensor * sufficient statistic Subject RIV: BA - General Mathematics Impact factor: 2.204, year: 2015 http://link.springer.com/article/10.1007/s00440-014-0574-8

  13. Multi-objective reliability redundancy allocation in an interval environment using particle swarm optimization

    International Nuclear Information System (INIS)

    Zhang, Enze; Chen, Qingwei

    2016-01-01

    Most of the existing works addressing reliability redundancy allocation problems are based on the assumption of fixed reliabilities of components. In real-life situations, however, the reliabilities of individual components may be imprecise, most often given as intervals, under different operating or environmental conditions. This paper deals with reliability redundancy allocation problems modeled in an interval environment. An interval multi-objective optimization problem is formulated from the original crisp one, where system reliability and cost are simultaneously considered. To render the multi-objective particle swarm optimization (MOPSO) algorithm capable of dealing with interval multi-objective optimization problems, a dominance relation for interval-valued functions is defined with the help of our newly proposed order relations of interval-valued numbers. Then, the crowding distance is extended to the multi-objective interval-valued case. Finally, the effectiveness of the proposed approach has been demonstrated through two numerical examples and a case study of supervisory control and data acquisition (SCADA) system in water resource management. - Highlights: • We model the reliability redundancy allocation problem in an interval environment. • We apply the particle swarm optimization directly on the interval values. • A dominance relation for interval-valued multi-objective functions is defined. • The crowding distance metric is extended to handle imprecise objective functions.

  14. Enough is as good as a feast - sufficiency as policy

    Energy Technology Data Exchange (ETDEWEB)

    Darby, Sarah [Lower Carbon Futures, Environmental Change Inst., Oxford Univ. Centre for the Environment (United Kingdom)

    2007-07-01

    The concept of sufficiency has a long history, related as it is to the timeless issues of how best to distribute and use resources. Where energy is concerned, absolute reductions in demand are increasingly seen as necessary in response to climate change and energy security concerns. There is an acknowledgement that, collectively if not individually, humans have gone beyond safe limits in their use of fuels. The relatively wealthy and industrialised nations urgently need to move beyond a primary focus on efficiency to the more contentious issues surrounding demand reduction and sufficiency. The paper considers definitions of energy sufficiency, looks at a recent attempt to model future energy use in terms of efficiency and sufficiency, and discusses quantitative and qualitative aspects of sufficiency and how they might become institutionalised. There are many arguments in favour of sufficiency but they often founder in the face of political requirements for market growth and the employment generated by it. Some options for 'sufficiency policy' are selected, including a focus on energy in relation to livelihoods, energy implications of our use of time and making energy use more transparent.

  15. A critique of reliability prediction techniques for avionics applications

    Directory of Open Access Journals (Sweden)

    Guru Prasad PANDIAN

    2018-01-01

    Full Text Available Avionics (aeronautics and aerospace industries must rely on components and systems of demonstrated high reliability. For this, handbook-based methods have been traditionally used to design for reliability, develop test plans, and define maintenance requirements and sustainment logistics. However, these methods have been criticized as flawed and leading to inaccurate and misleading results. In its recent report on enhancing defense system reliability, the U.S. National Academy of Sciences has recently discredited these methods, judging the Military Handbook (MIL-HDBK-217 and its progeny as invalid and inaccurate. This paper discusses the issues that arise with the use of handbook-based methods in commercial and military avionics applications. Alternative approaches to reliability design (and its demonstration are also discussed, including similarity analysis, testing, physics-of-failure, and data analytics for prognostics and systems health management.

  16. Reliability benefits of dispersed wind resource development

    International Nuclear Information System (INIS)

    Milligan, M.; Artig, R.

    1998-05-01

    Generating capacity that is available during the utility peak period is worth more than off-peak capacity. Wind power from a single location might not be available during enough of the peak period to provide sufficient value. However, if the wind power plant is developed over geographically disperse locations, the timing and availability of wind power from these multiple sources could provide a better match with the utility's peak load than a single site. There are other issues that arise when considering disperse wind plant development. Singular development can result in economies of scale and might reduce the costs of obtaining multiple permits and multiple interconnections. However, disperse development can result in cost efficiencies if interconnection can be accomplished at lower voltages or at locations closer to load centers. Several wind plants are in various stages of planning or development in the US. Although some of these are small-scale demonstration projects, significant wind capacity has been developed in Minnesota, with additional developments planned in Wyoming, Iowa and Texas. As these and other projects are planned and developed, there is a need to perform analysis of the value of geographically disperse sites on the reliability of the overall wind plant.This paper uses a production-cost/reliability model to analyze the reliability of several wind sites in the state of Minnesota. The analysis finds that the use of a model with traditional reliability measures does not produce consistent, robust results. An approach based on fuzzy set theory is applied in this paper, with improved results. Using such a model, the authors find that system reliability can be optimized with a mix of disperse wind sites

  17. When is there sufficient information from the Site Investigations?

    International Nuclear Information System (INIS)

    Andersson, Johan; Munier, Raymond; Stroem, Anders; Soederbaeck, Bjoern; Almen, Karl-Erik; Olsson, Lars

    2004-04-01

    SKB has started site investigations for a deep repository for spent nuclear fuel at two different sites in Sweden. The investigations should provide necessary information for a licence application aimed at starting underground exploration. The investigations and analyses of them are supposed to provide the broad knowledge base that is required to achieve the overall goals of the site investigation phase. The knowledge will be utilized to evaluate the suitability of investigated sites for the deep repository and must be comprehensive enough to: Show whether the selected site satisfies requirements on safety and technical aspects. Serve as a basis for adaptation of the deep repository to the characteristics of the site with an acceptable impact on society and the environment. Permit comparisons with other investigated sites. Furthermore, the investigations are discontinued when the reliability of the site description has reached such a level that the body of data for safety assessment and design is sufficient, or until the body of data shows that the rock does not satisfy the requirements. These objectives are valid, but do not provide sufficient and concrete guidance. For this reason SKB has conducted this project which should acquire concrete guidance on how to judge when the surface based Site Investigation Phase does not need to continue. After a general assessment of the problem, the following specific objectives of the current work were identified: Demonstrate concretely how the assessed uncertainties in a Site Description based on a specific level of investigations, together with expected feedback from Safety Assessment and Engineering, can be used to decide whether the site investigations are sufficient - or need to continue. This demonstration will be based on a practical application of relevant aspects of decision analysis tools. Highlight and make concrete the type of feedback to be expected from Safety Assessment and Engineering and show how this feedback

  18. When is there sufficient information from the Site Investigations?

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Munier, Raymond; Stroem, Anders; Soederbaeck, Bjoern [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Almen, Karl-Erik [KEA Geo-konsult (Sweden); Olsson, Lars [Geostatistik AB, Tumba (Sweden)

    2004-04-01

    SKB has started site investigations for a deep repository for spent nuclear fuel at two different sites in Sweden. The investigations should provide necessary information for a licence application aimed at starting underground exploration. The investigations and analyses of them are supposed to provide the broad knowledge base that is required to achieve the overall goals of the site investigation phase. The knowledge will be utilized to evaluate the suitability of investigated sites for the deep repository and must be comprehensive enough to: Show whether the selected site satisfies requirements on safety and technical aspects. Serve as a basis for adaptation of the deep repository to the characteristics of the site with an acceptable impact on society and the environment. Permit comparisons with other investigated sites. Furthermore, the investigations are discontinued when the reliability of the site description has reached such a level that the body of data for safety assessment and design is sufficient, or until the body of data shows that the rock does not satisfy the requirements. These objectives are valid, but do not provide sufficient and concrete guidance. For this reason SKB has conducted this project which should acquire concrete guidance on how to judge when the surface based Site Investigation Phase does not need to continue. After a general assessment of the problem, the following specific objectives of the current work were identified: Demonstrate concretely how the assessed uncertainties in a Site Description based on a specific level of investigations, together with expected feedback from Safety Assessment and Engineering, can be used to decide whether the site investigations are sufficient - or need to continue. This demonstration will be based on a practical application of relevant aspects of decision analysis tools. Highlight and make concrete the type of feedback to be expected from Safety Assessment and Engineering and show how this feedback

  19. Portfolio assessment during medical internships: How to obtain a reliable and feasible assessment procedure?

    Science.gov (United States)

    Michels, Nele R M; Driessen, Erik W; Muijtjens, Arno M M; Van Gaal, Luc F; Bossaert, Leo L; De Winter, Benedicte Y

    2009-12-01

    A portfolio is used to mentor and assess students' clinical performance at the workplace. However, students and raters often perceive the portfolio as a time-consuming instrument. In this study, we investigated whether assessment during medical internship by a portfolio can combine reliability and feasibility. The domain-oriented reliability of 61 double-rated portfolios was measured, using a generalisability analysis with portfolio tasks and raters as sources of variation in measuring the performance of a student. We obtained reliability (Phi coefficient) of 0.87 with this internship portfolio containing 15 double-rated tasks. The generalisability analysis showed that an acceptable level of reliability (Phi = 0.80) was maintained when the amount of portfolio tasks was decreased to 13 or 9 using one and two raters, respectively. Our study shows that a portfolio can be a reliable method for the assessment of workplace learning. The possibility of reducing the amount of tasks or raters while maintaining a sufficient level of reliability suggests an increase in feasibility of portfolio use for both students and raters.

  20. Reliability Analysis for Adhesive Bonded Composite Stepped Lap Joints Loaded in Fatigue

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Sørensen, John Dalsgaard; Lund, Erik

    2012-01-01

    -1, where partial safety factors are introduced together with characteristic values. Asymptotic sampling is used to estimate the reliability with support points generated by randomized Sobol sequences. The predicted reliability level is compared with the implicitly required target reliability level defined......This paper describes a probabilistic approach to calculate the reliability of adhesive bonded composite stepped lap joints loaded in fatigue using three- dimensional finite element analysis (FEA). A method for progressive damage modelling is used to assess fatigue damage accumulation and residual...... by the wind turbine standard IEC 61400-1. Finally, an approach for the assessment of the reliability of adhesive bonded composite stepped lap joints loaded in fatigue is presented. The introduced methodology can be applied in the same way to calculate the reliability level of wind turbine blade components...

  1. Optimal reliability design for over-actuated systems based on the MIT rule: Application to an octocopter helicopter testbed

    International Nuclear Information System (INIS)

    Chamseddine, Abbas; Theilliol, Didier; Sadeghzadeh, Iman; Zhang, Youmin; Weber, Philippe

    2014-01-01

    This paper addresses the problem of optimal reliability in over-actuated systems. Overloading an actuator decreases its overall lifetime and reduces its average performance over a long time. Therefore, performance and reliability are two conflicting requirements. While appropriate reliability is related to average loads, good performance is related to fast response and sufficient loads generated by actuators. Actuator redundancy allows us to address both performance and reliability at the same time by properly allocating desired loads among redundant actuators. The main contribution of this paper is the on-line optimization of the overall plant reliability according to performance objective using an MIT (Massachusetts Institute of Technology) rule-based method. The effectiveness of the proposed method is illustrated through an experimental application to an octocopter helicopter testbed

  2. Exploring Societal Preferences for Energy Sufficiency Measures in Switzerland

    International Nuclear Information System (INIS)

    Moser, Corinne; Rösch, Andreas; Stauffacher, Michael

    2015-01-01

    Many countries are facing a challenging transition toward more sustainable energy systems, which produce more renewables and consume less energy. The latter goal can only be achieved through a combination of efficiency measures and changes in people’s lifestyles and routine behaviors (i.e., sufficiency). While research has shown that acceptance of technical efficiency is relatively high, there is a lack of research on societal preferences for sufficiency measures. However, this is an important prerequisite for designing successful interventions to change behavior. This paper analyses societal preferences for different energy-related behaviors in Switzerland. We use an online choice-based conjoint analysis (N = 150) to examine preferences for behaviors with high technical potentials for energy demand reduction in the following domains: mobility, heating, and food. Each domain comprises different attributes across three levels of sufficiency. Respondents were confronted with trade-off situations evoked through different fictional lifestyles that comprised different combinations of attribute levels. Through a series of trade-off decisions, participants were asked to choose their preferred lifestyle. The results revealed that a vegetarian diet was considered the most critical issue that respondents were unwilling to trade off, followed by distance to workplace and means of transportation. The highest willingness to trade off was found for adjustments in room temperature, holiday travel behaviors, and living space. Participants’ preferences for the most energy-sufficient lifestyles were rather low. However, the study showed that there were lifestyles with substantive energy-saving potentials that were well accepted among respondents. Our study results suggest that the success of energy-sufficiency interventions might depend strongly on the targeted behavior. We speculate that they may face strong resistance (e.g., vegetarian diet). Thus, it seems promising to

  3. Exploring Societal Preferences for Energy Sufficiency Measures in Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Moser, Corinne, E-mail: corinne.moser@zhaw.ch [Institute of Sustainable Development, School of Engineering, Zurich University of Applied Sciences, Winterthur (Switzerland); Natural and Social Science Interface, Institute for Environmental Decisions, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland); Rösch, Andreas [Natural and Social Science Interface, Institute for Environmental Decisions, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland); Stauffacher, Michael [Natural and Social Science Interface, Institute for Environmental Decisions, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland); Transdisciplinarity Laboratory, Department of Environmental Systems Science, ETH Zürich, Zürich (Switzerland)

    2015-09-16

    Many countries are facing a challenging transition toward more sustainable energy systems, which produce more renewables and consume less energy. The latter goal can only be achieved through a combination of efficiency measures and changes in people’s lifestyles and routine behaviors (i.e., sufficiency). While research has shown that acceptance of technical efficiency is relatively high, there is a lack of research on societal preferences for sufficiency measures. However, this is an important prerequisite for designing successful interventions to change behavior. This paper analyses societal preferences for different energy-related behaviors in Switzerland. We use an online choice-based conjoint analysis (N = 150) to examine preferences for behaviors with high technical potentials for energy demand reduction in the following domains: mobility, heating, and food. Each domain comprises different attributes across three levels of sufficiency. Respondents were confronted with trade-off situations evoked through different fictional lifestyles that comprised different combinations of attribute levels. Through a series of trade-off decisions, participants were asked to choose their preferred lifestyle. The results revealed that a vegetarian diet was considered the most critical issue that respondents were unwilling to trade off, followed by distance to workplace and means of transportation. The highest willingness to trade off was found for adjustments in room temperature, holiday travel behaviors, and living space. Participants’ preferences for the most energy-sufficient lifestyles were rather low. However, the study showed that there were lifestyles with substantive energy-saving potentials that were well accepted among respondents. Our study results suggest that the success of energy-sufficiency interventions might depend strongly on the targeted behavior. We speculate that they may face strong resistance (e.g., vegetarian diet). Thus, it seems promising to

  4. Reliability program plan for the Kilowatt Isotope Power System (KIPS) technology verification phase

    International Nuclear Information System (INIS)

    1978-01-01

    Ths document is an integral part of the Kilowatt Isotope Power System (KIPS) Program Plan. This document defines the KIPS Reliability Program Plan for the Technology Verification Phase. This document delineates the reliability assurance tasks that are to be accomplished by Sundstrand and its suppliers during the design, fabrication and testing of the KIPS

  5. Validity and Reliability of the 8-Item Work Limitations Questionnaire.

    Science.gov (United States)

    Walker, Timothy J; Tullar, Jessica M; Diamond, Pamela M; Kohl, Harold W; Amick, Benjamin C

    2017-12-01

    Purpose To evaluate factorial validity, scale reliability, test-retest reliability, convergent validity, and discriminant validity of the 8-item Work Limitations Questionnaire (WLQ) among employees from a public university system. Methods A secondary analysis using de-identified data from employees who completed an annual Health Assessment between the years 2009-2015 tested research aims. Confirmatory factor analysis (CFA) (n = 10,165) tested the latent structure of the 8-item WLQ. Scale reliability was determined using a CFA-based approach while test-retest reliability was determined using the intraclass correlation coefficient. Convergent/discriminant validity was tested by evaluating relations between the 8-item WLQ with health/performance variables for convergent validity (health-related work performance, number of chronic conditions, and general health) and demographic variables for discriminant validity (gender and institution type). Results A 1-factor model with three correlated residuals demonstrated excellent model fit (CFI = 0.99, TLI = 0.99, RMSEA = 0.03, and SRMR = 0.01). The scale reliability was acceptable (0.69, 95% CI 0.68-0.70) and the test-retest reliability was very good (ICC = 0.78). Low-to-moderate associations were observed between the 8-item WLQ and the health/performance variables while weak associations were observed between the demographic variables. Conclusions The 8-item WLQ demonstrated sufficient reliability and validity among employees from a public university system. Results suggest the 8-item WLQ is a usable alternative for studies when the more comprehensive 25-item WLQ is not available.

  6. Efficiency evaluation of an electronic equipment: availability,reliability and maintenance

    International Nuclear Information System (INIS)

    Guyot, C.

    1966-01-01

    This concept of efficiency often called ''system effectiveness'', is presented and analyzed in terms of reliability and maintenance. It allows to define the availability factor of an electronic equipment. A procedure of evaluation is proposed. (A.L.B.)

  7. The role of high cycle fatigue (HCF) onset in Francis runner reliability

    International Nuclear Information System (INIS)

    Gagnon, M; Tahan, S A; Bocher, P; Thibault, D

    2012-01-01

    High Cycle Fatigue (HCF) plays an important role in Francis runner reliability. This paper presents a model in which reliability is defined as the probability of not exceeding a threshold above which HCF contributes to crack propagation. In the context of combined Low Cycle Fatigue (LCF) and HCF loading, the Kitagawa diagram is used as the limit state threshold for reliability. The reliability problem is solved using First-Order Reliability Methods (FORM). A study case is proposed using in situ measured strains and operational data. All the parameters of the reliability problem are based either on observed data or on typical design specifications. From the results obtained, we observed that the uncertainty around the defect size and the HCF stress range play an important role in reliability. At the same time, we observed that expected values for the LCF stress range and the number of LCF cycles have a significant influence on life assessment, but the uncertainty around these values could be neglected in the reliability assessment.

  8. Hybrid Structural Reliability Analysis under Multisource Uncertainties Based on Universal Grey Numbers

    Directory of Open Access Journals (Sweden)

    Xingfa Yang

    2018-01-01

    Full Text Available Nondeterministic parameters of certain distribution are employed to model structural uncertainties, which are usually assumed as stochastic factors. However, model parameters may not be precisely represented due to some factors in engineering practices, such as lack of sufficient data, data with fuzziness, and unknown-but-bounded conditions. To this end, interval and fuzzy parameters are implemented and an efficient approach to structural reliability analysis with random-interval-fuzzy hybrid parameters is proposed in this study. Fuzzy parameters are first converted to equivalent random ones based on the equal entropy principle. 3σ criterion is then employed to transform the equivalent random and the original random parameters to interval variables. In doing this, the hybrid reliability problem is transformed into the one only with interval variables, in other words, nonprobabilistic reliability analysis problem. Nevertheless, the problem of interval extension existed in interval arithmetic, especially for the nonlinear systems. Therefore, universal grey mathematics, which can tackle the issue of interval extension, is employed to solve the nonprobabilistic reliability analysis problem. The results show that the proposed method can obtain more conservative results of the hybrid structural reliability.

  9. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  10. The Comprehensive Snack Parenting Questionnaire (CSPQ: Development and Test-Retest Reliability

    Directory of Open Access Journals (Sweden)

    Dorus W. M. Gevers

    2018-04-01

    Full Text Available The narrow focus of existing food parenting instruments led us to develop a food parenting practices instrument measuring the full range of food practices constructs with a focus on snacking behavior. We present the development of the questionnaire and our research on the test-retest reliability. The developed Comprehensive Snack Parenting Questionnaire (CSPQ covers 21 constructs. Test-retest reliability was assessed by calculating intra class correlation coefficients and percentage agreement after two administrations of the CSPQ among a sample of 66 Dutch parents. Test-retest reliability analysis revealed acceptable intra class correlation coefficients (≥0.41 or agreement scores (≥0.60 for all items. These results, together with earlier work, suggest sufficient psychometric characteristics. The comprehensive, but brief CSPQ opens up chances for highly essential but unstudied research questions to understand and predict children’s snack intake. Example applications include studying the interactional nature of food parenting practices or interactions of food parenting with general parenting or child characteristics.

  11. Sufficient and Necessary Condition to Decide Compatibility for a Class of Interorganizational Workflow Nets

    Directory of Open Access Journals (Sweden)

    Guanjun Liu

    2015-01-01

    Full Text Available Interorganizational Workflow nets (IWF-nets can well model many concurrent systems such as web service composition, in which multiple processes interact via sending/receiving messages. Compatibility of IWF-nets is a crucial criterion for the correctness of these systems. It guarantees that a system has no deadlock, livelock, or dead tasks. In our previous work we proved that the compatibility problem is PSPACE-complete for safe IWF-nets. This paper defines a subclass of IWF-nets that can model many cases about interactions. Necessary and sufficient condition is presented to decide their compatibility, and it depends on the net structures only. Finally, an algorithm is developed based on the condition.

  12. Reliability of multi-model and structurally different single-model ensembles

    Energy Technology Data Exchange (ETDEWEB)

    Yokohata, Tokuta [National Institute for Environmental Studies, Center for Global Environmental Research, Tsukuba, Ibaraki (Japan); Annan, James D.; Hargreaves, Julia C. [Japan Agency for Marine-Earth Science and Technology, Research Institute for Global Change, Yokohama, Kanagawa (Japan); Collins, Matthew [University of Exeter, College of Engineering, Mathematics and Physical Sciences, Exeter (United Kingdom); Jackson, Charles S.; Tobis, Michael [The University of Texas at Austin, Institute of Geophysics, 10100 Burnet Rd., ROC-196, Mail Code R2200, Austin, TX (United States); Webb, Mark J. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-08-15

    The performance of several state-of-the-art climate model ensembles, including two multi-model ensembles (MMEs) and four structurally different (perturbed parameter) single model ensembles (SMEs), are investigated for the first time using the rank histogram approach. In this method, the reliability of a model ensemble is evaluated from the point of view of whether the observations can be regarded as being sampled from the ensemble. Our analysis reveals that, in the MMEs, the climate variables we investigated are broadly reliable on the global scale, with a tendency towards overdispersion. On the other hand, in the SMEs, the reliability differs depending on the ensemble and variable field considered. In general, the mean state and historical trend of surface air temperature, and mean state of precipitation are reliable in the SMEs. However, variables such as sea level pressure or top-of-atmosphere clear-sky shortwave radiation do not cover a sufficiently wide range in some. It is not possible to assess whether this is a fundamental feature of SMEs generated with particular model, or a consequence of the algorithm used to select and perturb the values of the parameters. As under-dispersion is a potentially more serious issue when using ensembles to make projections, we recommend the application of rank histograms to assess reliability when designing and running perturbed physics SMEs. (orig.)

  13. Reliability-based design code calibration for concrete containment structures

    International Nuclear Information System (INIS)

    Han, B.K.; Cho, H.N.; Chang, S.P.

    1991-01-01

    In this study, a load combination criteria for design and a probability-based reliability analysis were proposed on the basis of a FEM-based random vibration analysis. The limit state model defined for the study is a serviceability limit state of the crack failure that causes the emission of radioactive materials, and the results are compared with the case of strength limit state. More accurate reliability analyses under various dynamic loads such as earthquake loads were made possible by incorporating the FEM and random vibration theory, which is different from the conventional reliability analysis method. The uncertainties in loads and resistance available in Korea and the references were adapted to the situation of Korea, and especially in case of earthquake, the design earthquake was assessed based on the available data for the probabilistic description of earthquake ground acceleration in the Korea peninsula. The SAP V-2 is used for a three-dimensional finite element analysis of concrete containment structure, and the reliability analysis is carried out by modifying HRAS reliability analysis program for this study. (orig./GL)

  14. Virtual water and water self-sufficiency in agricultural and livestock products in Brazil.

    Science.gov (United States)

    da Silva, Vicente de Paulo R; de Oliveira, Sonaly D; Braga, Célia C; Brito, José Ivaldo B; de Sousa, Francisco de Assis S; de Holanda, Romildo M; Campos, João Hugo B C; de Souza, Enio P; Braga, Armando César R; Rodrigues Almeida, Rafaela S; de Araújo, Lincoln E

    2016-12-15

    Virtual water trade is often considered a solution for restricted water availability in many regions of the world. Brazil is the world leader in the production and export of various agricultural and livestock products. The country is either a strong net importer or a strong net exporter of these products. The objective of this study is to determine the volume of virtual water contained in agricultural and livestock products imported/exported by Brazil from 1997 to 2012, and to define the water self-sufficiency index of agricultural and livestock products in Brazil. The indexes of water scarcity (WSI), water dependency (WDI) and water self-sufficiency (WSSI) were calculated for each Brazilian state. These indexes and the virtual water balance were calculated following the methodology developed by Chapagain and Hoekstra (2008) and Hoekstra and Hung (2005). The total water exports and imports embedded in agricultural and livestock products were 5.28 × 10 10 and 1.22 × 10 10  Gm 3  yr -1 , respectively, which results in positive virtual water balance of 4.05 × 10 10  Gm 3  yr -1 . Brazil is either a strong net importer or a strong net exporter of agricultural and livestock products among the Mercosur countries. Brazil has a positive virtual water balance of 1.85 × 10 10  Gm 3  yr -1 . The indexes used in this study reveal that Brazil is self-sufficient in food production, except for a few products such as wheat and rice. Horticultural products (tomato, onion, potato, cassava and garlic) make up a unique product group with negative virtual water balance in Brazil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    Science.gov (United States)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  16. The use of reliability analysis techniques applied to nuclear power station emergency core cooling systems

    International Nuclear Information System (INIS)

    Danielsen, A.; Snaith, E.R.

    1975-01-01

    A reliability investigation carried out by the Safety and Reliability Services of the UKAEA, and the SSEB, of the essential system/reactor coolant system for a large nuclear power station is described. In AGR type reactors, after all reactor shutdown conditions, it is necessary to restore forced gas circulation and sufficient boiler feed to maintain the heat removal capacity of the boilers. The coolant requirements are provided by several independent mechanical systems of primary coolant fans, feedwater pumps, and valves integrated with electrical power sources, switchgear, and automatic control equipment. Reliability is treated as one aspect of system performance and quantified in terms of failure to meet a specific objective. Based on the reliability performance of the constituent components the optimum system configuration is determined together with the preferred plant operating procedures and maintenance requirements. (author)

  17. Ecological function as a target for ecosystem-based management: Defining when change matters in decision making

    Science.gov (United States)

    Ecosystem-based management (EBM) accounts for both direct and indirect drivers of ecological change for decision making. Just as with direct management of a resource, EBM requires a definition of management thresholds that define when change in function is sufficient to merit ma...

  18. Design of Accelerated Reliability Test for CNC Motorized Spindle Based on Vibration Signal

    Directory of Open Access Journals (Sweden)

    Chen Chao

    2016-01-01

    Full Text Available Motorized spindle is the key functional component of CNC machining centers which is a mechatronics system with long life and high reliability. The reliability test cycle of motorized spindle is too long and infeasible. This paper proposes a new accelerated test for reliability evaluation of motorized spindle. By field reliability test, authors collect and calculate the load data including rotational speed, cutting force and torque. Load spectrum distribution law is analyzed. And authors design a test platform to apply the load spectrum. A new method to define the fuzzy acceleration factor based on the vibration signal is proposed. Then the whole test plan of accelerated reliability test is done.

  19. Are chiropractic tests for the lumbo-pelvic spine reliable and valid? A systematic critical literature review

    DEFF Research Database (Denmark)

    Hestbaek, L; Leboeuf-Yde, C

    2000-01-01

    OBJECTIVE: To systematically review the peer-reviewed literature about the reliability and validity of chiropractic tests used to determine the need for spinal manipulative therapy of the lumbo-pelvic spine, taking into account the quality of the studies. DATA SOURCES: The CHIROLARS database......-pelvic spine were included. DATA EXTRACTION: Data quality were assessed independently by the two reviewers, with a quality score based on predefined methodologic criteria. Results of the studies were then evaluated in relation to quality. DATA SYNTHESIS: None of the tests studied had been sufficiently...... evaluated in relation to reliability and validity. Only tests for palpation for pain had consistently acceptable results. Motion palpation of the lumbar spine might be valid but showed poor reliability, whereas motion palpation of the sacroiliac joints seemed to be slightly reliable but was not shown...

  20. Development of the design and reliability analysis of a seabed repository system

    International Nuclear Information System (INIS)

    1987-06-01

    This study examines the seabed repository scheme proposed in 1979 for the long term disposal of heat generating radio-active waste and develops it to a standard sufficient to compare its reliability with the drilled emplacement and penetrator schemes. The reinforced concrete repositories contain 324 waste canisters and weigh 982 tonnes fully loaded in water. The repositories are transported up to 6000 km to the disposal area by a special purpose ship and lowered 5.5 km to the seabed on six braided nylon ropes by traction winches. Reliability of the seabed repository system, measured in terms of accidents per year involving loss of one or more canisters, was comparable with the other systems. (author)

  1. Approach to developing reliable space reactor power systems

    International Nuclear Information System (INIS)

    Mondt, J.F.; Shinbrot, C.H.

    1991-01-01

    The Space Reactor Power System Project is in the engineering development phase of a three-phase program. During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described in this paper along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top down systems approach which includes a point design based on a detailed technical specification of a 100 kW power system

  2. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  3. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  4. Acquisition of reliable vacuum hardware for large accelerator systems

    International Nuclear Information System (INIS)

    Welch, K.M.

    1996-01-01

    Credible and effective communications prove to be the major challenge in the acquisition of reliable vacuum hardware. Technical competence is necessary but not sufficient. We must effectively communicate with management, sponsoring agencies, project organizations, service groups, staff and with vendors. Most of Deming's 14 quality assurance tenets relate to creating an enlightened environment of good communications. All projects progress along six distinct, closely coupled, dynamic phases; all six phases are in a state of perpetual change. These phases and their elements are discussed, with emphasis given to the acquisition phase and its related vocabulary. (author)

  5. Some Reliability Considerations of UGV for Remote-response in Nuclear Emergency Situation

    International Nuclear Information System (INIS)

    Eom, Heungseop; Cho, Jaiwan; Jeong, Kyungmin

    2013-01-01

    In Fukushima disaster, a number of different UGVs, such as Packbots, Warriors, Quince, and Survey Runner, are used for monitoring, collecting data, inspection, and cleaning up. In utilizing UGVs in a nuclear emergency situation, one of serious problems is reliability of UGVs which is not sufficient yet for required mission completion. In this paper we surveyed failures and reliability of field UGVs and draw some important reliability considerations of UGVs for remote-response in a nuclear emergency situation. We think that the findings in this study will be helpful for developers or researchers of UGVs for nuclear emergency situations. We studied failures and reliability of UGVs used in search/rescue, military, and nuclear field by literature survey. The results showed that a state of art field UGVs can't be expected to complete an entire mission without failures, which leads to needs of reliability improvement of them. Though part of failure data from the surveyed studies were not enough detailed to get reliability matrix, some meaningful insights were found through analysis. Based on these insights, we draw some important considerations for reliability improvement of UGVs for an NPP emergency situation, and those reliability considerations are classified according to life cycle of a UGV for developers and researchers. Finally, there were not reported failures related to radiation environments in surveyed literature, but radiation tolerant control boards and sensors are easily anticipated in a NPP emergency situation. Therefore studies about the radiation-tolerant design and the use of radiation-tolerant components also should be considered for high reliability of UGVs for a NPP application

  6. Reliability-based condition assessment of steel containment and liners

    International Nuclear Information System (INIS)

    Ellingwood, B.; Bhattacharya, B.; Zheng, R.

    1996-11-01

    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs

  7. Optimal combination of illusory and luminance-defined 3-D surfaces: A role for ambiguity.

    Science.gov (United States)

    Hartle, Brittney; Wilcox, Laurie M; Murray, Richard F

    2018-04-01

    The shape of the illusory surface in stereoscopic Kanizsa figures is determined by the interpolation of depth from the luminance edges of adjacent inducing elements. Despite ambiguity in the position of illusory boundaries, observers reliably perceive a coherent three-dimensional (3-D) surface. However, this ambiguity may contribute additional uncertainty to the depth percept beyond what is expected from measurement noise alone. We evaluated the intrinsic ambiguity of illusory boundaries by using a cue-combination paradigm to measure the reliability of depth percepts elicited by stereoscopic illusory surfaces. We assessed the accuracy and precision of depth percepts using 3-D Kanizsa figures relative to luminance-defined surfaces. The location of the surface peak was defined by illusory boundaries, luminance-defined edges, or both. Accuracy and precision were assessed using a depth-discrimination paradigm. A maximum likelihood linear cue combination model was used to evaluate the relative contribution of illusory and luminance-defined signals to the perceived depth of the combined surface. Our analysis showed that the standard deviation of depth estimates was consistent with an optimal cue combination model, but the points of subjective equality indicated that observers consistently underweighted the contribution of illusory boundaries. This systematic underweighting may reflect a combination rule that attributes additional intrinsic ambiguity to the location of the illusory boundary. Although previous studies show that illusory and luminance-defined contours share many perceptual similarities, our model suggests that ambiguity plays a larger role in the perceptual representation of illusory contours than of luminance-defined contours.

  8. Reliability of CCGA and PBGA assemblies

    Science.gov (United States)

    Ghaffarian, Reza

    2005-01-01

    Area Array Packages (AAPs) with 1.27 mm pitch have been the packages of the choice for commercial applications; they are now started to be implemented for use in military and aerospace applications. Thermal cycling characteristics of plastic BGA (PBGA) and CSP assemblies, because of their wide usage for commercial applications, have been extensively reported in literature. Thermal cycling represents the on-off environmental condition for most electronic products and therefore is a key factor that defines reliability.

  9. PROVIDING RELIABILITY OF HUMAN RESOURCES IN PRODUCTION MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Anna MAZUR

    2014-07-01

    Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.

  10. Reliability analysis for dynamic configurations of systems with three failure modes

    International Nuclear Information System (INIS)

    Pham, Hoang

    1999-01-01

    Analytical models for computing the reliability of dynamic configurations of systems, such as majority and k-out-of-n, assuming that units and systems are subject to three types of failures: stuck-at-0, stuck-at-1, and stuck-at-x are presented in this paper. Formulas for determining the optimal design policies that maximize the reliability of dynamic k-out-of-n configurations subject to three types of failures are defined. The comparisons of the reliability modeling functions are also obtained. The optimum system size and threshold value k that minimize the expected cost of dynamic k-out-of-n configurations are also determined

  11. Snow reliability in ski resorts considering artificial snowmaking

    Science.gov (United States)

    Hofstätter, M.; Formayer, H.; Haas, P.

    2009-04-01

    Snow reliability is the key factor to make skiing on slopes possible and to ensure added value in winter tourism. In this context snow reliability is defined by the duration of a snowpack on the ski runs of at least 50 mm snow water equivalent (SWE), within the main season (Dec-Mar). Furthermore the snowpack should form every winter and be existent early enough in season. In our work we investigate the snow reliability of six Austrian ski resorts. Because nearly all Austrian resorts rely on artificial snowmaking it is of big importance to consider man made snow in the snowpack accumulation and ablation in addition to natural snow. For each study region observed weather data including temperature, precipitation and snow height are used. In addition we differentiate up to three elevations on each site (valley, intermediate, mountain top), being aware of the typical local winter inversion height. Time periods suitable for artificial snow production, for several temperature threshold (-6,-4 or -1 degree Celsius) are calculated on an hourly base. Depending on the actual snowpack height, man made snow can be added in the model with different defined capacities, considering different technologies or the usage of additives. To simulate natural snowpack accumulation and ablation we a simple snow model, based on daily precipitation and temperature. This snow model is optimized at each site separately through certain parameterization factors. Based on the local observations and the monthly climate change signals from the climate model REMO-UBA, we generate long term time series of temperature and precipitation, using the weather generator LARS. Thereby we are not only able to simulate the snow reliability under current, but also under future climate conditions. Our results show significant changes in snow reliability, like an increase of days with insufficient snow heights, especially at mid and low altitudes under natural snow conditions. Artificial snowmaking can partly

  12. Optimally Fortifying Logic Reliability through Criticality Ranking

    Directory of Open Access Journals (Sweden)

    Yu Bai

    2015-02-01

    Full Text Available With CMOS technology aggressively scaling towards the 22-nm node, modern FPGA devices face tremendous aging-induced reliability challenges due to bias temperature instability (BTI and hot carrier injection (HCI. This paper presents a novel anti-aging technique at the logic level that is both scalable and applicable for VLSI digital circuits implemented with FPGA devices. The key idea is to prolong the lifetime of FPGA-mapped designs by strategically elevating the VDD values of some LUTs based on their modular criticality values. Although the idea of scaling VDD in order to improve either energy efficiency or circuit reliability has been explored extensively, our study distinguishes itself by approaching this challenge through an analytical procedure, therefore being able to maximize the overall reliability of the target FPGA design by rigorously modeling the BTI-induced device reliability and optimally solving the VDD assignment problem. Specifically, we first develop a systematic framework to analytically model the reliability of an FPGA LUT (look-up table, which consists of both RAM memory bits and associated switching circuit. We also, for the first time, establish the relationship between signal transition density and a LUT’s reliability in an analytical way. This key observation further motivates us to define the modular criticality as the product of signal transition density and the logic observability of each LUT. Finally, we analytically prove, for the first time, that the optimal way to improve the overall reliability of a whole FPGA device is to fortify individual LUTs according to their modular criticality. To the best of our knowledge, this work is the first to draw such a conclusion.

  13. Software defined network inference with evolutionary optimal observation matrices

    OpenAIRE

    Malboubi, M; Gong, Y; Yang, Z; Wang, X; Chuah, CN; Sharma, P

    2017-01-01

    © 2017 Elsevier B.V. A key requirement for network management is the accurate and reliable monitoring of relevant network characteristics. In today's large-scale networks, this is a challenging task due to the scarcity of network measurement resources and the hard constraints that this imposes. This paper proposes a new framework, called SNIPER, which leverages the flexibility provided by Software-Defined Networking (SDN) to design the optimal observation or measurement matrix that can lead t...

  14. Reliability analysis of reactor protection systems

    International Nuclear Information System (INIS)

    Alsan, S.

    1976-07-01

    A theoretical mathematical study of reliability is presented and the concepts subsequently defined applied to the study of nuclear reactor safety systems. The theory is applied to investigations of the operational reliability of the Siloe reactor from the point of view of rod drop. A statistical study conducted between 1964 and 1971 demonstrated that most rod drop incidents arose from circumstances associated with experimental equipment (new set-ups). The reliability of the most suitable safety system for some recently developed experimental equipment is discussed. Calculations indicate that if all experimental equipment were equipped with these new systems, only 1.75 rod drop accidents would be expected to occur per year on average. It is suggested that all experimental equipment should be equipped with these new safety systems and tested every 21 days. The reliability of the new safety system currently being studied for the Siloe reactor was also investigated. The following results were obtained: definite failures must be detected immediately as a result of the disturbances produced; the repair time must not exceed a few hours; the equipment must be tested every week. Under such conditions, the rate of accidental rod drops is about 0.013 on average per year. The level of nondefinite failures is less than 10 -6 per hour and the level of nonprotection 1 hour per year. (author)

  15. Determining the optimum length of a bridge opening with a specified reliability level of water runoff

    Directory of Open Access Journals (Sweden)

    Evdokimov Sergey

    2017-01-01

    Full Text Available Current trends in construction are aimed at providing reliability and safety of engineering facilities. According to the latest government regulations for construction, the scientific approach to engineering research, design, construction and operation of construction projects is a key priority. The reliability of a road depends on a great number of factors and characteristics of their statistical compounds (sequential and parallel. A part of a road with such man-made structures as a bridge or a pipe is considered as a system with a sequential element connection. The overall reliability is the multiplication of the reliability of these elements. The parameters of engineering structures defined by analytical dependences are highly volatile because of the inaccuracy of the defining factors. However each physical parameter is statistically unstable that is evaluated by variable coefficient of their values. It causes the fluctuation in the parameters of engineering structures. Their study may result in the changes in general and particular design rules in order to increase the reliability. The paper gives the grounds for these changes by the example of a bridge. It allows calculating its optimum length with a specified reliability level of water runoff under the bridge.

  16. Development of a Reliability Program approach to assuring operational nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques used in other high technology industries is being formulated for potential application in the nuclear power industry. Research findings are discussed. The reliability methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed with several reliability concepts (e.g., quantitative reliability goals, reliability centered maintenance) appearing to be directly transferable. Other tasks in the RP development effort involved the benchmarking and evaluation of the existing nuclear regulations and practices relevant to safety/reliability integration. A review of current risk-dominant issues was also conducted using results from existing probabilistic risk assessment studies. The ongoing RP development tasks have concentrated on defining a RP for the operating phase of a nuclear plant's lifecycle. The RP approach incorporates safety systems risk/reliability analysis and performance monitoring activities with dedicated tasks that integrate these activities with operating, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the RP

  17. Reliability and validity of the German version of the Utrecht Questionnaire for Outcome Assessment in Aesthetic Rhinoplasty (D-OAR).

    Science.gov (United States)

    Spiekermann, Christoph; Rudack, Claudia; Stenner, Markus

    2017-11-01

    The outcome of aesthetic rhinoplasty is determined by the patient's subjective satisfaction with the nasal appearance which is difficult to assess. The Utrecht Questionnaire for Outcome Assessment in Aesthetic Rhinoplasty (OAR) is a brief and reliable instrument to assess the influence of the subjective nasal appearance on quality of life in patients undergoing aesthetic rhinoplasty. Preoperative application of this questionnaire reveals important aspects and possible disturbances of the body image which could be negative predictors concerning the result. On the other hand, it represents an appropriate tool to assess the postoperative outcome. The aim of this study was to determine the validity, reliability and responsiveness of the adapted German version of the OAR (D-OAR). The adaption of the OAR to German language was performed by a forward and backward translation process. Patients undergoing rhinoplasty were asked to complete the D-OAR preoperatively, 1, 3 and 12 months after procedure and healthy volunteers without any nasal complaints served as controls to test validity, reliability and responsiveness. An excellent internal consistency, a good test-retest reliability and good inter-item and item-total correlations demonstrated a good reliability of the D-OAR. The convincing validity of the adapted version was proven by an excellent discriminant and a sufficient content validity. Significant differences between pre- and postoperative D-OAR scores revealed a good responsiveness of the instrument. Hence, with a sufficient validity, reliability and sensitivity to changes, the D-OAR is a short and helpful instrument to assess the subjective perception of the nasal appearance in German patients.

  18. Analysis of Parking Reliability Guidance of Urban Parking Variable Message Sign System

    Directory of Open Access Journals (Sweden)

    Zhenyu Mei

    2012-01-01

    Full Text Available Operators of parking guidance and information systems (PGIS often encounter difficulty in determining when and how to provide reliable car park availability information to drivers. Reliability has become a key factor to ensure the benefits of urban PGIS. The present paper is the first to define the guiding parking reliability of urban parking variable message signs (VMSs. By analyzing the parking choice under guiding and optional parking lots, a guiding parking reliability model was constructed. A mathematical program was formulated to determine the guiding parking reliability of VMS. The procedures were applied to a numerical example, and the factors that affect guiding reliability were analyzed. The quantitative changes of the parking berths and the display conditions of VMS were found to be the most important factors influencing guiding reliability. The parking guiding VMS achieved the best benefit when the parking supply was close to or was less than the demand. The combination of a guiding parking reliability model and parking choice behavior offers potential for PGIS operators to reduce traffic congestion in central city areas.

  19. The reliability of radiochemical and chemical trace analyses in environmental materials

    International Nuclear Information System (INIS)

    Heinonen, Jorma.

    1977-12-01

    After theoretically exploring the factors which influence the quality of analytical data as well as the means by which a sufficient quality can be assured and controlled, schemes of different kinds have been developed and applied in order to demonstrate the analytical quality assurance and control in practice. Methods have been developed for the determination of cesium, bromine and arsenic by neutron activation analysis at the natural ''background'' concentration level in environmental materials. The calibration of methods is described. The methods were also applied on practical routine analysis, the results of which are briefly reviewed. In the case of Ce the precision of a comprehensive calibration was found to vary between 5.2-9.2% as a relative standard deviation, which agrees well with the calculated statistical random error 5.7-8.7%. In the case of Br the method showed a reasonable precision, about 11% on the average, and accuracy. In employing the method to analyze died samples containing Br from 3 to 12 ppm a continuous control of precison was performed. The analysis of As demonstrates the many problems and difficulties associated with environmental analysis. In developing the final method four former intercomparison materials of IAEA were utilized in the calibration. The tests performed revealed a systematic error. In this case a scheme was developed for the continuous control of both precision and accuracy. The results of radiochemical analyses in environmental materials show a reliability somewhat better than that occuring in the determination of stable trace elements. According to a rough classification, 15% of the results of radiochemical analysis show excellent reliability, whereas 60% show a reliability adequate for certain purposes. The remaining 15% are excellent, 60% adequate for some purposes and 30% good-for-nothing. The reasons for often insufficient reliability of results are both organizational and technical. With reasonable effort and

  20. Improving the Perception of Self-Sufficiency towards Creative Drama

    Science.gov (United States)

    Pekdogan, Serpil; Korkmaz, Halil Ibrahim

    2016-01-01

    The purpose of this study is to investigate the effects of a Creative Drama Based Perception of Self-sufficiency Skills Training Program on 2nd grade bachelor degree students' (who are attending a preschool teacher training program) perception of self-sufficiency. This is a quasi-experimental study. Totally 50 students were equally divided into…

  1. Construct Validity and Reliability of the Questionnaire on the Quality of Physician-Patient Interaction in Adults With Hypertension.

    Science.gov (United States)

    Hickman, Ronald L; Clochesy, John M; Hetland, Breanna; Alaamri, Marym

    2017-04-01

    There are limited reliable and valid measures of the patient- provider interaction among adults with hypertension. Therefore, the purpose of this report is to describe the construct validity and reliability of the Questionnaire on the Quality of Physician-Patient Interaction (QQPPI), in community-dwelling adults with hypertension. A convenience sample of 109 participants with hypertension was recruited and administered the QQPPI at baseline and 8 weeks later. The exploratory factor analysis established a 12-item, 2-factor structure for the QQPPI was valid in this sample. The modified QQPPI proved to have sufficient internal consistency and test- retest reliability. The modified QQPPI is a valid and reliable measure of the provider-patient interaction, a construct posited to impact self-management, in adults with hypertension.

  2. Role of exponential type random invexities for asymptotically sufficient efficiency conditions in semi-infinite multi-objective fractional programming.

    Science.gov (United States)

    Verma, Ram U; Seol, Youngsoo

    2016-01-01

    First a new notion of the random exponential Hanson-Antczak type [Formula: see text]-V-invexity is introduced, which generalizes most of the existing notions in the literature, second a random function [Formula: see text] of the second order is defined, and finally a class of asymptotically sufficient efficiency conditions in semi-infinite multi-objective fractional programming is established. Furthermore, several sets of asymptotic sufficiency results in which various generalized exponential type [Formula: see text]-V-invexity assumptions are imposed on certain vector functions whose components are the individual as well as some combinations of the problem functions are examined and proved. To the best of our knowledge, all the established results on the semi-infinite aspects of the multi-objective fractional programming are new, which is a significantly new emerging field of the interdisciplinary research in nature. We also observed that the investigated results can be modified and applied to several special classes of nonlinear programming problems.

  3. Critical Assessment of the Foundations of Power Transmission and Distribution Reliability Metrics and Standards.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Wu, Yue Grace; Bruss, C Bayan

    2016-01-01

    The U.S. federal government regulates the reliability of bulk power systems, while the reliability of power distribution systems is regulated at a state level. In this article, we review the history of regulating electric service reliability and study the existing reliability metrics, indices, and standards for power transmission and distribution networks. We assess the foundations of the reliability standards and metrics, discuss how they are applied to outages caused by large exogenous disturbances such as natural disasters, and investigate whether the standards adequately internalize the impacts of these events. Our reflections shed light on how existing standards conceptualize reliability, question the basis for treating large-scale hazard-induced outages differently from normal daily outages, and discuss whether this conceptualization maps well onto customer expectations. We show that the risk indices for transmission systems used in regulating power system reliability do not adequately capture the risks that transmission systems are prone to, particularly when it comes to low-probability high-impact events. We also point out several shortcomings associated with the way in which regulators require utilities to calculate and report distribution system reliability indices. We offer several recommendations for improving the conceptualization of reliability metrics and standards. We conclude that while the approaches taken in reliability standards have made considerable advances in enhancing the reliability of power systems and may be logical from a utility perspective during normal operation, existing standards do not provide a sufficient incentive structure for the utilities to adequately ensure high levels of reliability for end-users, particularly during large-scale events. © 2015 Society for Risk Analysis.

  4. A taxonomy for human reliability analysis

    International Nuclear Information System (INIS)

    Beattie, J.D.; Iwasa-Madge, K.M.

    1984-01-01

    A human interaction taxonomy (classification scheme) was developed to facilitate human reliability analysis in a probabilistic safety evaluation of a nuclear power plant, being performed at Ontario Hydro. A human interaction occurs, by definition, when operators or maintainers manipulate, or respond to indication from, a plant component or system. The taxonomy aids the fault tree analyst by acting as a heuristic device. It helps define the range and type of human errors to be identified in the construction of fault trees, while keeping the identification by different analysts consistent. It decreases the workload associated with preliminary quantification of the large number of identified interactions by including a category called 'simple interactions'. Fault tree analysts quantify these according to a procedure developed by a team of human reliability specialists. The interactions which do not fit into this category are called 'complex' and are quantified by the human reliability team. The taxonomy is currently being used in fault tree construction in a probabilistic safety evaluation. As far as can be determined at this early stage, the potential benefits of consistency and completeness in identifying human interactions and streamlining the initial quantification are being realized

  5. Reassessing Rogers' necessary and sufficient conditions of change.

    Science.gov (United States)

    Watson, Jeanne C

    2007-09-01

    This article reviews the impact of Carl Rogers' postulate about the necessary and sufficient conditions of therapeutic change on the field of psychotherapy. It is proposed that his article (see record 2007-14630-002) made an impact in two ways; first, by acting as a spur to researchers to identify the active ingredients of therapeutic change; and, second, by providing guidelines for therapeutic practice. The role of the necessary and sufficient conditions in process-experiential therapy, an emotion-focused therapy for individuals, and their limitations in terms of research and practice are discussed. It is proposed that although the conditions are necessary and important in promoting clients' affect regulation, they do not take sufficient account of other moderating variables that affect clients' response to treatment and may need to be balanced with more structured interventions. Notwithstanding, Rogers highlighted a way of interacting with clients that is generally acknowledged as essential to effective psychotherapy practice. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  6. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  7. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  8. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Reliability of surface electromyography timing parameters in gait in cervical spondylotic myelopathy.

    LENUS (Irish Health Repository)

    Malone, Ailish

    2012-02-01

    The aims of this study were to validate a computerised method to detect muscle activity from surface electromyography (SEMG) signals in gait in patients with cervical spondylotic myelopathy (CSM), and to evaluate the test-retest reliability of the activation times designated by this method. SEMG signals were recorded from rectus femoris (RF), biceps femoris (BF), tibialis anterior (TA), and medial gastrocnemius (MG), during gait in 12 participants with CSM on two separate test days. Four computerised activity detection methods, based on the Teager-Kaiser Energy Operator (TKEO), were applied to a subset of signals and compared to visual interpretation of muscle activation. The most accurate method was then applied to all signals for evaluation of test-retest reliability. A detection method based on a combined slope and amplitude threshold showed the highest agreement (87.5%) with visual interpretation. With respect to reliability, the standard error of measurement (SEM) of the timing of RF, TA and MG between test days was 5.5% stride duration or less, while the SEM of BF was 9.4%. The timing parameters of RF, TA and MG designated by this method were considered sufficiently reliable for use in clinical practice, however the reliability of BF was questionable.

  10. Defining mental disorder. Exploring the 'natural function' approach.

    Science.gov (United States)

    Varga, Somogy

    2011-01-21

    Due to several socio-political factors, to many psychiatrists only a strictly objective definition of mental disorder, free of value components, seems really acceptable. In this paper, I will explore a variant of such an objectivist approach to defining metal disorder, natural function objectivism. Proponents of this approach make recourse to the notion of natural function in order to reach a value-free definition of mental disorder. The exploration of Christopher Boorse's 'biostatistical' account of natural function (1) will be followed an investigation of the 'hybrid naturalism' approach to natural functions by Jerome Wakefield (2). In the third part, I will explore two proposals that call into question the whole attempt to define mental disorder (3). I will conclude that while 'natural function objectivism' accounts fail to provide the backdrop for a reliable definition of mental disorder, there is no compelling reason to conclude that a definition cannot be achieved.

  11. Homogenization of variational inequalities and equations defined by pseudomonotone operators

    International Nuclear Information System (INIS)

    Sandrakov, G V

    2008-01-01

    Results on the convergence of sequences of solutions of non-linear equations and variational inequalities for obstacle problems are proved. The variational inequalities and equations are defined by a non-linear, pseudomonotone operator of the second order with periodic, rapidly oscillating coefficients and by sequences of functions characterizing the obstacles and the boundary conditions. Two-scale and macroscale (homogenized) limiting problems for such variational inequalities and equations are obtained. Results on the relationship between solutions of these limiting problems are established and sufficient conditions for the uniqueness of solutions are presented. Bibliography: 25 titles

  12. The Interrelation among Faithful Representation (Reliability, Corruption and IFRS Adoption: An Empirical Investigation

    Directory of Open Access Journals (Sweden)

    Alexios Kythreotis

    2015-08-01

    Full Text Available Purpose – The degree of corruption, among other things, indicates the non -implementation of laws, weak enforcement of legal sanctions and the existence of non-transparent economic transactions. Therefore, the expected change in reliability (faithful-representation resulting from the adoption of IAS/IFRS, does not depend solely on the adoption of IAS/IFRS but is also influenced by the degree of corruption in each country. The purpose of this paper is to examine whether the above statement is true. Design/methodology/approach – The data were taken from DataStream database and the sample period consists of listed companies of fifteen European countries that adopted IAS/IFRS mandatorily. The time horizon is 10 years, from 2000 until 2009. The period between 2000 and 2004 is defined as the period before the adoption, while the period between 2005 and 2009 is defined as the period after the adoption. The reliability/faithful representation of financial statements - as defined by the Conceptual Framework - is detected through regression analysis. Findings – The findings advocate that the adoption of IFRS/IAS seems to be not enough. It appears that the level of reliability of financial statements in every country does not depend solely on the adoption of IAS/IFRS but is also influenced by the degree of corruption in each country. Research limitations/implications – The models that are used for the measurement of reliability have as an independent variable the short-term accruals. Given that, the models fail to take into consideration accounting treatments that concern non-current assets/liabilities. Originality/value – The findings that are identified for counties with a high degree of corruption indicate a statistically significant reduction in reliability after the adoption of IAS/IFRS. These findings constitute a useful tool for the IASB and the European Commission as well as for the users of financial statements.

  13. Interrater and Intrarater Reliability of the Balance Computerized Adaptive Test in Patients With Stroke.

    Science.gov (United States)

    Chiang, Hsin-Yu; Lu, Wen-Shian; Yu, Wan-Hui; Hsueh, I-Ping; Hsieh, Ching-Lin

    2018-04-11

    To examine the interrater and intrarater reliability of the Balance Computerized Adaptive Test (Balance CAT) in patients with chronic stroke having a wide range of balance functions. Repeated assessments design (1wk apart). Seven teaching hospitals. A pooled sample (N=102) including 2 independent groups of outpatients (n=50 for the interrater reliability study; n=52 for the intrarater reliability study) with chronic stroke. Not applicable. Balance CAT. For the interrater reliability study, the values of intraclass correlation coefficient, minimal detectable change (MDC), and percentage of MDC (MDC%) for the Balance CAT were .84, 1.90, and 31.0%, respectively. For the intrarater reliability study, the values of intraclass correlation coefficient, MDC, and MDC% ranged from .89 to .91, from 1.14 to 1.26, and from 17.1% to 18.6%, respectively. The Balance CAT showed sufficient intrarater reliability in patients with chronic stroke having balance functions ranging from sitting with support to independent walking. Although the Balance CAT may have good interrater reliability, we found substantial random measurement error between different raters. Accordingly, if the Balance CAT is used as an outcome measure in clinical or research settings, same raters are suggested over different time points to ensure reliable assessments. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  14. Reliability and integrity management program for PBMR helium pressure boundary components - HTR2008-58036

    International Nuclear Information System (INIS)

    Fleming, K. N.; Gamble, R.; Gosselin, S.; Fletcher, J.; Broom, N.

    2008-01-01

    The purpose of this paper is to present the results of a study to establish strategies for the reliability and integrity management (RIM) of passive metallic components for the PBMR. The RIM strategies investigated include design elements, leak detection and testing approaches, and non-destructive examinations. Specific combinations of strategies are determined to be necessary and sufficient to achieve target reliability goals for passive components. This study recommends a basis for the RIM program for the PBMR Demonstration Power Plant (DPP) and provides guidance for the development by the American Society of Mechanical Engineers (ASME) of RIM requirements for Modular High Temperature Gas-Cooled Reactors (MHRs). (authors)

  15. How Many Sleep Diary Entries Are Needed to Reliably Estimate Adolescent Sleep?

    Science.gov (United States)

    Arora, Teresa; Gradisar, Michael; Taheri, Shahrad; Carskadon, Mary A.

    2017-01-01

    Abstract Study Objectives: To investigate (1) how many nights of sleep diary entries are required for reliable estimates of five sleep-related outcomes (bedtime, wake time, sleep onset latency [SOL], sleep duration, and wake after sleep onset [WASO]) and (2) the test–retest reliability of sleep diary estimates of school night sleep across 12 weeks. Methods: Data were drawn from four adolescent samples (Australia [n = 385], Qatar [n = 245], United Kingdom [n = 770], and United States [n = 366]), who provided 1766 eligible sleep diary weeks for reliability analyses. We performed reliability analyses for each cohort using complete data (7 days), one to five school nights, and one to two weekend nights. We also performed test–retest reliability analyses on 12-week sleep diary data available from a subgroup of 55 US adolescents. Results: Intraclass correlation coefficients for bedtime, SOL, and sleep duration indicated good-to-excellent reliability from five weekday nights of sleep diary entries across all adolescent cohorts. Four school nights was sufficient for wake times in the Australian and UK samples, but not the US or Qatari samples. Only Australian adolescents showed good reliability for two weekend nights of bedtime reports; estimates of SOL were adequate for UK adolescents based on two weekend nights. WASO was not reliably estimated using 1 week of sleep diaries. We observed excellent test–rest reliability across 12 weeks of sleep diary data in a subsample of US adolescents. Conclusion: We recommend at least five weekday nights of sleep dairy entries to be made when studying adolescent bedtimes, SOL, and sleep duration. Adolescent sleep patterns were stable across 12 consecutive school weeks. PMID:28199718

  16. Sufficient conditions for Lagrange, Mayer, and Bolza optimization problems

    Directory of Open Access Journals (Sweden)

    Boltyanski V.

    2001-01-01

    Full Text Available The Maximum Principle [2,13] is a well known necessary condition for optimality. This condition, generally, is not sufficient. In [3], the author proved that if there exists regular synthesis of trajectories, the Maximum Principle also is a sufficient condition for time-optimality. In this article, we generalize this result for Lagrange, Mayer, and Bolza optimization problems.

  17. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  18. Defining clogging potential for permeable concrete.

    Science.gov (United States)

    Kia, Alalea; Wong, Hong S; Cheeseman, Christopher R

    2018-08-15

    Permeable concrete is used to reduce urban flooding as it allows water to flow through normally impermeable infrastructure. It is prone to clogging by particulate matter and predicting the long-term performance of permeable concrete is challenging as there is currently no reliable means of characterising clogging potential. This paper reports on the performance of a range of laboratory-prepared and commercial permeable concretes, close packed glass spheres and aggregate particles of varying size, exposed to different clogging methods to understand this phenomena. New methods were developed to study clogging and define clogging potential. The tests involved applying flowing water containing sand and/or clay in cycles, and measuring the change in permeability. Substantial permeability reductions were observed in all samples, particularly when exposed to sand and clay simultaneously. Three methods were used to define clogging potential based on measuring the initial permeability decay, half-life cycle and number of cycles to full clogging. We show for the first time strong linear correlations between these parameters for a wide range of samples, indicating their use for service-life prediction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Validity and Reliability of the Turkish Chronic Pain Acceptance Questionnaire

    Science.gov (United States)

    Akmaz, Hazel Ekin; Uyar, Meltem; Kuzeyli Yıldırım, Yasemin; Akın Korhan, Esra

    2018-05-29

    Pain acceptance is the process of giving up the struggle with pain and learning to live a worthwhile life despite it. In assessing patients with chronic pain in Turkey, making a diagnosis and tracking the effectiveness of treatment is done with scales that have been translated into Turkish. However, there is as yet no valid and reliable scale in Turkish to assess the acceptance of pain. To validate a Turkish version of the Chronic Pain Acceptance Questionnaire developed by McCracken and colleagues. Methodological and cross sectional study. A simple randomized sampling method was used in selecting the study sample. The sample was composed of 201 patients, more than 10 times the number of items examined for validity and reliability in the study, which totaled 20. A patient identification form, the Chronic Pain Acceptance Questionnaire, and the Brief Pain Inventory were used to collect data. Data were collected by face-to-face interviews. In the validity testing, the content validity index was used to evaluate linguistic equivalence, content validity, construct validity, and expert views. In reliability testing of the scale, Cronbach’s α coefficient was calculated, and item analysis and split-test reliability methods were used. Principal component analysis and varimax rotation were used in factor analysis and to examine factor structure for construct concept validity. The item analysis established that the scale, all items, and item-total correlations were satisfactory. The mean total score of the scale was 21.78. The internal consistency coefficient was 0.94, and the correlation between the two halves of the scale was 0.89. The Chronic Pain Acceptance Questionnaire, which is intended to be used in Turkey upon confirmation of its validity and reliability, is an evaluation instrument with sufficient validity and reliability, and it can be reliably used to examine patients’ acceptance of chronic pain.

  20. Evaluation of Network Reliability for Computer Networks with Multiple Sources

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2012-01-01

    Full Text Available Evaluating the reliability of a network with multiple sources to multiple sinks is a critical issue from the perspective of quality management. Due to the unrealistic definition of paths of network models in previous literature, existing models are not appropriate for real-world computer networks such as the Taiwan Advanced Research and Education Network (TWAREN. This paper proposes a modified stochastic-flow network model to evaluate the network reliability of a practical computer network with multiple sources where data is transmitted through several light paths (LPs. Network reliability is defined as being the probability of delivering a specified amount of data from the sources to the sink. It is taken as a performance index to measure the service level of TWAREN. This paper studies the network reliability of the international portion of TWAREN from two sources (Taipei and Hsinchu to one sink (New York that goes through a submarine and land surface cable between Taiwan and the United States.

  1. Transforming the energy system: Why municipalities strive for energy self-sufficiency

    International Nuclear Information System (INIS)

    Engelken, Maximilian; Römer, Benedikt; Drescher, Marcus; Welpe, Isabell

    2016-01-01

    Despite evidence that a rising number of municipalities in Germany are striving for energy self-sufficiency, there is little understanding of the driving factors behind this development. We investigate economic, ecological, social and energy system related factors that drive municipalities to strive for energy self-sufficiency with a focus on electricity supply. The empirical data for this study is based on insights generated through expert interviews (N =19) with mayors, energy experts and scientists as well as a quantitative study among mayors and energy officers (N =109) of German municipalities. Results show that environmental awareness, tax revenues and greater independence from private utilities are positively related to the mayors’ attitude towards the realization of energy self-sufficiency. Furthermore, citizens, the political environment, the mayor's political power, and his/her financial resources are relevant factors for a municipality striving for energy self-sufficiency. Policymakers need to decide whether or not to support mayors in this development. For suitable policy interventions, the results suggest the importance of an integrated approach that considers a combination of identified factors. Finally, we propose a morphological box to structure different aspects of energy self-sufficiency and categorize the present study. - Highlights: • Municipalities striving for energy self-sufficiency can play a key role in the transition of the energy system. • Tax revenues and environmental awareness main drivers behind mayors’ attitude towards energy self-sufficiency. • Citizens and the political environment main influencers of mayors striving for energy self-sufficiency. • 19 expert interviews analyzed for the framework of the study based on the theory of planned behavior (TPB). • 109 mayors and energy officers participated in the quantitative main survey.

  2. Sustainable, Reliable Mission-Systems Architecture

    Science.gov (United States)

    O'Neil, Graham; Orr, James K.; Watson, Steve

    2007-01-01

    A mission-systems architecture, based on a highly modular infrastructure utilizing: open-standards hardware and software interfaces as the enabling technology is essential for affordable and sustainable space exploration programs. This mission-systems architecture requires (a) robust communication between heterogeneous system, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered system are applied to define the model. Technology projections reaching out 5 years are mde to refine model details.

  3. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  4. A Study on the Reliability of Sasang Constitutional Body Trunk Measurement

    Directory of Open Access Journals (Sweden)

    Eunsu Jang

    2012-01-01

    Full Text Available Objective. Body trunk measurement for human plays an important diagnostic role not only in conventional medicine but also in Sasang constitutional medicine (SCM. The Sasang constitutional body trunk measurement (SCBTM consists of the 5-widths and the 8-circumferences which are standard locations currently employed in the SCM society. This study suggests to what extent a comprehensive training can improve the reliability of the SCBTM. Methods. We recruited 10 male subjects and 5 male observers with no experience of anthropometric measurement. We conducted measurements twice before and after a comprehensive training. Relative technical error of measurement (%TEMs was produced to assess intra and inter observer reliabilities. Results. Post-training intra-observer %TEMs of the SCBTM were 0.27% to 1.85% reduced from 0.27% to 6.26% in pre-training, respectively. Post-training inter-observer %TEMs of those were 0.56% to 1.66% reduced from 1.00% to 9.60% in pre-training, respectively. Post-training % total TEMs which represent the whole reliability were 0.68% to 2.18% reduced from maximum value of 10.18%. Conclusion. A comprehensive training makes the SCBTM more reliable, hence giving a sufficiently confident diagnostic tool. It is strongly recommended to give a comprehensive training in advance to take the SCBTM.

  5. A study on the reliability of sasang constitutional body trunk measurement.

    Science.gov (United States)

    Jang, Eunsu; Kim, Jong Yeol; Lee, Haejung; Kim, Honggie; Baek, Younghwa; Lee, Siwoo

    2012-01-01

    Objective. Body trunk measurement for human plays an important diagnostic role not only in conventional medicine but also in Sasang constitutional medicine (SCM). The Sasang constitutional body trunk measurement (SCBTM) consists of the 5-widths and the 8-circumferences which are standard locations currently employed in the SCM society. This study suggests to what extent a comprehensive training can improve the reliability of the SCBTM. Methods. We recruited 10 male subjects and 5 male observers with no experience of anthropometric measurement. We conducted measurements twice before and after a comprehensive training. Relative technical error of measurement (%TEMs) was produced to assess intra and inter observer reliabilities. Results. Post-training intra-observer %TEMs of the SCBTM were 0.27% to 1.85% reduced from 0.27% to 6.26% in pre-training, respectively. Post-training inter-observer %TEMs of those were 0.56% to 1.66% reduced from 1.00% to 9.60% in pre-training, respectively. Post-training % total TEMs which represent the whole reliability were 0.68% to 2.18% reduced from maximum value of 10.18%. Conclusion. A comprehensive training makes the SCBTM more reliable, hence giving a sufficiently confident diagnostic tool. It is strongly recommended to give a comprehensive training in advance to take the SCBTM.

  6. [The concept of nutritional self-sufficiency and the demographic equilibrium of Rwanda].

    Science.gov (United States)

    Habimana Nyirasafari, G

    1987-12-01

    Achieving food self-sufficiency is the basic strategy of Rwanda's 4th 5-year plan covering 1987-91. The population growth rate has increased from 3% in 1970 to 3.7% in 1983, with the population doubling between 1964 and 1985. Food production grew by about 4%/year between 1966-83, creating a slight increase in per capita food availability, but the 2171 calories available per capita is dangerously close to the theoretical minimum requirement of 2100 per day. The theoretical protein requirement is almost covered, but there is a serious shortage of oils. The increase in production since 1966 has been due almost exclusively to the extension of cultivated land. But the land supply is limited, and future production increases will need to be based on increased yields per unit cultivated. The National Office of Population has developed a simulation model that analyzes the parallel evolution of population and production so as to identify demographic and development policies that will assure food self-sufficiency and an improvement in living conditions. The population subsystem subjects the population divided by age and sex to the effects of fertility, migration, and mortality. Births are the result of 36 different fertility rates applied to the population of women aged 14-49 years. The agricultural subsystem is tied to the population subsystem by comparison of the volume of population to that of production, by estimation of the proportion of the population living exclusively by subsistence agriculture, by calculation of the potential emigration resulting from overpopulation of the countryside, and by estimation of the links between nutritional level, mortality, and duration of breastfeeding. 5 annexes contain subsystems showing effects of demographic growth on education, employment, and health. The model has various limitations including those of the reliability of its data, but it is sufficiently precise for its main function of clarifying the choices facing policymakers. 6

  7. Assessing intraindividual variability in sustained attention: reliability, relation to speed and accuracy, and practice effects

    Directory of Open Access Journals (Sweden)

    HAGEN C. FLEHMIG

    2007-06-01

    Full Text Available We investigated the psychometric properties of competing measures of sustained attention. 179 subjects were assessed twice within seven day's time with a test designed to measure sustained attention, or concentration, respectively. In addition to traditional performance indices [i.e., speed (MRT and accuracy (E%], we evaluated two intraindividual response time (RT variability measures: standard deviation (SDRT and coefficient of variation (CVRT. For the overall test, both indices were reliable. SDRT showed good to acceptable retest reliability for all subtests. For CVRT, retest reliability coefficients ranged from very good to not satisfactory. While the reversed-word recognition test proved highly reliable, the mental calculation test and the arrows test were not sufficiently reliable. CVRT was only slightly correlated but SDRT was highly correlated with MRT. In contrast to substantial practice gains for MRT, SDRT and E%, only CVRT proved to be stable. In conclusion, CVRT appears to be a potential index for assessing performance variability: it is reliable for the overall test, only moderately correlated with speed, and virtually not affected by practice. However, before applying CVRT in practical assessment settings, additional research is required to elucidate the impact of task-specific factors on the reliability of this performance measure.

  8. Denmark. Self-sufficiency and reserves management

    International Nuclear Information System (INIS)

    Erceville, H. d'.

    1997-01-01

    Since 1991, Denmark is a self-sufficient and a net petroleum and natural gas exporting country. Like all neighboring countries of the North sea, this country enjoys many advantages. However, Denmark exports and imports about a third of its hydrocarbons. This policy is a way to control its reserves for the future. (J.S.)

  9. Assessing high reliability via Bayesian approach and accelerated tests

    International Nuclear Information System (INIS)

    Erto, Pasquale; Giorgio, Massimiliano

    2002-01-01

    Sometimes the assessment of very high reliability levels is difficult for the following main reasons: - the high reliability level of each item makes it impossible to obtain, in a reasonably short time, a sufficient number of failures; - the high cost of the high reliability items to submit to life tests makes it unfeasible to collect enough data for 'classical' statistical analyses. In the above context, this paper presents a Bayesian solution to the problem of estimation of the parameters of the Weibull-inverse power law model, on the basis of a limited number (say six) of life tests, carried out at different stress levels, all higher than the normal one. The over-stressed (i.e. accelerated) tests allow the use of experimental data obtained in a reasonably short time. The Bayesian approach enables one to reduce the required number of failures adding to the failure information the available a priori engineers' knowledge. This engineers' involvement conforms to the most advanced management policy that aims at involving everyone's commitment in order to obtain total quality. A Monte Carlo study of the non-asymptotic properties of the proposed estimators and a comparison with the properties of maximum likelihood estimators closes the work

  10. Defining mental disorder. Exploring the 'natural function' approach

    Directory of Open Access Journals (Sweden)

    Varga Somogy

    2011-01-01

    Full Text Available Abstract Due to several socio-political factors, to many psychiatrists only a strictly objective definition of mental disorder, free of value components, seems really acceptable. In this paper, I will explore a variant of such an objectivist approach to defining metal disorder, natural function objectivism. Proponents of this approach make recourse to the notion of natural function in order to reach a value-free definition of mental disorder. The exploration of Christopher Boorse's 'biostatistical' account of natural function (1 will be followed an investigation of the 'hybrid naturalism' approach to natural functions by Jerome Wakefield (2. In the third part, I will explore two proposals that call into question the whole attempt to define mental disorder (3. I will conclude that while 'natural function objectivism' accounts fail to provide the backdrop for a reliable definition of mental disorder, there is no compelling reason to conclude that a definition cannot be achieved.

  11. Sufficiency of the Nuclear Fuel

    International Nuclear Information System (INIS)

    Pevec, D.; Knapp, V.; Matijevic, M.

    2008-01-01

    Estimation of the nuclear fuel sufficiency is required for rational decision making on long-term energy strategy. In the past an argument often invoked against nuclear energy was that uranium resources are inadequate. At present, when climate change associated with CO 2 emission is a major concern, one novel strong argument for nuclear energy is that it can produce large amounts of energy without the CO 2 emission. Increased interest in nuclear energy is evident, and a new look into uranium resources is relevant. We examined three different scenarios of nuclear capacity growth. The low growth of 0.4 percent per year in nuclear capacity is assumed for the first scenario. The moderate growth of 1.5 percent per year in nuclear capacity preserving the present share in total energy production is assumed for the second scenario. We estimated draining out time periods for conventional resources of uranium using once through fuel cycle for the both scenarios. For the first and the second scenario we obtained the draining out time periods for conventional uranium resources of 154 years and 96 years, respectively. These results are, as expected, in agreement with usual evaluations. However, if nuclear energy is to make a major impact on CO 2 emission it should contribute much more in the total energy production than at present level of 6 percent. We therefore defined the third scenario which would increase nuclear share in the total energy production from 6 percent in year 2020 to 30 percent by year 2060 while the total world energy production would grow by 1.5 percent per year. We also looked into the uranium requirement for this scenario, determining the time window for introduction of uranium or thorium reprocessing and for better use of uranium than what is the case in the once through fuel cycle. The once through cycle would be in this scenario sustainable up to about year 2060 providing most of the expected but undiscovered conventional uranium resources were turned

  12. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  13. Theoretical study of fuel element reliability in the BRIG-300 fast reactor

    International Nuclear Information System (INIS)

    Kulikov, I.S.; Nesterenko, V.B.; Tverkovkin, B.E.

    1983-01-01

    The theoretical results on studies of the reliability of cermet symmetrically heated fuel elements under conditions of the BRIG-300 fast gas cooled reactor are presented. The investigations have been conducted at the Nuclear Power Engineering Institute of the Byelorussian Academy of Sciences. Two variants of the fuel elements are considered :the fuel element with the gas gap between fuel and can and the fuel element with tight contact between cermet fuel and can. The estimated data on can resistance, swelling of the fuel rods and cans, strains and stresses in cans, change of the gap and its thermal coductivity during the reactor operation are obtained. The results of the analysis show that cermet fuel has sufficient reliability upon oparational conditions of the reactor with dissociating gas coolant in a steady-state regime

  14. 33 CFR 115.30 - Sufficiency of State authority for bridges.

    Science.gov (United States)

    2010-07-01

    ... for bridges. 115.30 Section 115.30 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES BRIDGE LOCATIONS AND CLEARANCES; ADMINISTRATIVE PROCEDURES § 115.30 Sufficiency of State authority for bridges. An opinion of the attorney general of the State as to the sufficiency of State...

  15. Enough is as good as a feast - sufficiency as policy. Volume 1

    International Nuclear Information System (INIS)

    Darby, Sarah

    2007-01-01

    The concept of sufficiency has a long history, related as it is to the timeless issues of how best to distribute and use resources. Where energy is concerned, absolute reductions in demand are increasingly seen as necessary in response to climate change and energy security concerns. There is an acknowledgement that, collectively if not individually, humans have gone beyond safe limits in their use of fuels. The relatively wealthy and industrialised nations urgently need to move beyond a primary focus on efficiency to the more contentious issues surrounding demand reduction and sufficiency. The paper considers definitions of energy sufficiency, looks at a recent attempt to model future energy use in terms of efficiency and sufficiency, and discusses quantitative and qualitative aspects of sufficiency and how they might become institutionalised. There are many arguments in favour of sufficiency but they often founder in the face of political requirements for market growth and the employment generated by it. Some options for 'sufficiency policy' are selected, including a focus on energy in relation to livelihoods, energy implications of our use of time and making energy use more transparent

  16. Enough is as good as a feast - sufficiency as policy. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Darby, Sarah [Lower Carbon Futures, Environmental Change Inst., Oxford Univ. Centre for the Environment (United Kingdom)

    2007-07-01

    The concept of sufficiency has a long history, related as it is to the timeless issues of how best to distribute and use resources. Where energy is concerned, absolute reductions in demand are increasingly seen as necessary in response to climate change and energy security concerns. There is an acknowledgement that, collectively if not individually, humans have gone beyond safe limits in their use of fuels. The relatively wealthy and industrialised nations urgently need to move beyond a primary focus on efficiency to the more contentious issues surrounding demand reduction and sufficiency. The paper considers definitions of energy sufficiency, looks at a recent attempt to model future energy use in terms of efficiency and sufficiency, and discusses quantitative and qualitative aspects of sufficiency and how they might become institutionalised. There are many arguments in favour of sufficiency but they often founder in the face of political requirements for market growth and the employment generated by it. Some options for 'sufficiency policy' are selected, including a focus on energy in relation to livelihoods, energy implications of our use of time and making energy use more transparent.

  17. Gamma processes and peaks-over-threshold distributions for time-dependent reliability

    International Nuclear Information System (INIS)

    Noortwijk, J.M. van; Weide, J.A.M. van der; Kallen, M.J.; Pandey, M.D.

    2007-01-01

    In the evaluation of structural reliability, a failure is defined as the event in which stress exceeds a resistance that is liable to deterioration. This paper presents a method to combine the two stochastic processes of deteriorating resistance and fluctuating load for computing the time-dependent reliability of a structural component. The deterioration process is modelled as a gamma process, which is a stochastic process with independent non-negative increments having a gamma distribution with identical scale parameter. The stochastic process of loads is generated by a Poisson process. The variability of the random loads is modelled by a peaks-over-threshold distribution (such as the generalised Pareto distribution). These stochastic processes of deterioration and load are combined to evaluate the time-dependent reliability

  18. Reliability analysis for Atucha II reactor protection system signals

    International Nuclear Information System (INIS)

    Roca, Jose Luis

    1996-01-01

    Atucha II is a 745 MW Argentine Power Nuclear Reactor constructed by ENACE SA, Nuclear Argentine Company for Electrical Power Generation and SIEMENS AG KWU, Erlangen, Germany. A preliminary modular logic analysis of RPS (Reactor Protection System) signals was performed by means of the well known Swedish professional risk and reliability software named Risk-Spectrum taking as a basis a reference signal coded as JR17ER003 which command the two moderator loops valves. From the reliability and behavior knowledge for this reference signal follows an estimation of the reliability for the other 97 RPS signals. Because the preliminary character of this analysis Main Important Measures are not performed at this stage. Reliability is by the statistic value named unavailability predicted. The scope of this analysis is restricted from the measurement elements to the RPS buffer outputs. In the present context only one redundancy is analyzed so in the Instrumentation and Control area there no CCF (Common Cause Failures) present for signals. Finally those unavailability values could be introduced in the failure domain for the posterior complete Atucha II reliability analysis which includes all mechanical and electromechanical features. Also an estimation of the spurious frequency of RPS signals defined as faulty by no trip is performed

  19. Reliability analysis for Atucha II reactor protection system signals

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2000-01-01

    Atucha II is a 745 MW Argentine power nuclear reactor constructed by Nuclear Argentine Company for Electric Power Generation S.A. (ENACE S.A.) and SIEMENS AG KWU, Erlangen, Germany. A preliminary modular logic analysis of RPS (Reactor Protection System) signals was performed by means of the well known Swedish professional risk and reliability software named Risk-Spectrum taking as a basis a reference signal coded as JR17ER003 which command the two moderator loops valves. From the reliability and behavior knowledge for this reference signal follows an estimation of the reliability for the other 97 RPS signals. Because the preliminary character of this analysis Main Important Measures are not performed at this stage. Reliability is by the statistic value named unavailability predicted. The scope of this analysis is restricted from the measurement elements to the RPS buffer outputs. In the present context only one redundancy is analyzed so in the Instrumentation and Control area there no CCF (Common Cause Failures) present for signals. Finally those unavailability values could be introduced in the failure domain for the posterior complete Atucha II reliability analysis which includes all mechanical and electromechanical features. Also an estimation of the spurious frequency of RPS signals defined as faulty by no trip is performed. (author)

  20. Topic B. Disposal objectives: are they fair and properly defined

    International Nuclear Information System (INIS)

    McComble, C.

    1994-01-01

    In this work the author was asked to make some connections between the ethical issues that are presently being discussed and the objectives and the principles which have been espoused in the nuclear waste disposal area. He tries to group it under the following set of questions : are the objectives and principles which we espouse properly defined. Are they sufficiently complete. Have we missed any out. Did we make any additional suggestions. Are they fair when we measure them against these ethical principles. Are they too ambitious. Are we going too far in one direction. (O.L.)

  1. System Reliability Evaluation of Data Transmission in Commercial Banks with Multiple Branches

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2014-01-01

    Full Text Available The main purpose of this paper is to assess the system reliability of electronic transaction data transmissions made by commercial banks in terms of stochastic flow network. System reliability is defined as the probability of demand satisfaction and it can be used to measure quality of service. In this paper, we study the system reliability of data transmission from the headquarters of a commercial bank to its multiple branches. The network structure of the bank and the probability of successful data transmission are obtained through the collection of real data. The system reliability, calculated using the minimal path method and the recursive sum of disjoint products algorithm, provides banking managers with a view to comprehend the current state of the entire system. Besides, the system reliability can be used not only as a measurement of quality of service, but also an improvement reference of the system by adopting sensitivity analysis.

  2. Integrated approach to economical, reliable, safe nuclear power production

    International Nuclear Information System (INIS)

    1982-06-01

    An Integrated Approach to Economical, Reliable, Safe Nuclear Power Production is the latest evolution of a concept which originated with the Defense-in-Depth philosophy of the nuclear industry. As Defense-in-Depth provided a framework for viewing physical barriers and equipment redundancy, the Integrated Approach gives a framework for viewing nuclear power production in terms of functions and institutions. In the Integrated Approach, four plant Goals are defined (Normal Operation, Core and Plant Protection, Containment Integrity and Emergency Preparedness) with the attendant Functional and Institutional Classifications that support them. The Integrated Approach provides a systematic perspective that combines the economic objective of reliable power production with the safety objective of consistent, controlled plant operation

  3. The level of energy sufficiency - why all the controversy?

    International Nuclear Information System (INIS)

    Maillard, D.

    2000-01-01

    It as become fashionable in certain circles to attempt to demolish the notion of energy sufficiency, a concept which is now seen as being archaic and unsuitable. To back up their claims, proponents of this standpoint take great pleasure in attacking the corresponding indicator - the rate of energy sufficiency calculated as a ratio of national primary energy production and the total consumption of primary energy (in the same unit and without climatic corrections). Confirming its precarious, conventional and debatable nature seems in their eyes to be the best means of ensuring that both the word, the concept and the measuring method of energy sufficiency are consigned to the dustbin of economic history. After having examined with perhaps a certain irony some of the usual criticisms, I intend to proceed with a re-examination of questions which in my eyes appear to be essential. (author)

  4. On the use of NDT Data for Reliability-Based Assessment of Existing Timber Structures

    DEFF Research Database (Denmark)

    Sousa, Hélder S.; Sørensen, John Dalsgaard; Kirkegaard, Poul Henning

    2013-01-01

    The objective of this paper is to address the possibilities of using non-destructive testing (NDT) data for updating information and obtaining adequate characterization of the reliability level of existing timber structures and, also, for assessing the evolution in time of performance...... of these structures when exposed to deterioration. By improving the knowledge upon the mechanical properties of timber, better and more substantiated decisions after a reliability safety assessment are aimed at. Bayesian methods are used to update the mechanical properties of timber and reliability assessment......, and information of NDT is also used to calibrate these models. The proposed approach is used for reliability assessment of different structural timber systems. Reliability of the structural system is assessed regarding the failure consequences of individual elements defined as key elements which were determined...

  5. Reliability analysis for manual radiographic measures of rotatory subluxation or lateral listhesis in adult scoliosis.

    Science.gov (United States)

    Freedman, Brett A; Horton, William C; Rhee, John M; Edwards, Charles C; Kuklo, Timothy R

    2009-03-15

    Retrospective observational study. To define the inter- and intraobserver reliability of 3 measures of rotatory subluxation (RS) in adult scoliosis (AS). RS is a hallmark of AS. To accurately track this measure, one must know its reliability. Reliability testing has not been performed. PA 36" films of 29 AS patients were collected from one surgeon's practice. Three observers on 2 separate occasions measured all levels with >or=3-mm RS (60 levels, 360 measurements) on the convexity of the involved segment using 3 different techniques-midbody (MB), endplate (EP), and centroid (C). These data were then analyzed to determine the intraclass correlation coefficient (ICC) for inter- and intraobserver reliability. The thoracolumbar/lumbar curve (average 58 degrees ) was the major curve for the majority (62%) of patients. RS at L3/4 was most common (35%). The overall inter- and intraobserver reliability was good-excellent for all methods, but the centroid method consistently had the highest ICC. ICC correlated with observer experience. Moderate-severe arthritic change (present in 55%) and poor image quality (52%) decreased ICC, but it still remained good-excellent for each measure. The reproducibility coefficient for each measure was 4 mm for MB and 2.8 mm for C and EP. MB, EP, and C are reliable techniques to measure RS even in elderly arthritic spines, but the methods inherently produce different values for a given level. The centroid method is most reliable and least influenced by experience. The EP method is easy to perform and very reliable. Spine surgeons should pick their preferred method and apply it consistently. Changes >3 mm suggest RS progression. RS may be a useful measure in addition to Cobb angle in AS. Having defined measurement reliability, the role of RS progression in surgical indications and patient outcomes can be evaluated.

  6. Intersession reliability of fMRI activation for heat pain and motor tasks.

    Science.gov (United States)

    Quiton, Raimi L; Keaser, Michael L; Zhuo, Jiachen; Gullapalli, Rao P; Greenspan, Joel D

    2014-01-01

    As the practice of conducting longitudinal fMRI studies to assess mechanisms of pain-reducing interventions becomes more common, there is a great need to assess the test-retest reliability of the pain-related BOLD fMRI signal across repeated sessions. This study quantitatively evaluated the reliability of heat pain-related BOLD fMRI brain responses in healthy volunteers across 3 sessions conducted on separate days using two measures: (1) intraclass correlation coefficients (ICC) calculated based on signal amplitude and (2) spatial overlap. The ICC analysis of pain-related BOLD fMRI responses showed fair-to-moderate intersession reliability in brain areas regarded as part of the cortical pain network. Areas with the highest intersession reliability based on the ICC analysis included the anterior midcingulate cortex, anterior insula, and second somatosensory cortex. Areas with the lowest intersession reliability based on the ICC analysis also showed low spatial reliability; these regions included pregenual anterior cingulate cortex, primary somatosensory cortex, and posterior insula. Thus, this study found regional differences in pain-related BOLD fMRI response reliability, which may provide useful information to guide longitudinal pain studies. A simple motor task (finger-thumb opposition) was performed by the same subjects in the same sessions as the painful heat stimuli were delivered. Intersession reliability of fMRI activation in cortical motor areas was comparable to previously published findings for both spatial overlap and ICC measures, providing support for the validity of the analytical approach used to assess intersession reliability of pain-related fMRI activation. A secondary finding of this study is that the use of standard ICC alone as a measure of reliability may not be sufficient, as the underlying variance structure of an fMRI dataset can result in inappropriately high ICC values; a method to eliminate these false positive results was used in this

  7. Intersession reliability of fMRI activation for heat pain and motor tasks

    Science.gov (United States)

    Quiton, Raimi L.; Keaser, Michael L.; Zhuo, Jiachen; Gullapalli, Rao P.; Greenspan, Joel D.

    2014-01-01

    As the practice of conducting longitudinal fMRI studies to assess mechanisms of pain-reducing interventions becomes more common, there is a great need to assess the test–retest reliability of the pain-related BOLD fMRI signal across repeated sessions. This study quantitatively evaluated the reliability of heat pain-related BOLD fMRI brain responses in healthy volunteers across 3 sessions conducted on separate days using two measures: (1) intraclass correlation coefficients (ICC) calculated based on signal amplitude and (2) spatial overlap. The ICC analysis of pain-related BOLD fMRI responses showed fair-to-moderate intersession reliability in brain areas regarded as part of the cortical pain network. Areas with the highest intersession reliability based on the ICC analysis included the anterior midcingulate cortex, anterior insula, and second somatosensory cortex. Areas with the lowest intersession reliability based on the ICC analysis also showed low spatial reliability; these regions included pregenual anterior cingulate cortex, primary somatosensory cortex, and posterior insula. Thus, this study found regional differences in pain-related BOLD fMRI response reliability, which may provide useful information to guide longitudinal pain studies. A simple motor task (finger-thumb opposition) was performed by the same subjects in the same sessions as the painful heat stimuli were delivered. Intersession reliability of fMRI activation in cortical motor areas was comparable to previously published findings for both spatial overlap and ICC measures, providing support for the validity of the analytical approach used to assess intersession reliability of pain-related fMRI activation. A secondary finding of this study is that the use of standard ICC alone as a measure of reliability may not be sufficient, as the underlying variance structure of an fMRI dataset can result in inappropriately high ICC values; a method to eliminate these false positive results was used in this

  8. Reliability programs for nuclear power plants. Regulatory standard S-98 revision 1

    International Nuclear Information System (INIS)

    2005-07-01

    The purpose of this regulatory standard is to help assure, in accordance with the purpose of the Nuclear Safety and Control Act (NSCA), that a licensee who constructs or operates a nuclear power plant (NPP) develops and implements a reliability program that assures that the systems important to safety at the plant can and will meet their defined design and performance specifications at acceptable levels of reliability throughout the lifetime of the facility. This regulatory standard describes the requirements of a reliability program for a nuclear power plant. The licensee shall implement the requirements described in this regulatory standard when a condition of a licence or other legally enforceable instrument so requires.(author)

  9. PPARalpha siRNA-treated expression profiles uncover the causal sufficiency network for compound-induced liver hypertrophy.

    Directory of Open Access Journals (Sweden)

    Xudong Dai

    2007-03-01

    Full Text Available Uncovering pathways underlying drug-induced toxicity is a fundamental objective in the field of toxicogenomics. Developing mechanism-based toxicity biomarkers requires the identification of such novel pathways and the order of their sufficiency in causing a phenotypic response. Genome-wide RNA interference (RNAi phenotypic screening has emerged as an effective tool in unveiling the genes essential for specific cellular functions and biological activities. However, eliciting the relative contribution of and sufficiency relationships among the genes identified remains challenging. In the rodent, the most widely used animal model in preclinical studies, it is unrealistic to exhaustively examine all potential interactions by RNAi screening. Application of existing computational approaches to infer regulatory networks with biological outcomes in the rodent is limited by the requirements for a large number of targeted permutations. Therefore, we developed a two-step relay method that requires only one targeted perturbation for genome-wide de novo pathway discovery. Using expression profiles in response to small interfering RNAs (siRNAs against the gene for peroxisome proliferator-activated receptor alpha (Ppara, our method unveiled the potential causal sufficiency order network for liver hypertrophy in the rodent. The validity of the inferred 16 causal transcripts or 15 known genes for PPARalpha-induced liver hypertrophy is supported by their ability to predict non-PPARalpha-induced liver hypertrophy with 84% sensitivity and 76% specificity. Simulation shows that the probability of achieving such predictive accuracy without the inferred causal relationship is exceedingly small (p < 0.005. Five of the most sufficient causal genes have been previously disrupted in mouse models; the resulting phenotypic changes in the liver support the inferred causal roles in liver hypertrophy. Our results demonstrate the feasibility of defining pathways mediating drug

  10. Developing safety performance functions incorporating reliability-based risk measures.

    Science.gov (United States)

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Reliability of Computerized Neurocognitive Tests for Concussion Assessment: A Meta-Analysis.

    Science.gov (United States)

    Farnsworth, James L; Dargo, Lucas; Ragan, Brian G; Kang, Minsoo

    2017-09-01

      Although widely used, computerized neurocognitive tests (CNTs) have been criticized because of low reliability and poor sensitivity. A systematic review was published summarizing the reliability of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores; however, this was limited to a single CNT. Expansion of the previous review to include additional CNTs and a meta-analysis is needed. Therefore, our purpose was to analyze reliability data for CNTs using meta-analysis and examine moderating factors that may influence reliability.   A systematic literature search (key terms: reliability, computerized neurocognitive test, concussion) of electronic databases (MEDLINE, PubMed, Google Scholar, and SPORTDiscus) was conducted to identify relevant studies.   Studies were included if they met all of the following criteria: used a test-retest design, involved at least 1 CNT, provided sufficient statistical data to allow for effect-size calculation, and were published in English.   Two independent reviewers investigated each article to assess inclusion criteria. Eighteen studies involving 2674 participants were retained. Intraclass correlation coefficients were extracted to calculate effect sizes and determine overall reliability. The Fisher Z transformation adjusted for sampling error associated with averaging correlations. Moderator analyses were conducted to evaluate the effects of the length of the test-retest interval, intraclass correlation coefficient model selection, participant demographics, and study design on reliability. Heterogeneity was evaluated using the Cochran Q statistic.   The proportion of acceptable outcomes was greatest for the Axon Sports CogState Test (75%) and lowest for the ImPACT (25%). Moderator analyses indicated that the type of intraclass correlation coefficient model used significantly influenced effect-size estimates, accounting for 17% of the variation in reliability.   The Axon Sports CogState Test, which

  12. Reliability of a consensus-based ultrasound score for tenosynovitis in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Naredo, Esperanza; D'Agostino, Maria Antonietta; Wakefield, Richard J

    2013-01-01

    OBJECTIVE: To produce consensus-based scoring systems for ultrasound (US) tenosynovitis and to assess the intraobserver and interobserver reliability of these scoring systems in rheumatoid arthritis (RA). METHODS: We undertook a Delphi process on US-defined tenosynovitis and US scoring system...... recruited. Ten rheumatologists expert in MSUS blindly, independently and consecutively scored for tenosynovitis in B-mode and PD mode three wrist extensor compartments, two finger flexor tendons and two ankle tendons of each patient in two rounds in a blinded fashion. Intraobserver reliability was assessed...... Doppler signal within the synovial sheath. The intraobserver reliability for tenosynovitis scoring on B-mode and PD mode was good (κ value 0.72 for B-mode; κ value 0.78 for PD mode). Interobserver reliability assessment showed good κ values for PD tenosynovitis scoring (first round, 0.64; second round, 0...

  13. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  14. Mission reliability of semi-Markov systems under generalized operational time requirements

    International Nuclear Information System (INIS)

    Wu, Xiaoyue; Hillston, Jane

    2015-01-01

    Mission reliability of a system depends on specific criteria for mission success. To evaluate the mission reliability of some mission systems that do not need to work normally for the whole mission time, two types of mission reliability for such systems are studied. The first type corresponds to the mission requirement that the system must remain operational continuously for a minimum time within the given mission time interval, while the second corresponds to the mission requirement that the total operational time of the system within the mission time window must be greater than a given value. Based on Markov renewal properties, matrix integral equations are derived for semi-Markov systems. Numerical algorithms and a simulation procedure are provided for both types of mission reliability. Two examples are used for illustration purposes. One is a one-unit repairable Markov system, and the other is a cold standby semi-Markov system consisting of two components. By the proposed approaches, the mission reliability of systems with time redundancy can be more precisely estimated to avoid possible unnecessary redundancy of system resources. - Highlights: • Two types of mission reliability under generalized requirements are defined. • Equations for both types of reliability are derived for semi-Markov systems. • Numerical methods are given for solving both types of reliability. • Simulation procedure is given for estimating both types of reliability. • Verification of the numerical methods is given by the results of simulation

  15. Method to ensure the reliability of power semiconductors depending on the application; Verfahren zur anwendungsspezifischen Sicherstellung der Zuverlaessigkeit von Leistungshalbleiter-Bauelementen

    Energy Technology Data Exchange (ETDEWEB)

    Grieger, Folkhart; Lindemann, Andreas [Magdeburg Univ. (Germany). Inst. fuer Elektrische Energiesysteme

    2011-07-01

    Load dependent conduction and switching losses during operation heat up power semiconductor devices. They this way age; lifetime can be limited e.g. by bond wire lift-off or solder fatigue. Components thus need to be dimensioned in a way that they can be expected to reach sufficient reliability during system lifetime. Electromobility or new applications in electric transmission and distribution are demanding in this respect because of high reliability requirements and long operation times. (orig.)

  16. Increasing the reliability of the fluid/crystallized difference score from the Kaufman Adolescent and Adult Intelligence Test with reliable component analysis.

    Science.gov (United States)

    Caruso, J C

    2001-06-01

    The unreliability of difference scores is a well documented phenomenon in the social sciences and has led researchers and practitioners to interpret differences cautiously, if at all. In the case of the Kaufman Adult and Adolescent Intelligence Test (KAIT), the unreliability of the difference between the Fluid IQ and the Crystallized IQ is due to the high correlation between the two scales. The consequences of the lack of precision with which differences are identified are wide confidence intervals and unpowerful significance tests (i.e., large differences are required to be declared statistically significant). Reliable component analysis (RCA) was performed on the subtests of the KAIT in order to address these problems. RCA is a new data reduction technique that results in uncorrelated component scores with maximum proportions of reliable variance. Results indicate that the scores defined by RCA have discriminant and convergent validity (with respect to the equally weighted scores) and that differences between the scores, derived from a single testing session, were more reliable than differences derived from equal weighting for each age group (11-14 years, 15-34 years, 35-85+ years). This reliability advantage results in narrower confidence intervals around difference scores and smaller differences required for statistical significance.

  17. Error Bounds: Necessary and Sufficient Conditions

    Czech Academy of Sciences Publication Activity Database

    Outrata, Jiří; Kruger, A.Y.; Fabian, Marián; Henrion, R.

    2010-01-01

    Roč. 18, č. 2 (2010), s. 121-149 ISSN 1877-0533 R&D Projects: GA AV ČR IAA100750802 Institutional research plan: CEZ:AV0Z10750506; CEZ:AV0Z10190503 Keywords : Error bounds * Calmness * Subdifferential * Slope Subject RIV: BA - General Mathematics Impact factor: 0.333, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/outrata-error bounds necessary and sufficient conditions.pdf

  18. The Army Ethic-Inchoate but Sufficient

    Science.gov (United States)

    2015-06-12

    are constraints imposed by this thesis. Delimitations include the scope, jus ad bellum, cultural relativism , descriptive ethics , and implementation...politicians. Third, this thesis will not look in depth at cultural relativism and how changes in laws and society’s philosophical and ethical ...THE ARMY ETHIC –INCHOATE BUT SUFFICIENT A thesis presented to the Faculty of the U.S. Army Command and General Staff College

  19. Reliability of peripheral arterial tonometry in patients with heart failure, diabetic nephropathy and arterial hypertension.

    Science.gov (United States)

    Weisrock, Fabian; Fritschka, Max; Beckmann, Sebastian; Litmeier, Simon; Wagner, Josephine; Tahirovic, Elvis; Radenovic, Sara; Zelenak, Christine; Hashemi, Djawid; Busjahn, Andreas; Krahn, Thomas; Pieske, Burkert; Dinh, Wilfried; Düngen, Hans-Dirk

    2017-08-01

    Endothelial dysfunction plays a major role in cardiovascular diseases and pulse amplitude tonometry (PAT) offers a non-invasive way to assess endothelial dysfunction. However, data about the reliability of PAT in cardiovascular patient populations are scarce. Thus, we evaluated the test-retest reliability of PAT using the natural logarithmic transformed reactive hyperaemia index (LnRHI). Our cohort consisted of 91 patients (mean age: 65±9.7 years, 32% female), who were divided into four groups: those with heart failure with preserved ejection fraction (HFpEF) ( n=25), heart failure with reduced ejection fraction (HFrEF) ( n=22), diabetic nephropathy ( n=21), and arterial hypertension ( n=23). All subjects underwent two separate PAT measurements at a median interval of 7 days (range 4-14 days). LnRHI derived by PAT showed good reliability in subjects with diabetic nephropathy (intra-class correlation (ICC) = 0.863) and satisfactory reliability in patients with both HFpEF (ICC = 0.557) and HFrEF (ICC = 0.576). However, in subjects with arterial hypertension, reliability was poor (ICC = 0.125). We demonstrated that PAT is a reliable technique to assess endothelial dysfunction in adults with diabetic nephropathy, HFpEF or HFrEF. However, in subjects with arterial hypertension, we did not find sufficient reliability, which can possibly be attributed to variations in heart rate and the respective time of the assessments. Clinical Trial Registration Identifier: NCT02299960.

  20. The Relationship between Organizational Support Perceptions and Self-Sufficiencies of Logistics Sector Employees

    Directory of Open Access Journals (Sweden)

    Sefer Gumus

    2016-01-01

    Full Text Available This study was performed in order to examine the relationship between organizational support perceptions and self-sufficiency levels of logistics sector employees and to determine whether organizational support perceptions and self-sufficiency levels of employees differ according to some specification. The questionnaire form consisting of perceived organizational support scale in accordance with the purpose, general self-sufficiency scale and personal information form, was applied to 124 employees of 3 separate logistics firms operating in Istanbul. The data obtained from the questionnaire were analyzed using SPSS17.0 statistical software package on computer. In the assessment of data, descriptive characteristics of employees were determined by frequency and percentage statistics and the self-sufficiency and perceived organizational support levels by the mean and standard deviation statistics. The t test, Tukey test and one-way Anova tests were utilized in determining employees' self-sufficiency and perceived organizational support levels differentiation according to descriptive characteristics, and correlation analysis was utilized in determining the relationship between self-sufficiency and perceived organizational support levels of employees. In conclusion, it was determined that there was statistical relationship between organizational support and self-sufficiency levels perceived by logistics sector employees. Accordingly, when employees' perceived organizational support levels increase then self-sufficiency levels also increase, and when perceived organizational support levels decrease then self-sufficiency levels also decrease.

  1. Environmental education curriculum evaluation questionnaire: A reliability and validity study

    Science.gov (United States)

    Minner, Daphne Diane

    The intention of this research project was to bridge the gap between social science research and application to the environmental domain through the development of a theoretically derived instrument designed to give educators a template by which to evaluate environmental education curricula. The theoretical base for instrument development was provided by several developmental theories such as Piaget's theory of cognitive development, Developmental Systems Theory, Life-span Perspective, as well as curriculum research within the area of environmental education. This theoretical base fueled the generation of a list of components which were then translated into a questionnaire with specific questions relevant to the environmental education domain. The specific research question for this project is: Can a valid assessment instrument based largely on human development and education theory be developed that reliably discriminates high, moderate, and low quality in environmental education curricula? The types of analyses conducted to answer this question were interrater reliability (percent agreement, Cohen's Kappa coefficient, Pearson's Product-Moment correlation coefficient), test-retest reliability (percent agreement, correlation), and criterion-related validity (correlation). Face validity and content validity were also assessed through thorough reviews. Overall results indicate that 29% of the questions on the questionnaire demonstrated a high level of interrater reliability and 43% of the questions demonstrated a moderate level of interrater reliability. Seventy-one percent of the questions demonstrated a high test-retest reliability and 5% a moderate level. Fifty-five percent of the questions on the questionnaire were reliable (high or moderate) both across time and raters. Only eight questions (8%) did not show either interrater or test-retest reliability. The global overall rating of high, medium, or low quality was reliable across both coders and time, indicating

  2. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  3. A study on the real-time reliability of on-board equipment of train control system

    Science.gov (United States)

    Zhang, Yong; Li, Shiwei

    2018-05-01

    Real-time reliability evaluation is conducive to establishing a condition based maintenance system for the purpose of guaranteeing continuous train operation. According to the inherent characteristics of the on-board equipment, the connotation of reliability evaluation of on-board equipment is defined and the evaluation index of real-time reliability is provided in this paper. From the perspective of methodology and practical application, the real-time reliability of the on-board equipment is discussed in detail, and the method of evaluating the realtime reliability of on-board equipment at component level based on Hidden Markov Model (HMM) is proposed. In this method the performance degradation data is used directly to realize the accurate perception of the hidden state transition process of on-board equipment, which can achieve a better description of the real-time reliability of the equipment.

  4. Uncertainty propagation and sensitivity analysis in system reliability assessment via unscented transformation

    International Nuclear Information System (INIS)

    Rocco Sanseverino, Claudio M.; Ramirez-Marquez, José Emmanuel

    2014-01-01

    The reliability of a system, notwithstanding it intended function, can be significantly affected by the uncertainty in the reliability estimate of the components that define the system. This paper implements the Unscented Transformation to quantify the effects of the uncertainty of component reliability through two approaches. The first approach is based on the concept of uncertainty propagation, which is the assessment of the effect that the variability of the component reliabilities produces on the variance of the system reliability. This assessment based on UT has been previously considered in the literature but only for system represented through series/parallel configuration. In this paper the assessment is extended to systems whose reliability cannot be represented through analytical expressions and require, for example, Monte Carlo Simulation. The second approach consists on the evaluation of the importance of components, i.e., the evaluation of the components that most contribute to the variance of the system reliability. An extension of the UT is proposed to evaluate the so called “main effects” of each component, as well to assess high order component interaction. Several examples with excellent results illustrate the proposed approach. - Highlights: • Simulation based approach for computing reliability estimates. • Computation of reliability variance via 2n+1 points. • Immediate computation of component importance. • Application to network systems

  5. The INEL Human Reliability Program: The first two years of experience

    International Nuclear Information System (INIS)

    Minner, D.E.

    1986-01-01

    This paper provides a review of the design, implementation, and operation of the INEL Human Reliability Program from January 1984 through June of 1986. Human Reliability Programs are defined in terms of the ''insider threat'' to security of nuclear facilities. The design of HRP's are discussed with special attention given the special challenge of the disgruntled employee. Each component of an HRP is reviewed noting pitfalls and opportunities with each: drug testing of applicants and incumbents, psychological evaluation by management, security clearance procedures and administration including the use of an Employee Review Board to recommend action prior to final management decision

  6. Reliable, Economic, Efficient CO2 Heat Pump Water Heater for North America

    Energy Technology Data Exchange (ETDEWEB)

    Radcliff, Thomas D; Sienel, Tobias; Huff, Hans-Joachim; Thompson, Adrian; Sadegh, Payman; Olsommer, Benoit; Park, Young

    2006-12-31

    Adoption of heat pump water heating technology for commercial hot water could save up to 0.4 quads of energy and 5 million metric tons of CO2 production annually in North America, but industry perception is that this technology does not offer adequate performance or reliability and comes at too high of a cost. Development and demonstration of a CO2 heat pump water heater is proposed to reduce these barriers to adoption. Three major themes are addressed: market analysis to understand barriers to adoption, use of advanced reliability models to design optimum qualification test plans, and field testing of two phases of water heater prototypes. Market experts claim that beyond good performance, market adoption requires 'drop and forget' system reliability and a six month payback of first costs. Performance, reliability and cost targets are determined and reliability models are developed to evaluate the minimum testing required to meet reliability targets. Three phase 1 prototypes are designed and installed in the field. Based on results from these trials a product specification is developed and a second phase of five field trial units are built and installed. These eight units accumulate 11 unit-years of service including 15,650 hours and 25,242 cycles of compressor operation. Performance targets can be met. An availability of 60% is achieved and the capability to achieve >90% is demonstrated, but overall reliability is below target, with an average of 3.6 failures/unit-year on the phase 2 demonstration. Most reliability issues are shown to be common to new HVAC products, giving high confidence in mature product reliability, but the need for further work to minimize leaks and ensure reliability of the electronic expansion valve is clear. First cost is projected to be above target, leading to an expectation of 8-24 month payback when substituted for an electric water heater. Despite not meeting all targets, arguments are made that an industry leader could

  7. [Systematic umbilical cord blood analysis at birth: feasibility and reliability in a French labour ward].

    Science.gov (United States)

    Ernst, D; Clerc, J; Decullier, E; Gavanier, G; Dupuis, O

    2012-10-01

    At birth, evaluation of neonatal well-being is crucial. It is though important to perform umbilical cord blood gas analysis, and then to analyze the samples. We wanted to establish the feasibility and reliability of systematic umbilical cord blood sampling in a French labour ward. Study of systematic umbilical cord blood gas analysis was realized retrospectively from 1000 consecutive deliveries. We first established the feasibility of the samples. Feasibility was defined by the ratio of complete cord acid-base data on the number of deliveries from alive newborns. Afterwards, we established the reliability on the remaining cord samples. Reliability was the ratio of samples that fulfilled quality criteria defined by Westgate et al. and revised by Kro et al., on the number of complete samples from alive newborns. At last, we looked for factors that would influence these results. The systematic umbilical cord blood sample feasibility reached 91.6%, and the reliability reached 80.7%. About the delivery mode, 38.6% of emergency caesarians (IC 95% [30.8-46.3]; Panalysis were significantly less validated during emergency caesarians. Realization of systematic cord blood gas analysis was followed by 8.4% of incomplete samples, and by 19.3% that were uninterpretable. Training sessions should be organized to improve the feasibility and reliability, especially during emergency caesarians. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  8. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  9. Sufficient conditions for optimality for a mathematical model of drug treatment with pharmacodynamics

    Directory of Open Access Journals (Sweden)

    Maciej Leszczyński

    2017-01-01

    Full Text Available We consider an optimal control problem for a general mathematical model of drug treatment with a single agent. The control represents the concentration of the agent and its effect (pharmacodynamics is modelled by a Hill function (i.e., Michaelis-Menten type kinetics. The aim is to minimize a cost functional consisting of a weighted average related to the state of the system (both at the end and during a fixed therapy horizon and to the total amount of drugs given. The latter is an indirect measure for the side effects of treatment. It is shown that optimal controls are continuous functions of time that change between full or no dose segments with connecting pieces that take values in the interior of the control set. Sufficient conditions for the strong local optimality of an extremal controlled trajectory in terms of the existence of a solution to a piecewise defined Riccati differential equation are given.

  10. Validity and Reliability of the Turkish Chronic Pain Acceptance Questionnaire

    Directory of Open Access Journals (Sweden)

    Hazel Ekin Akmaz

    2018-05-01

    Full Text Available Background: Pain acceptance is the process of giving up the struggle with pain and learning to live a worthwhile life despite it. In assessing patients with chronic pain in Turkey, making a diagnosis and tracking the effectiveness of treatment is done with scales that have been translated into Turkish. However, there is as yet no valid and reliable scale in Turkish to assess the acceptance of pain. Aims: To validate a Turkish version of the Chronic Pain Acceptance Questionnaire developed by McCracken and colleagues. Study Design: Methodological and cross sectional study. Methods: A simple randomized sampling method was used in selecting the study sample. The sample was composed of 201 patients, more than 10 times the number of items examined for validity and reliability in the study, which totaled 20. A patient identification form, the Chronic Pain Acceptance Questionnaire, and the Brief Pain Inventory were used to collect data. Data were collected by face-to-face interviews. In the validity testing, the content validity index was used to evaluate linguistic equivalence, content validity, construct validity, and expert views. In reliability testing of the scale, Cronbach’s α coefficient was calculated, and item analysis and split-test reliability methods were used. Principal component analysis and varimax rotation were used in factor analysis and to examine factor structure for construct concept validity. Results: The item analysis established that the scale, all items, and item-total correlations were satisfactory. The mean total score of the scale was 21.78. The internal consistency coefficient was 0.94, and the correlation between the two halves of the scale was 0.89. Conclusion: The Chronic Pain Acceptance Questionnaire, which is intended to be used in Turkey upon confirmation of its validity and reliability, is an evaluation instrument with sufficient validity and reliability, and it can be reliably used to examine patients’ acceptance

  11. On new cautious structural reliability models in the framework of imprecise probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev; Kozine, Igor

    2010-01-01

    measures when the number of events of interest or observations is very small. The main feature of the models is that prior ignorance is not modelled by a fixed single prior distribution, but by a class of priors which is defined by upper and lower probabilities that can converge as statistical data......New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability...

  12. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    Science.gov (United States)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  13. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  14. Reliability of pulse waveform separation analysis: effects of posture and fasting.

    Science.gov (United States)

    Stoner, Lee; Credeur, Daniel; Fryer, Simon; Faulkner, James; Lambrick, Danielle; Gibbs, Bethany Barone

    2017-03-01

    Oscillometric pulse wave analysis devices enable, with relative simplicity and objectivity, the measurement of central hemodynamic parameters. The important parameters are central blood pressures and indices of arterial wave reflection, including wave separation analysis (backward pressure component Pb and reflection magnitude). This study sought to determine whether the measurement precision (between-day reliability) of Pb and reflection magnitude: exceeds the criterion for acceptable reliability; and is affected by posture (supine, seated) and fasting state. Twenty healthy adults (50% female, 27.9 years, 24.2 kg/m) were tested on six different mornings: 3 days fasted, 3 days nonfasted condition. On each occasion, participants were tested in supine and seated postures. Oscillometric pressure waveforms were recorded on the left upper arm. The criterion intra-class correlation coefficient value of 0.75 was exceeded for Pb (0.76) and reflection magnitude (0.77) when participants were assessed under the combined supine-fasted condition. The intra-class correlation coefficient was lowest for Pb in seated-nonfasted condition (0.57), and lowest for reflection magnitude in the seated-fasted condition (0.56). For Pb, the smallest detectible change that must be exceeded in order for a significant change to occur in an individual was 2.5 mmHg, and for reflection magnitude, the smallest detectable change was 8.5%. Assessments of Pb and reflection magnitude are as follows: exceed the criterion for acceptable reliability; and are most reliable when participants are fasted in a supine position. The demonstrated reliability suggests sufficient precision to detect clinically meaningful changes in reflection magnitude and Pb.

  15. Reliability of the Q Force; a mobile instrument for measuring isometric quadriceps muscle strength

    OpenAIRE

    Schans, van der, C.P.; Zijlstra, W.; Regterschot, G.R.H.; Krijnen, W.P.; Douma, K.W.; Slager, G.E.C.

    2016-01-01

    BACKGROUND: The ability to generate muscle strength is a pre-requisite for all human movement. Decreased quadriceps muscle strength is frequently observed in older adults and is associated with a decreased performance and activity limitations. To quantify the quadriceps muscle strength and to monitor changes over time, instruments and procedures with a sufficient reliability are needed. The Q Force is an innovative mobile muscle strength measurement instrument suitable to measure in various d...

  16. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  17. The impact of reliability centered maintenance on plant prospects for license renewal

    International Nuclear Information System (INIS)

    Elliott, J.O.; Nakahara, Y.

    1991-01-01

    Much attention has been directed in recent years to means of extending nuclear power plant life. As many plants enter their third decade of service, the questions loom large of how to keep aging plants reliable, as well as how to assure reliability and safety to an extent sufficient to warrant license renewal at the end of the current licensing term. Concurrently the nuclear industry has seen a growing interest in reducing the cost and complexity of maintenance activities while at the same time improving plant reliability and availability. Attainment of these seemingly contradictory aims is being aided by the introduction of a maintenance philosophy developed originally by the airline industry and subsequently applied with great success both in that industry and the U.S. military services. Reliability Centered Maintenance (RCM), in its basic form, may be described as a consideration of reliability and maintenance problems from a systems level approach, allowing a focus on preservation of system functions as the aim of a maintenance program optimized for both safety and economics. It is this systematic view of plant maintenance, with the emphasis on preservation of overall functions rather than individual parts and components which sets RCM apart from past nuclear plant maintenance philosophies. It is also the factor which makes application of RCM an ideal first step in development of strategies for life extension and license renewal, both for aging plants, and for plants just beginning their first license term

  18. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  19. Determining the theoretical reliability function of thermal power system using simple and complex Weibull distribution

    Directory of Open Access Journals (Sweden)

    Kalaba Dragan V.

    2014-01-01

    Full Text Available The main subject of this paper is the representation of the probabilistic technique for thermal power system reliability assessment. Exploitation research of the reliability of the fossil fuel power plant system has defined the function, or the probabilistic law, according to which the random variable behaves (occurrence of complete unplanned standstill. Based on these data, and by applying the reliability theory to this particular system, using simple and complex Weibull distribution, a hypothesis has been confirmed that the distribution of the observed random variable fully describes the behaviour of such a system in terms of reliability. Establishing a comprehensive insight in the field of probabilistic power system reliability assessment technique could serve as an input for further research and development in the area of power system planning and operation.

  20. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  1. Reliability assurance programme guidebook for advanced light water reactors

    International Nuclear Information System (INIS)

    2001-12-01

    To facilitate the implementation of reliability assurance programmes (RAP) within future advanced reactor programmes and to ensure that the next generation of commercial nuclear reactors achieves the very high levels of safety, reliability and economy which are expected of them, in 1996, the International Atomic Energy Agency (IAEA) established a task to develop a guidebook for reliability assurance programmes. The draft RAP guidebook was prepared by an expert consultant and was reviewed/modified at an Advisory Group meeting (7-10 April 1997) and at a consults meeting (7-10 October 1997). The programme for the RAP guidebook was reported to and guided by the Technical Working Group on Advanced Technologies for Light Water Reactors (TWG-LWR). This guidebook will demonstrate how the designers and operators of future commercial nuclear plants can exploit the risk, reliability and availability engineering methods and techniques developed over the past two decades to augment existing design and operational nuclear plant decision-making capabilities. This guidebook is intended to provide the necessary understanding, insights and examples of RAP management systems and processes from which a future user can derive his own plant specific reliability assurance programmes. The RAP guidebook is intended to augment, not replace, specific reliability assurance requirements defined by the utility requirements documents and by individual nuclear steam supply system (NSSS) designers. This guidebook draws from utility experience gained during implementation of reliability and availability improvement and risk based management programmes to provide both written and diagrammatic 'how to' guidance which can be followed to assure conformance with the specific requirements outlined by utility requirements documents and in the development of a practical and effective plant specific RAP in any IAEA Member State

  2. Definition and Reliability Assessment of Elementary Ultrasonographic Findings in Calcium Pyrophosphate Deposition Disease

    DEFF Research Database (Denmark)

    Filippou, Georgios; Scirè, Carlo A; Damjanov, Nemanja

    2017-01-01

    OBJECTIVE: To define the ultrasonographic characteristics of calcium pyrophosphate crystal (CPP) deposits in joints and periarticular tissues and to evaluate the intra- and interobserver reliability of expert ultrasonographers in the assessment of CPP deposition disease (CPPD) according to the ne...

  3. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability.

    Science.gov (United States)

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Castellucci, Ignacio; Leão, Celina

    2018-01-01

    Collecting anthropometric data for real-life applications demands a high degree of precision and reliability. It is important to test new equipment that will be used for data collectionOBJECTIVE:Compare two anthropometric data gathering techniques - manual methods and a Kinect-based 3D body scanner - to understand which of them gives more precise and reliable results. The data was collected using a measuring tape and a Kinect-based 3D body scanner. It was evaluated in terms of precision by considering the regular and relative Technical Error of Measurement and in terms of reliability by using the Intraclass Correlation Coefficient, Reliability Coefficient, Standard Error of Measurement and Coefficient of Variation. The results obtained showed that both methods presented better results for reliability than for precision. Both methods showed relatively good results for these two variables, however, manual methods had better results for some body measurements. Despite being considered sufficiently precise and reliable for certain applications (e.g. apparel industry), the 3D scanner tested showed, for almost every anthropometric measurement, a different result than the manual technique. Many companies design their products based on data obtained from 3D scanners, hence, understanding the precision and reliability of the equipment used is essential to obtain feasible results.

  4. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  5. A multi-state reliability evaluation model for P2P networks

    International Nuclear Information System (INIS)

    Fan Hehong; Sun Xiaohan

    2010-01-01

    The appearance of new service types and the convergence tendency of the communication networks have endowed the networks more and more P2P (peer to peer) properties. These networks can be more robust and tolerant for a series of non-perfect operational states due to the non-deterministic server-client distributions. Thus a reliability model taking into account of the multi-state and non-deterministic server-client distribution properties is needed for appropriate evaluation of the networks. In this paper, two new performance measures are defined to quantify the overall and local states of the networks. A new time-evolving state-transition Monte Carlo (TEST-MC) simulation model is presented for the reliability analysis of P2P networks in multiple states. The results show that the model is not only valid for estimating the traditional binary-state network reliability parameters, but also adequate for acquiring the parameters in a series of non-perfect operational states, with good efficiencies, especially for highly reliable networks. Furthermore, the model is versatile for the reliability and maintainability analyses in that both the links and the nodes can be failure-prone with arbitrary life distributions, and various maintainability schemes can be applied.

  6. New Sufficient LMI Conditions for Static Output Stabilization

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher

    2014-01-01

    This paper presents new linear matrix inequality conditions to the static output feedback stabilization problem. Although the conditions are only sufficient, numerical experiments show excellent success rates in finding a stabilizing controller....

  7. Comprehensive low-cost reliability centered maintenance. Final report

    International Nuclear Information System (INIS)

    Rotton, S.J.; Dozier, I.J.; Thow, R.

    1995-09-01

    Reliability Centered Maintenance (RCM) is a maintenance optimization approach that all electric utilities can apply to power plant systems. The Electric Power Research Institute and PECO Energy Company jointly sponsored this Comprehensive Low-Cost Reliability Centered Maintenance project to demonstrate that the standard RCM methodology could be streamlined to reduce the cost of analysis while maintaining a high quality product. EPRI's previous investigation of streamlined RCM methods being pioneered in the nuclear industry indicated that PECO Energy could expect to optimize its maintenance program at reduced cost by carefully controlling the scope without sacrificing documentation or technical quality. Using the insights obtained from these previous studies, three methods were defined in this project and were demonstrated in a large scale application to 60 systems at both the Limerick Generating Station and the Peach Bottom Atomic Power Station

  8. Inter- and intra-observer reliability of masking in plantar pressure measurement analysis.

    Science.gov (United States)

    Deschamps, K; Birch, I; Mc Innes, J; Desloovere, K; Matricali, G A

    2009-10-01

    Plantar pressure measurement is an important tool in gait analysis. Manual placement of small masks (masking) is increasingly used to calculate plantar pressure characteristics. Little is known concerning the reliability of manual masking. The aim of this study was to determine the reliability of masking on 2D plantar pressure footprints, in a population with forefoot deformity (i.e. hallux valgus). Using a random repeated-measure design, four observers identified the third metatarsal head on a peak-pressure barefoot footprint, using a small mask. Subsequently, the location of all five metatarsal heads was identified, using the same size of masks and the same protocol. The 2D positional variation of the masks and the peak pressure (PP) and pressure time integral (PTI) values of each mask were calculated. For single-masking the lowest inter-observer reliability was found for the distal-proximal direction, causing a clear, adverse impact on the reliability of the pressure characteristics (PP and PTI). In the medial-lateral direction the inter-observer reliability could be scored as high. Intra-observer reliability was better and could be scored as high or good for both directions, with a correlated improved reliability of the pressure characteristics. Reliability of multi-masking showed a similar pattern, but overall values tended to be lower. Therefore, small sized masking in order to define pressure characteristics in the forefoot should be done with care.

  9. Innovation and reliability of atomic standards for PTTI applications

    Science.gov (United States)

    Kern, R.

    1981-01-01

    Innovation and reliability in hyperfine frequency standards and clock systems are discussed. Hyperfine standards are defined as those precision frequency sources and clocks which use a hyperfine atomic transition for frequency control and which have realized significant commercial production and acceptance (cesium, hydrogen, and rubidium atoms). References to other systems such as thallium and ammonia are excluded since these atomic standards have not been commercially exploited in this country.

  10. The Necessary and Sufficient Closure Process Completion Report for Purex FacilitySurveillance and Maintenance

    International Nuclear Information System (INIS)

    Gerald, J.W.

    1997-10-01

    This document completes the U.S. Department of Energy Closure Process for Necessary and Sufficient Sets of Standards process for the Plutonium Uranium Extraction facility located at the Hanford Site in Washington State. This documentation is provided to support the Work Smart Standards set identified for the long-term surveillance and maintenance of PUREX. This report is organized into two volumes. Volume 1 contains the following sections: Section 1: Provides an introduction for the document Section 2: Provides a basis for initiating the N ampersand S process Section 3: Defines the work and hazards to be addressed Section 4: Identifies the N ampersand S set of standards and requirements Section 5: Provides the justification for adequacy of the work smart standards Section 6: Shows the criteria and qualifications of the teams Section 7: Describes the stakeholder participation and concerns Section 8: Provides a list of references used within the document

  11. Implementing necessary and sufficient standards for radioactive waste management at LLNL

    International Nuclear Information System (INIS)

    Sims, J.M.; Ladran, A.; Hoyt, D.

    1995-01-01

    Lawrence Livermore National Laboratory (LLNL) and the U.S. Department of Energy, Oakland Field Office (DOE/OAK), are participating in a pilot program to evaluate the process to develop necessary and sufficient sets of standards for contractor activities. This concept of contractor and DOE jointly and locally deciding on what constitutes the set of standards that are necessary and sufficient to perform work safely and in compliance with federal, state, and local regulations, grew out of DOE's Department Standards Committee (Criteria for the Department's Standards Program, August 1994, DOE/EH/-0416). We have chosen radioactive waste management activities as the pilot program at LLNL. This pilot includes low-level radioactive waste, transuranic (TRU) waste, and the radioactive component of low-level and TRU mixed wastes. Guidance for the development and implementation of the necessary and sufficient set of standards is provided in open-quotes The Department of Energy Closure Process for Necessary and Sufficient Sets of Standards,close quotes March 27, 1995 (draft)

  12. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  13. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  14. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  15. Structural reliability analysis and seismic risk assessment

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed

  16. Insightful practice: a reliable measure for medical revalidation

    Science.gov (United States)

    Guthrie, Bruce; Sullivan, Frank M; Mercer, Stewart W; Russell, Andrew; Bruce, David A

    2012-01-01

    Background Medical revalidation decisions need to be reliable if they are to reassure on the quality and safety of professional practice. This study tested an innovative method in which general practitioners (GPs) were assessed on their reflection and response to a set of externally specified feedback. Setting and participants 60 GPs and 12 GP appraisers in the Tayside region of Scotland, UK. Methods A feedback dataset was specified as (1) GP-specific data collected by GPs themselves (patient and colleague opinion; open book self-evaluated knowledge test; complaints) and (2) Externally collected practice-level data provided to GPs (clinical quality and prescribing safety). GPs' perceptions of whether the feedback covered UK General Medical Council specified attributes of a ‘good doctor’ were examined using a mapping exercise. GPs' professionalism was examined in terms of appraiser assessment of GPs' level of insightful practice, defined as: engagement with, insight into and appropriate action on feedback data. The reliability of assessment of insightful practice and subsequent recommendations on GPs' revalidation by face-to-face and anonymous assessors were investigated using Generalisability G-theory. Main outcome measures Coverage of General Medical Council attributes by specified feedback and reliability of assessor recommendations on doctors' suitability for revalidation. Results Face-to-face assessment proved unreliable. Anonymous global assessment by three appraisers of insightful practice was highly reliable (G=0.85), as were revalidation decisions using four anonymous assessors (G=0.83). Conclusions Unlike face-to-face appraisal, anonymous assessment of insightful practice offers a valid and reliable method to decide GP revalidation. Further validity studies are needed. PMID:22653078

  17. Quality of life in oncological patients with oropharyngeal dysphagia: validity and reliability of the Dutch version of the MD Anderson Dysphagia Inventory and the Deglutition Handicap Index.

    Science.gov (United States)

    Speyer, Renée; Heijnen, Bas J; Baijens, Laura W; Vrijenhoef, Femke H; Otters, Elsemieke F; Roodenburg, Nel; Bogaardt, Hans C

    2011-12-01

    Quality of life is an important outcome measurement in objectifying the current health status or therapy effects in patients with oropharyngeal dysphagia. In this study, the validity and reliability of the Dutch version of the Deglutition Handicap Index (DHI) and the MD Anderson Dysphagia Inventory (MDADI) have been determined for oncological patients with oropharyngeal dysphagia. At Maastricht University Medical Center, 76 consecutive patients were selected and asked to fill in three questionnaires on quality of life related to oropharyngeal dysphagia (the SWAL-QOL, the MDADI, and the DHI) as well as a simple one-item visual analog Dysphagia Severity Scale. None of the quality-of-life questionnaires showed any floor or ceiling effect. The test-retest reliability of the MDADI and the Dysphagia Severity Scale proved to be good. The test-retest reliability of the DHI could not be determined because of insufficient data, but the intraclass correlation coefficients were rather high. The internal consistency proved to be good. However, confirmatory factor analysis could not distinguish the underlying constructs as defined by the subscales per questionnaire. When assessing criterion validity, both the MDADI and the DHI showed satisfactory associations with the SWAL-QOL (reference or gold standard) after having removed the less relevant subscales of the SWAL-QOL. In conclusion, when assessing the validity and reliability of the Dutch version of the DHI or the MDADI, not all psychometric properties have been adequately met. In general, because of difficulties in the interpretation of study results when using questionnaires lacking sufficient psychometric quality, it is recommended that researchers strive to use questionnaires with the most optimal psychometric properties.

  18. Reliability analysis and initial requirements for FC systems and stacks

    Science.gov (United States)

    Åström, K.; Fontell, E.; Virtanen, S.

    In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.

  19. Sufficient conditions for positivity of non-Markovian master equations with Hermitian generators

    International Nuclear Information System (INIS)

    Wilkie, Joshua; Wong Yinmei

    2009-01-01

    We use basic physical motivations to develop sufficient conditions for positive semidefiniteness of the reduced density matrix for generalized non-Markovian integrodifferential Lindblad-Kossakowski master equations with Hermitian generators. We show that it is sufficient for the memory function to be the Fourier transform of a real positive symmetric frequency density function with certain properties. These requirements are physically motivated, and are more general and more easily checked than previously stated sufficient conditions. We also explore the decoherence dynamics numerically for some simple models using the Hadamard representation of the propagator. We show that the sufficient conditions are not necessary conditions. We also show that models exist in which the long time limit is in part determined by non-Markovian effects

  20. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  1. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  2. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2015-01-01

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  3. Model of load balancing using reliable algorithm with multi-agent system

    Science.gov (United States)

    Afriansyah, M. F.; Somantri, M.; Riyadi, M. A.

    2017-04-01

    Massive technology development is linear with the growth of internet users which increase network traffic activity. It also increases load of the system. The usage of reliable algorithm and mobile agent in distributed load balancing is a viable solution to handle the load issue on a large-scale system. Mobile agent works to collect resource information and can migrate according to given task. We propose reliable load balancing algorithm using least time first byte (LFB) combined with information from the mobile agent. In system overview, the methodology consisted of defining identification system, specification requirements, network topology and design system infrastructure. The simulation method for simulated system was using 1800 request for 10 s from the user to the server and taking the data for analysis. Software simulation was based on Apache Jmeter by observing response time and reliability of each server and then compared it with existing method. Results of performed simulation show that the LFB method with mobile agent can perform load balancing with efficient systems to all backend server without bottleneck, low risk of server overload, and reliable.

  4. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  5. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  6. MRI of the small bowel: can sufficient bowel distension be achieved with small volumes of oral contrast?

    International Nuclear Information System (INIS)

    Kinner, Sonja; Kuehle, Christiane A.; Ladd, Susanne C.; Barkhausen, Joerg; Herbig, Sebastian; Haag, Sebastian; Lauenstein, Thomas C.

    2008-01-01

    Sufficient luminal distension is mandatory for small bowel imaging. However, patients often are unable to ingest volumes of currently applied oral contrast compounds. The aim of this study was to evaluate if administration of low doses of an oral contrast agent with high-osmolarity leads to sufficient and diagnostic bowel distension. Six healthy volunteers ingested at different occasions 150, 300 and 450 ml of a commercially available oral contrast agent (Banana Smoothie Readi-Cat, E-Z-EM; 194 mOsmol/l). Two-dimensional TrueFISP data sets were acquired in 5-min intervals up to 45 min after contrast ingestion. Small bowel distension was quantified using a visual five-grade ranking (5 very good distension, 1 = collapsed bowel). Results were statistically compared using a Wilcoxon-Rank test. Ingestion of 450 ml and 300 ml resulted in a significantly better distension than 150 ml. The all-over average distension value for 450 ml amounted to 3.4 (300 ml: 3.0, 150 ml: 2.3) and diagnostic bowel distension could be found throughout the small intestine. Even 45 min after ingestion of 450 ml the jejunum and ileum could be reliably analyzed. Small bowel imaging with low doses of contrast leads to diagnostic distension values in healthy subjects when a high-osmolarity substance is applied. These findings may help to further refine small bowel MRI techniques, but need to be confirmed in patients with small bowel disorders. (orig.)

  7. Alternative approach to automated management of load flow in engineering networks considering functional reliability

    Directory of Open Access Journals (Sweden)

    Ирина Александровна Гавриленко

    2016-02-01

    Full Text Available The approach to automated management of load flow in engineering networks considering functional reliability was proposed in the article. The improvement of the concept of operational and strategic management of load flow in engineering networks was considered. The verbal statement of the problem for thesis research is defined, namely, the problem of development of information technology for exact calculation of the functional reliability of the network, or the risk of short delivery of purpose-oriented product for consumers

  8. Reliability of dc power supplies in nuclear power plant application

    International Nuclear Information System (INIS)

    Eisenhut, D.G.

    1978-01-01

    In June 1977 the reliability of dc power supplies at nuclear power facilities was questioned. It was postulated that a sudden gross failure of the redundant dc power supplies might occur during normal plant operation, and that this could lead to insufficient shutdown cooling of the reactor core. It was further suggested that this potential for insufficient cooling is great enough to warrant consideration of prompt remedies. The work described herein was part of the NRC staff's efforts aimed towards putting the performance of dc power supplies in proper perspective and was mainly directed towards the particular concern raised at that time. While the staff did not attempt to perform a systematic study of overall dc power supply reliability including all possible failure modes for such supplies, the work summarized herein describes how a probabilistic approach was used to supplement our more usual deterministic approach to reactor safety. Our evaluation concluded that the likelihood of dc power supply failures leading to insufficient shutdown cooling of the reactor core is sufficiently small as to not require any immediate action

  9. Multi-state reliability for pump group in system based on UGF and semi-Markov process

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Zhao Xinwen; Chen Ling

    2012-01-01

    In this paper, multi-state reliability value of pump group in nuclear power system is obtained by the combination method of the universal generating function (UGF) and Semi-Markov process. UGF arithmetic model of multi-state system reliability is studied, and the performance state probability expression of multi-state component is derived using semi-Markov theory. A quantificational model is defined to express the performance rate of the system and component. Different availability results by multi-state and binary state analysis method are compared under the condition whether the performance rate can satisfy the demanded value, and the mean value of system instantaneous output performance is also obtained. It shows that this combination method is an effective and feasible one which can quantify the effect of the partial failure on the system reliability, and the result of multi-state system reliability by this method deduces the modesty of the reliability value obtained by binary reliability analysis method. (authors)

  10. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    spectroscopic redshift measurements. This newly-defined method is very promising for next-generation large spectroscopic surveys from the ground and in space, such as Euclid and WFIRST. A table of the reclassified VVDS redshifts and reliability is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/611/A53

  11. Analyzing the reliability of shuffle-exchange networks using reliability block diagrams

    International Nuclear Information System (INIS)

    Bistouni, Fathollah; Jahanshahi, Mohsen

    2014-01-01

    Supercomputers and multi-processor systems are comprised of thousands of processors that need to communicate in an efficient way. One reasonable solution would be the utilization of multistage interconnection networks (MINs), where the challenge is to analyze the reliability of such networks. One of the methods to increase the reliability and fault-tolerance of the MINs is use of various switching stages. Therefore, recently, the reliability of one of the most common MINs namely shuffle-exchange network (SEN) has been evaluated through the investigation on the impact of increasing the number of switching stage. Also, it is concluded that the reliability of SEN with one additional stage (SEN+) is better than SEN or SEN with two additional stages (SEN+2), even so, the reliability of SEN is better compared to SEN with two additional stages (SEN+2). Here we re-evaluate the reliability of these networks where the results of the terminal, broadcast, and network reliability analysis demonstrate that SEN+ and SEN+2 continuously outperform SEN and are very alike in terms of reliability. - Highlights: • The impact of increasing the number of stages on reliability of MINs is investigated. • The RBD method as an accurate method is used for the reliability analysis of MINs. • Complex series–parallel RBDs are used to determine the reliability of the MINs. • All measures of the reliability (i.e. terminal, broadcast, and network reliability) are analyzed. • All reliability equations will be calculated for different size N×N

  12. The birth satisfaction scale: Turkish adaptation, validation and reliability study

    Science.gov (United States)

    Cetin, Fatma Cosar; Sezer, Ayse; Merih, Yeliz Dogan

    2015-01-01

    OBJECTIVE: The objective of this study is to investigate the validity and the reliability of Birth Satisfaction Scale (BSS) and to adapt it into the Turkish language. This scale is used for measuring maternal satisfaction with birth in order to evaluate women’s birth perceptions. METHODS: In this study there were 150 women who attended to inpatient postpartum clinic. The participants filled in an information form and the BSS questionnaire forms. The properties of the scale were tested by conducting reliability and validation analyses. RESULTS: BSS entails 30 Likert-type questions. It was developed by Hollins Martin and Fleming. Total scale scores ranged between 30–150 points. Higher scores from the scale mean increases in birth satisfaction. Three overarching themes were identified in Scale: service provision (home assessment, birth environment, support, relationships with health care professionals); personal attributes (ability to cope during labour, feeling in control, childbirth preparation, relationship with baby); and stress experienced during labour (distress, obstetric injuries, receiving sufficient medical care, obstetric intervention, pain, prolonged labour and baby’s health). Cronbach’s alfa coefficient was 0.62. CONCLUSION: According to the present study, BSS entails 30 Likert-type questions and evaluates women’s birth perceptions. The Turkish version of BSS has been proven to be a valid and a reliable scale. PMID:28058355

  13. Reliability of Bluetooth Technology for Travel Time Estimation

    DEFF Research Database (Denmark)

    Araghi, Bahar Namaki; Olesen, Jonas Hammershøj; Krishnan, Rajesh

    2015-01-01

    . However, their corresponding impacts on accuracy and reliability of estimated travel time have not been evaluated. In this study, a controlled field experiment is conducted to collect both Bluetooth and GPS data for 1000 trips to be used as the basis for evaluation. Data obtained by GPS logger is used...... to calculate actual travel time, referred to as ground truth, and to geo-code the Bluetooth detection events. In this setting, reliability is defined as the percentage of devices captured per trip during the experiment. It is found that, on average, Bluetooth-enabled devices will be detected 80% of the time......-range antennae detect Bluetooth-enabled devices in a closer location to the sensor, thus providing a more accurate travel time estimate. However, the smaller the size of the detection zone, the lower the penetration rate, which could itself influence the accuracy of estimates. Therefore, there has to be a trade...

  14. The reliability of repeated TMS measures in older adults and in patients with subacute and chronic stroke

    Directory of Open Access Journals (Sweden)

    Heidi M. Schambra

    2015-09-01

    Full Text Available The reliability of transcranial magnetic stimulation (TMS measures in healthy older adults and stroke patients has been insufficiently characterized. We determined whether common TMS measures could reliably evaluate change in individuals and in groups using the smallest detectable change (SDC, or could tell subjects apart using the intraclass correlation coefficient (ICC. We used a single-rater test-retest design in older healthy, subacute stroke, and chronic stroke subjects. At twice daily sessions on two consecutive days, we recorded resting motor threshold, test stimulus intensity, recruitment curves, short-interval intracortical inhibition and facilitation, and long-interval intracortical inhibition. Using variances estimated from a random effects model, we calculated the SDC and ICC for each TMS measure. For all TMS measures in all groups, SDCs for single subjects were large; only with modest group sizes did the SDCs become low. Thus, while these TMS measures cannot be reliably used as a biomarker to detect individual change, they can reliably detect change exceeding measurement noise in moderate-sized groups. For several of the TMS measures, ICCs were universally high, suggesting that they can reliably discriminate between subjects. Though most TMS measures have sufficient reliability in particular contexts, work establishing their validity, responsiveness, and clinical relevance is still needed.

  15. Sufficient vitamin K status combined with sufficient vitamin D status is associated with better lower extremity function: a prospective analysis of two knee osteoarthritis cohorts.

    Science.gov (United States)

    Shea, M Kyla; Loeser, Richard F; McAlindon, Timothy E; Houston, Denise K; Kritchevsky, Stephen B; Booth, Sarah L

    2017-10-17

    Vitamins K and D are important for the function of vitamin K-dependent proteins in joint tissues. It is unclear if these nutrients are mutually important to functional outcomes related to knee osteoarthritis (OA). We evaluated the association of vitamin K and D sufficiency with lower-extremity function in the Health, Aging Body Composition Knee OA Sub-study (Health ABC) and conducted a replication analysis in an independent cohort, the Osteoarthritis Initiative (OAI). In Health ABC (60% female, 75±3 years) baseline nutrient status was measured using circulating vitamin K and 25(OH)D. Lower-extremity function was assessed using the short physical performance battery (SPPB) and usual 20-meter gait speed. In the OAI (58% female, 61±9 years), baseline nutrient intake was estimated by food frequency questionnaire. Lower-extremity function was assessed using usual 20-meter gait speed and chair stand completion time. Multivariate mixed models were used to evaluate the association of vitamin K and D status and intake with lower-extremity function over 4-5 years. Health ABC participants with sufficient plasma vitamin K (≥1.0 nmol/L) and serum 25(OH)D (≥50 nmol/L) generally had better SPPB scores and faster usual gait speed over follow-up (p≤0.002). In the OAI, sufficient vitamin K and vitamin D intake combined was associated with overall faster usual gait speed and chair stand completion time over follow-up (p≤0.029). Sufficient vitamin K status combined with sufficient vitamin D status was associated with better lower-extremity function in two knee OA cohorts. These findings merit confirmation in vitamin K and D co-supplementation trials. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  17. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  18. EVALUATION OF FOOD SELF-SUFFICIENCY OF THE REPUBLIC OF TATARSTAN DISTRICTS

    Directory of Open Access Journals (Sweden)

    R. E. Mansurov

    2017-01-01

    Full Text Available The article presents the author's method for estimation of the level of food self-sufficiency for the main types of food products in the regions of Republic of Tatarstan. The proposed method is based on the use of analytical methods and mathematical comparative analysis to compose a final rating. The proposed method can be used in the system of regional management of agro-industrial complex on the federal and local level. Relevance. The relevance of this work is caused by on the one hand a hardening of foreign policy that may negatively impact on national food security, and on the other hand the state crisis of the domestic agricultural sector. All this requires the development of new approaches to regional agribusiness management. Goal. To develop a methodology is used to assess the level of food self-sufficiency. To rate the level of self-sufficiency in main types of foodstuff in regions of Republic of Tatarstan. Materials and Methods. Statistical data of the results of the AIC of the Republic of Tatarstan for 2016 was used for the study. Analytical methods, including mathematical analysis and comparison were used. Results. Based on the analysis of the present situation for ensuring of food security in Russia it was shown that now it is necessary to develop effective indicators identifying the level of self-sufficiency in basic food regions. It was also revealed that there are no such indicators in system of regional agrarian and industrial complex at present time. As a result of analysis existing approaches the author's method of rating the level of self-sufficiency of regions was offered. This method was adopted on the example of the Republic of Tatarstan. Conclusions. The proposed method of rating estimation of self-sufficiency for basic foodstuffs can be used in the regional agroindustrial complex management system at the federal and local level. It can be used to rank areas in terms of their self-sufficiency for basic foodstuffs. This

  19. Measuring disability: a systematic review of the validity and reliability of the Global Activity Limitations Indicator (GALI).

    Science.gov (United States)

    Van Oyen, Herman; Bogaert, Petronille; Yokota, Renata T C; Berger, Nicolas

    2018-01-01

    GALI or Global Activity Limitation Indicator is a global survey instrument measuring participation restriction. GALI is the measure underlying the European indicator Healthy Life Years (HLY). Gali has a substantial policy use within the EU and its Member States. The objective of current paper is to bring together what is known from published manuscripts on the validity and the reliability of GALI. Following the PRISMA guidelines, two search strategies (PUBMED, Google Scholar) were combined to identify manuscripts published in English with publication date 2000 or beyond. Articles were classified as reliability studies, concurrent or predictive validity studies, in national or international populations. Four cross-sectional studies (of which 2 international) studied how GALI relates to other health measures (concurrent validity). A dose-response effect by GALI severity level on the association with the other health status measures was observed in the national studies. The 2 international studies (SHARE, EHIS) concluded that the odds of reporting participation restriction was higher in subjects with self-reported or observed functional limitations. In SHARE, the size of the Odds Ratio's (ORs) in the different countries was homogeneous, while in EHIS the size of the ORs varied more strongly. For the predictive validity, subjects were followed over time (4 studies of which one international). GALI proved, both in national and international data, to be a consistent predictor of future health outcomes both in terms of mortality and health care expenditure. As predictors of mortality, the two distinct health concepts, self-rated health and GALI, acted independently and complementary of each other. The one reliability study identified reported a sufficient reliability of GALI. GALI as inclusive one question instrument fits all conceptual characteristics specified for a global measure on participation restriction. In none of the studies, included in the review, there was

  20. Recent Reliability Reporting Practices in "Psychological Assessment": Recognizing the People behind the Data

    Science.gov (United States)

    Green, Carlton E.; Chen, Cynthia E.; Helms, Janet E.; Henze, Kevin T.

    2011-01-01

    Helms, Henze, Sass, and Mifsud (2006) defined good practices for internal consistency reporting, interpretation, and analysis consistent with an alpha-as-data perspective. Their viewpoint (a) expands on previous arguments that reliability coefficients are group-level summary statistics of samples' responses rather than stable properties of scales…

  1. Computational intelligence methods for the efficient reliability analysis of complex flood defence structures

    NARCIS (Netherlands)

    Kingston, Greer B.; Rajabali Nejad, Mohammadreza; Gouldby, Ben P.; van Gelder, Pieter H.A.J.M.

    2011-01-01

    With the continual rise of sea levels and deterioration of flood defence structures over time, it is no longer appropriate to define a design level of flood protection, but rather, it is necessary to estimate the reliability of flood defences under varying and uncertain conditions. For complex

  2. Global Sufficient Optimality Conditions for a Special Cubic Minimization Problem

    Directory of Open Access Journals (Sweden)

    Xiaomei Zhang

    2012-01-01

    Full Text Available We present some sufficient global optimality conditions for a special cubic minimization problem with box constraints or binary constraints by extending the global subdifferential approach proposed by V. Jeyakumar et al. (2006. The present conditions generalize the results developed in the work of V. Jeyakumar et al. where a quadratic minimization problem with box constraints or binary constraints was considered. In addition, a special diagonal matrix is constructed, which is used to provide a convenient method for justifying the proposed sufficient conditions. Then, the reformulation of the sufficient conditions follows. It is worth noting that this reformulation is also applicable to the quadratic minimization problem with box or binary constraints considered in the works of V. Jeyakumar et al. (2006 and Y. Wang et al. (2010. Finally some examples demonstrate that our optimality conditions can effectively be used for identifying global minimizers of the certain nonconvex cubic minimization problem.

  3. Reliability Improved Cooperative Communication over Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhuangbin Chen

    2017-10-01

    Full Text Available With the development of smart devices and connection technologies, Wireless Sensor Networks (WSNs are becoming increasingly intelligent. New or special functions can be obtained by receiving new versions of program codes to upgrade their software systems, forming the so-called smart Internet of Things (IoT. Due to the lossy property of wireless channels, data collection in WSNs still suffers from a long delay, high energy consumption, and many retransmissions. Thanks to wireless software-defined networks (WSDNs, software in sensors can now be updated to help them transmit data cooperatively, thereby achieving more reliable communication. In this paper, a Reliability Improved Cooperative Communication (RICC data collection scheme is proposed to improve the reliability of random-network-coding-based cooperative communications in multi-hop relay WSNs without reducing the network lifetime. In WSNs, sensors in different positions can have different numbers of packets to handle, resulting in the unbalanced energy consumption of the network. In particular, nodes in non-hotspot areas have up to 90% of their original energy remaining when the network dies. To efficiently use the residual energy, in RICC, high data transmission power is adopted in non-hotspot areas to achieve a higher reliability at the cost of large energy consumption, and relatively low transmission power is adopted in hotspot areas to maintain the long network lifetime. Therefore, high reliability and a long network lifetime can be obtained simultaneously. The simulation results show that compared with other scheme, RICC can reduce the end-to-end Message Fail delivering Ratio (MFR by 59.4%–62.8% under the same lifetime with a more balanced energy utilization.

  4. Do sufficient vitamin D levels at the end of summer in children and adolescents provide an assurance of vitamin D sufficiency at the end of winter? A cohort study.

    Science.gov (United States)

    Shakeri, Habibesadat; Pournaghi, Seyed-Javad; Hashemi, Javad; Mohammad-Zadeh, Mohammad; Akaberi, Arash

    2017-10-26

    The changes in serum 25-hydroxyvitamin D (25(OH)D) in adolescents from summer to winter and optimal serum vitamin D levels in the summer to ensure adequate vitamin D levels at the end of winter are currently unknown. This study was conducted to address this knowledge gap. The study was conducted as a cohort study. Sixty-eight participants aged 7-18 years and who had sufficient vitamin D levels at the end of the summer in 2011 were selected using stratified random sampling. Subsequently, the participants' vitamin D levels were measured at the end of the winter in 2012. A receiver operating characteristic (ROC) curve was used to determine optimal cutoff points for vitamin D at the end of the summer to predict sufficient vitamin D levels at the end of the winter. The results indicated that 89.7% of all the participants had a decrease in vitamin D levels from summer to winter: 14.7% of them were vitamin D-deficient, 36.8% had insufficient vitamin D concentrations and only 48.5% where able to maintain sufficient vitamin D. The optimal cutoff point to provide assurance of sufficient serum vitamin D at the end of the winter was 40 ng/mL at the end of the summer. Sex, age and vitamin D levels at the end of the summer were significant predictors of non-sufficient vitamin D at the end of the winter. In this age group, a dramatic reduction in vitamin D was observed over the follow-up period. Sufficient vitamin D at the end of the summer did not guarantee vitamin D sufficiency at the end of the winter. We found 40 ng/mL as an optimal cutoff point.

  5. Approach to defining de minimis, intermediate, and other classes of radioactive waste

    International Nuclear Information System (INIS)

    Cohen, J.J.; Smith, C.F.

    1986-01-01

    This study has developed a framework within which the complete spectrum of radioactive wastes can be defined. An approach has been developed that reflects both concerns in the framework of a radioactive waste classification system. In this approach, the class of any radioactive waste stream is dependent on its degree of radioactivity and its persistence. To be consistent with conventional systems, four waste classes are defined. In increasing order of concern due to radioactivity and/or duration, these are: 1. De Minimis Wastes: This waste has such a low content of radioactive material that it can be considered essentially nonradioactive and managed according to its nonradiological characteristics. 2. Low-Level Waste (LLW): Maximum concentrations for wastes considered to be in this class are prescribed in 10CFR61 as wastes that can be disposed of by shallow land burial methods. 3. Intermediate Level Waste (ILW): This category defines a class of waste whose content exceeds class C (10CFR61) levels, yet does not pose a sufficient hazard to justify management as a high-level waste (i.e., permanent isolation by deep geologic disposal). 4. High-Level Waste: HLW poses the most serious management problem and requires the most restrictive disposal methods. It is defined in NWPA as waste derived from the reprocessing of nuclear fuel and/or as highly radioactive wastes that require permanent isolation

  6. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  7. Generating Reliable and Affective Choreography through Engineering

    DEFF Research Database (Denmark)

    Jochum, Elizabeth

    How do we define graceful motion? Is grace exclusively the province of living, sentient beings, or is it possible to automate graceful motion? The GRACE project (Generating Reliable and Affective Choreography through Engineering) uses these two questions to investigate what makes movement graceful....... The principal objective is to measure the role of kinesics on human-robot interactions through the development of an automated performance. If it is possible to create an automated program using autonomous, artificial agents that emulate aspects of human gracefulness, then we can apply this understanding more...... widely to contemporary robotics research and human-robot interaction (HRI)....

  8. Problems Related to Use of Some Terms in System Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nadezda Hanusova

    2004-01-01

    Full Text Available The paper deals with problems of using dependability terms, defined in actual standard STN IEC 50 (191: International electrotechnical dictionary, chap. 191: Dependability and quality of service (1993, in a technical systems dependability analysis. The goal of the paper is to find a relation between terms introduced in the mentioned standard and used in the technical systems dependability analysis and rules and practices used in a system analysis of the system theory. Description of a part of the system life cycle related to reliability is used as a starting point. The part of a system life cycle is described by the state diagram and reliability relevant therms are assigned.

  9. Impact of multilateral congestion management on the reliability of power transactions

    International Nuclear Information System (INIS)

    Rodrigues, A.B.; Da Silva, M.G.

    2003-01-01

    The restructuring of the electricity industry has caused an increase in the number of transactions in the energy market. These transactions are defined by market forces without considering operational constraints of the transmission system. Consequently, there are transactions that cause congestion in the transmission network. This paper has as objective to assess the impact of multilateral congestion management on the reliability of power transactions. This assessment is based on reliability indices such as expected power curtailments, curtailment probability, expected cost of congestion management and probability distributions of the total power curtailment. Tests results with IEEE RTS-1996 demonstrate that the multilateral management results in smaller curtailments and congestion costs than traditional bilateral management. (author)

  10. POLITICAL ECONOMIC ANALYSIS OF RICE SELF-SUFFICIENCY IN INDONESIA

    Directory of Open Access Journals (Sweden)

    Sri Nuryanti

    2018-01-01

    Full Text Available Rice self-sufficiency is an important programme in Indonesia. The programme has four major targets, i.e. increasing production, stabilizing prices and reserve stocks, and minimizing import. For that purpose, the government gave a mandate to a parastatal, namely National Logistic Agency (Bulog in implementing the rice policies. Some studies found that involvement of such a parastatal could lead to government failure in budget allocation. The study aimed to estimate social cost of rice self-sufficiency programme based on the implementation of rice instrument policies by Bulog. The study used the national annual data of 2002–2014 period. The method used was the political preference function model to estimate economic rent and dead-weight loss using rice price elasticity of demand and supply. The result showed that in terms of percentage of food security budget, the average of economic rent reached IDR 6.37 trillion per annum (18.54%, while the average of dead-weight loss amounted at IDR 0.90 trillion per annum (2.34%. It proved that rice self-sufficiency programme along with the involvement of Bulog was economically inefficient. The government should provide better agricultural infrastructure, review governmental procurement prices, and stop rice import policy to remedy market failure.

  11. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  12. Role of sufficient statistics in stochastic thermodynamics and its implication to sensory adaptation

    Science.gov (United States)

    Matsumoto, Takumi; Sagawa, Takahiro

    2018-04-01

    A sufficient statistic is a significant concept in statistics, which means a probability variable that has sufficient information required for an inference task. We investigate the roles of sufficient statistics and related quantities in stochastic thermodynamics. Specifically, we prove that for general continuous-time bipartite networks, the existence of a sufficient statistic implies that an informational quantity called the sensory capacity takes the maximum. Since the maximal sensory capacity imposes a constraint that the energetic efficiency cannot exceed one-half, our result implies that the existence of a sufficient statistic is inevitably accompanied by energetic dissipation. We also show that, in a particular parameter region of linear Langevin systems there exists the optimal noise intensity at which the sensory capacity, the information-thermodynamic efficiency, and the total entropy production are optimized at the same time. We apply our general result to a model of sensory adaptation of E. coli and find that the sensory capacity is nearly maximal with experimentally realistic parameters.

  13. Pricing and crude oil self-sufficiency. [Canada

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-01

    How Canada should go about achieving crude oil self-sufficiency and who should develop Canada's petroleum resources are discussed. The degree of urgency and the level of commitment required by government, industry, and consumers are evaluated. What the price should be of Canadian crude oil and who should establish this price are also discussed. The economic aspects of investment, return, and taxation are also included. (DC)

  14. Reliable Portfolio Selection Problem in Fuzzy Environment: An mλ Measure Based Approach

    Directory of Open Access Journals (Sweden)

    Yuan Feng

    2017-04-01

    Full Text Available This paper investigates a fuzzy portfolio selection problem with guaranteed reliability, in which the fuzzy variables are used to capture the uncertain returns of different securities. To effectively handle the fuzziness in a mathematical way, a new expected value operator and variance of fuzzy variables are defined based on the m λ measure that is a linear combination of the possibility measure and necessity measure to balance the pessimism and optimism in the decision-making process. To formulate the reliable portfolio selection problem, we particularly adopt the expected total return and standard variance of the total return to evaluate the reliability of the investment strategies, producing three risk-guaranteed reliable portfolio selection models. To solve the proposed models, an effective genetic algorithm is designed to generate the approximate optimal solution to the considered problem. Finally, the numerical examples are given to show the performance of the proposed models and algorithm.

  15. Verification, validation, and reliability of predictions

    International Nuclear Information System (INIS)

    Pigford, T.H.; Chambre, P.L.

    1987-04-01

    The objective of predicting long-term performance should be to make reliable determinations of whether the prediction falls within the criteria for acceptable performance. Establishing reliable predictions of long-term performance of a waste repository requires emphasis on valid theories to predict performance. The validation process must establish the validity of the theory, the parameters used in applying the theory, the arithmetic of calculations, and the interpretation of results; but validation of such performance predictions is not possible unless there are clear criteria for acceptable performance. Validation programs should emphasize identification of the substantive issues of prediction that need to be resolved. Examples relevant to waste package performance are predicting the life of waste containers and the time distribution of container failures, establishing the criteria for defining container failure, validating theories for time-dependent waste dissolution that depend on details of the repository environment, and determining the extent of congruent dissolution of radionuclides in the UO 2 matrix of spent fuel. Prediction and validation should go hand in hand and should be done and reviewed frequently, as essential tools for the programs to design and develop repositories. 29 refs

  16. Reliability and validity of the workplace social distance scale.

    Science.gov (United States)

    Yoshii, Hatsumi; Mandai, Nozomu; Saito, Hidemitsu; Akazawa, Kouhei

    2014-10-29

    Self-stigma, defined by a negative attitude toward oneself combined with the consciousness of being a target of prejudice, is a critical problem for psychiatric patients. Self-stigma studies among psychiatric patients have indicated that high stigma is predictive of detrimental effects such as the delay of treatment and decreases in social participation in patients, and levels of self-stigma should be statistically evaluated. In this study, we developed the Workplace Social Distance Scale (WSDS), rephrasing the eight items of the Japanese version of the Social Distance Scale (SDSJ) to apply to the work setting in Japan. We examined the reliability and validity of the WSDS among 83 psychiatric patients. Factor analysis extracted three factors from the scale items: "work relations," "shallow relationships," and "employment." These factors are similar to the assessment factors of the SDSJ. Cronbach's alpha coefficient for the WSDS was 0.753. The split-half reliability for the WSDS was 0.801, indicating significant correlations. In addition, the WSDS was significantly correlated with the SDSJ. These findings suggest that the WSDS represents an approximation of self-stigma in the workplace among psychiatric patients. Our study assessed the reliability and validity of the WSDS for measuring self-stigma in Japan. Future studies should investigate the reliability and validity of the scale in other countries.

  17. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  18. [Vitamin-antioxidant sufficiency of winter sports athletes].

    Science.gov (United States)

    Beketova, N A; Kosheleva, O V; Pereverzeva, O G; Vrzhesinskaia, O A; Kodentsova, V M; Solntseva, T N; Khanfer'ian, R A

    2013-01-01

    The sufficiency of 169 athletes (six disciplines: bullet shooting, biathlon, bobsleigh, skeleton, freestyle skiing, snowboarding) with vitamins A, E, C, B2, and beta-carotene has been investigated in April-September 2013. All athletes (102 juniors, mean age--18.5 +/- 0.3 years, and 67 adult high-performance athletes, mean age--26.8 +/- 0.7 years) were sufficiently supplied with vitamin A (70.7 +/- 1.7 mcg/dl). Mean blood serum retinol level was 15% higher the upper limit of the norm (80 mcg/dl) in biathletes while median reached 90.9 mcg/dl. Blood serum level of tocopherols (1.22 +/- 0.03 mg/dl), ascorbic acid (1.06 +/- 0.03 mg/dl), riboflavin (7.1 +/- 0.4 ng/ml), and beta-carotene (25.1 +/- 1.7 mcg/dl) was in within normal range, but the incidence of insufficiency of vitamins E, C, B2, and carotenoid among athletes varied in the range of 0-25, 0-17, 15-67 and 42-75%, respectively. 95% of adults and 80% of younger athletes were sufficiently provided with vitamin E. Vitamin E level in blood serum of juniors involved in skeleton and biathlon was lower by 51 and 72% (p antioxidants (beta-carotene and vitamins E and C). In other sports, the relative quantity of athletes sufficiently supplied with these essential nutrients did not exceed 56%. The quota of supplied with all antioxidants among bullet shooters (31.1%) and bobsledders (23.5%) was significantly (p antioxidant (mainly beta-carotene) was most often recorded among persons engaged in bullet shooting (67%). The simultaneous lack of all three antioxidants was found only in freestylers and bobsledders (about 5%). Decreased level of antioxidants in blood serum in 40% of athletes was combined with vitamin B2 deficiency. The data obtained suggest the necessity to optimize diet vitamin content of all athletes, taking into account the age and gender differences. Contrary to prevailing stereotypes the optimization must involve not only an increase in the consumption of vitamins (vitamins E, B group) and carotenoids, but

  19. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  20. Reliability estimates for selected sensors in fusion applications

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1996-09-01

    This report presents the results of a study to define several types of sensors in use, the qualitative reliability (failure modes) and quantitative reliability (average failure rates) for these types of process sensors. Temperature, pressure, flow, and level sensors are discussed for water coolant and for cryogenic coolants. The failure rates that have been found are useful for risk assessment and safety analysis. Repair times and calibration intervals are also given when found in the literature. All of these values can also be useful to plant operators and maintenance personnel. Designers may be able to make use of these data when planning systems. The final chapter in this report discusses failure rates for several types of personnel safety sensors, including ionizing radiation monitors, toxic and combustible gas detectors, humidity sensors, and magnetic field sensors. These data could be useful to industrial hygienists and other safety professionals when designing or auditing for personnel safety

  1. 76 FR 23222 - Electric Reliability Organization Interpretation of Transmission Operations Reliability

    Science.gov (United States)

    2011-04-26

    ....3d 1342 (DC Cir. 2009). \\5\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693... Reliability Standards for the Bulk-Power System. Action: FERC-725A. OMB Control No.: 1902-0244. Respondents...] Electric Reliability Organization Interpretation of Transmission Operations Reliability AGENCY: Federal...

  2. The concepts of leak before break and absolute reliability of NPP equipment and piping

    International Nuclear Information System (INIS)

    Getman, A.F.; Komarov, O.V.; Sokov, L.M.

    1997-01-01

    This paper describes the absolute reliability (AR) concept for ensuring safe operation of nuclear plant equipment and piping. The AR of a pipeline or component is defined as the level of reliability when the probability of an instantaneous double-ended break is near zero. AR analysis has been applied to Russian RBMK and VVER type reactors. It is proposed that analyses required for application of the leak before break concept should be included in AR implementation. The basic principles, methods, and approaches that provide the basis for implementing the AR concept are described

  3. The concepts of leak before break and absolute reliability of NPP equipment and piping

    Energy Technology Data Exchange (ETDEWEB)

    Getman, A.F.; Komarov, O.V.; Sokov, L.M. [and others

    1997-04-01

    This paper describes the absolute reliability (AR) concept for ensuring safe operation of nuclear plant equipment and piping. The AR of a pipeline or component is defined as the level of reliability when the probability of an instantaneous double-ended break is near zero. AR analysis has been applied to Russian RBMK and VVER type reactors. It is proposed that analyses required for application of the leak before break concept should be included in AR implementation. The basic principles, methods, and approaches that provide the basis for implementing the AR concept are described.

  4. Validity and reliability of the novel thyroid-specific quality of life questionnaire, ThyPRO

    DEFF Research Database (Denmark)

    Watt, Torquil; Hegedüs, Laszlo; Groenvold, Mogens

    2010-01-01

    Background Appropriate scale validity and internal consistency reliability have recently been documented for the new thyroid-specific quality of life (QoL) patient-reported outcome (PRO) measure for benign thyroid disorders, the ThyPRO. However, before clinical use, clinical validity and test......-retest reliability should be evaluated. Aim To investigate clinical ('known-groups') validity and test-retest reliability of the Danish version of the ThyPRO. Methods For each of the 13 ThyPRO scales, we defined groups expected to have high versus low scores ('known-groups'). The clinical validity (known......-groups validity) was evaluated by whether the ThyPRO scales could detect expected differences in a cross-sectional study of 907 thyroid patients. Test-retest reliability was evaluated by intra-class correlations of two responses to the ThyPRO 2 weeks apart in a subsample of 87 stable patients. Results On all 13...

  5. Reliability Constrained Priority Load Shedding for Aerospace Power System Automation

    Science.gov (United States)

    Momoh, James A.; Zhu, Jizhong; Kaddah, Sahar S.; Dolce, James L. (Technical Monitor)

    2000-01-01

    The need for improving load shedding on board the space station is one of the goals of aerospace power system automation. To accelerate the optimum load-shedding functions, several constraints must be involved. These constraints include congestion margin determined by weighted probability contingency, component/system reliability index, generation rescheduling. The impact of different faults and indices for computing reliability were defined before optimization. The optimum load schedule is done based on priority, value and location of loads. An optimization strategy capable of handling discrete decision making, such as Everett optimization, is proposed. We extended Everett method to handle expected congestion margin and reliability index as constraints. To make it effective for real time load dispatch process, a rule-based scheme is presented in the optimization method. It assists in selecting which feeder load to be shed, the location of the load, the value, priority of the load and cost benefit analysis of the load profile is included in the scheme. The scheme is tested using a benchmark NASA system consisting of generators, loads and network.

  6. 76 FR 23171 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Science.gov (United States)

    2011-04-26

    ... Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242, order on reh'g...-Power System reliability may request an interpretation of a Reliability Standard.\\7\\ The ERO's standards... information in its reliability assessments. The Reliability Coordinator must monitor Bulk Electric System...

  7. The voice of the customer: consumers define the ideal battery charger.

    Science.gov (United States)

    Lane, J P; Usiak, D J; Stone, V I; Scherer, M J

    1997-01-01

    The Rehabilitation Engineering Research Center on Technology Evaluation and Transfer is exploring how the users of assistive technology devices define the ideal device. This work is called the Consumer Ideal Product program. The results show what device characteristics are most and least important, indicating where to place the priority on product features and functions from the consumer's perspective. The "voice of the customer" can be used (1) to define the ideal characteristics of a product, (2) to make trade-offs in product design and function improvements based on their relative importance to the consumer, (3) to compare the characteristics of existing products against the characteristics of the ideal product, or (4) to generate a product checklist for consumers to use when making a purchase decision. This paper presents the results of consumers' defining the ideal battery charger. Four focus groups generated the survey's content, then 100 experienced users rated 159 characteristics organized under 11 general evaluation criteria. The consumers placed the highest importance on characteristics from the general evaluation criteria of product reliability, effectiveness, and physical security/safety. The findings should help manufacturers and vendors improve their products and services and help professionals and consumers make informed choices.

  8. Reliability of structures of industrial installations. Theory and applications of probabilistic mechanics

    International Nuclear Information System (INIS)

    Procaccia, H.; Morilhat, P.; Carle, R.; Menjon, G.

    1996-01-01

    The management of the service life of mechanical materials implies an evaluation of their risk of failure during their use. To evaluate this risk the following methods are used: the classical frequency statistics applied to experience feedback data concerning failures noticed during operation of active parts (pumps, valves, exchangers, circuit breakers etc..); the Bayesian approach in the case of scarce statistical data and when experts are needed to compensate the lack of information; the structures reliability approach when no data are available and when a theoretical model of degradation must be used, in particular for passive structures (pressure vessels, pipes, tanks, etc..). The aim of this book is to describe the principles and applications of this third approach to industrial installations. Chapter 1 recalls the historical aspects of the probabilistic approach to the reliability of structures and the existing codes. Chapter 2 presents the level 1 deterministic method applied so far for the conceiving of passive structures. The Cornell reliability index, already used in civil engineering codes, is defined in chapter 3. The Hasofer and Lind reliability index, a generalization of the Cornell index, is defined in chapter 4. Chapter 5 concerns the application of probabilistic approaches to optimization studies with the introduction of the economical variables linked to the risk and the possible actions to limit this risk (in-service inspection, maintenance, repairing etc..). Chapters 6 and 7 describe the Monte Carlo simulation and approximation methods for failure probabilistic calculations, and recall the fracture mechanics basis and the models of load and degradation of industrial installations. Some applications are given in chapter 9 with the cases of the safety margins quantization of a fissured pipe and the optimizing of the in-service inspection policy of a steam generator. Chapter 10 raises the problem of the coupling between mechanical and reliability

  9. Method for assessing the reliability of molecular diagnostics based on multiplexed SERS-coded nanoparticles.

    Directory of Open Access Journals (Sweden)

    Steven Y Leigh

    Full Text Available Surface-enhanced Raman scattering (SERS nanoparticles have been engineered to generate unique fingerprint spectra and are potentially useful as bright contrast agents for molecular diagnostics. One promising strategy for biomedical diagnostics and imaging is to functionalize various particle types ("flavors", each emitting a unique spectral signature, to target a large multiplexed panel of molecular biomarkers. While SERS particles emit narrow spectral features that allow them to be easily separable under ideal conditions, the presence of competing noise sources and background signals such as detector noise, laser background, and autofluorescence confounds the reliability of demultiplexing algorithms. Results obtained during time-constrained in vivo imaging experiments may not be reproducible or accurate. Therefore, our goal is to provide experimentalists with a metric that may be monitored to enforce a desired bound on accuracy within a user-defined confidence level. We have defined a spectral reliability index (SRI, based on the output of a direct classical least-squares (DCLS demultiplexing routine, which provides a measure of the reliability of the computed nanoparticle concentrations and ratios. We present simulations and experiments to demonstrate the feasibility of this strategy, which can potentially be utilized for a range of instruments and biomedical applications involving multiplexed SERS nanoparticles.

  10. Energy self-sufficiency in Northampton, Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    The study is not an engineering analysis but begins the process of exploring the potential for conservation and local renewable-resource development in a specific community, Northampton, Massachusetts, with the social, institutional, and environmental factors in that community taken into account. Section I is an extensive executive summary of the full study, and Section II is a detailed examination of the potential for increased local energy self-sufficiency in Northampton, including current and future demand estimates, the possible role of conservation and renewable resources, and a discussion of the economic and social implications of alternative energy systems. (MOW)

  11. Reliability in sealing of canister for spent nuclear fuel

    International Nuclear Information System (INIS)

    Ronneteg, Ulf; Cederqvist, Lars; Ryden, Haakan; Oeberg, Tomas; Mueller, Christina

    2006-06-01

    The reliability of the system for sealing the canister and inspecting the weld that has been developed for the Encapsulation plant was investigated. In the investigation the occurrence of discontinuities that can be formed in the welds was determined both qualitatively and quantitatively. The probability that these discontinuities can be detected by nondestructive testing (NDT) was also studied. The friction stir welding (FSW) process was verified in several steps. The variables in the welding process that determine weld quality were identified during the development work. In order to establish the limits within which they can be allowed to vary, a screening experiment was performed where the different process settings were tested according to a given design. In the next step the optimal process setting was determined by means of a response surface experiment, whereby the sensitivity of the process to different variable changes was studied. Based on the optimal process setting, the process window was defined, i.e. the limits within which the welding variables must lie in order for the process to produce the desired result. Finally, the process was evaluated during a demonstration series of 20 sealing welds which were carried out under production-like conditions. Conditions for the formation of discontinuities in welding were investigated. The investigations show that the occurrence of discontinuities is dependent on the welding variables. Discontinuities that can arise were classified and described with respect to characteristics, occurrence, cause and preventive measures. To ensure that testing of the welds has been done with sufficient reliability, the probability of detection (POD) of discontinuities by NDT and the accuracy of size determination by NDT were determined. In the evaluation of the demonstration series, which comprised 20 welds, a statistical method based on the generalized extreme value distribution was fitted to the size estimate of the indications

  12. Reliability in sealing of canister for spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    Ronneteg, Ulf [Bodycote Materials Testing AB, Nykoeping (Sweden); Cederqvist, Lars; Ryden, Haakan [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Oeberg, Tomas [Tomas Oeberg Konsult AB, Karlskrona (Sweden); Mueller, Christina [Federal Inst. for Materials Research and Testing, Berlin (Germany)

    2006-06-15

    The reliability of the system for sealing the canister and inspecting the weld that has been developed for the Encapsulation plant was investigated. In the investigation the occurrence of discontinuities that can be formed in the welds was determined both qualitatively and quantitatively. The probability that these discontinuities can be detected by nondestructive testing (NDT) was also studied. The friction stir welding (FSW) process was verified in several steps. The variables in the welding process that determine weld quality were identified during the development work. In order to establish the limits within which they can be allowed to vary, a screening experiment was performed where the different process settings were tested according to a given design. In the next step the optimal process setting was determined by means of a response surface experiment, whereby the sensitivity of the process to different variable changes was studied. Based on the optimal process setting, the process window was defined, i.e. the limits within which the welding variables must lie in order for the process to produce the desired result. Finally, the process was evaluated during a demonstration series of 20 sealing welds which were carried out under production-like conditions. Conditions for the formation of discontinuities in welding were investigated. The investigations show that the occurrence of discontinuities is dependent on the welding variables. Discontinuities that can arise were classified and described with respect to characteristics, occurrence, cause and preventive measures. To ensure that testing of the welds has been done with sufficient reliability, the probability of detection (POD) of discontinuities by NDT and the accuracy of size determination by NDT were determined. In the evaluation of the demonstration series, which comprised 20 welds, a statistical method based on the generalized extreme value distribution was fitted to the size estimate of the indications

  13. On the Simulation-Based Reliability of Complex Emergency Logistics Networks in Post-Accident Rescues.

    Science.gov (United States)

    Wang, Wei; Huang, Li; Liang, Xuedong

    2018-01-06

    This paper investigates the reliability of complex emergency logistics networks, as reliability is crucial to reducing environmental and public health losses in post-accident emergency rescues. Such networks' statistical characteristics are analyzed first. After the connected reliability and evaluation indices for complex emergency logistics networks are effectively defined, simulation analyses of network reliability are conducted under two different attack modes using a particular emergency logistics network as an example. The simulation analyses obtain the varying trends in emergency supply times and the ratio of effective nodes and validates the effects of network characteristics and different types of attacks on network reliability. The results demonstrate that this emergency logistics network is both a small-world and a scale-free network. When facing random attacks, the emergency logistics network steadily changes, whereas it is very fragile when facing selective attacks. Therefore, special attention should be paid to the protection of supply nodes and nodes with high connectivity. The simulation method provides a new tool for studying emergency logistics networks and a reference for similar studies.

  14. On the Simulation-Based Reliability of Complex Emergency Logistics Networks in Post-Accident Rescues

    Science.gov (United States)

    Wang, Wei; Huang, Li; Liang, Xuedong

    2018-01-01

    This paper investigates the reliability of complex emergency logistics networks, as reliability is crucial to reducing environmental and public health losses in post-accident emergency rescues. Such networks’ statistical characteristics are analyzed first. After the connected reliability and evaluation indices for complex emergency logistics networks are effectively defined, simulation analyses of network reliability are conducted under two different attack modes using a particular emergency logistics network as an example. The simulation analyses obtain the varying trends in emergency supply times and the ratio of effective nodes and validates the effects of network characteristics and different types of attacks on network reliability. The results demonstrate that this emergency logistics network is both a small-world and a scale-free network. When facing random attacks, the emergency logistics network steadily changes, whereas it is very fragile when facing selective attacks. Therefore, special attention should be paid to the protection of supply nodes and nodes with high connectivity. The simulation method provides a new tool for studying emergency logistics networks and a reference for similar studies. PMID:29316614

  15. Energy Self-Sufficient Island

    International Nuclear Information System (INIS)

    Bratic, S.; Krajacic, G.; Duic, N.; Cotar, A.; Jardas, D.

    2011-01-01

    In order to analyze energy self-sufficient island, example of a smaller island, connected to the power system of a bigger island with an undersea cable, was taken. Mounting substation 10/0,4 is situated on the island and for the moment it provides enough electricity using the medium voltage line. It is assumed that the island is situated on the north part of the Adriatic Sea. The most important problem that occurs on the island is the population drop that occurs for a significant number of years, therefore, life standard needs to be improved, and economic development needs to be encouraged immediately. Local authorities to stimulate sustainable development on the island through different projects, to breath in a new life to the island, open new jobs and attract new people to come live there. Because of the planned development and increase of the population, energy projects, planned as a support to sustainable development, and later achievement of the energy self-sufficiency, is described in this paper. Therefore, Rewisland methodology appliance is described taking into the account three possible scenarios of energy development. Each scenario is calculated until year 2030. Also, what is taken into the account is 100% usage of renewable sources of energy in 2030. Scenario PTV, PP, EE - This scenario includes installation of solar photovoltaic modules and solar thermal collectors on the buildings roofs, as well as well as implementation of energy efficiency on the island (replacement of the street light bulbs with LED lightning, replacement of the old windows and doors on the houses, as well as the installation of the thermal insulation). Scenario PV island - This scenario, similarly to the previous one, includes installation of solar photovoltaic modules and solar thermal collectors an the residential buildings, as well as the 2 MW photovoltaic power plant and ''Green Hotel'', a building that satisfies all of its energy needs completely from renewable energy sources

  16. Surviving the Lead Reliability Engineer Role in High Unit Value Projects

    Science.gov (United States)

    Perez, Reinaldo J.

    2011-01-01

    A project with a very high unit value within a company is defined as a project where a) the project constitutes one of a kind (or two-of-a-kind) national asset type of project, b) very large cost, and c) a mission failure would be a very public event that will hurt the company's image. The Lead Reliability engineer in a high visibility project is by default involved in all phases of the project, from conceptual design to manufacture and testing. This paper explores a series of lessons learned, over a period of ten years of practical industrial experience by a Lead Reliability Engineer. We expand on the concepts outlined by these lessons learned via examples. The lessons learned are applicable to all industries.

  17. Intellectual Freedom and Economic Sufficiency as Educational Entitlements.

    Science.gov (United States)

    Morse, Jane Fowler

    2001-01-01

    Using the theories of John Stuart Mill and Karl Marx, this article supports the educational entitlements of intellectual freedom and economic sufficiency. Explores these issues in reference to their implications for teaching, the teaching profession and its training. Concludes that ideas cannot be controlled by the interests of the dominant class.…

  18. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  19. Hippocampal MRI volumetry at 3 Tesla: reliability and practical guidance.

    Science.gov (United States)

    Jeukens, Cécile R L P N; Vlooswijk, Mariëlle C G; Majoie, H J Marian; de Krom, Marc C T F M; Aldenkamp, Albert P; Hofman, Paul A M; Jansen, Jacobus F A; Backes, Walter H

    2009-09-01

    Although volumetry of the hippocampus is considered to be an established technique, protocols reported in literature are not described in great detail. This article provides a complete and detailed protocol for hippocampal volumetry applicable to T1-weighted magnetic resonance (MR) images acquired at 3 Tesla, which has become the standard for structural brain research. The protocol encompasses T1-weighted image acquisition at 3 Tesla, anatomic guidelines for manual hippocampus delineation, requirements of delineation software, reliability measures, and criteria to assess and ensure sufficient reliability. Moreover, the validity of the correction for total intracranial volume size was critically assessed. The protocol was applied by 2 readers to the MR images of 36 patients with cryptogenic localization-related epilepsy, 4 patients with unilateral hippocampal sclerosis, and 20 healthy control subjects. The uncorrected hippocampal volumes were 2923 +/- 500 mm3 (mean +/- SD) (left) and 3120 +/- 416 mm3 (right) for the patient group and 3185 +/- 411 mm3 (left) and 3302 +/- 411 mm3 (right) for the healthy control group. The volume of the 4 pathologic hippocampi of the patients with unilateral hippocampal sclerosis was 2980 +/- 422 mm3. The inter-reader reliability values were determined: intraclass-correlation-coefficient (ICC) = 0.87 (left) and 0.86 (right), percentage volume difference (VD) = 7.0 +/- 4.7% (left) and 6.0 +/- 3.8% (right), and overlap ratio (OR) = 0.82 +/- 0.04 (left) and 0.82 +/- 0.03 (right). The positive Pearson correlation between hippocampal volume and total intracranial volume was found to be low: r = 0.48 (P = 0.03, left) and r = 0.62 (P = 0.004, right) and did not significantly reduce the volumetric variances, showing the limited benefit of the brain size correction. A protocol was described to determine hippocampal volumes based on 3 Tesla MR images with high inter-reader reliability. Although the reliability of hippocampal volumetry at 3 Tesla

  20. Value Relevance of the Fair Value Hierarchy of IFRS 7 in Europe - How reliable are mark-to-model Fair Values ?

    OpenAIRE

    Bosch, Patrick

    2012-01-01

    According to IFRS 7, banks have to disclose the inputs used in measuring the fair value of financial instruments. For this purpose the standard defines a three-level measurement hierarchy. The reliability of fair values is expected to decrease with decreasing hierarchy level due to the lower quality of the input factors. Using a value relevance research setting, I find that investors perceive the reliability of level 3 fair values as significantly lower than the reliability of level 1 fair va...

  1. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Science.gov (United States)

    2013-07-10

    ...] Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards AGENCY: Federal... Reliability Standards identified by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. FOR FURTHER INFORMATION CONTACT: Kevin Ryan (Legal Information...

  2. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Science.gov (United States)

    2011-09-20

    ....C. Cir. 2009). \\4\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, FERC... for maintaining real and reactive power balance. \\14\\ Electric Reliability Organization Interpretation...; Order No. 753] Electric Reliability Organization Interpretation of Transmission Operations Reliability...

  3. Acquisition of reliable vacuum hardware for large accelerator systems

    International Nuclear Information System (INIS)

    Welch, K.M.

    1995-01-01

    Credible and effective communications prove to be the major challenge in the acquisition of reliable vacuum hardware. Technical competence is necessary but not sufficient. The authors must effectively communicate with management, sponsoring agencies, project organizations, service groups, staff and with vendors. Most of Deming's 14 quality assurance tenants relate to creating an enlightened environment of good communications. All projects progress along six distinct, closely coupled, dynamic phases. All six phases are in a state of perpetual change. These phases and their elements are discussed, with emphasis given to the acquisition phase and its related vocabulary. Large projects require great clarity and rigor as poor communications can be costly. For rigor to be cost effective, it can't be pedantic. Clarity thrives best in a low-risk, team environment

  4. Redox self-sufficient whole cell biotransformation for amination of alcohols.

    Science.gov (United States)

    Klatte, Stephanie; Wendisch, Volker F

    2014-10-15

    Whole cell biotransformation is an upcoming tool to replace common chemical routes for functionalization and modification of desired molecules. In the approach presented here the production of various non-natural (di)amines was realized using the designed whole cell biocatalyst Escherichia coli W3110/pTrc99A-ald-adh-ta with plasmid-borne overexpression of genes for an l-alanine dehydrogenase, an alcohol dehydrogenase and a transaminase. Cascading alcohol oxidation with l-alanine dependent transamination and l-alanine dehydrogenase allowed for redox self-sufficient conversion of alcohols to the corresponding amines. The supplementation of the corresponding (di)alcohol precursors as well as amino group donor l-alanine and ammonium chloride were sufficient for amination and redox cofactor recycling in a resting buffer system. The addition of the transaminase cofactor pyridoxal-phosphate and the alcohol dehydrogenase cofactor NAD(+) was not necessary to obtain complete conversion. Secondary and cyclic alcohols, for example, 2-hexanol and cyclohexanol were not aminated. However, efficient redox self-sufficient amination of aliphatic and aromatic (di)alcohols in vivo was achieved with 1-hexanol, 1,10-decanediol and benzylalcohol being aminated best. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. 76 FR 42534 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits; System...

    Science.gov (United States)

    2011-07-19

    ... Reliability Operating Limits; System Restoration Reliability Standards AGENCY: Federal Energy Regulatory... data necessary to analyze and monitor Interconnection Reliability Operating Limits (IROL) within its... Interconnection Reliability Operating Limits, Order No. 748, 134 FERC ] 61,213 (2011). \\2\\ The term ``Wide-Area...

  6. Seismic reliability assessment methodology for CANDU concrete containment structures

    International Nuclear Information System (INIS)

    Stephens, M.J.; Nessim, M.A.; Hong, H.P.

    1995-05-01

    A study was undertaken to develop a reliability-based methodology for the assessment of existing CANDU concrete containment structures with respect to seismic loading. The focus of the study was on defining appropriate specified values and partial safety factors for earthquake loading and resistance parameters. Key issues addressed in the work were the identification of an approach to select design earthquake spectra that satisfy consistent safety levels, and the use of structure-specific data in the evaluation of structural resistance. (author). 23 refs., 9 tabs., 15 figs

  7. Vaccine procurement and self-sufficiency in developing countries.

    Science.gov (United States)

    Woodle, D

    2000-06-01

    This paper discusses the movement toward self-sufficiency in vaccine supply in developing countries (and countries in transition to new economic and political systems) and explains special supply concerns about vaccine as a product class. It traces some history of donor support and programmes aimed at self-financing, then continues with a discussion about self-sufficiency in terms of institutional capacity building. A number of deficiencies commonly found in vaccine procurement and supply in low- and middle-income countries are characterized, and institutional strengthening with procurement technical assistance is described. The paper also provides information about a vaccine procurement manual being developed by the United States Agency for International Development (USAID) and the World Health Organization (WHO) for use in this environment. Two brief case studies are included to illustrate the spectrum of existing capabilities and different approaches to technical assistance aimed at developing or improving vaccine procurement capability. In conclusion, the paper discusses the special nature of vaccine and issues surrounding potential integration and decentralization of vaccine supply systems as part of health sector reform.

  8. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    Science.gov (United States)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  9. Software-defined optical network for metro-scale geographically distributed data centers.

    Science.gov (United States)

    Samadi, Payman; Wen, Ke; Xu, Junjie; Bergman, Keren

    2016-05-30

    The emergence of cloud computing and big data has rapidly increased the deployment of small and mid-sized data centers. Enterprises and cloud providers require an agile network among these data centers to empower application reliability and flexible scalability. We present a software-defined inter data center network to enable on-demand scale out of data centers on a metro-scale optical network. The architecture consists of a combined space/wavelength switching platform and a Software-Defined Networking (SDN) control plane equipped with a wavelength and routing assignment module. It enables establishing transparent and bandwidth-selective connections from L2/L3 switches, on-demand. The architecture is evaluated in a testbed consisting of 3 data centers, 5-25 km apart. We successfully demonstrated end-to-end bulk data transfer and Virtual Machine (VM) migrations across data centers with less than 100 ms connection setup time and close to full link capacity utilization.

  10. 76 FR 66055 - North American Electric Reliability Corporation; Order Approving Interpretation of Reliability...

    Science.gov (United States)

    2011-10-25

    ...\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242... materially affected'' by Bulk-Power System reliability may request an interpretation of a Reliability... Electric Reliability Corporation; Order Approving Interpretation of Reliability Standard; Before...

  11. ACCURACY AND RELIABILITY AS CRITERIA OF INFORMATIVENESS IN THE NEWS STORY

    Directory of Open Access Journals (Sweden)

    Melnikova Ekaterina Aleksandrovna

    2014-12-01

    Full Text Available The article clarifies the meaning of the terms accuracy and reliability of the news story, offers a researcher's approach to obtaining objective data that helps to verify linguistic means of accuracy and reliability presence in the informative structure of the text. The accuracy of the news story is defined as a high relevance degree of event reflection through language representation of its constituents; the reliability is viewed as news story originality that is proved by introducing citations and sources of information considered being trustworthy into the text content. Having based the research on an event nominative density identification method, the author composed nominative charts of 115 news story texts, collected at web-sites of BBC and CNN media corporations; distinguished qualitative and quantitative markers of accuracy and reliability in the news story text; confirmed that the accuracy of the news story is achieved with terminological clearness in nominating event constituents in the text, thematic bind between words, presence of onyms that help deeply identify characteristics of the referent event. The reliability of the text is discovered in eyewitness accounts, quotations, and references to the sources being considered as trustworthy. Accurate revision of associations between accuracy and reliability and informing strategies in digital news nets allowed the author to set two variants of information delivery, that differ in their communicative and pragmatic functions: developing (that informs about major and minor details of an event and truncated (which gives some details thus raising the interest to the event and urging a reader to open a full story.

  12. Safety, reliability, and validity of a physiologic definition of bronchopulmonary dysplasia.

    Science.gov (United States)

    Walsh, Michele C; Wilson-Costello, Deanna; Zadell, Arlene; Newman, Nancy; Fanaroff, Avroy

    2003-09-01

    Bronchopulmonary dysplasia (BPD) is the focus of many intervention trials, yet the outcome measure when based solely on oxygen administration may be confounded by differing criteria for oxygen administration between physicians. Thus, we wished to define BPD by a standardized oxygen saturation monitoring at 36 weeks corrected age, and compare this physiologic definition with the standard clinical definition of BPD based solely on oxygen administration. A total of 199 consecutive very low birthweight infants (VLBW, 501 to 1500 g birthweight) were assessed prospectively at 36+/-1 weeks corrected age. Neonates on positive pressure support or receiving >30% supplemental oxygen were assigned the outcome BPD. Those receiving or =88% for 60 minutes) or "BPD" (saturation reliability, test-retest reliability, and validity of the physiologic definition vs the clinical definition were assessed. A total of 199 VLBW were assessed, of whom 45 (36%) were diagnosed with BPD by the clinical definition of oxygen use at 36 weeks corrected age. The physiologic definition identified 15 infants treated with oxygen who successfully passed the saturation monitoring test in room air. The physiologic definition diagnosed BPD in 30 (24%) of the cohort. All infants were safely studied. The test was highly reliable (inter-rater reliability, kappa=1.0; test-retest reliability, kappa=0.83) and highly correlated with discharge home in oxygen, length of hospital stay, and hospital readmissions in the first year of life. The physiologic definition of BPD is safe, feasible, reliable, and valid and improves the precision of the diagnosis of BPD. This may be of benefit in future multicenter clinical trials.

  13. Defining Issues Test-2: fidedignidade da versão brasileira e ponderações acerca de seu uso em pesquisas sobre moralidade Defining Issues Test-2: reliability of the Brazilian version and considerations concerning its use in studies on morality

    Directory of Open Access Journals (Sweden)

    Alessandra de Morais Shimizu

    2004-01-01

    Full Text Available Este estudo teve como objetivo avaliar a fidedignidade da tradução e adaptação brasileira do Definig Issues Test (DIT -2, assim como realizar algumas ponderações sobre a utilização desse instrumento e do DIT -1 em pesquisas sobre moralidade. Os testes DIT-1 e DIT-2 foram aplicados em 621 jovens brasileiros, proporcionalmente distribuídos conforme a cidade de procedência (Floriano/PI, Erechim/RS e Marília/SP, o tipo de escola (pública e particular e o ano escolar freqüentado (8º ano do ensino fundamental e 3º ano do ensino médio. Em relação à fidedignidade, notou-se que, apesar de os valores alcançados serem próximos àquele obtido na tradução e adaptação do DIT-1, revelam-se bem menores que os verificados nas versões originais americanas. Na verificação das pontuações alcançadas nos testes, foi observada a existência de tendências distintas no interior da amostra investigada, marcadas pelas variáveis controladas. Foram tecidas considerações sobre a validade e a interpretação desses testes.This study aims to evaluate the reliability of a Brazilian translation and adaptation of the Defining Issues Test (DIT -2, as well as to make considerations concerning the use of this tool and of DIT -1 in studies on morality. The DIT -1 and DIT -2 were administred to 621 Brazilian youngsters, equally distributed according to the city of origin (Floriano/PI, Erechim/RS and Marília/SP, the kind of school (public or private and the school year attended (8th grade and 11th grade. Regarding the reliability, it was noticed that although the values achieved were close to that one obtained in the translation and adaptation of DIT -1, they revealed themselves much lower than the ones verified in the American original versions. When checking the scores achieved in the tests the existence of some tendencies within the investigated sample was observed. Some considerations regarding the validity and interpretation of these tests are

  14. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  15. FOOD SELF-SUFFICIENCY OF THE EUROPEAN UNION COUNTRIES – ENERGETIC APPROACH

    Directory of Open Access Journals (Sweden)

    Arkadiusz Sadowski

    2016-06-01

    Full Text Available The paper covers the issues of a basic social need, namely alimentation. The aim of the research is to evaluate the energetic food self-sufficiency and its changes in the European Union countries. The research has been conducted using the author’s methodology basing on the amount of energy produced and consumed in 1990-2009. The analyses proved that within the considered period, the European Union became an importer of net energy comprised in agricultural products. The excess in produced energy was mainly observed by the countries of European lowland. Moreover in most of the countries, a decrease in the analyzed factor was observed when compared with the 1990-1999 period. On the other hand, in relation to the new member states the increase in food energetic self-sufficiency was observed. The conclusion has been drawn that, while the general food self-sufficiency is mainly determined by environmental factors, its dynamics is primarily influenced by the factors connected with agricultural policy.

  16. 76 FR 23801 - North American Electric Reliability Corporation; Order Approving Reliability Standard

    Science.gov (United States)

    2011-04-28

    ... have an operating plan and facilities for backup functionality to ensure Bulk-Power System reliability... entity's primary control center on the reliability of the Bulk-Power System. \\1\\ Mandatory Reliability... potential impact of a violation of the Requirement on the reliability of the Bulk-Power System. The...

  17. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  18. Modeling of seismic hazards for dynamic reliability analysis

    International Nuclear Information System (INIS)

    Mizutani, M.; Fukushima, S.; Akao, Y.; Katukura, H.

    1993-01-01

    This paper investigates the appropriate indices of seismic hazard curves (SHCs) for seismic reliability analysis. In the most seismic reliability analyses of structures, the seismic hazards are defined in the form of the SHCs of peak ground accelerations (PGAs). Usually PGAs play a significant role in characterizing ground motions. However, PGA is not always a suitable index of seismic motions. When random vibration theory developed in the frequency domain is employed to obtain statistics of responses, it is more convenient for the implementation of dynamic reliability analysis (DRA) to utilize an index which can be determined in the frequency domain. In this paper, we summarize relationships among the indices which characterize ground motions. The relationships between the indices and the magnitude M are arranged as well. In this consideration, duration time plays an important role in relating two distinct class, i.e. energy class and power class. Fourier and energy spectra are involved in the energy class, and power and response spectra and PGAs are involved in the power class. These relationships are also investigated by using ground motion records. Through these investigations, we have shown the efficiency of employing the total energy as an index of SHCs, which can be determined in the time and frequency domains and has less variance than the other indices. In addition, we have proposed the procedure of DRA based on total energy. (author)

  19. Simulation techniques for determining reliability and availability of technical systems

    International Nuclear Information System (INIS)

    Lindauer, E.

    1975-01-01

    The system is described in the form of a fault tree with components representing part functions of the system and connections which reproduce the logical structure of the system. Both have the states intact or failed, they are defined here as in the programme FESIVAR of the IRS. For the simulation of components corresponding to the given probabilities, pseudo-random numbers are applied; these are numbers whose sequence is determined by the producing algorithm, but which for the given purpose sufficiently exhibit the behaviour of randomly successive numbers. This method of simulation is compared with deterministic methods. (HP/LH) [de

  20. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  1. Equipment design for reliability testing of protection system

    International Nuclear Information System (INIS)

    Situmorang, Johnny; Tjahjono, H.; Santosa, A. Z.; Tjahjani, S.DT.; Ismu, P.H; Haryanto, D.; Mulyanto, D.; Kusmono, S

    1999-01-01

    The equipment for reliability testing of cable of protection system has been designed as a a furnace with the electric heater have a 4 kW power, and need time 10 minute to reach the designed maximum temperature 3000C. The dimension of furnace is 800 mm diameter and 2000 mm length is isolated use rockwool isolator and coated by aluminium. For the designed maximum temperature the surface temperature is 78 0c. Assemble of specimens is arranged horizontally in the furnace. The failure criteria will be defined based on the behaviour of the load circuit in each line of cable specimens

  2. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    Science.gov (United States)

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  3. The performance shaping factors influence analysis on the human reliability for NPP operation

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Florescu, G.

    2008-01-01

    The Human Reliability Analysis (HRA) is an important step in Probabilistic Safety Assessment (PSA) studies and offers an advisability for concrete improvement of the man - machine - organization interfaces, reliability and safety. The goals of this analysis are to obtain sufficient details in order to understand and document all-important factors that affect human performance. The purpose of this paper is to estimate the human errors probabilities in view of the negative or positive effect of the human performance shaping factors (PSFs) for the mitigation of the initiating events which could occur in Nuclear Power Plant (NPP). Using THERP and SPAR-H methods, an analysis model of PSFs influence on the human reliability is performed. This model is applied to more important activities, that are necessary to mitigate 'one steam generator tube failure' event at Cernavoda NPP. The results are joint human error probabilities (JHEP) values estimated for the following situations: without regarding to PSFs influence; with PSFs in specific conditions; with PSFs which could have only positive influence and with PSFs which could have only negative influence. In addition, PSFs with negative influence were identified and using the DOE method, the necessary activities for changing negative influence were assigned. (authors)

  4. Can New Zealand achieve self-sufficiency in its nursing workforce?

    Science.gov (United States)

    North, Nicola

    2011-01-01

    This paper reviews impacts on the nursing workforce of health policy and reforms of the past two decades and suggests reasons for both current difficulties in retaining nurses in the workforce and measures to achieve short-term improvements. Difficulties in retaining nurses in the New Zealand workforce have contributed to nursing shortages, leading to a dependence on overseas recruitment. In a context of global shortages and having to compete in a global nursing labour market, an alternative to dependence on overseas nurses is self-sufficiency. Discursive paper. Analysis of nursing workforce data highlighted threats to self-sufficiency, including age structure, high rates of emigration of New Zealand nurses with reliance on overseas nurses and an annual output of nurses that is insufficient to replace both expected retiring nurses and emigrating nurses. A review of recent policy and other documents indicates that two decades of health reform and lack of a strategic focus on nursing has contributed to shortages. Recent strategic approaches to the nursing workforce have included workforce stocktakes, integrated health workforce development and nursing workforce projections, with a single authority now responsible for planning, education, training and development for all health professions and sectors. Current health and nursing workforce development strategies offer wide-ranging and ambitious approaches. An alternative approach is advocated: based on workforce data analysis, pressing threats to self-sufficiency and measures available are identified to achieve, in the short term, the maximum impact on retaining nurses. A human resources in health approach is recommended that focuses on employment conditions and professional nursing as well as recruitment and retention strategies. Nursing is identified as 'crucial' to meeting demands for health care. A shortage of nurses threatens delivery of health services and supports the case for self-sufficiency in the nursing

  5. Evaluation of reproducibility and reliability of 3D soft tissue analysis using 3D stereophotogrammetry.

    NARCIS (Netherlands)

    Plooij, J.M.; Swennen, G.R.J.; Rangel, F.A.; Maal, T.J.J.; Schutyser, F.A.C.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Berge, S.J.

    2009-01-01

    In 3D photographs the bony structures are neither available nor palpable, therefore, the bone-related landmarks, such as the soft tissue gonion, need to be redefined. The purpose of this study was to determine the reproducibility and reliability of 49 soft tissue landmarks, including newly defined

  6. How, When, and Where? Assessing Renewable Energy Self-Sufficiency at the Neighborhood Level.

    Science.gov (United States)

    Grosspietsch, David; Thömmes, Philippe; Girod, Bastien; Hoffmann, Volker H

    2018-02-20

    Self-sufficient decentralized systems challenge the centralized energy paradigm. Although scholars have assessed specific locations and technological aspects, it remains unclear how, when, and where energy self-sufficiency could become competitive. To address this gap, we develop a techno-economic model for energy self-sufficient neighborhoods that integrates solar photovoltaics (PV), conversion, and storage technologies. We assess the cost of 100% self-sufficiency for both electricity and heat, comparing different technical configurations for a stylized neighborhood in Switzerland and juxtaposing these findings with projections on market and technology development. We then broaden the scope and vary the neighborhood's composition (residential share) and geographic position (along different latitudes). Regarding how to design self-sufficient neighborhoods, we find two promising technical configurations. The "PV-battery-hydrogen" configuration is projected to outperform a fossil-fueled and grid-connected reference configuration when energy prices increase by 2.5% annually and cost reductions in hydrogen-related technologies by a factor of 2 are achieved. The "PV-battery" configuration would allow achieving parity with the reference configuration sooner, at 21% cost reduction. Additionally, more cost-efficient deployment is found in neighborhoods where the end-use is small commercial or mixed and in regions where seasonal fluctuations are low and thus allow for reducing storage requirements.

  7. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  8. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  9. Reliability techniques and Coupled BEM/FEM for interaction pile-soil

    Directory of Open Access Journals (Sweden)

    Ahmed SAHLI

    2017-06-01

    Full Text Available This paper deals with the development of a computational code for the modelling and verification of safety in relation to limit states of piles found in foundations of current structures. To this end, it makes use of reliability techniques for the probabilistic analysis of piles modelled with the finite element method (FEM coupled to the boundary element method (BEM. The soil is modelled with the BEM employing Mindlin's fundamental solutions, suitable for the consideration of a three-dimensional infinite half-space. The piles are modelled as bar elements with the MEF, each of which is represented in the BEM as a loading line. The finite element of the employed bar has four nodes and fourteen nodal parameters, three of which are displacements for each node plus two rotations for the top node. The slipping of the piles in relation to the mass is carried out using adhesion models to define the evolution of the shaft tensions during the transfer of load to the soil. The reliability analysis is based on three methods: first order second moment (FOSM, first order reliability method and Monte Carlo method.

  10. Launch and Assembly Reliability Analysis for Human Space Exploration Missions

    Science.gov (United States)

    Cates, Grant; Gelito, Justin; Stromgren, Chel; Cirillo, William; Goodliff, Kandyce

    2012-01-01

    NASA's future human space exploration strategy includes single and multi-launch missions to various destinations including cis-lunar space, near Earth objects such as asteroids, and ultimately Mars. Each campaign is being defined by Design Reference Missions (DRMs). Many of these missions are complex, requiring multiple launches and assembly of vehicles in orbit. Certain missions also have constrained departure windows to the destination. These factors raise concerns regarding the reliability of launching and assembling all required elements in time to support planned departure. This paper describes an integrated methodology for analyzing launch and assembly reliability in any single DRM or set of DRMs starting with flight hardware manufacturing and ending with final departure to the destination. A discrete event simulation is built for each DRM that includes the pertinent risk factors including, but not limited to: manufacturing completion; ground transportation; ground processing; launch countdown; ascent; rendezvous and docking, assembly, and orbital operations leading up to trans-destination-injection. Each reliability factor can be selectively activated or deactivated so that the most critical risk factors can be identified. This enables NASA to prioritize mitigation actions so as to improve mission success.

  11. Towards a sufficiency-driven business model : Experiences and opportunities

    NARCIS (Netherlands)

    Bocken, N.M.P.; Short, SW

    2016-01-01

    Business model innovation is an important lever for change to tackle pressing sustainability issues. In this paper, ‘sufficiency’ is proposed as a driver of business model innovation for sustainability. Sufficiency-driven business models seek to moderate overall resource consumption by curbing

  12. Flat-plate solar array project. Volume 6: Engineering sciences and reliability

    Science.gov (United States)

    Ross, R. G., Jr.; Smokler, M. I.

    1986-01-01

    The Flat-Plate Solar Array (FSA) Project activities directed at developing the engineering technology base required to achieve modules that meet the functional, safety, and reliability requirements of large scale terrestrial photovoltaic systems applications are reported. These activities included: (1) development of functional, safety, and reliability requirements for such applications; (2) development of the engineering analytical approaches, test techniques, and design solutions required to meet the requirements; (3) synthesis and procurement of candidate designs for test and evaluation; and (4) performance of extensive testing, evaluation, and failure analysis of define design shortfalls and, thus, areas requiring additional research and development. A summary of the approach and technical outcome of these activities are provided along with a complete bibliography of the published documentation covering the detailed accomplishments and technologies developed.

  13. DNA sequence of 15 base pairs is sufficient to mediate both glucocorticoid and progesterone induction of gene expression

    International Nuclear Information System (INIS)

    Straehle, U.; Klock, G.; Schuetz, G.

    1987-01-01

    To define the recognition sequence of the glucocorticoid receptor and its relationship with that of the progesterone receptor, oligonucleotides derived from the glucocorticoid response element of the tyrosine aminotransferase gene were tested upstream of a heterologous promoter for their capacity to mediate effects of these two steroids. The authors show that a 15-base-pair sequence with partial symmetry is sufficient to confer glucocorticoid inducibility on the promoter of the herpes simplex virus thymidine kinase gene. The same 15-base-pair sequence mediates induction by progesterone. Point mutations in the recognition sequence affect inducibility by glucocorticoids and progesterone similarly. Together with the strong conservation of the sequence of the DNA-binding domain of the two receptors, these data suggest that both proteins recognize a sequence that is similar, if not the same

  14. Entrepreneurship by any other name: self-sufficiency versus innovation.

    Science.gov (United States)

    Parker Harris, Sarah; Caldwell, Kate; Renko, Maija

    2014-01-01

    Entrepreneurship has been promoted as an innovative strategy to address the employment of people with disabilities. Research has predominantly focused on the self-sufficiency aspect without fully integrating entrepreneurship literature in the areas of theory, systems change, and demonstration projects. Subsequently there are gaps in services, policies, and research in this field that, in turn, have limited our understanding of the support needs and barriers or facilitators of entrepreneurs with disabilities. A thorough analysis of the literature in these areas led to the development of two core concepts that need to be addressed in integrating entrepreneurship into disability employment research and policy: clarity in operational definitions and better disability statistics and outcome measures. This article interrogates existing research and policy efforts in this regard to argue for a necessary shift in the field from focusing on entrepreneurship as self-sufficiency to understanding entrepreneurship as innovation.

  15. 76 FR 16240 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits

    Science.gov (United States)

    2011-03-23

    ... identified by the Commission. \\5\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693... reliability of the interconnection by ensuring that the bulk electric system is assessed during the operations... responsibility for SOLs. Further, Bulk-Power System reliability practices assign responsibilities for analyzing...

  16. Validity and Reliability of the Iranian Version of eHealth Literacy Scale

    Directory of Open Access Journals (Sweden)

    Soheila Bazm

    2016-06-01

    Full Text Available Abstract: Introduction:  The eHEALS is an 8-item measure of eHealth literacy developed to measure consumers’ combined knowledge, comfort, and perceived skills at finding, evaluating, and applying electronic health information to health problems. The current study aims to measure validity and reliability of the Iranian version of eHEALS questionnaire in a population context. Materials & Methods: A cross-sectional study was done on 525 youths people who has been chosen randomly in Iran, Yazd. We determined content validity, construct validity and predictive validity of the translated questionnaire. Principal components factor analysis was used to determine the theoretical fit of the measures with the data. The internal consistency of the translated questionnaire was evaluated using Cronbach α coefficient. The results were analyzed in SPSSv16. Results: The principal component analysis (PCA produced a single factor solution (70.48% of variance with factor loading ranging from 0.723 to 0.862. The internal consistency of the scale was sufficient (alpha=0.88 , P<0.001 and the test-retest coefficients for the items were reliable (r= 0.96, P<0.001. Discussion: The results of the study showed that the items in the translated questionnaire were equivalent to the original scale .The version of the eHEALS questionnaire showed both good reliability and validity for the screening of eHealth literacy of Iranian people.

  17. Software reliability evaluation of digital plant protection system development process using V and V

    International Nuclear Information System (INIS)

    Lee, Na Young; Hwang, Il Soon; Seong, Seung Hwan; Oh, Seung Rok

    2001-01-01

    In the nuclear power industry, digital technology has been introduced recently for the Instrumentation and Control (I and C) of reactor systems. For its application to the safety critical system such as Reactor Protection System(RPS), a reliability assessment is indispensable. Unlike traditional reliability models, software reliability is hard to evaluate, and should be evaluated throughout development lifecycle. In the development process of Digital Plant Protection System(DPPS), the concept of verification and validation (V and V) was introduced to assure the quality of the product. Also, test should be performed to assure the reliability. Verification procedure with model checking is relatively well defined, however, test is labor intensive and not well organized. In this paper, we developed the methodological process of combining the verification with validation test case generation. For this, we used PVS for the table specification and for the theorem proving. As a result, we could not only save time to design test case but also get more effective and complete verification related test case set. Add to this, we could extract some meaningful factors useful for the reliability evaluation both from the V and V and verification combined tests

  18. Measuring older adults' sedentary time: reliability, validity, and responsiveness.

    Science.gov (United States)

    Gardiner, Paul A; Clark, Bronwyn K; Healy, Genevieve N; Eakin, Elizabeth G; Winkler, Elisabeth A H; Owen, Neville

    2011-11-01

    With evidence that prolonged sitting has deleterious health consequences, decreasing sedentary time is a potentially important preventive health target. High-quality measures, particularly for use with older adults, who are the most sedentary population group, are needed to evaluate the effect of sedentary behavior interventions. We examined the reliability, validity, and responsiveness to change of a self-report sedentary behavior questionnaire that assessed time spent in behaviors common among older adults: watching television, computer use, reading, socializing, transport and hobbies, and a summary measure (total sedentary time). In the context of a sedentary behavior intervention, nonworking older adults (n = 48, age = 73 ± 8 yr (mean ± SD)) completed the questionnaire on three occasions during a 2-wk period (7 d between administrations) and wore an accelerometer (ActiGraph model GT1M) for two periods of 6 d. Test-retest reliability (for the individual items and the summary measure) and validity (self-reported total sedentary time compared with accelerometer-derived sedentary time) were assessed during the 1-wk preintervention period, using Spearman (ρ) correlations and 95% confidence intervals (CI). Responsiveness to change after the intervention was assessed using the responsiveness statistic (RS). Test-retest reliability was excellent for television viewing time (ρ (95% CI) = 0.78 (0.63-0.89)), computer use (ρ (95% CI) = 0.90 (0.83-0.94)), and reading (ρ (95% CI) = 0.77 (0.62-0.86)); acceptable for hobbies (ρ (95% CI) = 0.61 (0.39-0.76)); and poor for socializing and transport (ρ < 0.45). Total sedentary time had acceptable test-retest reliability (ρ (95% CI) = 0.52 (0.27-0.70)) and validity (ρ (95% CI) = 0.30 (0.02-0.54)). Self-report total sedentary time was similarly responsive to change (RS = 0.47) as accelerometer-derived sedentary time (RS = 0.39). The summary measure of total sedentary time has good repeatability and modest validity and is

  19. Revised scoring and improved reliability for the Communication Patterns Questionnaire.

    Science.gov (United States)

    Crenshaw, Alexander O; Christensen, Andrew; Baucom, Donald H; Epstein, Norman B; Baucom, Brian R W

    2017-07-01

    The Communication Patterns Questionnaire (CPQ; Christensen, 1987) is a widely used self-report measure of couple communication behavior and is well validated for assessing the demand/withdraw interaction pattern, which is a robust predictor of poor relationship and individual outcomes (Schrodt, Witt, & Shimkowski, 2014). However, no studies have examined the CPQ's factor structure using analytic techniques sufficient by modern standards, nor have any studies replicated the factor structure using additional samples. Further, the current scoring system uses fewer than half of the total items for its 4 subscales, despite the existence of unused items that have content conceptually consistent with those subscales. These characteristics of the CPQ have likely contributed to findings that subscale scores are often troubled by suboptimal psychometric properties such as low internal reliability (e.g., Christensen, Eldridge, Catta-Preta, Lim, & Santagata, 2006). The present study uses exploratory and confirmatory factor analyses on 4 samples to reexamine the factor structure of the CPQ to improve scale score reliability and to determine if including more items in the subscales is warranted. Results indicate that a 3-factor solution (constructive communication and 2 demand/withdraw scales) provides the best fit for the data. That factor structure was confirmed in the replication samples. Compared with the original scales, the revised scales include additional items that expand the conceptual range of the constructs, substantially improve reliability of scale scores, and demonstrate stronger associations with relationship satisfaction and sensitivity to change in therapy. Implications for research and treatment are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Design of etch holes to compensate spring width loss for reliable resonant frequencies

    International Nuclear Information System (INIS)

    Jang, Yun-Ho; Kim, Jong-Wan; Kim, Yong-Kweon; Kim, Jung-Mu

    2012-01-01

    A pattern width loss during the fabrication of lateral silicon resonators degrades resonant frequency reliability since such a width loss causes the significant deviation of spring stiffness. Here we present a design guide for etch holes to obtain reliable resonant frequencies by controlling etch holes geometries. The new function of an etch hole is to generate the comparable amount of the width loss between springs and etch holes, in turn to minimize the effect of the spring width loss on resonant frequency shift and deviation. An analytic expression reveals that a compensation factor (CF), defined by the circumference (C u ) of a unit etch hole divided by its silicon area (A u ), is a key parameter for reliable frequencies. The protrusive etch holes were proposed and compared with square etch holes to demonstrate the frequency reliability according to CF values and etch hole shapes. The normalized resonant frequency shift and deviation of the protrusive etch hole (−13.0% ± 6.9%) were significantly improved compared to those of a square etch hole with a small CF value (−42.8% ± 14.8%). The proposed design guide based on the CF value and protrusive shapes can be used to achieve reliable resonant frequencies for high performance silicon resonators. (technical note)

  1. Design for reliability: NASA reliability preferred practices for design and test

    Science.gov (United States)

    Lalli, Vincent R.

    1994-01-01

    This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.

  2. Validity and reliability of the Mastication Observation and Evaluation (MOE) instrument.

    Science.gov (United States)

    Remijn, Lianne; Speyer, Renée; Groen, Brenda E; van Limbeek, Jacques; Nijhuis-van der Sanden, Maria W G

    2014-07-01

    The Mastication Observation and Evaluation (MOE) instrument was developed to allow objective assessment of a child's mastication process. It contains 14 items and was developed over three Delphi rounds. The present study concerns the further development of the MOE using the COSMIN (Consensus based Standard for the Selection of Measurement Instruments) and investigated the instrument's internal consistency, inter-observer reliability, construct validity and floor and ceiling effects. Consumption of three bites of bread and biscuit was evaluated using the MOE. Data of 59 healthy children (6-48 mths) and 38 children (bread) and 37 children (biscuit) with cerebral palsy (24-72 mths) were used. Four items were excluded before analysis due to zero variance. Principal Components Analysis showed one factor with 8 items. Internal consistency was >0.70 (Chronbach's alpha) for both food consistencies and for both groups of children. Inter-observer reliability varied from 0.51 to 0.98 (weighted Gwet's agreement coefficient). The total MOE scores for both groups showed normal distribution for the population. There were no floor or ceiling effects. The revised MOE now contains 8 items that (a) have a consistent concept for mastication and can be scored on a 4-point scale with sufficient reliability and (b) are sensitive to stages of chewing development in young children. The removed items are retained as part of a criterion referenced list within the MOE. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. 3-D high-frequency endovaginal ultrasound of female urethral complex and assessment of inter-observer reliability

    International Nuclear Information System (INIS)

    Wieczorek, A.P.; Wozniak, M.M.; Stankiewicz, A.; Santoro, G.A.; Bogusiewicz, M.; Rechberger, T.

    2012-01-01

    Objectives: Assessment of the urethral complex and defining its morphological characteristics with 3-dimensional endovaginal ultrasonography with the use of high frequency rotational 360° transducer. Defining inter-observer reliability of the performed measurements. Materials and methods: Twenty-four asymptomatic, nulliparous females (aged 18–55, mean 32 years) underwent high-frequency (12 MHz) endovaginal ultrasound with rotational 360° and automated 3D data acquisition (type 2050, B-K Medical, Herlev, Denmark). Measurements of the urethral thickness, width and length, bladder neck-symphysis distance, intramural part of the urethra as well as rhabdosphincter thickness, width and length were taken by three investigators. Descriptive statistics for continuous data was performed. The results were given as mean values with standard deviation. The relationships among different variables were assessed with ANOVA for repeated measures factors, as well as T-test for dependent samples. Intraclass correlation (ICC) was calculated for each parameter. Intra- and interobserver reliability was assessed. Statistical significance was assigned to a P value of 0.8) and good reliability for rhabdosphincter measurements (ICC > 0.6) between all three investigators. Conclusions: Advanced EVUS provides detailed information on anatomy and morphology of the female urethral complex. Our results show that 360° rotational transducer with automated 3D acquisition, currently routinely used for proctological scanning is suitable for the reliable assessment of the urethral complex and can be applied in a routine diagnostics of pelvic floor disturbances in females.

  4. Reliability and validity in measurement of true humeral retroversion by a three-dimensional cylinder fitting method.

    Science.gov (United States)

    Saka, Masayuki; Yamauchi, Hiroki; Hoshi, Kenji; Yoshioka, Toru; Hamada, Hidetoshi; Gamada, Kazuyoshi

    2015-05-01

    Humeral retroversion is defined as the orientation of the humeral head relative to the distal humerus. Because none of the previous methods used to measure humeral retroversion strictly follow this definition, values obtained by these techniques vary and may be biased by morphologic variations of the humerus. The purpose of this study was 2-fold: to validate a method to define the axis of the distal humerus with a virtual cylinder and to establish the reliability of 3-dimensional (3D) measurement of humeral retroversion by this cylinder fitting method. Humeral retroversion in 14 baseball players (28 humeri) was measured by the 3D cylinder fitting method. The root mean square error was calculated to compare values obtained by a single tester and by 2 different testers using the embedded coordinate system. To establish the reliability, intraclass correlation coefficient (ICC) and precision (standard error of measurement [SEM]) were calculated. The root mean square errors for the humeral coordinate system were reliability and precision of the 3D measurement of retroversion yielded an intratester ICC of 0.99 (SEM, 1.0°) and intertester ICC of 0.96 (SEM, 2.8°). The error in measurements obtained by a distal humerus cylinder fitting method was small enough not to affect retroversion measurement. The 3D measurement of retroversion by this method provides excellent intratester and intertester reliability. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  5. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  6. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    Science.gov (United States)

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual

  7. Online Learning in Higher Education: Necessary and Sufficient Conditions

    Science.gov (United States)

    Lim, Cher Ping

    2005-01-01

    The spectacular development of information and communication technologies through the Internet has provided opportunities for students to explore the virtual world of information. In this article, the author discusses the necessary and sufficient conditions for successful online learning in educational institutions. The necessary conditions…

  8. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  9. Reliability of the Structured Clinical Interview for DSM-5 Sleep Disorders Module.

    Science.gov (United States)

    Taylor, Daniel J; Wilkerson, Allison K; Pruiksma, Kristi E; Williams, Jacob M; Ruggero, Camilo J; Hale, Willie; Mintz, Jim; Organek, Katherine Marczyk; Nicholson, Karin L; Litz, Brett T; Young-McCaughan, Stacey; Dondanville, Katherine A; Borah, Elisa V; Brundige, Antoinette; Peterson, Alan L

    2018-03-15

    To develop and demonstrate interrater reliability for a Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) Sleep Disorders (SCISD). The SCISD was designed to be a brief, reliable, and valid interview assessment of adult sleep disorders as defined by the DSM-5. A sample of 106 postdeployment active-duty military members seeking cognitive behavioral therapy for insomnia in a randomized clinical trial were assessed with the SCISD prior to treatment to determine eligibility. Audio recordings of these interviews were double-scored for interrater reliability. The interview is 8 pages long, includes 20 to 51 questions, and takes 10 to 20 minutes to administer. Of the nine major disorders included in the SCISD, six had prevalence rates high enough (ie, n ≥ 5) to include in analyses. Cohen kappa coefficient (κ) was used to assess interrater reliability for insomnia, hypersomnolence, obstructive sleep apnea hypopnea (OSAH), circadian rhythm sleep-wake, nightmare, and restless legs syndrome disorders. There was excellent interrater reliability for insomnia (1.0) and restless legs syndrome (0.83); very good reliability for nightmare disorder (0.78) and OSAH (0.73); and good reliability for hypersomnolence (0.50) and circadian rhythm sleep-wake disorders (0.50). The SCISD is a brief, structured clinical interview that is easy for clinicians to learn and use. The SCISD showed moderate to excellent interrater reliability for six of the major sleep disorders in the DSM-5 among active duty military seeking cognitive behavioral therapy for insomnia in a randomized clinical trial. Replication and extension studies are needed. Registry: ClinicalTrials.gov; Title: Comparing Internet and In-Person Brief Cognitive Behavioral Therapy of Insomnia; Identifier: NCT01549899; URL: https://clinicaltrials.gov/ct2/show/NCT01549899. © 2018 American Academy of Sleep Medicine.

  10. "Dermatitis" defined.

    Science.gov (United States)

    Smith, Suzanne M; Nedorost, Susan T

    2010-01-01

    The term "dermatitis" can be defined narrowly or broadly, clinically or histologically. A common and costly condition, dermatitis is underresourced compared to other chronic skin conditions. The lack of a collectively understood definition of dermatitis and its subcategories could be the primary barrier. To investigate how dermatologists define the term "dermatitis" and determine if a consensus on the definition of this term and other related terms exists. A seven-question survey of dermatologists nationwide was conducted. Of respondents (n  =  122), half consider dermatitis to be any inflammation of the skin. Nearly half (47.5%) use the term interchangeably with "eczema." Virtually all (> 96%) endorse the subcategory "atopic" under the terms "dermatitis" and "eczema," but the subcategories "contact," "drug hypersensitivity," and "occupational" are more highly endorsed under the term "dermatitis" than under the term "eczema." Over half (55.7%) personally consider "dermatitis" to have a broad meaning, and even more (62.3%) believe that dermatologists as a whole define the term broadly. There is a lack of consensus among experts in defining dermatitis, eczema, and their related subcategories.

  11. Inter-rater reliability in the classification of supraspinatus tendon tears using 3D ultrasound – a question of experience?

    Directory of Open Access Journals (Sweden)

    Giorgio Tamborrini

    2016-09-01

    Full Text Available Background: Three-dimensional (3D ultrasound of the shoulder is characterized by a comparable accuracy to two-dimensional (2D ultrasound. No studies investigating 2D versus 3D inter-rater reliability in the detection of supraspinatus tendon tears taking into account the level of experience of the raters have been carried out so far. Objectives: The aim of this study was to determine the inter-rater reliability in the analysis of 3D ultrasound image sets of the supraspinatus tendon between sonographer with different levels of experience. Patients and methods: Non-interventional, prospective, observational pilot study of 2309 images of 127 adult patients suffering from unilateral shoulder pain. 3D ultrasound image sets were scored by three raters independently. The intra-and interrater reliabilities were calculated. Results: There was an excellent intra-rater reliability of rater A in the overall classification of supraspinatus tendon tears (2D vs 3D κ = 0.892, pairwise reliability 93.81%, 3D scoring round 1 vs 3D scoring round 2 κ = 0.875, pairwise reliability 92.857%. The inter-rater reliability was only moderate compared to rater B on 3D (κ = 0.497, pairwise reliability 70.95% and fair compared to rater C (κ = 0.238, pairwise reliability 42.38%. Conclusions: The reliability of 3D ultrasound of the supraspinatus tendon depends on the level of experience of the sonographer. Experience in 2D ultrasound does not seem to be sufficient for the analysis of 3D ultrasound imaging sets. Therefore, for a 3D ultrasound analysis new diagnostic criteria have to be established and taught even to experienced 2D sonographers to improve reproducibility.

  12. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    Energy Technology Data Exchange (ETDEWEB)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  13. Reliability importance analysis of Markovian systems at steady state using perturbation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phuc Do Van [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France); Barros, Anne [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)], E-mail: anne.barros@utt.fr; Berenguer, Christophe [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)

    2008-11-15

    Sensitivity analysis has been primarily defined for static systems, i.e. systems described by combinatorial reliability models (fault or event trees). Several structural and probabilistic measures have been proposed to assess the components importance. For dynamic systems including inter-component and functional dependencies (cold spare, shared load, shared resources, etc.), and described by Markov models or, more generally, by discrete events dynamic systems models, the problem of sensitivity analysis remains widely open. In this paper, the perturbation method is used to estimate an importance factor, called multi-directional sensitivity measure, in the framework of Markovian systems. Some numerical examples are introduced to show why this method offers a promising tool for steady-state sensitivity analysis of Markov processes in reliability studies.

  14. Reliability importance analysis of Markovian systems at steady state using perturbation analysis

    International Nuclear Information System (INIS)

    Phuc Do Van; Barros, Anne; Berenguer, Christophe

    2008-01-01

    Sensitivity analysis has been primarily defined for static systems, i.e. systems described by combinatorial reliability models (fault or event trees). Several structural and probabilistic measures have been proposed to assess the components importance. For dynamic systems including inter-component and functional dependencies (cold spare, shared load, shared resources, etc.), and described by Markov models or, more generally, by discrete events dynamic systems models, the problem of sensitivity analysis remains widely open. In this paper, the perturbation method is used to estimate an importance factor, called multi-directional sensitivity measure, in the framework of Markovian systems. Some numerical examples are introduced to show why this method offers a promising tool for steady-state sensitivity analysis of Markov processes in reliability studies

  15. Evaluation of the reliability of transport networks based on the stochastic flow of moving objects

    International Nuclear Information System (INIS)

    Wu Weiwei; Ning, Angelika; Ning Xuanxi

    2008-01-01

    In transport networks, human beings are moving objects whose moving direction is stochastic in emergency situations. Based on this idea, a new model-stochastic moving network (SMN) is proposed. It is different from binary-state networks and stochastic-flow networks. The flow of SMNs has multiple-saturated states, that correspond to different flow values in each arc. In this paper, we try to evaluate the system reliability, defined as the probability that the saturated flow of the network is not less than a given demand d. Based on this new model, we obtain the flow probability distribution of every arc by simulation. An algorithm based on the blocking cutset of the SMN is proposed to evaluate the network reliability. An example is used to show how to calculate the corresponding reliabilities for different given demands of the SMN. Simulation experiments of different size were made and the system reliability precision was calculated. The precision of simulation results also discussed

  16. Guidelines for reliability analysis of digital systems in PSA context. Phase 1 status report

    International Nuclear Information System (INIS)

    Authen, S.; Larsson, J.; Bjoerkman, K.; Holmberg, J.-E.

    2010-12-01

    Digital protection and control systems are appearing as upgrades in older nuclear power plants (NPPs) and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital system upgrades on NPPs, quantitative reliability models are needed for digital systems. Due to the many unique attributes of these systems, challenges exist in systems analysis, modeling and in data collection. Currently there is no consensus on reliability analysis approaches. Traditional methods have clearly limitations, but more dynamic approaches are still in trial stage and can be difficult to apply in full scale probabilistic safety assessments (PSA). The number of PSAs worldwide including reliability models of digital I and C systems are few. A comparison of Nordic experiences and a literature review on main international references have been performed in this pre-study project. The study shows a wide range of approaches, and also indicates that no state-of-the-art currently exists. The study shows areas where the different PSAs agree and gives the basis for development of a common taxonomy for reliability analysis of digital systems. It is still an open matter whether software reliability needs to be explicitly modelled in the PSA. The most important issue concerning software reliability is proper descriptions of the impact that software-based systems has on the dependence between the safety functions and the structure of accident sequences. In general the conventional fault tree approach seems to be sufficient for modelling reactor protection system kind of functions. The following focus areas have been identified for further activities: 1. Common taxonomy of hardware and software failure modes of digital components for common use 2. Guidelines regarding level of detail in system analysis and screening of components, failure modes and dependencies 3. Approach for modelling of CCF between components (including software). (Author)

  17. Guidelines for reliability analysis of digital systems in PSA context. Phase 1 status report

    Energy Technology Data Exchange (ETDEWEB)

    Authen, S.; Larsson, J. (Risk Pilot AB, Stockholm (Sweden)); Bjoerkman, K.; Holmberg, J.-E. (VTT, Helsingfors (Finland))

    2010-12-15

    Digital protection and control systems are appearing as upgrades in older nuclear power plants (NPPs) and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital system upgrades on NPPs, quantitative reliability models are needed for digital systems. Due to the many unique attributes of these systems, challenges exist in systems analysis, modeling and in data collection. Currently there is no consensus on reliability analysis approaches. Traditional methods have clearly limitations, but more dynamic approaches are still in trial stage and can be difficult to apply in full scale probabilistic safety assessments (PSA). The number of PSAs worldwide including reliability models of digital I and C systems are few. A comparison of Nordic experiences and a literature review on main international references have been performed in this pre-study project. The study shows a wide range of approaches, and also indicates that no state-of-the-art currently exists. The study shows areas where the different PSAs agree and gives the basis for development of a common taxonomy for reliability analysis of digital systems. It is still an open matter whether software reliability needs to be explicitly modelled in the PSA. The most important issue concerning software reliability is proper descriptions of the impact that software-based systems has on the dependence between the safety functions and the structure of accident sequences. In general the conventional fault tree approach seems to be sufficient for modelling reactor protection system kind of functions. The following focus areas have been identified for further activities: 1. Common taxonomy of hardware and software failure modes of digital components for common use 2. Guidelines regarding level of detail in system analysis and screening of components, failure modes and dependencies 3. Approach for modelling of CCF between components (including software). (Author)

  18. Resiliency as a component importance measure in network reliability

    International Nuclear Information System (INIS)

    Whitson, John C.; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This paper seeks to define the concept of resiliency as a component importance measure related to network reliability. Resiliency can be defined as a composite of: (1) the ability of a network to provide service despite external failures and (2) the time to restore service when in the presence of such failures. Although, Resiliency has been extensively studied in different research areas, this paper will study the specific aspects of quantifiable network resiliency when the network is experiencing potential catastrophic failures from external events and/or influences, and when it is not known a priori which specific components within the network will fail. A formal definition for Category I resiliency is proposed and a step-by-step approach based on Monte-Carlo simulation to calculate it is defined. To illustrate the approach, two-terminal networks with varying degrees of redundancy, have been considered. The results obtained for test networks show that this new quantifiable concept of resiliency provides insight into the performance and topology of the network. Future use for this work could include methods for safeguarding critical network components and optimizing the use of redundancy as a technique to improve network resiliency.

  19. Reliability, precision, and measurement in the context of data from ability tests, surveys, and assessments

    International Nuclear Information System (INIS)

    Fisher, W P Jr; Elbaum, B; Coulter, A

    2010-01-01

    Reliability coefficients indicate the proportion of total variance attributable to differences among measures separated along a quantitative continuum by a testing, survey, or assessment instrument. Reliability is usually considered to be influenced by both the internal consistency of a data set and the number of items, though textbooks and research papers rarely evaluate the extent to which these factors independently affect the data in question. Probabilistic formulations of the requirements for unidimensional measurement separate consistency from error by modelling individual response processes instead of group-level variation. The utility of this separation is illustrated via analyses of small sets of simulated data, and of subsets of data from a 78-item survey of over 2,500 parents of children with disabilities. Measurement reliability ultimately concerns the structural invariance specified in models requiring sufficient statistics, parameter separation, unidimensionality, and other qualities that historically have made quantification simple, practical, and convenient for end users. The paper concludes with suggestions for a research program aimed at focusing measurement research more on the calibration and wide dissemination of tools applicable to individuals, and less on the statistical study of inter-variable relations in large data sets.

  20. Manufacturing of reliable actively cooled fusion components - a challenge for non-destructive inspections

    International Nuclear Information System (INIS)

    Reheis, N.; Zabernig, A.; Ploechl, L.

    1994-01-01

    Actively cooled in-vessel components like divertors or limiters require high quality and reliability to ensure safe operation during long term use. Such components are subjected to very severe thermal and mechanical cyclic loads and high power densities. Key requirements for materials in question are e.g. high melting point and thermal conductivity and low atomic mass number. Since no single material can simultaneously meet all of these requirements the selection of materials to be combined in composite components as well as of manufacturing and non-destructive inspection (NDI) methods is a particularly challenging task. Armour materials like graphite intended to face the plasma and help to maintain its desired properties, are bonded to metallic substrates like copper, molybdenum or stainless steel providing cooling and mechanical support. Several techniques such as brazing and active metal casting have been developed and successfully applied for joining materials with different thermophysical properties, pursuing the objective of sufficient heat dissipation from the hot, plasma facing surface to the coolant. NDI methods are an integral part of the manufacturing schedule of these components, starting in the design phase and ending in the final inspection. They apply all kinds of divertor types (monobloc and flat-tile concept). Particular focus is put on the feasibility of detecting small flaws and defects in complex interfaces and on the limits of these techniques. Special test pieces with defined defects acting as standards were inspected. Accompanying metallographic investigations were carried out to compare actual defects with results recorded during NDI

  1. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    International Nuclear Information System (INIS)

    Dong, Feifei; Liu, Yong; Su, Han; Zou, Rui; Guo, Huaicheng

    2015-01-01

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  2. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Feifei [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Liu, Yong, E-mail: yongliu@pku.edu.cn [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Institute of Water Sciences, Peking University, Beijing 100871 (China); Su, Han [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Zou, Rui [Tetra Tech, Inc., 10306 Eaton Place, Ste 340, Fairfax, VA 22030 (United States); Yunnan Key Laboratory of Pollution Process and Management of Plateau Lake-Watershed, Kunming 650034 (China); Guo, Huaicheng [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China)

    2015-05-15

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  3. 76 FR 73608 - Reliability Technical Conference, North American Electric Reliability Corporation, Public Service...

    Science.gov (United States)

    2011-11-29

    ... or municipal authority play in forming your bulk power system reliability plans? b. Do you support..., North American Electric Reliability Corporation (NERC) Nick Akins, CEO of American Electric Power (AEP..., EL11-62-000] Reliability Technical Conference, North American Electric Reliability Corporation, Public...

  4. Assessing sufficient capability: A new approach to economic evaluation.

    Science.gov (United States)

    Mitchell, Paul Mark; Roberts, Tracy E; Barton, Pelham M; Coast, Joanna

    2015-08-01

    Amartya Sen's capability approach has been discussed widely in the health economics discipline. Although measures have been developed to assess capability in economic evaluation, there has been much less attention paid to the decision rules that might be applied alongside. Here, new methods, drawing on the multidimensional poverty and health economics literature, are developed for conducting economic evaluation within the capability approach and focusing on an objective of achieving "sufficient capability". This objective more closely reflects the concern with equity that pervades the capability approach and the method has the advantage of retaining the longitudinal aspect of estimating outcome that is associated with quality-adjusted life years (QALYs), whilst also drawing on notions of shortfall associated with assessments of poverty. Economic evaluation from this perspective is illustrated in an osteoarthritis patient group undergoing joint replacement, with capability wellbeing assessed using ICECAP-O. Recommendations for taking the sufficient capability approach forward are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Progress in evaluation and improvement in nondestructive examination reliability for inservice inspection of Light Water Reactors (LWRs) and characterize fabrication flaws in reactor pressure vessels

    International Nuclear Information System (INIS)

    Doctor, S.R.; Bowey, R.E.; Good, M.S.; Friley, J.R.; Kurtz, R.J.; Simonen, F.A.; Taylor, T.T.; Heasler, P.G.; Andersen, E.S.; Diaz, A.A.; Greenwood, M.S.; Hockey, R.L.; Schuster, G.J.; Spanner, J.C.; Vo, T.V.

    1991-10-01

    This paper is a review of the work conducted under two programs. One (NDE Reliability Program) is a multi-year program addressing the reliability of nondestructive evaluation (NDE) for the inservice inspection (ISI) of light water reactor components. This program examines the reliability of current NDE, the effectiveness of evolving technologies, and provides assessments and recommendations to ensure that the NDE is applied at the right time, in the right place with sufficient effectiveness that defects of importance to structural integrity will be reliably detected and accurately characterized. The second program (Characterizing Fabrication Flaws in Reactor Pressure Vessels) is assembling a data base to quantify the distribution of fabrication flaws that exist in US nuclear reactor pressure vessels with respect to density, size, type, and location. These programs will be discussed as two separate sections in this report. 4 refs., 7 figs

  6. Endoscopic clipping for gastrointestinal tumors. A method to define the target volume more precisely

    International Nuclear Information System (INIS)

    Riepl, M.; Klautke, G.; Fehr, R.; Fietkau, R.; Pietsch, A.

    2000-01-01

    Background: In many cases it is not possible to exactly define the extension of carcinoma of the gastrointestinal tract with the help of computertomography scans made for 3-D-radiation treatment planning. Consequently, the planning of external beam radiotherapy is made more difficult for the gross tumor volume as well as, in some cases, also for the clinical target volume. Patients and Methods: Eleven patients with macrosocpic tumors (rectal cancer n = 5, cardiac cancer n = 6) were included. Just before 3-D planning, the oral and aboral border of the tumor was marked endoscopically with hemoclips. Subsequently, CT scans for radiotherapy planning were made and the clinical target volume was defined. Five to 6 weeks thereafter, new CT scans were done to define the gross tumor volume for boost planning. Two investigators independently assessed the influence of the hemoclips on the different planning volumes, and whether the number of clips was sufficient to define the gross tumor volume. Results: In all patients, the implantation of the clips was done without complications. Start of radiotherapy was not delayed. With the help of the clips it was possible to exactly define the position and the extension of the primary tumor. The clinical target volume was modified according to the position of the clips in 5/11 patients; the gross tumor volume was modified in 7/11 patients. The use of the clips made the documentation and verification of the treatment portals by the simulator easier. Moreover, the clips helped the surgeon to define the primary tumor region following marked regression after neoadjuvant therapy in 3 patients. Conclusions: Endoscopic clipping of gastrointestinal tumors helps to define the tumor volumes more precisely in radiation therapy. The clips are easily recognized on the portal films and, thus, contribute to quality control. (orig.) [de

  7. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  8. Interrater reliability of schizoaffective disorder compared with schizophrenia, bipolar disorder, and unipolar depression - A systematic review and meta-analysis.

    Science.gov (United States)

    Santelmann, Hanno; Franklin, Jeremy; Bußhoff, Jana; Baethge, Christopher

    2016-10-01

    Schizoaffective disorder is a common diagnosis in clinical practice but its nosological status has been subject to debate ever since it was conceptualized. Although it is key that diagnostic reliability is sufficient, schizoaffective disorder has been reported to have low interrater reliability. Evidence based on systematic review and meta-analysis methods, however, is lacking. Using a highly sensitive literature search in Medline, Embase, and PsycInfo we identified studies measuring the interrater reliability of schizoaffective disorder in comparison to schizophrenia, bipolar disorder, and unipolar disorder. Out of 4126 records screened we included 25 studies reporting on 7912 patients diagnosed by different raters. The interrater reliability of schizoaffective disorder was moderate (meta-analytic estimate of Cohen's kappa 0.57 [95% CI: 0.41-0.73]), and substantially lower than that of its main differential diagnoses (difference in kappa between 0.22 and 0.19). Although there was considerable heterogeneity, analyses revealed that the interrater reliability of schizoaffective disorder was consistently lower in the overwhelming majority of studies. The results remained robust in subgroup and sensitivity analyses (e.g., diagnostic manual used) as well as in meta-regressions (e.g., publication year) and analyses of publication bias. Clinically, the results highlight the particular importance of diagnostic re-evaluation in patients diagnosed with schizoaffective disorder. They also quantify a widely held clinical impression of lower interrater reliability and agree with earlier meta-analysis reporting low test-retest reliability. Copyright © 2016. Published by Elsevier B.V.

  9. Reliability of intra-oral quantitative sensory testing (QST) in patients with atypical odontalgia and healthy controls - a multicentre study.

    Science.gov (United States)

    Baad-Hansen, L; Pigg, M; Yang, G; List, T; Svensson, P; Drangsholt, M

    2015-02-01

    The reliability of comprehensive intra-oral quantitative sensory testing (QST) protocol has not been examined systematically in patients with chronic oro-facial pain. The aim of the present multicentre study was to examine test-retest and interexaminer reliability of intra-oral QST measures in terms of absolute values and z-scores as well as within-session coefficients of variation (CV) values in patients with atypical odontalgia (AO) and healthy pain-free controls. Forty-five patients with AO and 68 healthy controls were subjected to bilateral intra-oral gingival QST and unilateral extratrigeminal QST (thenar) on three occasions (twice on 1 day by two different examiners and once approximately 1 week later by one of the examiners). Intra-class correlation coefficients and kappa values for interexaminer and test-retest reliability were computed. Most of the standardised intra-oral QST measures showed fair to excellent interexaminer (9-12 of 13 measures) and test-retest (7-11 of 13 measures) reliability. Furthermore, no robust differences in reliability measures or within-session variability (CV) were detected between patients with AO and the healthy reference group. These reliability results in chronic orofacial pain patients support earlier suggestions based on data from healthy subjects that intra-oral QST is sufficiently reliable for use as a part of a comprehensive evaluation of patients with somatosensory disturbances or neuropathic pain in the trigeminal region. © 2014 John Wiley & Sons Ltd.

  10. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  11. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  12. Reliability enhancement through optimal burn-in

    Science.gov (United States)

    Kuo, W.

    1984-06-01

    A numerical reliability and cost model is defined for production line burn-in tests of electronic components. The necessity of burn-in is governed by upper and lower bounds: burn-in is mandatory for operation-critical or nonreparable component; no burn-in is needed when failure effects are insignificant or easily repairable. The model considers electronic systems in terms of a series of components connected by a single black box. The infant mortality rate is described with a Weibull distribution. Performance reaches a steady state after burn-in, and the cost of burn-in is a linear function for each component. A minimum cost is calculated among the costs and total time of burn-in, shop repair, and field repair, with attention given to possible losses in future sales from inadequate burn-in testing.

  13. Human reliability program: Components and effects

    International Nuclear Information System (INIS)

    Baley-Downes, S.

    1986-01-01

    The term ''Human Reliability Program'' (HRP) is defined as a series of selective controls which are implemented and integrated to identify the ''insider threat'' from current and prospective employees who are dishonest, disloyal and unreliable. The HRP, although not a prediction of human behaviour, is an excellent tool for decision making and should compliment security and improve employee quality. The HRP consists of several component applications such as management evaluation; appropriate background investigative requirements; occupational health examination and laboratory testing; drug/alcohol screening; psychological testing and interviews; polygraph examination; job related aberrant behaviour recognition; on-going education and training; document control; drug/alcohol rehabilitation; periodic HRP audit; and implementation of an onsite central clearing house. The components and effects of HRP are discussed in further detail in this paper

  14. Diakoptical reliability analysis of transistorized systems

    International Nuclear Information System (INIS)

    Kontoleon, J.M.; Lynn, J.W.; Green, A.E.

    1975-01-01

    Limitations both on high-speed core availability and computation time required for assessing the reliability of large-sized and complex electronic systems, such as used for the protection of nuclear reactors, are very serious restrictions which continuously confront the reliability analyst. Diakoptic methods simplify the solution of the electrical-network problem by subdividing a given network into a number of independent subnetworks and then interconnecting the solutions of these smaller parts by a systematic process involving transformations based on connection-matrix elements associated with the interconnecting links. However, the interconnection process is very complicated and it may be used only if the original system has been cut in such a manner that a relation can be established between the constraints appearing at both sides of the cut. Also, in dealing with transistorized systems, one of the difficulties encountered is that of modelling adequately their performance under various operating conditions, since their parameters are strongly affected by the imposed voltage and current levels. In this paper a new interconnection approach is presented which may be of use in the reliability analysis of large-sized transistorized systems. This is based on the partial optimization of the subdivisions of the torn network as well as on the optimization of the torn paths. The solution of the subdivisions is based on the principles of algebraic topology, with an algebraic structure relating the physical variables in a topological structure which defines the interconnection of the discrete elements. Transistors, and other nonlinear devices, are modelled using their actual characteristics, under normal and abnormal operating conditions. Use of so-called k factors is made to facilitate accounting for use of electrical stresses. The approach is demonstrated by way of an example. (author)

  15. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  16. IEEE standard requirements for reliability analysis in the design and operation of safety systems for nuclear power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The purpose of this standard is to provide uniform, minimum acceptable requirements for the performance of reliability analyses for safety-related systems found in nuclear-power generating stations, but not to define the need for an analysis. The need for reliability analysis has been identified in other standards which expand the requirements of regulations (e.g., IEEE Std 379-1972 (ANSI N41.2-1972), ''Guide for the Application of the Single-Failure Criterion to Nuclear Power Generating Station Protection System,'' which describes the application of the single-failure criterion). IEEE Std 352-1975, ''Guide for General Principles of Reliability Analysis of Nuclear Power Generating Station Protection Systems,'' provides guidance in the application and use of reliability techniques referred to in this standard

  17. Software-defined network abstractions and configuration interfaces for building programmable quantum networks

    Energy Technology Data Exchange (ETDEWEB)

    Dasari, Venkat [U.S. Army Research Laboratory, Aberdeen Proving Ground, MD; Sadlier, Ronald J [ORNL; Geerhart, Mr. Billy [U.S. Army Research Laboratory, Aberdeen Proving Ground, MD; Snow, Nikolai [U.S. Army Research Laboratory, Aberdeen Proving Ground, MD; Williams, Brian P [ORNL; Humble, Travis S [ORNL

    2017-01-01

    Well-defined and stable quantum networks are essential to realize functional quantum applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. We develop new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.

  18. From reliability to maintenance of emergency generator sets in nuclear plants

    International Nuclear Information System (INIS)

    Reyraud, Y.

    1986-01-01

    The particular conditions of operation of emergency generator sets in a nuclear energy plant induce to take up a strategy of supervision and of maintenance very different of the one recommended for production generator sets. Mechanical and thermal pulls are affected by the size of the set and the choice of rotation running in respect of the wanted power and the response time requirements for the security of nuclear reactor. Reliability studies are helpful to define the strategy of supervision tests. The importance of the number of starts with respect to the running time requires the introduction of the idea of equivalent hours for the definition of maintenance periods. The security of the equipment and the upholding of the reliability at a value close to the optimum impose rigorous choices and strict conditions of supervision and maintenance [fr

  19. Intra-tester Reliability and Construct Validity of a Hip Abductor Eccentric Strength Test.

    Science.gov (United States)

    Brindle, Richard A; Ebaugh, D David; Milner, Clare E

    2017-11-15

    Side-lying hip abductor strength tests are commonly used to evaluate muscle strength. In a 'break' test the tester applies sufficient force to lower the limb to the table while the patient resists. The peak force is postulated to occur while the leg is lowering, thus representing the participant's eccentric muscle strength. However, it is unclear whether peak force occurs before or after the leg begins to lower. To determine intra-rater reliability and construct validity of a hip abductor eccentric strength test. Intra-rater reliability and construct validity study. Twenty healthy adults (26 ±6 years; 1.66 ±0.06 m; 62.2 ±8.0 kg) made two visits to the laboratory at least one week apart. During the hip abductor eccentric strength test, a hand-held dynamometer recorded peak force and time to peak force and limb position was recorded via a motion capture system. Intra-rater reliability was determined using intra-class correlation (ICC), standard error of measurement (SEM), and minimal detectable difference (MDD). Construct validity was assessed by determining if peak force occurred after the start of the lowering phase using a one-sample t-test. The hip abductor eccentric strength test had substantial intra-rater reliability (ICC( 3,3 ) = 0.88; 95% confidence interval: 0.65-0.95), SEM of 0.9%BWh, and a MDD of 2.5%BWh. Construct validity was established as peak force occurred 2.1s (±0.6s; range 0.7s to 3.7s) after the start of the lowering phase of the test (p ≤ 0.001). The hip abductor eccentric strength test is a valid and reliable measure of eccentric muscle strength. This test may be used clinically to assess changes in eccentric muscle strength over time.

  20. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  1. A reliable, compact and low-cost Michelson wavemeter for laser wavelength measurement

    International Nuclear Information System (INIS)

    Fox, P.J.; Scholten, R.E.; Walkiewicz, M.R.; Drullinger, R.E.

    1998-01-01

    We describe the construction and operation of a simple, compact and cost effective Michelson wavemeter with picometer accuracy. The low cost of the device means that it can form the basis of an undergraduate laboratory experiment, yet it is sufficiently reliable and accurate that it has become an important tool in our research laboratory, where it is regularly used to tune lasers to atomic transitions. The usefulness and accuracy of the wavemeter is demonstrated by tuning two separate extended cavity diode lasers to achieve two-step excitation of the Rb 5 2 D state, observed by detecting 420 nm blue fluorescence from the 5 2 D → 6 2 P → 5 2 S decay path. (authors)

  2. GATA4 Is Sufficient to Establish Jejunal Versus Ileal Identity in the Small IntestineSummary

    Directory of Open Access Journals (Sweden)

    Cayla A. Thompson

    2017-05-01

    Full Text Available Background & Aims: Patterning of the small intestinal epithelium along its cephalocaudal axis establishes three functionally distinct regions: duodenum, jejunum, and ileum. Efficient nutrient assimilation and growth depend on the proper spatial patterning of specialized digestive and absorptive functions performed by duodenal, jejunal, and ileal enterocytes. When enterocyte function is disrupted by disease or injury, intestinal failure can occur. One approach to alleviate intestinal failure would be to restore lost enterocyte functions. The molecular mechanisms determining regionally defined enterocyte functions, however, are poorly delineated. We previously showed that GATA binding protein 4 (GATA4 is essential to define jejunal enterocytes. The goal of this study was to test the hypothesis that GATA4 is sufficient to confer jejunal identity within the intestinal epithelium. Methods: To test this hypothesis, we generated a novel Gata4 conditional knock-in mouse line and expressed GATA4 in the ileum, where it is absent. Results: We found that GATA4-expressing ileum lost ileal identity. The global gene expression profile of GATA4-expressing ileal epithelium aligned more closely with jejunum and duodenum rather than ileum. Focusing on jejunal vs ileal identity, we defined sets of jejunal and ileal genes likely to be regulated directly by GATA4 to suppress ileal identity and promote jejunal identity. Furthermore, our study implicates GATA4 as a transcriptional repressor of fibroblast growth factor 15 (Fgf15, which encodes an enterokine that has been implicated in an increasing number of human diseases. Conclusions: Overall, this study refines our understanding of an important GATA4-dependent molecular mechanism to pattern the intestinal epithelium along its cephalocaudal axis by elaborating on GATA4’s function as a crucial dominant molecular determinant of jejunal enterocyte identity. Microarray data from this study have been deposited into

  3. Reliability of a Computerized Neurocognitive Test in Baseline Concussion Testing of High School Athletes.

    Science.gov (United States)

    MacDonald, James; Duerson, Drew

    2015-07-01

    of asymptomatic individuals before the start of a sporting season. This study adds to the evidence that suggests in this population such testing may lack sufficient reliability to support clinical decision making.

  4. Assessing the contribution of microgrids to the reliability of distribution networks

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Paulo Moises [Escola Superior Tecnologia Viseu, Instituto Politecnico Viseu, Campus Politecnico Repeses, 3504-510 Viseu (Portugal); Matos, Manuel A. [INESC Porto, Faculdade de Engenharia da Universidade do Porto, Porto (Portugal)

    2009-02-15

    The emergence of microgeneration has recently lead to the concept of microgrid, a network of LV consumers and producers able to export electric energy in some circumstances and also to work in an isolated way in emergency situations. Research on the organization of microgrids, control devices, functionalities and other technical aspects is presently being carried out, in order to establish a consistent technical framework to support the concept. The successful development of the microgrid concept implies the definition of a suitable regulation for its integration on distribution systems. In order to define such a regulation, the identification of costs and benefits that microgrids may bring is a crucial task. Actually, this is the basis for a discussion about the way global costs could be divided among the different agents that benefit from the development of microgrids. Among other aspects, the effect of microgrids on the reliability of the distribution network has been pointed out as an important advantage, due to the ability of isolated operation in emergency situations. This paper identifies the situations where the existence of a microgrid may reduce the interruption rate and duration and thus improve the reliability indices of the distribution network. The relevant expressions necessary to quantify the reliability are presented. An illustrative example is included, where the global influence of the microgrid in the reliability is commented. (author)

  5. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    Science.gov (United States)

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  6. Estimating the Optimal Capacity for Reservoir Dam based on Reliability Level for Meeting Demands

    Directory of Open Access Journals (Sweden)

    Mehrdad Taghian

    2017-02-01

    Full Text Available Introduction: One of the practical and classic problems in the water resource studies is estimation of the optimal reservoir capacity to satisfy demands. However, full supplying demands for total periods need a very high dam to supply demands during severe drought conditions. That means a major part of reservoir capacity and costs is only usable for a short period of the reservoir lifetime, which would be unjustified in economic analysis. Thus, in the proposed method and model, the full meeting demand is only possible for a percent time of the statistical period that is according to reliability constraint. In the general methods, although this concept apparently seems simple, there is a necessity to add binary variables for meeting or not meeting demands in the linear programming model structures. Thus, with many binary variables, solving the problem will be time consuming and difficult. Another way to solve the problem is the application of the yield model. This model includes some simpler assumptions and that is so difficult to consider details of the water resource system. The applicationof evolutionary algorithms, for the problems have many constraints, is also very complicated. Therefore, this study pursues another solution. Materials and Methods: In this study, for development and improvement the usual methods, instead of mix integer linear programming (MILP and the above methods, a simulation model including flow network linear programming is used coupled with an interface manual code in Matlab to account the reliability based on output file of the simulation model. The acre reservoir simulation program (ARSP has been utilized as a simulation model. A major advantage of the ARSP is its inherent flexibility in defining the operating policies through a penalty structure specified by the user. The ARSP utilizes network flow optimization techniques to handle a subset of general linear programming (LP problems for individual time intervals

  7. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  8. Reliability in automotive and mechanical engineering determination of component and system reliability

    CERN Document Server

    Bertsche, Bernd

    2008-01-01

    In the present contemporary climate of global competition in every branch of engineering and manufacture it has been shown from extensive customer surveys that above every other attribute, reliability stands as the most desired feature in a finished product. To survive this relentless fight for survival any organisation, which neglect the plea of attaining to excellence in reliability, will do so at a serious cost Reliability in Automotive and Mechanical Engineering draws together a wide spectrum of diverse and relevant applications and analyses on reliability engineering. This is distilled into this attractive and well documented volume and practising engineers are challenged with the formidable task of simultaneously improving reliability and reducing the costs and down-time due to maintenance. The volume brings together eleven chapters to highlight the importance of the interrelated reliability and maintenance disciplines. They represent the development trends and progress resulting in making this book ess...

  9. Note: An online testing method for lifetime projection of high power light-emitting diode under accelerated reliability test.

    Science.gov (United States)

    Chen, Qi; Chen, Quan; Luo, Xiaobing

    2014-09-01

    In recent years, due to the fast development of high power light-emitting diode (LED), its lifetime prediction and assessment have become a crucial issue. Although the in situ measurement has been widely used for reliability testing in laser diode community, it has not been applied commonly in LED community. In this paper, an online testing method for LED life projection under accelerated reliability test was proposed and the prototype was built. The optical parametric data were collected. The systematic error and the measuring uncertainty were calculated to be within 0.2% and within 2%, respectively. With this online testing method, experimental data can be acquired continuously and sufficient amount of data can be gathered. Thus, the projection fitting accuracy can be improved (r(2) = 0.954) and testing duration can be shortened.

  10. Refueling availability for alternative fuel vehicle markets: Sufficient urban station coverage

    International Nuclear Information System (INIS)

    Melaina, Marc; Bremson, Joel

    2008-01-01

    Alternative fuel vehicles can play an important role in addressing the challenges of climate change, energy security, urban air pollution and the continued growth in demand for transportation services. The successful commercialization of alternative fuels for vehicles is contingent upon a number of factors, including vehicle cost and performance. Among fuel infrastructure issues, adequate refueling availability is one of the most fundamental to successful commercialization. A commonly cited source reports 164,300 refueling stations in operation nationwide. However, from the perspective of refueling availability, this nationwide count tends to overstate the number of stations required to support the widespread deployment of alternative fuel vehicles. In terms of spatial distribution, the existing gasoline station networks in many urban areas are more than sufficient. We characterize a sufficient level of urban coverage based upon a subset of cities served by relatively low-density station networks, and estimate that some 51,000 urban stations would be required to provide this sufficient level of coverage to all major urban areas, 33 percent less than our estimate of total urban stations. This improved characterization will be useful for engineering, economic and policy analyses. (author)

  11. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  12. Defining the actinic keratosis field: a literature review and discussion.

    Science.gov (United States)

    Figueras Nart, I; Cerio, R; Dirschka, T; Dréno, B; Lear, J T; Pellacani, G; Peris, K; Ruiz de Casas, A

    2018-04-01

    Despite the chronic and increasingly prevalent nature of actinic keratosis (AK) and existing evidence supporting assessment of the entire cancerization field during clinical management, a standardized definition of the AK field to aid in the understanding and characterization of the disease is lacking. The objective of this review was to present and appraise the available evidence describing the AK cancerization field, with the aim of determining a precise definition of the AK field in terms of its molecular (including genetic and immunological), histological and clinical characteristics. Eight European dermatologists collaborated to conduct a review and expert appraisal of articles detailing the characteristics of the AK field. Articles published in English before August 2016 were identified using PubMed and independently selected for further assessment according to predefined preliminary inclusion and exclusion criteria. In addition, a retrospective audit of patients with AK was performed to define the AK field in clinical terms. A total of 32 review articles and 47 original research articles provided evidence of sun-induced molecular (including genetic and immunological) and histological skin changes in the sun-exposed area affected by AK. However, the available literature was deemed insufficient to inform a clinical definition of the AK field. During the retrospective audit, visible signs of sun damage in 40 patients with AK were assessed. Telangiectasia, atrophy and pigmentation disorders emerged as 'reliable or very reliable' indicators of AK field based on expert opinion, whereas 'sand paper' was deemed a 'moderately reliable' indicator. This literature review has revealed a significant gap of evidence to inform a clinical definition of the AK field. Therefore, the authors instead propose a clinical definition of field cancerization based on the identification of visible signs of sun damage that are reliable indicators of field cancerization based on expert

  13. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  14. Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.

    Science.gov (United States)

    Olson, Andrew P J; Graber, Mark L; Singh, Hardeep

    2018-01-29

    Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.

  15. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  16. Node-pair reliability of network systems with small distances between adjacent nodes

    International Nuclear Information System (INIS)

    Malinowski, Jacek

    2007-01-01

    A new method for computing the node-pair reliability of network systems modeled by random graphs with nodes arranged in sequence is presented. It is based on a recursive algorithm using the 'sliding window' technique, the window being composed of several consecutive nodes. In a single step, the connectivity probabilities for all nodes included in the window are found. Subsequently, the window is moved one node forward. This process is repeated until, in the last step, the window reaches the terminal node. The connectivity probabilities found at that point are used to compute the node-pair reliability of the network system considered. The algorithm is designed especially for graphs with small distances between adjacent nodes, where the distance between two nodes is defined as the absolute value of the difference between the nodes' numbers. The maximal distance between any two adjacent nodes is denoted by Γ(G), where G symbolizes a random graph. If Γ(G)=2 then the method can be applied for directed as well as undirected graphs whose nodes and edges are subject to failure. This is important in view of the fact that many algorithms computing network reliability are designed for graphs with failure-prone edges and reliable nodes. If Γ(G)=3 then the method's applicability is limited to undirected graphs with reliable nodes. The main asset of the presented algorithms is their low numerical complexity-O(n), where n denotes the number of nodes

  17. Reliability estimation of safety-critical software-based systems using Bayesian networks

    International Nuclear Information System (INIS)

    Helminen, A.

    2001-06-01

    Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of software-based safety-critical automation systems in nuclear power plants. In the research project 'Programmable automation system safety integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002), various safety assessment methods and tools for software based systems are developed and evaluated. The project is financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT). In this report the applicability of Bayesian networks to the reliability estimation of software-based systems is studied. The applicability is evaluated by building Bayesian network models for the systems of interest and performing simulations for these models. In the simulations hypothetical evidence is used for defining the parameter relations and for determining the ability to compensate disparate evidence in the models. Based on the experiences from modelling and simulations we are able to conclude that Bayesian networks provide a good method for the reliability estimation of software-based systems. (orig.)

  18. Economic Valuation of Sufficient and Guaranteed Irrigation Water Supply for Paddy Farms of Guilan Province

    Directory of Open Access Journals (Sweden)

    Mohammad Kavoosi Kalashami

    2014-08-01

    Full Text Available Cultivation of the strategic crop of rice highly depends to the existence of sufficient and guaranteed irrigation water, and water shortage stresses have irreparable effects on yield and quality of productions. Decrease of the Sefidrud river inflow in Guilan province which is the main source of supplying irrigation water for 171 thousand hectares under rice cropping area of this province, has been challenged sufficient and guaranteed irrigation water supply in many regions of mentioned province. Hence, in present study estimating the value that paddy farmers place on sufficient and guaranteed irrigation water supply has been considered. Economic valuation of sufficient and guaranteed irrigation water supply improves water resource management policies in demand side. Requested data set were obtained on the base of a survey and are collected from 224 paddy farms in rural regions that faced with irrigation water shortages. Then, using open-ended valuation approach and estimation of Tobit model via ML and two stages Heckman approach, eliciting paddy farmers' willingness to pay for sufficient and guaranteed irrigation water supply has been accomplished. Results revealed that farmers in investigated regions willing to pay 26.49 percent more than present costs of providing irrigation water in order to have sufficient and guaranteed irrigation water.

  19. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  20. Sufficient education attainment for a decent standard of living in modern Australia

    Directory of Open Access Journals (Sweden)

    Emily Joy Callander

    2012-06-01

    Full Text Available Education attainment will impact upon an individual’s capacity to engage in the labour force, their living standards and hence their poverty status. As such, education should be included in measures of poverty. However, it is not known what a sufficient level of education to have a decent standard of living is. Using the 2003 Survey of Disability, Ageing and Carers different levels of education attainment were tested for their association with labour force participation and income. Based upon this, it was concluded that Year 12 or higher is a sufficient level of education attainment for 15 to 64 year olds; and Year 10 or higher for people over the age of 65 years. This is in line with current government policies to improve Year 12 completion rates. Knowing what a ‘sufficient level of education attainment’ is, allows education to be included in multidimensional measures of poverty that view education as a key dimension of disadvantage.

  1. Electronic device for endosurgical skills training (EDEST): study of reliability.

    Science.gov (United States)

    Pagador, J B; Uson, J; Sánchez, M A; Moyano, J L; Moreno, J; Bustos, P; Mateos, J; Sánchez-Margallo, F M

    2011-05-01

    Minimally Invasive Surgery procedures are commonly used in many surgical practices, but surgeons need specific training models and devices due to its difficulty and complexity. In this paper, an innovative electronic device for endosurgical skills training (EDEST) is presented. A study on reliability for this device was performed. Different electronic components were used to compose this new training device. The EDEST was focused on two basic laparoscopic tasks: triangulation and coordination manoeuvres. A configuration and statistical software was developed to complement the functionality of the device. A calibration method was used to assure the proper work of the device. A total of 35 subjects (8 experts and 27 novices) were used to check the reliability of the system using the MTBF analysis. Configuration values for triangulation and coordination exercises were calculated as 0.5 s limit threshold and 800-11,000 lux range of light intensity, respectively. Zero errors in 1,050 executions (0%) for triangulation and 21 errors in 5,670 executions (0.37%) for coordination were obtained. A MTBF of 2.97 h was obtained. The results show that the reliability of the EDEST device is acceptable when used under previously defined light conditions. These results along with previous work could demonstrate that the EDEST device can help surgeons during first training stages.

  2. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    Energy Technology Data Exchange (ETDEWEB)

    Emery, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Coffin, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robbins, Brian A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carroll, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Field, Richard V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeremy Yoo, Yung Suk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kacher, Josh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins with a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.

  3. History of Reliability and Quality Assurance at Kennedy Space Center

    Science.gov (United States)

    Childers, Frank M.

    2004-01-01

    This Kennedy Historical Document (KHD) provides a unique historical perspective of the organizational and functional responsibilities for the manned and un-manned programs at Kennedy Space Center, Florida. As systems become more complex and hazardous, the attention to detailed planning and execution continues to be a challenge. The need for a robust reliability and quality assurance program will always be a necessity to ensure mission success. As new space missions are defined and technology allows for continued access to space, these programs cannot be compromised. The organizational structure that has provided the reliability and quality assurance functions for both the manned and unmanned programs has seen many changes since the first group came to Florida in the 1950's. The roles of government and contractor personnel have changed with each program and organizational alignment has changed based on that responsibility. The organizational alignment of the personnel performing these functions must ensure independent assessment of the processes.

  4. Mechanical Properties for Reliability Analysis of Structures in Glassy Carbon

    CERN Document Server

    Garion, Cédric

    2014-01-01

    Despite its good physical properties, the glassy carbon material is not widely used, especially for structural applications. Nevertheless, its transparency to particles and temperature resistance are interesting properties for the applications to vacuum chambers and components in high energy physics. For example, it has been proposed for fast shutter valve in particle accelerator [1] [2]. The mechanical properties have to be carefully determined to assess the reliability of structures in such a material. In this paper, mechanical tests have been carried out to determine the elastic parameters, the strength and toughness on commercial grades. A statistical approach, based on the Weibull’s distribution, is used to characterize the material both in tension and compression. The results are compared to the literature and the difference of properties for these two loading cases is shown. Based on a Finite Element analysis, a statistical approach is applied to define the reliability of a structural component in gl...

  5. Flash memories economic principles of performance, cost and reliability optimization

    CERN Document Server

    Richter, Detlev

    2014-01-01

    The subject of this book is to introduce a model-based quantitative performance indicator methodology applicable for performance, cost and reliability optimization of non-volatile memories. The complex example of flash memories is used to introduce and apply the methodology. It has been developed by the author based on an industrial 2-bit to 4-bit per cell flash development project. For the first time, design and cost aspects of 3D integration of flash memory are treated in this book. Cell, array, performance and reliability effects of flash memories are introduced and analyzed. Key performance parameters are derived to handle the flash complexity. A performance and array memory model is developed and a set of performance indicators characterizing architecture, cost and durability is defined.   Flash memories are selected to apply the Performance Indicator Methodology to quantify design and technology innovation. A graphical representation based on trend lines is introduced to support a requirement based pr...

  6. Quantification of the reliability of personnel actions from the evaluation of actual German operational experience. Final report

    International Nuclear Information System (INIS)

    Preischl, W.; Fassmann, W.

    2013-07-01

    The results and their uncertainty bounds of PSA studies are considerably impacted by the assessment of human reliability. But the amount of available, generic data is not sufficient to evaluate all human actions considered in a modern PSA study adequately. Further the data are not sufficiently validated and rely as well as the proposed uncertainty bounds on expert judgement. This research project as well as the preceding project /GRS 10/ validated data recommended by the German PSA Guidelines and enlarged the amount of available data. The findings may contribute to an update of the German PSA Guidelines. In a first step of the project information about reportable events in German nuclear power plants with observed human errors (event reports, expert statements, technical documents, interviews and plant walk downs with subject matter experts from the plants) were analysed. The investigation resulted in 67 samples describing personal activities, performance conditions, the number of observed errors and the number of action performance. In a second step a new methodology was developed and applied in a pilot plant. The objective was to identify undoubtedly error free safety relevant actions, their performance conditions, and frequency as well as to prove and demonstrate that probabilistic data can be derived from that operational experience (OE). The application in the pilot plant resulted in 18 ''error free'' samples characterizing human reliability. All available samples were evaluated by use of the method of Bayes. That commonly accepted methodology was applied in order to derive probabilistic data based on samples taken from operational experience. A thorough analysis of the obtained results shows that both data sources (OE reportable events, OE with undoubtedly error free action performance) provide data with comparable quality and validity. At the end of the research project the following products are available. - Methods to select samples

  7. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  8. On the Impact of Precoding Errors on Ultra-Reliable Communications

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Pedersen, Klaus I.; Alvarez, Beatriz Soret

    2016-01-01

    Motivated by the stringent reliability required by some of the future cellular use cases, we study the impact of precoding errors on the SINR outage performance for various spatial diversity techniques. The performance evaluation is carried out via system-level simulations, including the effects...... of multi-user and multicell interference, and following the 3GPP-defined simulation assumptions for a traditional macro case. It is shown that, except for feedback error probabilities larger than 1%, closed-loop microscopic diversity schemes are generally preferred over open-loop techniques as a way...

  9. Ideal energy self-sufficient bioclimatic house

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, C.

    1990-04-01

    This paper points out some of the interesting architectural features of a conceptual house being designed to be self-sufficient relative to the use of conventional energy sources. Brief notes are given on the following special design characteristics: the house's orientation and form - essentially a V - shaped two storey design with an orientation such as to maximize the surface area exposed to winter insolation; its special low emissivity glazing equipped with nightfall insulating screens; the adoption of maximized insulation, in which case cost benefits were assessed based on amortization over the entire life span of the house; hybrid space heating and ventilation systems involving the integration of pumps and ventilators for air circulation, and the use of a varied mix of active and passive solar heating and cooling systems.

  10. Study of evaluation techniques of software safety and reliability in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Park, N. J.; Shin, C. Y. [Chungnam National Univ., Taejon (Korea, Republic of)

    1999-04-15

    Software system development process and software quality assurance activities are examined in this study. Especially software safety and reliability requirements in nuclear power plant are investigated. For this purpose methodologies and tools which can be applied to software analysis, design, implementation, testing, maintenance step are evaluated. Necessary tasks for each step are investigated. Duty, input, and detailed activity for each task are defined to establish development process of high quality software system. This means applying basic concepts of software engineering and principles of system development. This study establish a guideline that can assure software safety and reliability requirements in digitalized nuclear plant systems and can be used as a guidebook of software development process to assure software quality many software development organization.

  11. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  12. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  13. Reliability of the Q Force; a mobile instrument for measuring isometric quadriceps muscle strength.

    Science.gov (United States)

    Douma, K W; Regterschot, G R H; Krijnen, W P; Slager, G E C; van der Schans, C P; Zijlstra, W

    2016-01-01

    The ability to generate muscle strength is a pre-requisite for all human movement. Decreased quadriceps muscle strength is frequently observed in older adults and is associated with a decreased performance and activity limitations. To quantify the quadriceps muscle strength and to monitor changes over time, instruments and procedures with a sufficient reliability are needed. The Q Force is an innovative mobile muscle strength measurement instrument suitable to measure in various degrees of extension. Measurements between 110 and 130° extension present the highest values and the most significant increase after training. The objective of this study is to determine the test-retest reliability of muscle strength measurements by the Q Force in older adults in 110° extension. Forty-one healthy older adults, 13 males and 28 females were included in the study. Mean (SD) age was 81.9 (4.89) years. Isometric muscle strength of the Quadriceps muscle was assessed with the Q Force at 110° of knee extension. Participants were measured at two sessions with a three to eight day interval between sessions. To determine relative reliability, the intraclass correlation coefficient (ICC) was calculated. To determine absolute reliability, Bland and Altman Limits of Agreement (LOA) were calculated and t-tests were performed. Relative reliability of the Q Force is good to excellent as all ICC coefficients are higher than 0.75. Generally a large 95 % LOA, reflecting only moderate absolute reliability, is found as exemplified for the peak torque left leg of -18.6 N to 33.8 N and the right leg of -9.2 N to 26.4 N was between 15.7 and 23.6 Newton representing 25.2 % to 39.9 % of the size of the mean. Small systematic differences in mean were found between measurement session 1 and 2. The present study shows that the Q Force has excellent relative test-retest reliability, but limited absolute test-retest reliability. Since the Q Force is relatively cheap and mobile it is suitable for

  14. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  15. Is fasting glucose sufficient to define diabetes? Epidemiological data from 20 European studies

    NARCIS (Netherlands)

    DECODE-study group, [Unknown

    1999-01-01

    Aims/hypothesis. The World Health Organization Consultation recommended new diagnostic criteria for diabetes mellitus including: lowering of the diagnostic fasting plasma glucose to 7.0 mmol/l and introduction of a new category: impaired fasting glycaemia. The diagnostic 2-h glucose concentrations

  16. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  17. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  18. A Reliability and Validity Study of the Defining Issues Test: The Relationship of Age, Education, Gender and Parental Education with Moral Development

    Science.gov (United States)

    Cesur, Sevim; Topcu, Mustafa Sami

    2010-01-01

    The aim of the study is twofold: First and main aim was to develop a valid and reliable Turkish version of the DIT which is one of the most important instruments in the psychology and education research; second is to explore the relationships between moral development and age, gender, education, and parental education. The study group consists of…

  19. Validity and reliability of Thai version of the Foot and Ankle Outcome Score in patients with arthritis of the foot and ankle.

    Science.gov (United States)

    Angthong, Chayanin

    2016-12-01

    Although the Foot and Ankle Outcome Score (FAOS) is commonly used in several languages for a variety of foot disorders, it has not been validated specifically for foot and ankle arthritic conditions. The aims of the present study were to translate the original English FAOS into Thai and to evaluate the validity and reliability of the Thai version of the FAOS for the foot and ankle arthritic conditions. The original FAOS was translated into Thai using forward-backward translation. The Thai FAOS and validated Thai Short Form-36 (SF-36 ® ) questionnaires were distributed to 44 Thai patients suffering from arthritis of the foot and ankle to complete. For validation, Thai FAOS scores were correlated with SF-36 scores. Test-retest reliability and internal consistency were also analyzed in this study. The Thai FAOS score demonstrated sufficient correlation with SF-36 total score in Pain (Pearson's correlation coefficient (r)=0.45, p=0.002), Symptoms (r=0.45, p=0.002), Activities of Daily Living (ADL) (r=0.47, p=0.001), and Quality of Life (QOL) (r=0.38, p=0.011) subscales. The Sports and Recreational Activities (Sports & Rec) subscale did not correlate significantly with the SF-36 ® (r=0.20, p=0.20). Cronbach's alpha, a measure of internal consistency, for the five subscales was as follows: Pain, 0.94 (pvalidity for the evaluation of foot and ankle arthritis. Although reliability was satisfactory for the major subscale ADL, it was not sufficient for the minor subscales. Our findings suggest that it can be used as a disease-specific instrument to evaluate foot and ankle arthritis and can complement other reliable outcome surveys. Copyright © 2015 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.

  20. Determination of reliability criteria for standby diesel generators at a nuclear power station

    International Nuclear Information System (INIS)

    Evans, M.G.K.

    1987-01-01

    The requirement for standby diesel generators at nuclear power stations is developed and a probabilistic approach used to define the reliability parameters. The present criteria used when ordering a diesel generator are compared with the testing required by the regulatory body and the most likely requirement following an accident. The impact of this on the diesels at a particular station and the root cause of failures are discussed. (orig.)