WorldWideScience

Sample records for reliability principles methodology

  1. Methodology for allocating reliability and risk

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1986-05-01

    This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs

  2. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  3. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  4. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.; El-Bassioni, A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed

  5. Principle of maximum entropy for reliability analysis in the design of machine components

    Science.gov (United States)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  6. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  7. CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology

    Science.gov (United States)

    Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy

    2006-01-01

    This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.

  8. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  9. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Science.gov (United States)

    2010-10-01

    ... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... 49 Transportation 4 2010-10-01 2010-10-01 false General Principles of Reliability-Based... the design level of safety and reliability of the equipment; (2) To restore safety and reliability to...

  10. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  11. Contextual factors, methodological principles and teacher cognition

    Directory of Open Access Journals (Sweden)

    Rupert Walsh

    2014-01-01

    Full Text Available Teachers in various contexts worldwide are sometimes unfairly criticized for not putting teaching methods developed for the well-resourced classrooms of Western countries into practice. Factors such as the teachers’ “misconceptualizations” of “imported” methods, including Communicative Language Teaching (CLT, are often blamed, though the challenges imposed by “contextual demands,” such as large class sizes, are sometimes recognised. Meanwhile, there is sometimes an assumption that in the West there is a happy congruence between policy supportive of CLT or Task-Based Language Teaching, teacher education and supervision, and curriculum design with teachers’ cognitions and their practices. Our case study of three EFL teachers at a UK adult education college is motivated by a wish to question this assumption. Findings from observational and interview data suggest the practices of two teachers were largely consistent with their methodological principles, relating to stronger and weaker forms of CLT respectively, as well as to more general educational principles, such as a concern for learners; the supportive environment seemed to help. The third teacher appeared to put “difficult” contextual factors, for example, tests, ahead of methodological principles without, however, obviously benefiting. Implications highlight the important role of teacher cognition research in challenging cultural assumptions.

  12. Reliability assessment of passive containment isolation system using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Srivastava, A.; Prasad, Hari; Anthony, A.; Gaikwad, A.J.; Bhatia, S.; Sinha, R.K.

    2008-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) has been employed for evaluation of the reliability of passive systems. The methodology has been applied to the passive containment isolation system (PCIS) of the Indian advanced heavy water reactor (AHWR). In the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to carryout the desired function. The methodology first determines the operational characteristics of the system and the failure conditions by assigning a predetermined failure criterion. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the PCIS performance. APSRA proposes to compare the code predictions with the test data to generate the uncertainties on the failure parameter prediction, which is later considered in the code for accurate prediction of failure surface of the system. Once the failure surface of the system is predicted, the cause of failure is examined through root diagnosis, which occurs mainly due to failure of mechanical components. The failure probability of these components is evaluated through a classical PSA treatment using the generic data. The reliability of the PCIS is evaluated from the probability of availability of the components for the success of the passive containment isolation system

  13. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  14. Reliability evaluation of thermophysical properties from first-principles calculations.

    Science.gov (United States)

    Palumbo, Mauro; Fries, Suzana G; Dal Corso, Andrea; Kürmann, Fritz; Hickel, Tilmann; Neugebauer, Jürg

    2014-08-20

    Thermophysical properties, such as heat capacity, bulk modulus and thermal expansion, are of great importance for many technological applications and are traditionally determined experimentally. With the rapid development of computational methods, however, first-principles computed temperature-dependent data are nowadays accessible. We evaluate various computational realizations of such data in comparison to the experimental scatter. The work is focussed on the impact of different first-principles codes (QUANTUM ESPRESSO and VASP), pseudopotentials (ultrasoft and projector augmented wave) as well as phonon determination methods (linear response and direct force constant method) on these properties. Based on the analysis of data for two pure elements, Cr and Ni, consequences for the reliability of temperature-dependent first-principles data in computational thermodynamics are discussed.

  15. Methodology for reliability, economic and environmental assessment of wave energy

    International Nuclear Information System (INIS)

    Thorpe, T.W.; Muirhead, S.

    1994-01-01

    As part of the Preliminary Actions in Wave Energy R and D for DG XII's Joule programme, methodologies were developed to facilitate assessment of the reliability, economics and environmental impact of wave energy. This paper outlines these methodologies, their limitations and areas requiring further R and D. (author)

  16. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  17. Flash memories economic principles of performance, cost and reliability optimization

    CERN Document Server

    Richter, Detlev

    2014-01-01

    The subject of this book is to introduce a model-based quantitative performance indicator methodology applicable for performance, cost and reliability optimization of non-volatile memories. The complex example of flash memories is used to introduce and apply the methodology. It has been developed by the author based on an industrial 2-bit to 4-bit per cell flash development project. For the first time, design and cost aspects of 3D integration of flash memory are treated in this book. Cell, array, performance and reliability effects of flash memories are introduced and analyzed. Key performance parameters are derived to handle the flash complexity. A performance and array memory model is developed and a set of performance indicators characterizing architecture, cost and durability is defined.   Flash memories are selected to apply the Performance Indicator Methodology to quantify design and technology innovation. A graphical representation based on trend lines is introduced to support a requirement based pr...

  18. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  19. Reliability assessment of passive isolation condenser system of AHWR using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Prasad, Hari; Anthony, A.; Bhatia, S.K.; Sinha, R.K.

    2009-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) is used for evaluation of reliability of passive isolation condenser system of the Indian Advanced Heavy Water Reactor (AHWR). As per the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to perform the design basis function. The methodology first determines the operational characteristics of the system and the failure conditions based on a predetermined failure criterion. The parameters that could degrade the system performance are identified and considered for analysis. Different modes of failure and their cause are identified. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the isolation condenser system performance. Once the failure surface of the system is predicted, the causes of failure are examined through root diagnosis, which occur mainly due to failure of mechanical components. Reliability of the system is evaluated through a classical PSA treatment based on the failure probability of the components using generic data

  20. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  1. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    International Nuclear Information System (INIS)

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  2. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  3. IEEE guide for general principles of reliability analysis of nuclear power generating station protection systems

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Presented is the Institute of Electrical and Electronics Engineers, Inc. (IEEE) guide for general principles of reliability analysis of nuclear power generating station protection systems. The document has been prepared to provide the basic principles needed to conduct a reliability analysis of protection systems. Included is information on qualitative and quantitative analysis, guides for failure data acquisition and use, and guide for establishment of intervals

  4. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  5. Integrating rock mechanics issues with repository design through design process principles and methodology

    International Nuclear Information System (INIS)

    Bieniawski, Z.T.

    1996-01-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as open-quotes design for manufactureclose quotes or open-quotes concurrent engineeringclose quotes are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of open-quotes Design for Constructibility and Performanceclose quotes is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance

  6. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  7. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  8. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  9. A reach of the principle of entry and the principle of reliability in the real estate cadastre in our court practice

    OpenAIRE

    Cvetić Radenka M.

    2015-01-01

    Through the review of the principle of entry and the principle of reliability in the Real Estate Cadastre and their reach in our court practice, this article indicates the indispensability of compliance with these principles for the sake of legal certainty. A formidable and a complex role of the court when applying law in order to rightfully resolve an individual case has been underlined. Having regard to the accountability of the courts for the efficacy of the legal system, without any inten...

  10. Reliability evaluation methodologies for ensuring container integrity of stored transuranic (TRU) waste

    International Nuclear Information System (INIS)

    Smith, K.L.

    1995-06-01

    This report provides methodologies for providing defensible estimates of expected transuranic waste storage container lifetimes at the Radioactive Waste Management Complex. These methodologies can be used to estimate transuranic waste container reliability (for integrity and degradation) and as an analytical tool to optimize waste container integrity. Container packaging and storage configurations, which directly affect waste container integrity, are also addressed. The methodologies presented provide a means for demonstrating Resource Conservation and Recovery Act waste storage requirements

  11. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  12. A methodology and success/failure criteria for determining emergency diesel generator reliability

    International Nuclear Information System (INIS)

    Wyckoff, H.L.

    1986-01-01

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  13. A methodology and success/failure criteria for determining emergency diesel generator reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wyckoff, H. L. [Electric Power Research Institute, Palo Alto, California (United States)

    1986-02-15

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  14. Methodological principles outline discipline "Organization studies-tourism activity" using information technologies.

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-08-01

    Full Text Available The basic methodological principles of the disciplines of tourism and local history with information technology. 15 analyzed the literature and experience of leading experts in the field of sports and health tourism, and orienteering. Identified principles of academic disciplines of tourism and local history: the shift in emphasis from sports tourism to the cognitive, health tourism, the development of spiritual qualities, acquisition of life skills in nature, discovery and development of pedagogical and psychological abilities, character traits through the study of native land, the development of cognitive-research abilities, physical abilities, motor skills, application of modern information technology.

  15. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  16. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  17. Transmission embedded cost allocation methodology with consideration of system reliability

    International Nuclear Information System (INIS)

    Hur, D.; Park, J.-K.; Yoo, C.-I.; Kim, B.H.

    2004-01-01

    In a vertically integrated utility industry, the cost of reliability, as a separate service, has not received much rigorous analysis. However, as a cornerstone of restructuring the industry, the transmission service pricing must change to be consistent with, and supportive of, competitive wholesale electricity markets. This paper focuses on the equitable allocation of transmission network embedded costs including the transmission reliability cost based on the contributions of each generator to branch flows under normal conditions as well as the line outage impact factor under a variety of load levels. A numerical example on a six-bus system is given to illustrate the applications of the proposed methodology. (author)

  18. Basic Principles of Electrical Network Reliability Optimization in Liberalised Electricity Market

    Science.gov (United States)

    Oleinikova, I.; Krishans, Z.; Mutule, A.

    2008-01-01

    The authors propose to select long-term solutions to the reliability problems of electrical networks in the stage of development planning. The guide lines or basic principles of such optimization are: 1) its dynamical nature; 2) development sustainability; 3) integrated solution of the problems of network development and electricity supply reliability; 4) consideration of information uncertainty; 5) concurrent consideration of the network and generation development problems; 6) application of specialized information technologies; 7) definition of requirements for independent electricity producers. In the article, the major aspects of liberalized electricity market, its functions and tasks are reviewed, with emphasis placed on the optimization of electrical network development as a significant component of sustainable management of power systems.

  19. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  20. Seismic reliability assessment methodology for CANDU concrete containment structures

    International Nuclear Information System (INIS)

    Stephens, M.J.; Nessim, M.A.; Hong, H.P.

    1995-05-01

    A study was undertaken to develop a reliability-based methodology for the assessment of existing CANDU concrete containment structures with respect to seismic loading. The focus of the study was on defining appropriate specified values and partial safety factors for earthquake loading and resistance parameters. Key issues addressed in the work were the identification of an approach to select design earthquake spectra that satisfy consistent safety levels, and the use of structure-specific data in the evaluation of structural resistance. (author). 23 refs., 9 tabs., 15 figs

  1. An integrated methodology for the dynamic performance and reliability evaluation of fault-tolerant systems

    International Nuclear Information System (INIS)

    Dominguez-Garcia, Alejandro D.; Kassakian, John G.; Schindall, Joel E.; Zinchuk, Jeffrey J.

    2008-01-01

    We propose an integrated methodology for the reliability and dynamic performance analysis of fault-tolerant systems. This methodology uses a behavioral model of the system dynamics, similar to the ones used by control engineers to design the control system, but also incorporates artifacts to model the failure behavior of each component. These artifacts include component failure modes (and associated failure rates) and how those failure modes affect the dynamic behavior of the component. The methodology bases the system evaluation on the analysis of the dynamics of the different configurations the system can reach after component failures occur. For each of the possible system configurations, a performance evaluation of its dynamic behavior is carried out to check whether its properties, e.g., accuracy, overshoot, or settling time, which are called performance metrics, meet system requirements. Markov chains are used to model the stochastic process associated with the different configurations that a system can adopt when failures occur. This methodology not only enables an integrated framework for evaluating dynamic performance and reliability of fault-tolerant systems, but also enables a method for guiding the system design process, and further optimization. To illustrate the methodology, we present a case-study of a lateral-directional flight control system for a fighter aircraft

  2. Robust design principles for reducing variation in functional performance

    DEFF Research Database (Denmark)

    Christensen, Martin Ebro; Howard, Thomas J.

    2016-01-01

    This paper identifies, describes and classifies a comprehensive collection of variation reduction principles (VRP) that can be used to increase the robustness of a product and reduce its variation in functional performance. Performance variation has a negative effect on the reliability and percei......This paper identifies, describes and classifies a comprehensive collection of variation reduction principles (VRP) that can be used to increase the robustness of a product and reduce its variation in functional performance. Performance variation has a negative effect on the reliability...... and perceived quality of a product and efforts should be made to minimise it. The design principles are identified by a systematic decomposition of the Taguchi Transfer Function in combination with the use of existing literature and the authors’ experience. The paper presents 15 principles and describes...... their advantages and disadvantages along with example cases. Subsequently, the principles are classified based on their applicability in the various development and production stages. The VRP are to be added to existing robust design methodologies, helping the designer to think beyond robust design tool and method...

  3. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  4. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs.

  5. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea

    2014-01-01

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs

  6. Methodological and Methodical Principles of the Empirical Study of Spiritual Development of a Personality

    Directory of Open Access Journals (Sweden)

    Olga Klymyshyn

    2017-06-01

    Full Text Available The article reveals the essence of the methodological principles of the spiritual development of a personality. The results of the theoretical analysis of psychological content of spirituality from the positions of system and structural approach to studying of a personality, age patterns of the mental personality development, the sacramental nature of human person, mechanisms of human spiritual development are taken into consideration. The interpretation of spirituality and the spiritual development of a personality is given. Initial principles of the organization of the empirical research of the spiritual development of a personality (ontogenetic, sociocultural, self-determination, system are presented. Such parameters of the estimation of a personality’s spiritual development as general index of the development of spiritual potential, indexes of the development of ethical, aesthetical, cognitive, existential components of spirituality, index of religiousness of a personality are described. Methodological support of psychological diagnostic research is defined.

  7. A reliability assessment methodology for the VHTR passive safety system

    International Nuclear Information System (INIS)

    Lee, Hyungsuk; Jae, Moosung

    2014-01-01

    The passive safety system of a VHTR (Very High Temperature Reactor), which has recently attracted worldwide attention, is currently being considered for the design of safety improvements for the next generation of nuclear power plants in Korea. The functionality of the passive system does not rely on an external source of an electrical support system, but on the intelligent use of natural phenomena. Its function involves an ultimate heat sink for a passive secondary auxiliary cooling system, especially during a station blackout such as the case of the Fukushima Daiichi reactor accidents. However, it is not easy to quantitatively evaluate the reliability of passive safety for the purpose of risk analysis, considering the existing active system failure since the classical reliability assessment method cannot be applied. Therefore, we present a new methodology to quantify the reliability based on reliability physics models. This evaluation framework is then applied to of the conceptually designed VHTR in Korea. The Response Surface Method (RSM) is also utilized for evaluating the uncertainty of the maximum temperature of nuclear fuel. The proposed method could contribute to evaluating accident sequence frequency and designing new innovative nuclear systems, such as the reactor cavity cooling system (RCCS) in VHTR to be designed and constructed in Korea.

  8. 4. Principles of Art from Antiquity to Contemporary Pedagogy in the Context of Methodology of Art Education

    Directory of Open Access Journals (Sweden)

    Olimpiada Arbuz-Spatari

    2016-03-01

    Full Text Available The methodologies of Art Education is a system of educational documents - principles, rules, methods, procedures, forms - designed determinative - reflective thinking from teleology, content, communication arts / cultural / scientific, reception and receiver topic communicating, and are subject oriented educated / creator student under the laws of education, communication and artistic principles.

  9. Basic principles of STT-MRAM cell operation in memory arrays

    International Nuclear Information System (INIS)

    Khvalkovskiy, A V; Apalkov, D; Watts, S; Chepulskii, R; Beach, R S; Ong, A; Tang, X; Driskill-Smith, A; Lottis, D; Chen, E; Nikitin, V; Krounbi, M; Butler, W H; Visscher, P B

    2013-01-01

    For reliable operation, individual cells of an STT-MRAM memory array must meet specific requirements on their performance. In this work we review some of these requirements and discuss the fundamental physical principles of STT-MRAM operation, covering the range from device level to chip array performance, and methodology for its development. (paper)

  10. A study on a reliability assessment methodology for the VHTR safety systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok

    2012-02-01

    The passive safety system of a 300MWt VHTR (Very High Temperature Reactor)which has attracted worldwide attention recently is actively considered for designing the improvement in the safety of the next generation nuclear power plant. The passive system functionality does not rely on an external source of the electrical support system,but on an intelligent use of the natural phenomena, such as convection, conduction, radiation, and gravity. It is not easy to evaluate quantitatively the reliability of the passive safety for the risk analysis considering the existing active system failure since the classical reliability assessment method could not be applicable. Therefore a new reliability methodology needs to be developed and applied for evaluating the reliability of the conceptual designed VHTR in this study. The preliminary evaluation and conceptualization are performed using the concept of the load and capacity theory related to the reliability physics model. The method of response surface method (RSM) is also utilized for evaluating the maximum temperature of nuclear fuel in this study. The significant variables and their correlation are considered for utilizing the GAMMA+ code. The proposed method might contribute to designing the new passive system of the VHTR

  11. Inter comparison of REPAS and APSRA methodologies for passive system reliability analysis

    International Nuclear Information System (INIS)

    Solanki, R.B.; Krishnamurthy, P.R.; Singh, Suneet; Varde, P.V.; Verma, A.K.

    2014-01-01

    The increasing use of passive systems in the innovative nuclear reactors puts demand on the estimation of the reliability assessment of these passive systems. The passive systems operate on the driving forces such as natural circulation, gravity, internal stored energy etc. which are moderately weaker than that of active components. Hence, phenomenological failures (virtual components) are equally important as that of equipment failures (real components) in the evaluation of passive systems reliability. The contribution of the mechanical components to the passive system reliability can be evaluated in a classical way using the available component reliability database and well known methods. On the other hand, different methods are required to evaluate the reliability of processes like thermohydraulics due to lack of adequate failure data. The research is ongoing worldwide on the reliability assessment of the passive systems and their integration into PSA, however consensus is not reached. Two of the most widely used methods are Reliability Evaluation of Passive Systems (REPAS) and Assessment of Passive System Reliability (APSRA). Both these methods characterize the uncertainties involved in the design and process parameters governing the function of the passive system. However, these methods differ in the quantification of passive system reliability. Inter comparison among different available methods provides useful insights into the strength and weakness of different methods. This paper highlights the results of the thermal hydraulic analysis of a typical passive isolation condenser system carried out using RELAP mode 3.2 computer code applying REPAS and APSRA methodologies. The failure surface is established for the passive system under consideration and system reliability has also been evaluated using these methods. Challenges involved in passive system reliabilities are identified, which require further attention in order to overcome the shortcomings of these

  12. A reach of the principle of entry and the principle of reliability in the real estate cadastre in our court practice

    Directory of Open Access Journals (Sweden)

    Cvetić Radenka M.

    2015-01-01

    Full Text Available Through the review of the principle of entry and the principle of reliability in the Real Estate Cadastre and their reach in our court practice, this article indicates the indispensability of compliance with these principles for the sake of legal certainty. A formidable and a complex role of the court when applying law in order to rightfully resolve an individual case has been underlined. Having regard to the accountability of the courts for the efficacy of the legal system, without any intention to disavow the court practice, some deficiencies have been pointed out, with the aim to help. An abstract manner of legal norms necessarily requires a creative role of courts in cases which cannot be easily qualified. For that reason certain deviations ought to be made followed by reasoning which unambiguously leads to the conclusion that only a specific decision which the court rendered is possible and just.

  13. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  14. Improved FTA methodology and application to subsea pipeline reliability design.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  15. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  16. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  17. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  18. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  19. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  20. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    International Nuclear Information System (INIS)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J.

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction

  1. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction.

  2. Reliability assessment of Passive Containment Cooling System of an Advanced Reactor using APSRA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Mukesh, E-mail: mukeshd@barc.gov.in [Reactor Engineering Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Chakravarty, Aranyak [School of Nuclear Studies and Application, Jadavpur University, Kolkata 700032 (India); Nayak, A.K. [Reactor Engineering Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Prasad, Hari; Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2014-10-15

    Highlights: • The paper deals with the reliability assessment of Passive Containment Cooling System of Advanced Heavy Water Reactor. • Assessment of Passive System ReliAbility (APSRA) methodology is used for reliability assessment. • Performance assessment of the PCCS is initially performed during a postulated design basis LOCA. • The parameters affecting the system performance are then identified and considered for further analysis. • The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. - Abstract: Passive Systems are increasingly playing a prominent role in the advanced nuclear reactor systems and are being utilised in normal operations as well as safety systems of the reactors following an accident. The Passive Containment Cooling System (PCCS) is one of the several passive safety features in an Advanced Reactor (AHWR). In this paper, the APSRA methodology has been employed for reliability evaluation of the PCCS of AHWR. Performance assessment of the PCCS is initially performed during a postulated design basis LOCA using the best-estimate code RELAP5/Mod 3.2. The parameters affecting the system performance are then identified and considered for further analysis. Based on some pre-determined failure criterion, the failure surface for the system is predicted using the best-estimate code taking into account the deviations of the identified parameters from their nominal states as well as the model uncertainties inherent to the best estimate code. Root diagnosis is then carried out to determine the various failure causes, which occurs mainly due to malfunctioning of mechanical components. The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. The reliability of the PCCS is then evaluated from the probability of availability of these components.

  3. Reliability assessment of Passive Containment Cooling System of an Advanced Reactor using APSRA methodology

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Chakravarty, Aranyak; Nayak, A.K.; Prasad, Hari; Gopika, V.

    2014-01-01

    Highlights: • The paper deals with the reliability assessment of Passive Containment Cooling System of Advanced Heavy Water Reactor. • Assessment of Passive System ReliAbility (APSRA) methodology is used for reliability assessment. • Performance assessment of the PCCS is initially performed during a postulated design basis LOCA. • The parameters affecting the system performance are then identified and considered for further analysis. • The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. - Abstract: Passive Systems are increasingly playing a prominent role in the advanced nuclear reactor systems and are being utilised in normal operations as well as safety systems of the reactors following an accident. The Passive Containment Cooling System (PCCS) is one of the several passive safety features in an Advanced Reactor (AHWR). In this paper, the APSRA methodology has been employed for reliability evaluation of the PCCS of AHWR. Performance assessment of the PCCS is initially performed during a postulated design basis LOCA using the best-estimate code RELAP5/Mod 3.2. The parameters affecting the system performance are then identified and considered for further analysis. Based on some pre-determined failure criterion, the failure surface for the system is predicted using the best-estimate code taking into account the deviations of the identified parameters from their nominal states as well as the model uncertainties inherent to the best estimate code. Root diagnosis is then carried out to determine the various failure causes, which occurs mainly due to malfunctioning of mechanical components. The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. The reliability of the PCCS is then evaluated from the probability of availability of these components

  4. On the complex analysis of the reliability, safety, and economic efficiency of atomic electric power stations

    International Nuclear Information System (INIS)

    Emel'yanov, I.Ya.; Klemin, A.I.; Polyakov, E.F.

    1977-01-01

    The problem is posed of effectively increasing the engineering performance of nuclear electric power stations (APS). The principal components of the engineering performance of modern large APS are considered: economic efficiency, radiation safety, reliability, and their interrelationship. A nomenclature is proposed for the quantitative indices which most completely characterize the enumerated properties and are convenient for the analysis of the engineering performance. The urgent problem of developing a methodology for the complex analysis and optimization of the principal performance components is considered; this methodology is designed to increase the efficiency of the work on high-performance competitive APS. The principle of complex optimization of the reliability, safety, and economic-efficiency indices is formulated; specific recommendations are made for the practical realization of this principle. The structure of the complex quantiative analysis of the enumerated performance components is given. The urgency and promise of the complex approach to solving the problem of APS optimization is demonstrated, i.e., the solution of the problem of creating optimally reliable, fairly safe, and maximally economically efficient stations

  5. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  6. Methodology for risk assessment and reliability applied for pipeline engineering design and industrial valves operation

    Energy Technology Data Exchange (ETDEWEB)

    Silveira, Dierci [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Escola de Engenharia Industrial e Metalurgia. Lab. de Sistemas de Producao e Petroleo e Gas], e-mail: dsilveira@metal.eeimvr.uff.br; Batista, Fabiano [CICERO, Rio das Ostras, RJ (Brazil)

    2009-07-01

    Two kinds of situations may be distinguished for estimating the operating reliability when maneuvering industrial valves and the probability of undesired events in pipelines and industrial plants: situations in which the risk is identified in repetitive cycles of operations and situations in which there is a permanent hazard due to project configurations introduced by decisions during the engineering design definition stage. The estimation of reliability based on the influence of design options requires the choice of a numerical index, which may include a composite of human operating parameters based on biomechanics and ergonomics data. We first consider the design conditions under which the plant or pipeline operator reliability concepts can be applied when operating industrial valves, and then describe in details the ergonomics and biomechanics risks that would lend itself to engineering design database development and human reliability modeling and assessment. This engineering design database development and reliability modeling is based on a group of engineering design and biomechanics parameters likely to lead to over-exertion forces and working postures, which are themselves associated with the functioning of a particular plant or pipeline. This approach to construct based on ergonomics and biomechanics for a more common industrial valve positioning in the plant layout is proposed through the development of a methodology to assess physical efforts and operator reach, combining various elementary operations situations. These procedures can be combined with the genetic algorithm modeling and four elements of the man-machine systems: the individual, the task, the machinery and the environment. The proposed methodology should be viewed not as competing to traditional reliability and risk assessment bur rather as complementary, since it provides parameters related to physical efforts values for valves operation and workspace design and usability. (author)

  7. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  8. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  9. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  10. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  11. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  12. Developing "Personality" Taxonomies: Metatheoretical and Methodological Rationales Underlying Selection Approaches, Methods of Data Generation and Reduction Principles.

    Science.gov (United States)

    Uher, Jana

    2015-12-01

    Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".

  13. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  14. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  15. Bulk Fuel Pricing: DOD Needs to Take Additional Actions to Establish a More Reliable Methodology

    Science.gov (United States)

    2015-11-19

    Page 1 GAO-16-78R Bulk Fuel Pricing 441 G St. N.W. Washington, DC 20548 November 19, 2015 The Honorable Ashton Carter The Secretary of...Defense Bulk Fuel Pricing : DOD Needs to Take Additional Actions to Establish a More Reliable Methodology Dear Secretary Carter: Each fiscal...year, the Office of the Under Secretary of Defense (Comptroller), in coordination with the Defense Logistics Agency, sets a standard price per barrel

  16. Methodology for sodium fire vulnerability assessment of sodium cooled fast reactor based on the Monte-Carlo principle

    Energy Technology Data Exchange (ETDEWEB)

    Song, Wei [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China); Wu, Yuanyu [ITER Organization, Route de Vinon-sur-Verdon, 13115 Saint-Paul-lès-Durance (France); Hu, Wenjun [China Institute of Atomic Energy, P. O. Box 275(34), Beijing (China); Zuo, Jiaxu, E-mail: zuojiaxu@chinansc.cn [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China)

    2015-11-15

    Highlights: • Monte-Carlo principle coupling with fire dynamic code is adopted to perform sodium fire vulnerability assessment. • The method can be used to calculate the failure probability of sodium fire scenarios. • A calculation example and results are given to illustrate the feasibility of the methodology. • Some critical parameters and experience are shared. - Abstract: Sodium fire is a typical and distinctive hazard in sodium cooled fast reactors, which is significant for nuclear safety. In this paper, a method of sodium fire vulnerability assessment based on the Monte-Carlo principle was introduced, which could be used to calculate the probabilities of every failure mode in sodium fire scenarios. After that, the sodium fire scenario vulnerability assessment of primary cold trap room of China Experimental Fast Reactor was performed to illustrate the feasibility of the methodology. The calculation result of the example shows that the conditional failure probability of key cable is 23.6% in the sodium fire scenario which is caused by continuous sodium leakage because of the isolation device failure, but the wall temperature, the room pressure and the aerosol discharge mass are all lower than the safety limits.

  17. Methodology for sodium fire vulnerability assessment of sodium cooled fast reactor based on the Monte-Carlo principle

    International Nuclear Information System (INIS)

    Song, Wei; Wu, Yuanyu; Hu, Wenjun; Zuo, Jiaxu

    2015-01-01

    Highlights: • Monte-Carlo principle coupling with fire dynamic code is adopted to perform sodium fire vulnerability assessment. • The method can be used to calculate the failure probability of sodium fire scenarios. • A calculation example and results are given to illustrate the feasibility of the methodology. • Some critical parameters and experience are shared. - Abstract: Sodium fire is a typical and distinctive hazard in sodium cooled fast reactors, which is significant for nuclear safety. In this paper, a method of sodium fire vulnerability assessment based on the Monte-Carlo principle was introduced, which could be used to calculate the probabilities of every failure mode in sodium fire scenarios. After that, the sodium fire scenario vulnerability assessment of primary cold trap room of China Experimental Fast Reactor was performed to illustrate the feasibility of the methodology. The calculation result of the example shows that the conditional failure probability of key cable is 23.6% in the sodium fire scenario which is caused by continuous sodium leakage because of the isolation device failure, but the wall temperature, the room pressure and the aerosol discharge mass are all lower than the safety limits.

  18. Functional components for a design strategy: Hot cell shielding in the high reliability safeguards methodology

    Energy Technology Data Exchange (ETDEWEB)

    Borrelli, R.A., E-mail: rborrelli@uidaho.edu

    2016-08-15

    The high reliability safeguards (HRS) methodology has been established for the safeguardability of advanced nuclear energy systems (NESs). HRS is being developed in order to integrate safety, security, and safeguards concerns, while also optimizing these with operational goals for facilities that handle special nuclear material (SNM). Currently, a commercial pyroprocessing facility is used as an example system. One of the goals in the HRS methodology is to apply intrinsic features of the system to a design strategy. This current study investigates the thickness of the hot cell walls that could adequately shield processed materials. This is an important design consideration that carries implications regarding the formation of material balance areas, the location of key measurement points, and material flow in the facility.

  19. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Dreyhaupt, Jens

    2017-05-01

    Full Text Available An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”. Compared with studies with individual randomization, studies with cluster randomization normally require (significantly larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies.Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  20. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    Science.gov (United States)

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  1. Design for reliability: NASA reliability preferred practices for design and test

    Science.gov (United States)

    Lalli, Vincent R.

    1994-01-01

    This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.

  2. Reliability Centered Maintenance (RCM) Methodology and Application to the Shutdown Cooling System for APR-1400 Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Faragalla, Mohamed M.; Emmanuel, Efenji; Alhammadi, Ibrahim; Awwal, Arigi M.; Lee, Yong Kwan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    Shutdown Cooling System (SCS) is a safety-related system that is used in conjunction with the Main Steam and Main or Auxiliary Feedwater Systems to reduce the temperature of the Reactor Coolant System (RCS) in post shutdown periods from the hot shutdown operating temperature to the refueling temperature. In this paper RCM methodology is applied to (SCS). RCM analysis is performed based on evaluation of Failure Modes Effects and Criticality Analysis (FME and CA) on the component, system and plant. The Logic Tree Analysis (LTA) is used to determine the optimum maintenance tasks. The main objectives of RCM is the safety, preserve the System function, the cost-effective maintenance of the plant components and increase the reliability and availability value. The RCM methodology is useful for improving the equipment reliability by strengthening the management of equipment condition, and leads to a significant decrease in the number of periodical maintenance, extended maintenance cycle, longer useful life of equipment, and decrease in overall maintenance cost. It also focuses on the safety of the system by assigning criticality index to the various components and further selecting maintenance activities based on the risk of failure involved. Therefore, it can be said that RCM introduces a maintenance plan designed for maximum safety in an economical manner and making the system more reliable. For the SCP, increasing the number of condition monitoring tasks will improve the availability of the SCP. It is recommended to reduce the number of periodic maintenance activities.

  3. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  4. Preparation of methodology for reliability analysis of selected digital segments of the instrumentation and control systems of NPPs. Pt. 1

    International Nuclear Information System (INIS)

    Hustak, S.; Patrik, M.; Babic, P.

    2000-12-01

    The report is structured as follows: (i) Introduction; (ii) Important notions relating to the safety and dependability of software systems for nuclear power plants (selected notions from IAEA Technical Report No. 397; safety aspects of software application; reliability/dependability aspects of digital systems); (iii) Peculiarities of digital systems and ways to a dependable performance of the required function (failures in the system and principles of defence against them; ensuring resistance of digital systems against failures at various hardware and software levels); (iv) The issue of analytical procedures to assess the safety and reliability of safety-related digital systems (safety and reliability assessment at an early stage of the project; general framework of reliability analysis of complex systems; choice of an appropriate quantitative measure of software reliability); (v) Selected qualitative and quantitative information about the reliability of digital systems; the use of relations between the incidence of various types of faults); and (vi) Conclusions and recommendations. (P.A.)

  5. Health economic evaluation: important principles and methodology.

    Science.gov (United States)

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  6. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  7. Development of a methodology for conducting an integrated HRA/PRA --. Task 1, An assessment of human reliability influences during LP&S conditions PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., McLean, VA (United States)

    1993-06-01

    During Low Power and Shutdown (LP&S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant`s systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP&S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP&S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP&S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP&S, (2) identification of potentially important LP&S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP&S conditions for a pressurized water reactor (PWR).

  8. METHODOLOGICAL PRINCIPLES OF FORMING REPERTOIRE OF STUDENTS’ FOLK INSTRUMENTAL ORCHESTRA

    Directory of Open Access Journals (Sweden)

    Mykola Pshenychnykh

    2016-11-01

    Full Text Available One of the main aspects of forming future music teachers’ professional competence, connected with mastering professional musical and performing skills in the course “Orchestra Class” and realized in the activity of students’ performing group, is revealed. Nowadays the problem of creative personality development is relevant, as creative future music art teachers freely orient themselves and guide pupils students in today's cultural environment, music and media space, have a strong musical taste and aesthetic guidelines. The music genre groups have been characterized in the article. It is thought that these groups are the traditional components of repertoire of folk and orchestra student groups: arrangements of folk tunes; works of Ukrainian and world classics, orchestrated for the folk groups, taking into account each orchestra performing possibilities; works by contemporary authors, written specifically for the orchestra of folk instruments. The main methodological principles of selecting the repertoire for the student orchestra of folk instruments are disclosed, including: technical, artistic and performing capabilities of student groups; involvement of works of different genres into the repertoire; correspondence of orchestra scores to instrumental composition of the student orchestra, and their correction if it is necessary; selecting works, whose performing arouses interest of the student audience; using the experience of the leading professional ensembles of folk instruments; constant updating the orchestra's repertoire. In the conclusion the author emphasizes that taking into account the methodological tips helps solve the main tasks within the course of “Orchestra Class”. These tips are the following: students’ acquaintance with the history of foundation, composition, ways of musicianship, technique of playing the instrument of folk instrument orchestra and acquaintance with specific orchestral music; development of all

  9. State of the art of probabilistic safety analysis (PSA) in the FRG, and principles of a PSA-guideline

    International Nuclear Information System (INIS)

    Balfanz, H.P.

    1987-01-01

    Contents of the articles: Survey of PSA performed during licensing procedures of an NPP; German Nuclear Standards' requirements on the reliability of safety systems; PSA-guideline for NPP: Principles and suggestions; Motivation and tasks of PSA; Aspects of the methodology of safety analyses; Structure of event tree and fault tree analyses; Extent of safety analyses; Performance and limits of PSA. (orig./HSCH)

  10. Field programmable gate array reliability analysis using the dynamic flow graph methodology

    Energy Technology Data Exchange (ETDEWEB)

    McNelles, Phillip; Lu, Lixuan [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario (Canada)

    2016-10-15

    Field programmable gate array (FPGA)-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs), for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM). It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the 'IEEE 1164 standard', registers (D flip-flops), configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  11. Application of REPAS Methodology to Assess the Reliability of Passive Safety Systems

    Directory of Open Access Journals (Sweden)

    Franco Pierro

    2009-01-01

    Full Text Available The paper deals with the presentation of the Reliability Evaluation of Passive Safety System (REPAS methodology developed by University of Pisa. The general objective of the REPAS is to characterize in an analytical way the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems. The REPAS can be used in the design of the passive safety systems to assess their goodness and to optimize their costs. It may also provide numerical values that can be used in more complex safety assessment studies and it can be seen as a support to Probabilistic Safety Analysis studies. With regard to this, some examples in the application of the methodology are reported in the paper. A best-estimate thermal-hydraulic code, RELAP5, has been used to support the analyses and to model the selected systems. Probability distributions have been assigned to the uncertain input parameters through engineering judgment. Monte Carlo method has been used to propagate uncertainties and Wilks' formula has been taken into account to select sample size. Failure criterions are defined in terms of nonfulfillment of the defined design targets.

  12. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  13. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  14. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  15. A first-principles generic methodology for representing the knowledge base of a process diagnostic expert system

    International Nuclear Information System (INIS)

    Reifman, J.; Briggs, L.L.; Wei, T.Y.C.

    1990-01-01

    In this paper we present a methodology for identifying faulty component candidates of process malfunctions through basic physical principles of conservation, functional classification of components and information from the process schematics. The basic principles of macroscopic balance of mass, momentum and energy in thermal hydraulic control volumes are applied in a novel approach to incorporate deep knowledge into the knowledge base. Additional deep knowledge is incorporated through the functional classification of process components according to their influence in disturbing the macroscopic balance equations. Information from the process schematics is applied to identify the faulty component candidates after the type of imbalance in the control volumes is matched against the functional classification of the components. Except for the information from the process schematics, this approach is completely general and independent of the process under consideration. The use of basic first-principles, which are physically correct, and the process-independent architecture of the diagnosis procedure allow for the verification and validation of the system. A prototype process diagnosis expert system is developed and a test problem is presented to identify faulty component candidates in the presence of a single failure in a hypothetical balance of plant of a liquid metal nuclear reactor plant

  16. Robust Reliability or reliable robustness? - Integrated consideration of robustness and reliability aspects

    DEFF Research Database (Denmark)

    Kemmler, S.; Eifler, Tobias; Bertsche, B.

    2015-01-01

    products are and vice versa. For a comprehensive understanding and to use existing synergies between both domains, this paper discusses the basic principles of Reliability- and Robust Design theory. The development of a comprehensive model will enable an integrated consideration of both domains...

  17. Seeking high reliability in primary care: Leadership, tools, and organization.

    Science.gov (United States)

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an

  18. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  19. The DYLAM approach to systems safety and reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.

    1988-01-01

    A survey of the principal features and applications of DYLAM (Dynamic Logical Analytical Methodology) is presented, whose basic principles can be summarized as follows: after a particular modelling of the component states, computerized heuristical procedures generate stochastic configurations of the system, whereas the resulting physical processes are simultaneously simulated to give account of the possible interactions between physics and states and, on the other hand, to search for system dangerous configurations and related probabilities. The association of probabilistic techniques for describing the states with physical equations for describing the process results in a very powerful tool for safety and reliability assessment of systems potentially subjected to dangerous incidental transients. A comprehensive picture of DYLAM capability for manifold applications can be obtained by the review of the study cases analyzed (LMFBR core accident, systems reliability assessment, accident simulation, man-machine interaction analysis, chemical reactors safety, etc.)

  20. Study of evaluation techniques of software safety and reliability in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Park, N. J.; Shin, C. Y. [Chungnam National Univ., Taejon (Korea, Republic of)

    1999-04-15

    Software system development process and software quality assurance activities are examined in this study. Especially software safety and reliability requirements in nuclear power plant are investigated. For this purpose methodologies and tools which can be applied to software analysis, design, implementation, testing, maintenance step are evaluated. Necessary tasks for each step are investigated. Duty, input, and detailed activity for each task are defined to establish development process of high quality software system. This means applying basic concepts of software engineering and principles of system development. This study establish a guideline that can assure software safety and reliability requirements in digitalized nuclear plant systems and can be used as a guidebook of software development process to assure software quality many software development organization.

  1. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  2. Development of a highly reliable CRT processor

    International Nuclear Information System (INIS)

    Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya

    1996-01-01

    Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)

  3. A novel application for the cavalieri principle: a stereological and methodological study.

    Science.gov (United States)

    Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin

    2009-08-01

    The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method.

  4. Public views on principles for health care priority setting: findings of a European cross-country study using Q methodology.

    Science.gov (United States)

    van Exel, Job; Baker, Rachel; Mason, Helen; Donaldson, Cam; Brouwer, Werner

    2015-02-01

    Resources available to the health care sector are finite and typically insufficient to fulfil all the demands for health care in the population. Decisions must be made about which treatments to provide. Relatively little is known about the views of the general public regarding the principles that should guide such decisions. We present the findings of a Q methodology study designed to elicit the shared views in the general public across ten countries regarding the appropriate principles for prioritising health care resources. In 2010, 294 respondents rank ordered a set of cards and the results of these were subject to by-person factor analysis to identify common patterns in sorting. Five distinct viewpoints were identified, (I) "Egalitarianism, entitlement and equality of access"; (II) "Severity and the magnitude of health gains"; (III) "Fair innings, young people and maximising health benefits"; (IV) "The intrinsic value of life and healthy living"; (V) "Quality of life is more important than simply staying alive". Given the plurality of views on the principles for health care priority setting, no single equity principle can be used to underpin health care priority setting. Hence, the process of decision making becomes more important, in which, arguably, these multiple perspectives in society should be somehow reflected. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Multi-Disciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  6. The analysis of pricing principles at domestic industrial enterprises

    OpenAIRE

    I.M. Rjabchenko; V.V. Bozhkova

    2013-01-01

    The analysis of pricing principles at domestic industrial enterprisesTheoretical and methodological aspects of marketing pricing formation are investigated in the article. The aim of this research is systematization of marketing pricing principles and formation of corresponding offers concerning perfection of a domestic industrial enterprises pricing policy.The results of the analysis. The authors note that pricing principles are important element of pricing methodology which form basic posit...

  7. Reliability Assessment and Reliability-Based Inspection and Maintenance of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Ramírez, José G. Rangel; Sørensen, John Dalsgaard

    2009-01-01

    Probabilistic methodologies represent an important tool to identify the suitable strategy to inspect and deal with the deterioration in structures such as offshore wind turbines (OWT). Reliability based methods such as Risk Based Inspection (RBI) planning may represent a proper methodology to opt...

  8. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  9. Reliability and Probabilistic Risk Assessment - How They Play Together

    Science.gov (United States)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will

  10. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  11. Modern electronic maintenance principles

    CERN Document Server

    Garland, DJ

    2013-01-01

    Modern Electronic Maintenance Principles reviews the principles of maintaining modern, complex electronic equipment, with emphasis on preventive and corrective maintenance. Unfamiliar subjects such as the half-split method of fault location, functional diagrams, and fault finding guides are explained. This book consists of 12 chapters and begins by stressing the need for maintenance principles and discussing the problem of complexity as well as the requirements for a maintenance technician. The next chapter deals with the connection between reliability and maintenance and defines the terms fai

  12. Design for ASIC reliability for low-temperature applications

    Science.gov (United States)

    Chen, Yuan; Mojaradi, Mohammad; Westergard, Lynett; Billman, Curtis; Cozy, Scott; Burke, Gary; Kolawa, Elizabeth

    2005-01-01

    In this paper, we present a methodology to design for reliability for low temperature applications without requiring process improvement. The developed hot carrier aging lifetime projection model takes into account both the transistor substrate current profile and temperature profile to determine the minimum transistor size needed in order to meet reliability requirements. The methodology is applicable for automotive, military, and space applications, where there can be varying temperature ranges. A case study utilizing this methodology is given to design for reliability into a custom application-specific integrated circuit (ASIC) for a Mars exploration mission.

  13. Developing principles of growth

    DEFF Research Database (Denmark)

    Neergaard, Helle; Fleck, Emma

    of the principles of growth among women-owned firms. Using an in-depth case study methodology, data was collected from women-owned firms in Denmark and Ireland, as these countries are similar in contextual terms, e.g. population and business composition, dominated by micro, small and medium-sized enterprises....... Extending on principles put forward in effectuation theory, we propose that women grow their firms according to five principles which enable women’s enterprises to survive in the face of crises such as the current financial world crisis....

  14. Application case study of AP1000 automatic depressurization system (ADS) for reliability evaluation by GO-FLOW methodology

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Muhammad, E-mail: hashimsajid@yahoo.com; Hidekazu, Yoshikawa, E-mail: yosikawa@kib.biglobe.ne.jp; Takeshi, Matsuoka, E-mail: mats@cc.utsunomiya-u.ac.jp; Ming, Yang, E-mail: myang.heu@gmail.com

    2014-10-15

    Highlights: • Discussion on reasons why AP1000 equipped with ADS system comparatively to PWR. • Clarification of full and partial depressurization of reactor coolant system by ADS system. • Application case study of four stages ADS system for reliability evaluation in LBLOCA. • GO-FLOW tool is capable to evaluate dynamic reliability of passive safety systems. • Calculated ADS reliability result significantly increased dynamic reliability of PXS. - Abstract: AP1000 nuclear power plant (NPP) utilized passive means for the safety systems to ensure its safety in events of transient or severe accidents. One of the unique safety systems of AP1000 to be compared with conventional PWR is the “four stages Automatic Depressurization System (ADS)”, and ADS system originally works as an active safety system. In the present study, authors first discussed the reasons of why four stages ADS system is added in AP1000 plant to be compared with conventional PWR in the aspect of reliability. And then explained the full and partial depressurization of RCS system by four stages ADS in events of transient and loss of coolant accidents (LOCAs). Lastly, the application case study of four stages ADS system of AP1000 has been conducted in the aspect of reliability evaluation of ADS system under postulated conditions of full RCS depressurization during large break loss of a coolant accident (LBLOCA) in one of the RCS cold legs. In this case study, the reliability evaluation is made by GO-FLOW methodology to determinate the influence of ADS system in dynamic reliability of passive core cooling system (PXS) of AP1000, i.e. what will happen if ADS system fails or successfully actuate. The GO-FLOW is success-oriented reliability analysis tool and is capable to evaluating the systems reliability/unavailability alternatively to Fault Tree Analysis (FTA) and Event Tree Analysis (ETA) tools. Under these specific conditions of LBLOCA, the GO-FLOW calculated reliability results indicated

  15. THE GENERAL METHODOLOGICAL PRINCIPLES OF COMBINED OPTIONAL ONLINE ENGLISH LANGUAGE TRAINING OF PRIMARY SCHOOL STUDENTS

    Directory of Open Access Journals (Sweden)

    E. I. Zadorozhnaya

    2016-01-01

    Full Text Available The aim of the publication is to demonstrate the implementation of general methodological principles of optional elementary school online foreign languages learning on an example of a virtual course for students of the second and third grades.Methods. The methods involve pedagogical modeling and projecting; the experience of foreign and Russian methodists, teachers and researchers is analysed, generalized and adjusted to the modern realias.Results and scientific novelty. On the basis of the requirements of the state educational standard and interest of pupils in computer games, the author’s technique of the combined facultative educational activities integrated to training in English at elementary school is developed. Online training in the form of games (additional to the major classroom activities gives a possibility of the choice of tasks interesting to children, studying the material at optimum comfortable and individual speed; it is possible to perform the tasks at home excluding the stressful situations that are specific to school examination, and allows pupils to master most effectively personal, metasubject and object competences. In general context of quality improvement of the general education, the modernization of educational process assumes not only justification of its new maintenance, but also restructuring of scientific and methodical support which has to meet essential needs of teachers and pupils, to facilitate access to necessary specific information. The lack of methodical base of creation of electronic distance resources for foreign-language education of younger school students has motivated the author to create own methodical concept of online training taking into account age of pupils. The complex of the general methodical principles is thoroughly considered; based on the general methodical principles, the proposed modular technique of the organization of an online class is created and implemented. Interactive blocks are

  16. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  17. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  18. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  19. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  20. Standards in reliability and safety engineering

    International Nuclear Information System (INIS)

    O'Connor, Patrick

    1998-01-01

    This article explains how the highest 'world class' levels of reliability and safety are achieved, by adherence to the basic principles of excellence in design, production, support and maintenance, by continuous improvement, and by understanding that excellence and improvement lead to reduced costs. These principles are contrasted with the methods that have been developed and standardised, particularly military standards for reliability, ISO9000, and safety case regulations. The article concludes that the formal, standardised approaches are misleading and counterproductive, and recommends that they be replaced by a philosophy based on the realities of human performance

  1. A technical survey on issues of the quantitative evaluation of software reliability

    International Nuclear Information System (INIS)

    Park, J. K; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K. Y.; Park, J. K.

    2000-04-01

    To develop the methodology for evaluating the software reliability included in digital instrumentation and control system (I and C), many kinds of methodologies/techniques that have been proposed from the software reliability engineering fuel are analyzed to identify the strong and week points of them. According to analysis results, methodologies/techniques that can be directly applied for the evaluation of the software reliability are not exist. Thus additional researches to combine the most appropriate methodologies/techniques from existing ones would be needed to evaluate the software reliability. (author)

  2. A Review: Passive System Reliability Analysis – Accomplishments and Unresolved Issues

    Energy Technology Data Exchange (ETDEWEB)

    Nayak, Arun Kumar, E-mail: arunths@barc.gov.in [Reactor Engineering Division, Reactor Design and Development Group, Bhabha Atomic Research Centre, Mumbai (India); Chandrakar, Amit [Homi Bhabha National Institute, Mumbai (India); Vinod, Gopika [Reactor Safety Division, Reactor Design and Development Group, Bhabha Atomic Research Centre, Mumbai (India)

    2014-10-10

    Reliability assessment of passive safety systems is one of the important issues, since safety of advanced nuclear reactors rely on several passive features. In this context, a few methodologies such as reliability evaluation of passive safety system (REPAS), reliability methods for passive safety functions (RMPS), and analysis of passive systems reliability (APSRA) have been developed in the past. These methodologies have been used to assess reliability of various passive safety systems. While these methodologies have certain features in common, but they differ in considering certain issues; for example, treatment of model uncertainties, deviation of geometric, and process parameters from their nominal values. This paper presents the state of the art on passive system reliability assessment methodologies, the accomplishments, and remaining issues. In this review, three critical issues pertaining to passive systems performance and reliability have been identified. The first issue is applicability of best estimate codes and model uncertainty. The best estimate codes based phenomenological simulations of natural convection passive systems could have significant amount of uncertainties, these uncertainties must be incorporated in appropriate manner in the performance and reliability analysis of such systems. The second issue is the treatment of dynamic failure characteristics of components of passive systems. REPAS, RMPS, and APSRA methodologies do not consider dynamic failures of components or process, which may have strong influence on the failure of passive systems. The influence of dynamic failure characteristics of components on system failure probability is presented with the help of a dynamic reliability methodology based on Monte Carlo simulation. The analysis of a benchmark problem of Hold-up tank shows the error in failure probability estimation by not considering the dynamism of components. It is thus suggested that dynamic reliability methodologies must be

  3. Risk assessment and reliability for low level radioactive waste disposal

    International Nuclear Information System (INIS)

    Gregory, P.O.; Jones, G.A.

    1986-01-01

    The reliability of critical design features at low-level radioactive waste disposal facilities is a major concern in the licensing of these structures. To date, no systematic methodology has been adopted to evaluate the geotechnical reliability of Uranium Mill Tailings Remedial Action (UMTRA) disposal facilities currently being designed and/or constructed. This paper discusses and critiques the deterministic methods currently used to evaluate UMTRA reliability. Because deterministic methods may not be applicable in some cases because of the unusually long design life of UMTRA facilities, it is proposed that a probabilistic risk assessment-based methodology be used as a secondary method to aid in the evaluating of geotechnical reliability of critical items. Similar methodologies have proven successful in evaluating the reliability of a variety of conventional earth structures. In this paper, an ''acceptable'' level of risk for UMTRA facilities is developed, an evaluation method is presented, and two example applications of the proposed methodology are provided for a generic UMTRA disposal facility. The proposed technique is shown to be a simple method which might be used to aid in reliability evaluations on a selective basis. Finally, other possible applications and the limitations of the proposed methodology are discussed

  4. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  5. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  6. Methodological basis for formation of uniterruptible education content for future specialists of atomic-nuclear complex

    International Nuclear Information System (INIS)

    Burtebayev, N.; Burtebayeva, J.T.; Basharuly, R.; Altynsarin, Y.

    2009-01-01

    Full text: For science-reliable determination of the content of uninterruptible education system, as a rule, the following levels of theoretical-methodological approach are used in complex: 1) science-wide methodological level based on the dialectical laws of knowledge theory; 2) science-wide methodological level based on the principles and the provisions of system analysis; 3) particular science methodological level based on the laws and the principles of any specific science [1]. Such holistic approach covering all levels of science methodology is required for determination of the content of uninterruptible education for future specialists of nuclear profile. Indeed, considering the problem related to the content of uninterruptible education from the point of the first science-wide methodological level we shall follow primary the requirements of dialectical 'Law of common, special and single unity', where firstly the universal values in science, culture and technology forming the united invariant of education content of the world education space is positioned as the 'common' component of uninterruptible education content; secondly, the theoretical-practical achievements gained in the countries of any region (for example Eurasian space) are positioned as the 'special' component of the content for the training of the specialists of nuclear profile; thirdly, the content elements determined in accordance with socio-economic order of the specific society introducing the national interests of the specific country (for example, Republic of Kazakhstan) are positioned as the 'single' component of the education content for the future specialists of atomic-nuclear complex. Inseparable unity of the above mentioned components of the education content which have been determined in accordance with the laws, principles and provisions of all three levels of science-methodological approach assures the high level competence and the functional mobility of nuclear profile specialist

  7. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  8. Evaluation of methodologies for remunerating wind power's reliability in Colombia

    International Nuclear Information System (INIS)

    Botero B, Sergio; Isaza C, Felipe; Valencia, Adriana

    2010-01-01

    Colombia strives to have enough firm capacity available to meet unexpected power shortages and peak demand; this is clear from mechanisms currently in place that provide monetary incentives (in the order of nearly US$ 14/MW h) to power producers that can guarantee electricity provision during scarcity periods. Yet, wind power in Colombia is not able to currently guarantee firm power because an accepted methodology to calculate its potential firm capacity does not exist. In this paper we argue that developing such methodology would provide an incentive to potential investors to enter into this low carbon technology. This paper analyzes three methodologies currently used in energy markets around the world to calculate firm wind energy capacity: PJM, NYISO, and Spain. These methodologies are initially selected due to their ability to accommodate to the Colombian energy regulations. The objective of this work is to determine which of these methodologies makes most sense from an investor's perspective, to ultimately shed light into developing a methodology to be used in Colombia. To this end, the authors developed a methodology consisting on the elaboration of a wind model using the Monte-Carlo simulation, based on known wind behaviour statistics of a region with adequate wind potential in Colombia. The simulation gives back random generation data, representing the resource's inherent variability and simulating the historical data required to evaluate the mentioned methodologies, thus achieving the technology's theoretical generation data. The document concludes that the evaluated methodologies are easy to implement and that these do not require historical data (important for Colombia, where there is almost no historical wind power data). It is also found that the Spanish methodology provides a higher Capacity Value (and therefore a higher return to investors). The financial assessment results show that it is crucial that these types of incentives exist to make viable

  9. An analytical framework for reliability growth of one-shot systems

    International Nuclear Information System (INIS)

    Hall, J. Brian; Mosleh, Ali

    2008-01-01

    In this paper, we introduce a new reliability growth methodology for one-shot systems that is applicable to the case where all corrective actions are implemented at the end of the current test phase. The methodology consists of four model equations for assessing: expected reliability, the expected number of failure modes observed in testing, the expected probability of discovering new failure modes, and the expected portion of system unreliability associated with repeat failure modes. These model equations provide an analytical framework for which reliability practitioners can estimate reliability improvement, address goodness-of-fit concerns, quantify programmatic risk, and assess reliability maturity of one-shot systems. A numerical example is given to illustrate the value and utility of the presented approach. This methodology is useful to program managers and reliability practitioners interested in applying the techniques above in their reliability growth program

  10. Functional principles of registry-based service discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Tan, C.; Hartel, P.H.; Hartog, den J.I.; Scholten, J.

    2005-01-01

    As Service Discovery Protocols (SDP) are becoming increasingly important for ubiquitous computing, they must behave according to predefined principles. We present the functional Principles of Service Discovery for robust, registry-based service discovery. A methodology to guarantee adherence to

  11. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  12. Validity and reliability of using photography for measuring knee range of motion: a methodological study

    Directory of Open Access Journals (Sweden)

    Adie Sam

    2011-04-01

    Full Text Available Abstract Background The clinimetric properties of knee goniometry are essential to appreciate in light of its extensive use in the orthopaedic and rehabilitative communities. Intra-observer reliability is thought to be satisfactory, but the validity and inter-rater reliability of knee goniometry often demonstrate unacceptable levels of variation. This study tests the validity and reliability of measuring knee range of motion using goniometry and photographic records. Methods Design: Methodology study assessing the validity and reliability of one method ('Marker Method' which uses a skin marker over the greater trochanter and another method ('Line of Femur Method' which requires estimation of the line of femur. Setting: Radiology and orthopaedic departments of two teaching hospitals. Participants: 31 volunteers (13 arthritic and 18 healthy subjects. Knee range of motion was measured radiographically and photographically using a goniometer. Three assessors were assessed for reliability and validity. Main outcomes: Agreement between methods and within raters was assessed using concordance correlation coefficient (CCCs. Agreement between raters was assessed using intra-class correlation coefficients (ICCs. 95% limits of agreement for the mean difference for all paired comparisons were computed. Results Validity (referenced to radiographs: Each method for all 3 raters yielded very high CCCs for flexion (0.975 to 0.988, and moderate to substantial CCCs for extension angles (0.478 to 0.678. The mean differences and 95% limits of agreement were narrower for flexion than they were for extension. Intra-rater reliability: For flexion and extension, very high CCCs were attained for all 3 raters for both methods with slightly greater CCCs seen for flexion (CCCs varied from 0.981 to 0.998. Inter-rater reliability: For both methods, very high ICCs (min to max: 0.891 to 0.995 were obtained for flexion and extension. Slightly higher coefficients were obtained

  13. Future of structural reliability methodology in nuclear power plant technology

    Energy Technology Data Exchange (ETDEWEB)

    Schueeller, G I [Technische Univ. Muenchen (Germany, F.R.); Kafka, P [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany, F.R.)

    1978-10-01

    This paper presents the authors' personal view as to which areas of structural reliability in nuclear power plant design need most urgently to be advanced. Aspects of simulation modeling, design rules, codification and specification of reliability, system analysis, probabilistic structural dynamics, rare events and particularly the interaction of systems and structural reliability are discussed. As an example, some considerations of the interaction effects between the protective systems and the pressure vessel are stated. The paper concludes with recommendation for further research.

  14. Development of RBDGG Solver and Its Application to System Reliability Analysis

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    For the purpose of making system reliability analysis easier and more intuitive, RBDGG (Reliability Block diagram with General Gates) methodology was introduced as an extension of the conventional reliability block diagram. The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system, and therefore the modeling of a system for system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar with that of the development of the RGGG (Reliability Graph with General Gates) methodology, which is an extension of a conventional reliability graph. The newly proposed methodology is now implemented into a software tool, RBDGG Solver. RBDGG Solver was developed as a WIN32 console application. RBDGG Solver receives information on the failure modes and failure probabilities of each component in the system, along with the connection structure and connection logics among the components in the system. Based on the received information, RBDGG Solver automatically generates a system reliability analysis model for the system, and then provides the analysis results. In this paper, application of RBDGG Solver to the reliability analysis of an example system, and verification of the calculation results are provided for the purpose of demonstrating how RBDGG Solver is used for system reliability analysis

  15. THE BASIC PRINCIPLES OF RESEARCH IN NEUROEDUCATION STUDIES

    Directory of Open Access Journals (Sweden)

    Ali Nouri

    2016-06-01

    Full Text Available The present paper assembles contributions from the areas of education, psychology, cognitive science, and of course, neuroeducation itself to introduce the basic principles of research in the field of neuroeducation studies. It is particularly important, as such it is a useful way to justify researchers about what neuroeducation as a specific domain do that no other field can do as well or cannot do at all. Based on the literature reviewed, neuroeducational research can be understood as an interdisciplinary endeavor to develop an insightful understanding and holistic picture of problems related to learning and education. It thus epistemologically is based on an integrated methodological pluralism paradigm. This requires researchers to understand multiple methods and methodologies and employ as they formulate their own research projects. Researchers have a critical role to play in providing systematic evidence and conclusions that are scientifically valid and reliable and educationally relevant and usable. One significant implication of this argument is the need to strengthen the quality of the research component in graduate programs of the field and train interested researchers in the identification and formulation of relevant research questions.

  16. Analysis of NPP protection structure reliability under impact of a falling aircraft

    International Nuclear Information System (INIS)

    Shul'man, G.S.

    1996-01-01

    Methodology for evaluation of NPP protection structure reliability by impact of aircraft fall down is considered. The methodology is base on the probabilistic analysis of all potential events. The problem is solved in three stages: determination of loads on structural units, calculation of local reliability of protection structures by assigned loads and estimation of the structure reliability. The methodology proposed may be applied at the NPP design stage and by determination of reliability of already available structures

  17. Use of curium neutron flux from head-end pyroprocessing subsystems for the High Reliability Safeguards methodology

    Energy Technology Data Exchange (ETDEWEB)

    Borrelli, R.A., E-mail: r.angelo.borrelli@gmail.com

    2014-10-01

    The deployment of nuclear energy systems (NESs) is expanding around the world. Nations are investing in NESs as a means to establish energy independence, grow national economies, and address climate change. Transitioning to the advanced nuclear fuel cycle can meet growing energy demands and ensure resource sustainability. However, nuclear facilities in all phases of the advanced fuel cycle must be ‘safeguardable,’ where safety, safeguards, and security are integrated into a practical design strategy. To this end, the High Reliability Safeguards (HRS) approach is a continually developing safeguardability methodology that applies intrinsic design features and employs a risk-informed approach for systems assessment that is safeguards-motivated. Currently, a commercial pyroprocessing facility is used as the example system. This paper presents a modeling study that investigates the neutron flux associated with processed materials. The intent of these studies is to determine if the neutron flux will affect facility design, and subsequently, safeguardability. The results presented in this paper are for the head-end subsystems in a pyroprocessing facility. The collective results from these studies will then be used to further develop the HRS methodology.

  18. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  19. Estimation of Bridge Reliability Distributions

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    In this paper it is shown how the so-called reliability distributions can be estimated using crude Monte Carlo simulation. The main purpose is to demonstrate the methodology. Therefor very exact data concerning reliability and deterioration are not needed. However, it is intended in the paper to ...

  20. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  1. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  2. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  3. Reliability-based condition assessment of steel containment and liners

    International Nuclear Information System (INIS)

    Ellingwood, B.; Bhattacharya, B.; Zheng, R.

    1996-11-01

    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs

  4. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  5. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  6. Integrating evidence-based principles into the undergraduate ...

    African Journals Online (AJOL)

    Background. The research methodology module was reviewed as part of the overall revision of the undergraduate physiotherapy curriculum of Stellenbosch University. This created an ideal platform from which to assess how to align the principles of evidence-based practice (EBP) with research methodology. Fostering the ...

  7. Fundamental Principles of Alarm Design

    DEFF Research Database (Denmark)

    Us, Tolga; Jensen, Niels; Lind, Morten

    2011-01-01

    Traditionally alarms are designed on the basis of empirical guidelines rather than on a sound scientific framework rooted in a theoretical foundation for process and control system design. This paper proposes scientific principles and a methodology for design of alarms based on a functional...... be applied to any engineering system which can be modeled by MFM. The methodology provides a set of alarms which can facilitate event interpretation and operator support for abnormal situation management. The proposed design methodology provides the information content of the alarms, but does not deal...

  8. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  9. Empirical evaluation and justification of methodologies in psychological science.

    Science.gov (United States)

    Proctor, R W; Capaldi, E J

    2001-11-01

    The purpose of this article is to describe a relatively new movement in the history and philosophy of science, naturalism, a form of pragmatism emphasizing that methodological principles are empirical statements. Thus, methodological principles must be evaluated and justified on the same basis as other empirical statements. On this view, methodological statements may be less secure than the specific scientific theories to which they give rise. The authors examined the feasibility of a naturalistic approach to methodology using logical and historical analysis and by contrasting theories that predict new facts versus theories that explain already known facts. They provide examples of how differences over methodological issues in psychology and in science generally may be resolved using a naturalistic, or empirical, approach.

  10. METHODOLOGICAL PROBLEMS OF E-LEARNING DIDACTICS

    Directory of Open Access Journals (Sweden)

    Sergey F. Sergeev

    2015-01-01

    Full Text Available The article is devoted to the discussion of the methodological problems of e-learning, didactic issues the use of advanced networking and Internet technologies to create training systems and simulators based on the methodological principles of non-classical and post-non-classical psychology and pedagogy. 

  11. Sequential decision reliability concept and failure rate assessment

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1990-11-01

    Conventionally, a reliability concept is considered together with both each basic unit and their integration in a complicated large scale system such as a nuclear power plant (NPP). Basically, as the plant's operational status is determined by the information obtained from various sensors, the plant's reliability and the risk assessment is closely related to the reliability of the sensory information and hence the sensor components. However, considering the relevant information-processing systems, e.g. fault detection processors, there exists a further question about the reliability of such systems, specifically the reliability of the systems' decision-based outcomes by means of which the further actions are performed. To this end, a general sequential decision reliability concept and the failure rate assessment methodology is introduced. The implications of the methodology are investigated and the importance of the decision reliability concept in system operation is demonstrated by means of sensory signals in real-time from the Borssele NPP in the Netherlands. (author). 21 refs.; 8 figs

  12. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  13. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  14. Principle and methodology of nuclear power plant site selection. Application to radiocobalt cycle in the Rhone river

    International Nuclear Information System (INIS)

    Georges, J.

    1987-01-01

    In a first bibliographic part, after some generalities on radioactivity and nuclear power, general principles of radiation protection and national and international regulations are presented. The methodology of the radioecological study involved in site selection is developed. In a second more experimental part, the processing of radiocobalt gamma radioactivity measurement in water, fishes, plants and Rhone river sediments demonstrates the influence of age and geographical situation of the nuclear power stations located along the river. A laboratory experiment of cobalt 60 transfer from chironomes larvae to carp is carried out. Comparison with the results of other laboratory experiments makes it possible to propose an experimental model of cobalt transfer within a fresh water ecosystem; radioactivity levels calculated for various compartments seem to be consistent with the Rhone river levels [fr

  15. Design methodologies for reliability of SSL LED boards

    NARCIS (Netherlands)

    Jakovenko, J.; Formánek, J.; Perpiñà, X.; Jorda, X.; Vellvehi, M.; Werkhoven, R.J.; Husák, M.; Kunen, J.M.G.; Bancken, P.; Bolt, P.J.; Gasse, A.

    2013-01-01

    This work presents a comparison of various LED board technologies from thermal, mechanical and reliability point of view provided by an accurate 3-D modelling. LED boards are proposed as a possible technology replacement of FR4 LED boards used in 400 lumen retrofit SSL lamps. Presented design

  16. System reliability of corroding pipelines

    International Nuclear Information System (INIS)

    Zhou Wenxing

    2010-01-01

    A methodology is presented in this paper to evaluate the time-dependent system reliability of a pipeline segment that contains multiple active corrosion defects and is subjected to stochastic internal pressure loading. The pipeline segment is modeled as a series system with three distinctive failure modes due to corrosion, namely small leak, large leak and rupture. The internal pressure is characterized as a simple discrete stochastic process that consists of a sequence of independent and identically distributed random variables each acting over a period of one year. The magnitude of a given sequence follows the annual maximum pressure distribution. The methodology is illustrated through a hypothetical example. Furthermore, the impact of the spatial variability of the pressure loading and pipe resistances associated with different defects on the system reliability is investigated. The analysis results suggest that the spatial variability of pipe properties has a negligible impact on the system reliability. On the other hand, the spatial variability of the internal pressure, initial defect sizes and defect growth rates can have a significant impact on the system reliability.

  17. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  18. The Maximum Entropy Principle and the Modern Portfolio Theory

    Directory of Open Access Journals (Sweden)

    Ailton Cassetari

    2003-12-01

    Full Text Available In this work, a capital allocation methodology base don the Principle of Maximum Entropy was developed. The Shannons entropy is used as a measure, concerning the Modern Portfolio Theory, are also discuted. Particularly, the methodology is tested making a systematic comparison to: 1 the mean-variance (Markovitz approach and 2 the mean VaR approach (capital allocations based on the Value at Risk concept. In principle, such confrontations show the plausibility and effectiveness of the developed method.

  19. Influence Of Inspection Intervals On Mechanical System Reliability

    International Nuclear Information System (INIS)

    Zilberman, B.

    1998-01-01

    In this paper a methodology of reliability analysis of mechanical systems with latent failures is described. Reliability analysis of such systems must include appropriate usage of check intervals for latent failure detection. The methodology suggests, that based on system logic the analyst decides at the beginning if a system can fail actively or latently and propagates this approach through all system levels. All inspections are assumed to be perfect (all failures are detected and repaired and no new failures are introduced as a result of the maintenance). Additional assumptions are that mission time is much smaller, than check intervals and all components have constant failure rates. Analytical expressions for reliability calculates are provided, based on fault tree and Markov modeling techniques (for two and three redundant systems with inspection intervals). The proposed methodology yields more accurate results than are obtained by not using check intervals or using half check interval times. The conventional analysis assuming that at the beginning of each mission system is as new, give an optimistic prediction of system reliability. Some examples of reliability calculations of mechanical systems with latent failures and establishing optimum check intervals are provided

  20. Constructing Ethical Principles for Synthetic Biology

    DEFF Research Database (Denmark)

    Dige, Morten

    2010-01-01

    The ethical discussion over synbio naturally raises metaquestions or questions of methodology: Which ethical principles and values could or should function as orientation or guidelines in discussing these issues?...

  1. Effect of communication on the reliability of nuclear power plant control room operations - pre study

    International Nuclear Information System (INIS)

    Kettunen, Jari; Pyy, Pekka

    1999-01-01

    The objective of the study presented in this paper is to investigate communication practices and their impact on human reliability and plant safety in a nuclear power plant environment. The study aims at developing a general systems approach towards the issue. The ultimate goal of the study is to contribute to the development of probabilistic safety assessment methodologies in the area of communications and crew co-operation. This paper outlines the results of the pre-study. The study is based on the use and further development of different modelling techniques and the application of generic systems engineering as well as crew resource management (CRM) principles. The results so far include a concise literature review on communication and crew performance, a presentation of some potential theoretical concepts and approaches for studying communication in relation to organisational reliability, causal failure sequences and human failures mechanisms, and an introduction of a General Communications Model (GCM) that is presented as a promising approach for studying the reliability and adequacy of communication transactions. Finally, some observations and recommendation concerning next phases of the study are made (author) (ml)

  2. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  3. Progress in Methodologies for the Assessment of Passive Safety System Reliability in Advanced Reactors. Results from the Coordinated Research Project on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors

    International Nuclear Information System (INIS)

    2014-09-01

    Strong reliance on inherent and passive design features has become a hallmark of many advanced reactor designs, including several evolutionary designs and nearly all advanced small and medium sized reactor (SMR) designs. Advanced nuclear reactor designs incorporate several passive systems in addition to active ones — not only to enhance the operational safety of the reactors but also to eliminate the possibility of serious accidents. Accordingly, the assessment of the reliability of passive safety systems is a crucial issue to be resolved before their extensive use in future nuclear power plants. Several physical parameters affect the performance of a passive safety system, and their values at the time of operation are unknown a priori. The functions of passive systems are based on basic physical laws and thermodynamic principals, and they may not experience the same kind of failures as active systems. Hence, consistent efforts are required to qualify the reliability of passive systems. To support the development of advanced nuclear reactor designs with passive systems, investigations into their reliability using various methodologies are being conducted in several Member States with advanced reactor development programmes. These efforts include reliability methods for passive systems by the French Atomic Energy and Alternative Energies Commission, reliability evaluation of passive safety system by the University of Pisa, Italy, and assessment of passive system reliability by the Bhabha Atomic Research Centre, India. These different approaches seem to demonstrate a consensus on some aspects. However, the developers of the approaches have been unable to agree on the definition of reliability in a passive system. Based on these developments and in order to foster collaboration, the IAEA initiated the Coordinated Research Project (CRP) on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors in 2008. The

  4. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  5. The principles of radiation protection

    International Nuclear Information System (INIS)

    2004-01-01

    The aim of radiation protection is to avoid or to reduce the risks linked to ionizing radiation. In order to reduce these risks, the radiation protection uses three great principles: justification, optimization and limitation of radiation doses. to apply these principles, the radiation protection has regulatory and technical means adapted to three different categories of people: public, patients and workers. The nuclear safety authority elaborates the regulation, and carries out monitoring of the reliable application of radiation protection system. (N.C.)

  6. A Simulation Model for Machine Efficiency Improvement Using Reliability Centered Maintenance: Case Study of Semiconductor Factory

    Directory of Open Access Journals (Sweden)

    Srisawat Supsomboon

    2014-01-01

    Full Text Available The purpose of this study was to increase the quality of product by focusing on the machine efficiency improvement. The principle of the reliability centered maintenance (RCM was applied to increase the machine reliability. The objective was to create preventive maintenance plan under reliability centered maintenance method and to reduce defects. The study target was set to reduce the Lead PPM for a test machine by simulating the proposed preventive maintenance plan. The simulation optimization approach based on evolutionary algorithms was employed for the preventive maintenance technique selection process to select the PM interval that gave the best total cost and Lead PPM values. The research methodology includes procedures such as following the priority of critical components in test machine, analyzing the damage and risk level by using Failure Mode and Effects Analysis (FMEA, calculating the suitable replacement period through reliability estimation, and optimizing the preventive maintenance plan. From the result of the study it is shown that the Lead PPM of test machine can be reduced. The cost of preventive maintenance, cost of good product, and cost of lost product were decreased.

  7. Notes on human factors problems in process plant reliability and safety prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.; Taylor, J.R.

    1976-09-01

    The basis for plant operator reliability evaluation is described. Principles for plant design, necessary to permit reliability evaluation, are outlined. Five approaches to the plant operator reliability problem are described. Case stories, illustrating operator reliability problems, are given. (author)

  8. Reliability assessment based on subjective inferences

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    The reliability information which comes from subjective analysis is often incomplete prior. This information can be generally assumed to exist in the form of either a stated prior mean of R (reliability) or a stated prior credibility interval on R. An efficient approach is developed to determine a complete beta prior distribution from the subjective information according to the principle of maximum entropy, and the the reliability of survival/failure product is assessed via Bayes theorem. Numerical examples are presented to illustrate the methods

  9. The methodology of population surveys of headache prevalence, burden and cost: Principles and recommendations from the Global Campaign against Headache

    Science.gov (United States)

    2014-01-01

    The global burden of headache is very large, but knowledge of it is far from complete and needs still to be gathered. Published population-based studies have used variable methodology, which has influenced findings and made comparisons difficult. Among the initiatives of the Global Campaign against Headache to improve and standardize methods in use for cross-sectional studies, the most important is the production of consensus-based methodological guidelines. This report describes the development of detailed principles and recommendations. For this purpose we brought together an expert consensus group to include experience and competence in headache epidemiology and/or epidemiology in general and drawn from all six WHO world regions. The recommendations presented are for anyone, of whatever background, with interests in designing, performing, understanding or assessing studies that measure or describe the burden of headache in populations. While aimed principally at researchers whose main interests are in the field of headache, they should also be useful, at least in parts, to those who are expert in public health or epidemiology and wish to extend their interest into the field of headache disorders. Most of all, these recommendations seek to encourage collaborations between specialists in headache disorders and epidemiologists. The focus is on migraine, tension-type headache and medication-overuse headache, but they are not intended to be exclusive to these. The burdens arising from secondary headaches are, in the majority of cases, more correctly attributed to the underlying disorders. Nevertheless, the principles outlined here are relevant for epidemiological studies on secondary headaches, provided that adequate definitions can be not only given but also applied in questionnaires or other survey instruments. PMID:24467862

  10. Aerospace reliability applied to biomedicine.

    Science.gov (United States)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  11. Principles of Bioremediation Assessment

    Science.gov (United States)

    Madsen, E. L.

    2001-12-01

    Although microorganisms have successfully and spontaneously maintained the biosphere since its inception, industrialized societies now produce undesirable chemical compounds at rates that outpace naturally occurring microbial detoxification processes. This presentation provides an overview of both the complexities of contaminated sites and methodological limitations in environmental microbiology that impede the documentation of biodegradation processes in the field. An essential step toward attaining reliable bioremediation technologies is the development of criteria which prove that microorganisms in contaminated field sites are truly active in metabolizing contaminants of interest. These criteria, which rely upon genetic, biochemical, physiological, and ecological principles and apply to both in situ and ex situ bioremediation strategies include: (i) internal conservative tracers; (ii) added conservative tracers; (iii) added radioactive tracers; (iv) added isotopic tracers; (v) stable isotopic fractionation patterns; (vi) detection of intermediary metabolites; (vii) replicated field plots; (viii) microbial metabolic adaptation; (ix) molecular biological indicators; (x) gradients of coreactants and/or products; (xi) in situ rates of respiration; (xii) mass balances of contaminants, coreactants, and products; and (xiii) computer modeling that incorporates transport and reactive stoichiometries of electron donors and acceptors. The ideal goal is achieving a quantitative understanding of the geochemistry, hydrogeology, and physiology of complex real-world systems.

  12. The reliability of the Glasgow Coma Scale: a systematic review.

    Science.gov (United States)

    Reith, Florence C M; Van den Brande, Ruben; Synnot, Anneliese; Gruen, Russell; Maas, Andrew I R

    2016-01-01

    The Glasgow Coma Scale (GCS) provides a structured method for assessment of the level of consciousness. Its derived sum score is applied in research and adopted in intensive care unit scoring systems. Controversy exists on the reliability of the GCS. The aim of this systematic review was to summarize evidence on the reliability of the GCS. A literature search was undertaken in MEDLINE, EMBASE and CINAHL. Observational studies that assessed the reliability of the GCS, expressed by a statistical measure, were included. Methodological quality was evaluated with the consensus-based standards for the selection of health measurement instruments checklist and its influence on results considered. Reliability estimates were synthesized narratively. We identified 52 relevant studies that showed significant heterogeneity in the type of reliability estimates used, patients studied, setting and characteristics of observers. Methodological quality was good (n = 7), fair (n = 18) or poor (n = 27). In good quality studies, kappa values were ≥0.6 in 85%, and all intraclass correlation coefficients indicated excellent reliability. Poor quality studies showed lower reliability estimates. Reliability for the GCS components was higher than for the sum score. Factors that may influence reliability include education and training, the level of consciousness and type of stimuli used. Only 13% of studies were of good quality and inconsistency in reported reliability estimates was found. Although the reliability was adequate in good quality studies, further improvement is desirable. From a methodological perspective, the quality of reliability studies needs to be improved. From a clinical perspective, a renewed focus on training/education and standardization of assessment is required.

  13. EDUCATION IN SEARCH OF THE ADEQUACY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Y. V. Larin

    2014-01-01

    Full Text Available The paper discusses the acute methodology problem: elicitation of the fundamental principle of modern education. In the course of retrospective analysis, the author attempts to trace the essence and comparative historical specificity of the principle in question, and find out whether the currently declared one actually corresponds with the society demands and time requirements.Consequently, the author singles out three successive historical types of education, each of them based on the respective ideological and methodological assumptions. The first one (the 17th – mid-19th century, based on the ontological system of the «Man and Nature», regards the man as a natural creature and proclaims a fundamental educational principle of adequacy to nature. The second type, formed by the end of the 19th century and based on the ontological system of the «Man and Society», takes the man as a social creature and puts forward a fundamental educational principle of adequacy to society. And finally, the multi-dimensional ontological system of the «ManNature-Culture-Society», developed in the mid-20th century, defines the man as a bio-socio-cultural creature and forms a basis for a new fundamental educational principle of adequacy to culture.The paper maintains that the principle of adequacy to nature corresponds with the classical period of education history; orientation on the social adequacy represents its non-classical stage; and consequently, the principle of cultural adequacy signifies the post-non-classical phase. In conclusion, the author argues that resumption of the initial educational principle of adequacy to nature can be regarded as moving backward.

  14. EDUCATION IN SEARCH OF THE ADEQUACY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Y. V. Larin

    2015-03-01

    Full Text Available The paper discusses the acute methodology problem: elicitation of the fundamental principle of modern education. In the course of retrospective analysis, the author attempts to trace the essence and comparative historical specificity of the principle in question, and find out whether the currently declared one actually corresponds with the society demands and time requirements.Consequently, the author singles out three successive historical types of education, each of them based on the respective ideological and methodological assumptions. The first one (the 17th – mid-19th century, based on the ontological system of the «Man and Nature», regards the man as a natural creature and proclaims a fundamental educational principle of adequacy to nature. The second type, formed by the end of the 19th century and based on the ontological system of the «Man and Society», takes the man as a social creature and puts forward a fundamental educational principle of adequacy to society. And finally, the multi-dimensional ontological system of the «ManNature-Culture-Society», developed in the mid-20th century, defines the man as a bio-socio-cultural creature and forms a basis for a new fundamental educational principle of adequacy to culture.The paper maintains that the principle of adequacy to nature corresponds with the classical period of education history; orientation on the social adequacy represents its non-classical stage; and consequently, the principle of cultural adequacy signifies the post-non-classical phase. In conclusion, the author argues that resumption of the initial educational principle of adequacy to nature can be regarded as moving backward.

  15. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  16. Power Industry Reliability Coordination in Asia in a Market Environment

    OpenAIRE

    Hammons, Thomas J.; Voropai, Nikolai I.

    2010-01-01

    This paper addresses the problems of power supply reliability in a market environment. The specific features of economic interrelations between the power supply organization and consumers in terms of reliability assurance are examined and the principles of providing power supply reliability are formulated. The economic mechanisms of coordinating the interests of power supply organization and consumers to provide power supply reliability are discussed. Reliability of restructuring China's powe...

  17. Seismic stops vs. snubbers, a reliable alternative

    International Nuclear Information System (INIS)

    Cloud, R.L.; Anderson, P.H.; Leung, J.S.M.

    1988-01-01

    The Seismic Stops methodology has been developed to provide a reliable alternative for providing seismic support to nuclear power plant piping. The concept is based on using rigid passive supports with large clearances. These gaps permit unrestrained thermal expansion while limiting excessive seismic displacements. This type of restraint has performed successfully in fossil fueled power plants. A simplified production analysis tool has been developed which evaluates the nonlinear piping response including the effect of the gapped supports. The methodology utilizes the response spectrum approach and has been incorporated into a piping analysis computer program RLCA-GAP. Full scale shake table tests of piping specimens were performed to provide test correlation with the developed methodology. Analyses using RLCA-GAP were in good agreement with test results. A sample piping system was evaluated using the Seismic Stops methodology to replace the existing snubbers with passive gapped supports. To provide further correlation data, the sample system was also evaluated using nonlinear time history analysis. The correlation comparisons showed RLCA-GAP to be a viable methodology and a reliable alternative for snubber optimization and elimination. (orig.)

  18. Qualitative methodology in a psychoanalytic single case study

    DEFF Research Database (Denmark)

    Grünbaum, Liselotte

    features and breaks in psychotherapy investigated. One aim of the study was to contribute to the development of a transparent and systematic methodology for the psychoanalytic case study by application of rigorous qualitative research methodology. To this end, inductive-deductive principles in line...

  19. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology

    Science.gov (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  20. Proposed Reliability/Cost Model

    Science.gov (United States)

    Delionback, L. M.

    1982-01-01

    New technique estimates cost of improvement in reliability for complex system. Model format/approach is dependent upon use of subsystem cost-estimating relationships (CER's) in devising cost-effective policy. Proposed methodology should have application in broad range of engineering management decisions.

  1. Overview of system reliability analyses for PSA

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2012-01-01

    Overall explanations are given for many matters relating to system reliability analysis. Systems engineering, Operations research, Industrial engineering, Quality control are briefly explained. Many system reliability analysis methods including advanced methods are introduced. Discussions are given for FMEA, reliability block diagram, Markov model, Petri net, Bayesian network, goal tree success tree, dynamic flow graph methodology, cell-to-cell mapping technique, the GO-FLOW and others. (author)

  2. Reliability evaluation of deregulated electric power systems for planning applications

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A.M.; Jafari, A.; Fotuhi-Firuzabad, M.

    2008-01-01

    In a deregulated electric power utility industry in which a competitive electricity market can influence system reliability, market risks cannot be ignored. This paper (1) proposes an analytical probabilistic model for reliability evaluation of competitive electricity markets and (2) develops a methodology for incorporating the market reliability problem into HLII reliability studies. A Markov state space diagram is employed to evaluate the market reliability. Since the market is a continuously operated system, the concept of absorbing states is applied to it in order to evaluate the reliability. The market states are identified by using market performance indices and the transition rates are calculated by using historical data. The key point in the proposed method is the concept that the reliability level of a restructured electric power system can be calculated using the availability of the composite power system (HLII) and the reliability of the electricity market. Two case studies are carried out over Roy Billinton Test System (RBTS) to illustrate interesting features of the proposed methodology

  3. A Reliability Assessment Method for the VHTR Safety Systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok; Jae, Moo Sung; Kim, Yong Wan

    2011-01-01

    The Passive safety system by very high temperature reactor which has attracted worldwide attention in the last century is the reliability safety system introduced for the improvement in the safety of the next generation nuclear power plant design. The Passive system functionality does not rely on an external source of energy, but on an intelligent use of the natural phenomena, such as gravity, conduction and radiation, which are always present. Because of these features, it is difficult to evaluate the passive safety on the risk analysis methodology having considered the existing active system failure. Therefore new reliability methodology has to be considered. In this study, the preliminary evaluation and conceptualization are tried, applying the concept of the load and capacity from the reliability physics model, designing the new passive system analysis methodology, and the trial applying to paper plant.

  4. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  5. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  6. MEMS reliability: coming of age

    Science.gov (United States)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  7. Reliability assessment of nuclear structural systems

    International Nuclear Information System (INIS)

    Reich, M.; Hwang, H.

    1983-01-01

    Reliability assessment of nuclear structural systems has been receiving more emphasis over the last few years. This paper deals with the recent progress made by the Structural Analysis Division of Brookhaven National Laboratory (BNL), in the development of a probability-based reliability analysis methodology for safety evaluation of reactor containments and other seismic category I structures. An important feature of this methodology is the incorporation of finite element analysis and random vibration theory. By utilizing this method, it is possible to evaluate the safety of nuclear structures under various static and dynamic loads in terms of limit state probability. Progress in other related areas, such as the establishment of probabilistic characteristics for various loads and structural resistance, are also described. Results of an application of the methodology to a realistic reinforced concrete containment subjected to dead and live loads, accidental internal pressures and earthquake ground accelerations are presented

  8. Reliability of thermal-hydraulic passive safety systems

    International Nuclear Information System (INIS)

    D'Auria, F.; Araneo, D.; Pierro, F.; Galassi, G.

    2014-01-01

    The scholar will be informed of reliability concepts applied to passive system adopted for nuclear reactors. Namely, for classical components and systems the failure concept is associated with malfunction of breaking of hardware. In the case of passive systems the failure is associated with phenomena. A method for studying the reliability of passive systems is discussed and is applied. The paper deals with the description of the REPAS (Reliability Evaluation of Passive Safety System) methodology developed by University of Pisa (UNIPI) and with results from its application. The general objective of the REPAS methodology is to characterize the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems

  9. General principles of radiotherapy

    International Nuclear Information System (INIS)

    Easson, E.C.

    1985-01-01

    The daily practice of any established branch of medicine should be based on some acceptable principles. This chapter is concerned with the general principles on which the radiotherapy of the Manchester school is based. Though many radiotherapists in other centres would doubtless accept these principles, there are sufficiently wide differences in practice throughout the world to suggest that some therapists adhere to a fundamentally different philosophy. The authors believe it is important, especially for those beginning their formal training in radiotherapy, to subscribe to an internally consistent school of thought, employing methods of treatment for each type of lesion in each anatomical site that are based on accepted principles and subjected to continuous rigorous scrutiny to test their effectiveness. Not only must each therapeutic technique be evaluated, but the underlying principles too must be questioned if and when this seems indicated. It is a feature of this hospital that similar lesions are all treated by the same technique, so long as statistical evidence justifies such a policy. All members of the staff adhere to the accepted policy until or unless reliable reasons are adduced to change this policy

  10. Agile foundations principles, practices and frameworks

    CERN Document Server

    Measey, Peter; Gray, Alex; Levy, Richard; Oliver, Les; Roberts, Barbara; Short, Michael; Wilmshurst, Darren; Wolf, Lazaro

    2015-01-01

    Agile practices transform the way organisations carry out business and respond to change. But to realise success, an Agile mindset needs to be adopted throughout an organisation. This book gives a comprehensive introduction to Agile principles and methodologies.

  11. Parts and Components Reliability Assessment: A Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  12. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  13. A reliability-based preventive maintenance methodology for the projection spot welding machine

    Directory of Open Access Journals (Sweden)

    Fayzimatov Ulugbek

    2018-06-01

    Full Text Available An effective operations of a projection spot welding (PSW machine is closely related to the effec-tiveness of the maintenance. Timely maintenance can prevent failures and improve reliability and maintainability of the machine. Therefore, establishing the maintenance frequency for the welding machine is one of the most important tasks for plant engineers. In this regard, reliability analysis of the welding machine can be used to establish preventive maintenance intervals (PMI and to identify the critical parts of the system. In this reliability and maintainability study, analysis of the PSW machine was carried out. The failure and repair data for analysis were obtained from automobile manufacturing company located in Uzbekistan. The machine was divided into three main sub-systems: electrical, pneumatic and hydraulic. Different distributions functions for all sub-systems was tested and their parameters tabulated. Based on estimated parameters of the analyzed distribu-tions, PMI for the PSW machines sub-systems at different reliability levels was calculated. Finally, preventive measures for enhancing the reliability of the PSW machine sub-systems are suggested.

  14. Integration of human reliability analysis into the probabilistic risk assessment process: Phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1984-10-01

    A research program was initiated to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three-phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  15. Use of PRA methodology for enhancing operational safety and reliability

    International Nuclear Information System (INIS)

    Chu, B.; Rumble, E.; Najafi, B.; Putney, B.; Young, J.

    1985-01-01

    This paper describes a broad scope, on-going R and D study, sponsored by the Electric Power Research Institute (EPRI) to utilize key features of the state-of-the-art plant information management and system analysis techniques to develop and demonstrate a practical engineering tool for assisting plant engineering and operational staff to perform their activities more effectively. The study is foreseen to consist of two major activities: to develop a user-friendly, integrated software system; and to demonstrate the applications of this software on-site. This integrated software, Reliability Analysis Program with In-Plant Data (RAPID), will consist of three types of interrelated elements: an Executive Controller which will provide engineering and operations staff users with interface and control of the other two software elements, a Data Base Manager which can acquire, store, select, and transfer data, and Applications Modules which will perform the specific reliability-oriented functions. A broad range of these functions has been envisaged. The immediate emphasis will be focused on four application modules: a Plant Status Module, a Technical Specification Optimization Module, a Reliability Assessment Module, and a Utility Module for acquiring plant data

  16. A methodology based in particle swarm optimization algorithm for preventive maintenance focused in reliability and cost

    International Nuclear Information System (INIS)

    Luz, Andre Ferreira da

    2009-01-01

    In this work, a Particle Swarm Optimization Algorithm (PSO) is developed for preventive maintenance optimization. The proposed methodology, which allows the use flexible intervals between maintenance interventions, instead of considering fixed periods (as usual), allows a better adaptation of scheduling in order to deal with the failure rates of components under aging. Moreover, because of this flexibility, the planning of preventive maintenance becomes a difficult task. Motivated by the fact that the PSO has proved to be very competitive compared to other optimization tools, this work investigates the use of PSO as an alternative tool of optimization. Considering that PSO works in a real and continuous space, it is a challenge to use it for discrete optimization, in which scheduling may comprise variable number of maintenance interventions. The PSO model developed in this work overcome such difficulty. The proposed PSO searches for the best policy for maintaining and considers several aspects, such as: probability of needing repair (corrective maintenance), the cost of such repairs, typical outage times, costs of preventive maintenance, the impact of maintaining the reliability of systems as a whole, and the probability of imperfect maintenance. To evaluate the proposed methodology, we investigate an electro-mechanical system consisting of three pumps and four valves, High Pressure Injection System (HPIS) of a PWR. Results show that PSO is quite efficient in finding the optimum preventive maintenance policies for the HPIS. (author)

  17. Study on seismic reliability for foundation grounds and surrounding slopes of nuclear power plants. Proposal of evaluation methodology and integration of seismic reliability evaluation system

    International Nuclear Information System (INIS)

    Ohtori, Yasuki; Kanatani, Mamoru

    2006-01-01

    This paper proposes an evaluation methodology of annual probability of failure for soil structures subjected to earthquakes and integrates the analysis system for seismic reliability of soil structures. The method is based on margin analysis, that evaluates the ground motion level at which structure is damaged. First, ground motion index that is strongly correlated with damage or response of the specific structure, is selected. The ultimate strength in terms of selected ground motion index is then evaluated. Next, variation of soil properties is taken into account for the evaluation of seismic stability of structures. The variation of the safety factor (SF) is evaluated and then the variation is converted into the variation of the specific ground motion index. Finally, the fragility curve is developed and then the annual probability of failure is evaluated combined with seismic hazard curve. The system facilitates the assessment of seismic reliability. A generator of random numbers, dynamic analysis program and stability analysis program are incorporated into one package. Once we define a structural model, distribution of the soil properties, input ground motions and so forth, list of safety factors for each sliding line is obtained. Monte Carlo Simulation (MCS), Latin Hypercube Sampling (LHS), point estimation method (PEM) and first order second moment (FOSM) implemented in this system are also introduced. As numerical examples, a ground foundation and a surrounding slope are assessed using the proposed method and the integrated system. (author)

  18. Improving accuracy of electrochemical capacitance and solvation energetics in first-principles calculations

    Science.gov (United States)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    2018-04-01

    Reliable first-principles calculations of electrochemical processes require accurate prediction of the interfacial capacitance, a challenge for current computationally efficient continuum solvation methodologies. We develop a model for the double layer of a metallic electrode that reproduces the features of the experimental capacitance of Ag(100) in a non-adsorbing, aqueous electrolyte, including a broad hump in the capacitance near the potential of zero charge and a dip in the capacitance under conditions of low ionic strength. Using this model, we identify the necessary characteristics of a solvation model suitable for first-principles electrochemistry of metal surfaces in non-adsorbing, aqueous electrolytes: dielectric and ionic nonlinearity, and a dielectric-only region at the interface. The dielectric nonlinearity, caused by the saturation of dipole rotational response in water, creates the capacitance hump, while ionic nonlinearity, caused by the compactness of the diffuse layer, generates the capacitance dip seen at low ionic strength. We show that none of the previously developed solvation models simultaneously meet all these criteria. We design the nonlinear electrochemical soft-sphere solvation model which both captures the capacitance features observed experimentally and serves as a general-purpose continuum solvation model.

  19. The Principle-Based Method of Practical Ethics.

    Science.gov (United States)

    Spielthenner, Georg

    2017-09-01

    This paper is about the methodology of doing practical ethics. There is a variety of methods employed in ethics. One of them is the principle-based approach, which has an established place in ethical reasoning. In everyday life, we often judge the rightness and wrongness of actions by their conformity to principles, and the appeal to principles plays a significant role in practical ethics, too. In this paper, I try to provide a better understanding of the nature of principle-based reasoning. To accomplish this, I show in the first section that these principles can be applied to cases in a meaningful and sufficiently precise way. The second section discusses the question how relevant applying principles is to the resolution of ethical issues. This depends on their nature. I argue that the principles under consideration in this paper should be interpreted as presumptive principles and I conclude that although they cannot be expected to bear the weight of definitely resolving ethical problems, these principles can nevertheless play a considerable role in ethical research.

  20. FACING ISO 9001 . PRINCIPLE OF PDCA, METHODOLOGY 8D

    Directory of Open Access Journals (Sweden)

    S. V. Yurchenko

    2014-01-01

    Full Text Available Tools of management system by means of which it is possible to operate any information flows both in the organization and in the chain consumer-supplier are presented. Models of control cards help to build and show productive system of management, applying at the same time methodology 8D.

  1. On the Reliability of Implicit and Explicit Memory Measures.

    Science.gov (United States)

    Buchner, Axel; Wippich, Werner

    2000-01-01

    Studied the reliability of implicit and explicit memory tests in experiments involving these tests. Results with 168, 84, 120, and 128 undergraduates show that methodological artifacts may cause implicit memory tests to have lower reliability than explicit memory tests, but that implicit tests need not necessarily be less reliable. (SLD)

  2. Reliability of thermal interface materials: A review

    International Nuclear Information System (INIS)

    Due, Jens; Robinson, Anthony J.

    2013-01-01

    Thermal interface materials (TIMs) are used extensively to improve thermal conduction across two mating parts. They are particularly crucial in electronics thermal management since excessive junction-to-ambient thermal resistances can cause elevated temperatures which can negatively influence device performance and reliability. Of particular interest to electronic package designers is the thermal resistance of the TIM layer at the end of its design life. Estimations of this allow the package to be designed to perform adequately over its entire useful life. To this end, TIM reliability studies have been performed using accelerated stress tests. This paper reviews the body of work which has been performed on TIM reliability. It focuses on the various test methodologies with commentary on the results which have been obtained for the different TIM materials. Based on the information available in the open literature, a test procedure is proposed for TIM selection based on beginning and end of life performance. - Highlights: ► This paper reviews the body of work which has been performed on TIM reliability. ► Test methodologies for reliability testing are outlined. ► Reliability results for the different TIM materials are discussed. ► A test procedure is proposed for TIM selection BOLife and EOLife performance.

  3. Composite reliability evaluation for transmission network planning

    Directory of Open Access Journals (Sweden)

    Jiashen Teh

    2018-01-01

    Full Text Available As the penetration of wind power into the power system increases, the ability to assess the reliability impact of such interaction becomes more important. The composite reliability evaluations involving wind energy provide ample opportunities for assessing the benefits of different wind farm connection points. A connection to the weak area of the transmission network will require network reinforcement for absorbing the additional wind energy. Traditionally, the reinforcements are performed by constructing new transmission corridors. However, a new state-of-art technology such as the dynamic thermal rating (DTR system, provides new reinforcement strategy and this requires new reliability assessment method. This paper demonstrates a methodology for assessing the cost and the reliability of network reinforcement strategies by considering the DTR systems when large scale wind farms are connected to the existing power network. Sequential Monte Carlo simulations were performed and all DTRs and wind speed were simulated using the auto-regressive moving average (ARMA model. Various reinforcement strategies were assessed from their cost and reliability aspects. Practical industrial standards are used as guidelines when assessing costs. Due to this, the proposed methodology in this paper is able to determine the optimal reinforcement strategies when both the cost and reliability requirements are considered.

  4. A methodology to aid in the design of naval steels: Linking first principles calculations to mesoscale modeling

    International Nuclear Information System (INIS)

    Spanos, G.; Geltmacher, A.B.; Lewis, A.C.; Bingert, J.F.; Mehl, M.; Papaconstantopoulos, D.; Mishin, Y.; Gupta, A.; Matic, P.

    2007-01-01

    This paper provides a brief overview of a multidisciplinary effort at the Naval Research Laboratory aimed at developing a computationally-based methodology to assist in the design of advanced Naval steels. This program uses multiple computational techniques ranging from the atomistic length scale to continuum response. First-principles electronic structure calculations using density functional theory were employed, semi-empirical angular dependent potentials were developed based on the embedded atom method, and these potentials were used as input into Monte-Carlo and molecular dynamics simulations. Experimental techniques have also been applied to a super-austenitic stainless steel (AL6XN) to provide experimental input, guidance, verification, and enhancements to the models. These experimental methods include optical microscopy, scanning electron microscopy, transmission electron microscopy, electron backscatter diffraction, and serial sectioning in conjunction with computer-based three-dimensional reconstruction and quantitative analyses. The experimental results are also used as critical input into mesoscale finite element models of materials response

  5. Development of reliable pavement models.

    Science.gov (United States)

    2011-05-01

    The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...

  6. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  7. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  8. Seismic reliability assessment methodology for CANDU concrete containment structures-phase 11

    International Nuclear Information System (INIS)

    Hong, H.P.

    1996-07-01

    This study was undertaken to verify a set of load factors for reliability-based seismic evaluation of CANDU containment structures in Eastern Canada. Here, the new, site-specific, results of probabilistic seismic hazard assessment (response spectral velocity) were applied. It was found that the previously recommended load factors are relatively insensitive to the new seismic hazard information, and are adequate for a reliability-based seismic evaluation process. (author). 4 refs., 5 tabs., 9 figs

  9. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    Mills, S.E.

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  10. Reliability assessment for safety critical systems by statistical random testing

    Energy Technology Data Exchange (ETDEWEB)

    Mills, S E [Carleton Univ., Ottawa, ON (Canada). Statistical Consulting Centre

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs.

  11. Application of PRINCE2 Project Management Methodology

    Directory of Open Access Journals (Sweden)

    Vaníčková Radka

    2017-09-01

    Full Text Available The methodology describes the principle of setting a project in PRINCE2 project management. The main aim of the paper is to implement PRINCE2 methodology to be used in an enterprise in the service industry. A partial aim is to choose a supplier of the project among new travel guides. The result of the project activity is a sight-seeing tour/service more attractive for customers in the tourism industry and a possible choice of new job opportunities. The added value of the article is the description of applying the principles, processes and topics of PRINCE2 project management so that they might be used in the field.

  12. Methodology for assessing the impacts of distributed generation interconnection

    Directory of Open Access Journals (Sweden)

    Luis E. Luna

    2011-06-01

    Full Text Available This paper proposes a methodology for identifying and assessing the impact of distributed generation interconnection on distribution systems using Monte Carlo techniques. This methodology consists of two analysis schemes: a technical analysis, which evaluates the reliability conditions of the distribution system; on the other hand, an economic analysis that evaluates the financial impacts on the electric utility and its customers, according to the system reliability level. The proposed methodology was applied to an IEEE test distribution system, considering different operation schemes for the distributed generation interconnection. The application of each one of these schemes provided significant improvements regarding the reliability and important economic benefits for the electric utility. However, such schemes resulted in negative profitability levels for certain customers, therefore, regulatory measures and bilateral contracts were proposed which would provide a solution for this kind of problem.

  13. Reliable lateral and vertical manipulations of a single Cu adatom on a Cu(111) surface with multi-atom apex tip: semiempirical and first-principles simulations

    International Nuclear Information System (INIS)

    Xie Yiqun; Liu Qingwei; Zhang Peng; Wang Songyou; Li Yufen; Gan Fuxi; Zhuang Jun; Zhang Wenqing; Zhuang Min

    2008-01-01

    We study the reliability of the lateral manipulation of a single Cu adatom on a Cu(111) surface with single-atom, dimer and trimer apex tips using both semiempirical and first-principles simulations. The dependence of the manipulation reliability on tip height is investigated. For the single-atom apex tip the manipulation reliability increases monotonically with decreasing tip height. For the dimer and trimer apex tips the manipulation reliability is greatly improved compared to that for the single-atom apex tip over a certain tip-height range. Two kinds of mechanism are found responsible for this improvement. One is the so-called enhanced interaction mechanism in which the lateral tip-adatom interaction in the manipulation direction is improved. The other is the suspended atom mechanism in which the relative lateral trapping ability of the tip is improved due to the strong vertical attraction of the tip on the adatom. Both mechanisms occur in the manipulations with the trimer apex tip, while in those with the dimer apex tip only the former is effective. Moreover, we present a method to realize reversible vertical manipulation of a single atom on a Cu(111) surface with the trimer apex tip, based on its strong vertical and lateral attraction on the adatom

  14. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  15. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  16. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  17. Methodological foundations of target market enterprise orientation

    OpenAIRE

    N.V. Karpenko

    2012-01-01

    In the article the author determines the importance of target market orientation maintenance which content is based on marketing principles and envisages the interrelationship of market segmentation processes and positioning. Proposed methodological principles of segmentation implementation are the result of the authors own research, and the process of positioning is examined through the five-level system that contains three stages and two variants of organizational behavior.

  18. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  19. Some approaches to system reliability improvement in engineering design

    International Nuclear Information System (INIS)

    Shen, Kecheng.

    1990-01-01

    In this thesis some approaches to system reliability improvement in engineering design are studied. In particular, the thesis aims at developing alternative methodologies for ranking of component importance which are more related to the design practice and which are more useful in system synthesis than the existing ones. It also aims at developing component reliability models by means of stress-strength interference which will enable both component reliability prediction and design for reliability. A new methodology for ranking of component importance is first developed based on the notion of the increase of the expected system yield. This methodology allows for incorporation of different improvement actions at the component level such as parallel redundancy, standby redundancy, burn-in, minimal repair and perfect replacement. For each of these improvement actions, the increase of system reliability is studied and used as the component importance measure. A possible connection between the commonly known models of component lifetimes and the stress-strength interference models is suggested. Under some general conditions the relationship between component failure rate and the stress and strength distribution characteristics is studied. A heuristic approach for obtaining bounds on failure probability through stress-strength interference is also presented. A case study and a worked example are presented, which illustrate and verify the developed importance measures and their applications in the analytical as well as synthetical work of engineering design. (author)

  20. Life management and operational experience feedback - tools to enhance safety and reliability of the NPP

    International Nuclear Information System (INIS)

    Mach, P.

    1997-01-01

    Preparation has started of the Temelin power plant centralized equipment database. Principles of reliability centered maintenance are studied, and use of these activities will be made in the Plant Ageing Management Programme. The aims of the Programme are as follows: selection of important components subject to ageing, data collection, determination of dominant stressors, development, selection and validation of ageing evaluation methods, setup of experience feedback, determination of responsibilities, methodologies and strategy, elaboration of programme procedures and documentation, and maintenance of programme flexibility. Pilot studies of component ageing are under way: for the reactor pressure vessel, steam generator, pressurizer, piping, ECCS and cables. The organizational structure of the Operational Experience Feedback system is described, as are the responsibility of staff and sources of information. (M.D.)

  1. Methodology for the application of the I.C.R.P. optimization principle. The case of radioactive effluent control systems in the nuclear fuel cycle

    International Nuclear Information System (INIS)

    Lochard, Jacques; Maccia, Carlo; Pages, Pierre.

    1980-10-01

    This report aims at giving a detailed methodology to help improving decision making process in the radiation protection field, according to the optimization principle of the ICRP. A model was elaborated in such a general way as to be applicable for public as well as occupational radiation protection. The main steps of the model are: 1) the assessment of collective doses and residual health effects associated with a given radiation protection level, 2) the determination of protection costs, 3) the decision analysis: cost effectiveness and cost-benefit analysis. The model is implemented by means of a conversational computer program. This methodology is exemplified with the problem of the choice of waste treatment systems for the PWRs in France. The public impact of radioactive releases is evaluated for the population within 100 km around the site. The main results are presented for two existing sites of the French nuclear program [fr

  2. Summary of the preparation of methodology for digital system reliability analysis for PSA purposes

    International Nuclear Information System (INIS)

    Hustak, S.; Babic, P.

    2001-12-01

    The report is structured as follows: Specific features of and requirements for the digital part of NPP Instrumentation and Control (I and C) systems (Computer-controlled digital technologies and systems of the NPP I and C system; Specific types of digital technology failures and preventive provisions; Reliability requirements for the digital parts of I and C systems; Safety requirements for the digital parts of I and C systems; Defence-in-depth). Qualitative analyses of NPP I and C system reliability and safety (Introductory system analysis; Qualitative requirements for and proof of NPP I and C system reliability and safety). Quantitative reliability analyses of the digital parts of I and C systems (Selection of a suitable quantitative measure of digital system reliability; Selected qualitative and quantitative findings regarding digital system reliability; Use of relations among the occurrences of the various types of failure). Mathematical section in support of the calculation of the various types of indices (Boolean reliability models, Markovian reliability models). Example of digital system analysis (Description of a selected protective function and the relevant digital part of the I and C system; Functional chain examined, its components and fault tree). (P.A.)

  3. PRINCIPLES OF CONTENT FORMATION EDUCATIONAL ELECTRONIC RESOURCE

    Directory of Open Access Journals (Sweden)

    О Ю Заславская

    2017-12-01

    Full Text Available The article considers modern possibilities of information and communication technologies for the design of electronic educational resources. The conceptual basis of the open educational multimedia system is based on the modular architecture of the electronic educational resource. The content of the electronic training module can be implemented in several versions of the modules: obtaining information, practical exercises, control. The regularities in the teaching process in modern pedagogical theory are considered: general and specific, and the principles for the formation of the content of instruction at different levels are defined, based on the formulated regularities. On the basis of the analysis, the principles of the formation of the electronic educational resource are determined, taking into account the general and didactic patterns of teaching.As principles of the formation of educational material for obtaining information for the electronic educational resource, the article considers: the principle of methodological orientation, the principle of general scientific orientation, the principle of systemic nature, the principle of fundamentalization, the principle of accounting intersubject communications, the principle of minimization. The principles of the formation of the electronic training module of practical studies in the article include: the principle of systematic and dose based consistency, the principle of rational use of study time, the principle of accessibility. The principles of the formation of the module for monitoring the electronic educational resource can be: the principle of the operationalization of goals, the principle of unified identification diagnosis.

  4. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  5. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  6. Environmental Zoning: Some methodological implications

    NARCIS (Netherlands)

    Ike, Paul; Voogd, Henk

    1991-01-01

    The purpose of this article is to discuss some methodological problems of environmental zoning. The principle of environmental zoning will be elaborated. In addition an overview is given of a number of approaches that have been followed in practice to arrive at an integral judgement. Finally some

  7. Reliability Assessment Method of Reactor Protection System Software by Using V and Vbased Bayesian Nets

    International Nuclear Information System (INIS)

    Eom, H. S.; Park, G. Y.; Kang, H. G.; Son, H. S.

    2010-07-01

    Developed a methodology which can be practically used in quantitative reliability assessment of a safety c ritical software for a protection system of nuclear power plants. The base of the proposed methodology is V and V being used in the nuclear industry, which means that it is not affected with specific software development environments or parameters that are necessary for the reliability calculation. Modular and formal sub-BNs in the proposed methodology is useful tool to constitute the whole BN model for reliability assessment of a target software. The proposed V and V based BN model estimates the defects in the software according to the performance of V and V results and then calculate reliability of the software. A case study was carried out to validate the proposed methodology. The target software is the RPS SW which was developed by KNICS project

  8. Reliability of nuclear power plants and equipment

    International Nuclear Information System (INIS)

    1985-01-01

    The standard sets the general principles, a list of reliability indexes and demands on their selection. Reliability indexes of nuclear power plants include the simple indexes of fail-safe operation, life and maintainability, and of storage capability. All terms and notions are explained and methods of evaluating the indexes briefly listed - statistical, and calculation experimental. The dates when the standard comes in force in the individual CMEA countries are given. (M.D.)

  9. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  10. Scaled CMOS Technology Reliability Users Guide

    Science.gov (United States)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  11. Reliability evaluation of a natural circulation system

    International Nuclear Information System (INIS)

    Jafari, Jalil; D'Auria, Francesco; Kazeminejad, Hossein; Davilu, Hadi

    2003-01-01

    This paper discusses a reliability study performed with reference to a passive thermohydraulic natural circulation (NC) system, named TTL-1. A methodology based on probabilistic techniques has been applied with the main purpose to optimize the system design. The obtained results have been adopted to estimate the thermal-hydraulic reliability (TH-R) of the same system. A total of 29 relevant parameters (including nominal values and plausible ranges of variations) affecting the design and the NC performance of the TTL-1 loop are identified and a probability of occurrence is assigned for each value based on expert judgment. Following procedures established for the uncertainty evaluation of thermal-hydraulic system codes results, 137 system configurations have been selected and each configuration has been analyzed via the Relap5 best-estimate code. The reference system configuration and the failure criteria derived from the 'mission' of the passive system are adopted for the evaluation of the system TH-R. Four different definitions of a less-than-unity 'reliability-values' (where unity represents the maximum achievable reliability) are proposed for the performance of the selected passive system. This is normally considered fully reliable, i.e. reliability-value equal one, in typical Probabilistic Safety Assessment (PSA) applications in nuclear reactor safety. The two 'point' TH-R values for the considered NC system were found equal to 0.70 and 0.85, i.e. values comparable with the reliability of a pump installed in an 'equivalent' forced circulation (active) system having the same 'mission'. The design optimization study was completed by a regression analysis addressing the output of the 137 calculations: heat losses, undetected leakage, loop length, riser diameter, and equivalent diameter of the test section have been found as the most important parameters bringing to the optimal system design and affecting the TH-R. As added values for this work, the comparison has

  12. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  13. Culture of safety. Indicators of culture of safety. Stage of culture of safety. Optimization of radiating protection. Principle of precaution. Principle ALARA. Procedure ALARA

    International Nuclear Information System (INIS)

    Mursa, E.

    2006-01-01

    Object of research: is the theory and practice of optimization of radiating protection according to recommendations of the international organizations, realization of principle ALARA and maintenance of culture of safety (SC) on the nuclear power plant. The purpose of work - to consider the general aspects of realization of principle ALARA, conceptual bases of culture of safety, as principle of management, and practice of their introduction on the nuclear power plant. The work has the experts' report character in which the following questions are presented: The recommendations materials of the IAEA and other international organizations have been assembled, systematized and analyzed. The definitions, characteristics and universal SC features, and also indicators as a problem of parameters and quantitative SC measurements are described in details advanced. The ALARA principles - principle of precaution; not acceptance of zero risk; choice of a principle ALARA; model of acceptable radiation risk are described. The methodology of an estimation of culture of safety level and practical realization of the ALARA principle in separate organization is shown on a practical example. The SC general estimation at a national level in Republic of Moldova have been done. Taking into consideration that now Safety Culture politics are introduced only in relation to APS, in this paper the attempt of application of Safety Culture methodology to Radiological Objects have been made (Oncological Institute of the Republic of Moldova and Special Objects No.5101 and 5102 for a long time Storage of the Radioactive Waste). (authors)

  14. D5.3 Reading reliability report

    DEFF Research Database (Denmark)

    Cetin, Bilge Kartal; Galiotto, Carlo; Cetin, Kamil

    2010-01-01

    This deliverable presents a detailed description of the main causes of reading reliability degradation. Two main groups of impairments are recognized: those at the physical layer (e.g., fading, multipath, electromagnetic interference, shadowing due to obstacles, tag orientation misalignment, tag...... bending, metallic environments, etc.) and those at the medium access control sub-layer (e.g., collisions due to tag-to-tag, reader-to-reader and multiple readers-to-tag interference). The review presented in this deliverable covers previous reliability reports and existing definitions of RFID reading...... reliability. Performance metrics and methodologies for assessing reading reliability are further discussed. This document also presents a review of state-of-the-art RFID reading reliability improvement schemes. The solutions are classified into physical- (PHY), medium access control- (MAC), upper-, and cross...

  15. Reliability and risk treatment centered maintenance

    International Nuclear Information System (INIS)

    Pexa, Martin; Hladik, Tomas; Ales, Zdenek; Legat, Vaclav; Muller, Miroslav; Valasek, Petr; Havlu, Vit

    2014-01-01

    We propose a new methodology for application of well-known tools - RCM, RBI and SIF pro - with the aim to treat risks by means of suitable maintenance. The basis of the new methodology is the complex application of all three methods at the same time and not separately as is typical today. The proposed methodology suggests having just one managing team for reliability and risk treatment centred maintenance (RRTCM), employing existing RCM, RBI, and SIFpro tools concurrently. This approach allows for significant reduction of engineering activities' duration. In the proposed methodology these activities are staged into five phases and structured to eliminate all duplication resulting from separate application of the three tools. The newly proposed methodology saves 45% to 50% of the engineering workload and dequate significant financial savings.

  16. Reliability and risk treatment centered maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Pexa, Martin; Hladik, Tomas; Ales, Zdenek; Legat, Vaclav; Muller, Miroslav; Valasek, Petr [Czech University of Life Sciences Prague, Kamycka (Czech Republic); Havlu, Vit [Unipetrol A. S, Prague (Czech Republic)

    2014-10-15

    We propose a new methodology for application of well-known tools - RCM, RBI and SIF pro - with the aim to treat risks by means of suitable maintenance. The basis of the new methodology is the complex application of all three methods at the same time and not separately as is typical today. The proposed methodology suggests having just one managing team for reliability and risk treatment centred maintenance (RRTCM), employing existing RCM, RBI, and SIFpro tools concurrently. This approach allows for significant reduction of engineering activities' duration. In the proposed methodology these activities are staged into five phases and structured to eliminate all duplication resulting from separate application of the three tools. The newly proposed methodology saves 45% to 50% of the engineering workload and dequate significant financial savings.

  17. Reliability Assessment of Concrete Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Middleton, C. R.

    This paper is partly based on research performed for the Highways Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: concrete bridges". It contains the details of a methodology which can be used to generate Whole Life (WL) reliability...... profiles. These WL reliability profiles may be used to establish revised rules for concrete bridges. This paper is to some extend based on Thoft-Christensen et. al. [1996], Thoft-Christensen [1996] et. al. and Thoft-Christensen [1996]....

  18. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  19. Reliability Assessment of 2400 MWth Gas-Cooled Fast Reactor Natural Circulation Decay Heat Removal in Pressurized Situations

    Directory of Open Access Journals (Sweden)

    C. Bassi

    2008-01-01

    Full Text Available As the 2400 MWth gas-cooled fast reactor concept makes use of passive safety features in combination with active safety systems, the question of natural circulation decay heat removal (NCDHR reliability and performance assessment into the ongoing probabilistic safety assessment in support to the reactor design, named “probabilistic engineering assessment” (PEA, constitutes a challenge. Within the 5th Framework Program for Research and Development (FPRD of the European Community, a methodology has been developed to evaluate the reliability of passive systems characterized by a moving fluid and whose operation is based on physical principles, such as the natural circulation. This reliability method for passive systems (RMPSs is based on uncertainties propagation into thermal-hydraulic (T-H calculations. The aim of this exercise is finally to determine the performance reliability of the DHR system operating in a “passive” mode, taking into account the uncertainties of parameters retained for thermal-hydraulical calculations performed with the CATHARE 2 code. According to the PEA preliminary results, exhibiting the weight of pressurized scenarios (i.e., with intact primary circuit boundary for the core damage frequency (CDF, the RMPS exercise is first focusing on the NCDHR performance at these T-H conditions.

  20. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  1. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  2. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  3. An Embedded System for Safe, Secure and Reliable Execution of High Consequence Software

    Energy Technology Data Exchange (ETDEWEB)

    MCCOY,JAMES A.

    2000-08-29

    As more complex and functionally diverse requirements are placed on high consequence embedded applications, ensuring safe and secure operation requires an execution environment that is ultra reliable from a system viewpoint. In many cases the safety and security of the system depends upon the reliable cooperation between the hardware and the software to meet real-time system throughput requirements. The selection of a microprocessor and its associated development environment for an embedded application has the most far-reaching effects on the development and production of the system than any other element in the design. The effects of this choice ripple through the remainder of the hardware design and profoundly affect the entire software development process. While state-of-the-art software engineering principles indicate that an object oriented (OO) methodology provides a superior development environment, traditional programming languages available for microprocessors targeted for deeply embedded applications do not directly support OO techniques. Furthermore, the microprocessors themselves do not typically support nor do they enforce an OO environment. This paper describes a system level approach for the design of a microprocessor intended for use in deeply embedded high consequence applications that both supports and enforces an OO execution environment.

  4. Integration of human reliability analysis into the probabilistic risk assessment process: phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1985-01-01

    The US Nuclear Regulatory Commission and Pacific Northwest Laboratory initiated a research program in 1984 to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  5. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  6. Cooperative learning as a methodology for inclusive education development

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz Martínez

    2017-06-01

    Full Text Available This paper presents the methodology of cooperative learning as a strategy to develop the principles of inclusive education. It has a very practical orientation, with the intention of providing tools for teachers who want to implement this methodology in the classroom, starting with a theoretical review, and then a description of a case in which they have worked this methodology for 5 years. We describe specific activities and ways of working with students, later reaching conclusions on the implementation of the methodology.

  7. Operationalising the Lean principles in maternity service design using 3P methodology.

    Science.gov (United States)

    Smith, Iain

    2016-01-01

    The last half century has seen significant changes to Maternity services in England. Though rates of maternal and infant mortality have fallen to very low levels, this has been achieved largely through hospital admission. It has been argued that maternity services may have become over-medicalised and service users have expressed a preference for more personalised care. NHS England's national strategy sets out a vision for a modern maternity service that continues to deliver safe care whilst also adopting the principles of personalisation. Therefore, there is a need to develop maternity services that balance safety with personal choice. To address this challenge, a maternity unit in North East England considered improving their service through refurbishment or building new facilities. Using a design process known as the production preparation process (or 3P), the Lean principles of understanding user value, mapping value-streams, creating flow, developing pull processes and continuous improvement were applied to the design of a new maternity department. Multiple stakeholders were engaged in the design through participation in a time-out (3P) workshop in which an innovative pathway and facility for maternity services were co-designed. The team created a hybrid model that they described as "wrap around care" in which the Lean concept of pull was applied to create a service and facility design in which expectant mothers were put at the centre of care with clinicians, skills, equipment and supplies drawn towards them in line with acuity changes as needed. Applying the Lean principles using the 3P method helped stakeholders to create an innovative design in line with the aspirations and objectives of the National Maternity Review. The case provides a practical example of stakeholders applying the Lean principles to maternity services and demonstrates the potential applicability of the Lean 3P approach to design healthcare services in line with policy requirements.

  8. Confirmatory factor analysis of teaching and learning guiding principles instrument among teacher educators in higher education institutions

    Science.gov (United States)

    Masuwai, Azwani; Tajudin, Nor'ain Mohd; Saad, Noor Shah

    2017-05-01

    The purpose of this study is to develop and establish the validity and reliability of an instrument to generate teaching and learning guiding principles using Teaching and Learning Guiding Principles Instrument (TLGPI). Participants consisted of 171 Malaysian teacher educators. It is an essential instrument to reflect in generating the teaching and learning guiding principles in higher education level in Malaysia. Confirmatory Factor Analysis has validated all 19 items of TLGPI whereby all items indicated high reliability and internal consistency. A Confirmatory Factor Analysis also confirmed that a single factor model was used to generate teaching and learning guiding principles.

  9. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  10. [Principles and methodology for ecological rehabilitation and security pattern design in key project construction].

    Science.gov (United States)

    Chen, Li-Ding; Lu, Yi-He; Tian, Hui-Ying; Shi, Qian

    2007-03-01

    Global ecological security becomes increasingly important with the intensive human activities. The function of ecological security is influenced by human activities, and in return, the efficiency of human activities will also be affected by the patterns of regional ecological security. Since the 1990s, China has initiated the construction of key projects "Yangtze Three Gorges Dam", "Qinghai-Tibet Railway", "West-to-East Gas Pipeline", "West-to-East Electricity Transmission" and "South-to-North Water Transfer" , etc. The interaction between these projects and regional ecological security has particularly attracted the attention of Chinese government. It is not only important for the regional environmental protection, but also of significance for the smoothly implementation of various projects aimed to develop an ecological rehabilitation system and to design a regional ecological security pattern. This paper made a systematic analysis on the types and characteristics of key project construction and their effects on the environment, and on the basis of this, brought forward the basic principles and methodology for ecological rehabilitation and security pattern design in this construction. It was considered that the following issues should be addressed in the implementation of a key project: 1) analysis and evaluation of current regional ecological environment, 2) evaluation of anthropogenic disturbances and their ecological risk, 3) regional ecological rehabilitation and security pattern design, 4) scenario analysis of environmental benefits of regional ecological security pattern, 5) re-optimization of regional ecological system framework, and 6) establishment of regional ecosystem management plan.

  11. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  12. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  13. Structural Reliability of Wind Turbine Blades

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    turbine blades. The main purpose is to draw a clear picture of how reliability-based design of wind turbines can be done in practice. The objectives of the thesis are to create methodologies for efficient reliability assessment of composite materials and composite wind turbine blades, and to map...... the uncertainties in the processes, materials and external conditions that have an effect on the health of a composite structure. The study considers all stages in a reliability analysis, from defining models of structural components to obtaining the reliability index and calibration of partial safety factors...... by developing new models and standards or carrying out tests The following aspects are covered in detail: ⋅ The probabilistic aspects of ultimate strength of composite laminates are addressed. Laminated plates are considered as a general structural reliability system where each layer in a laminate is a separate...

  14. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  15. Justifying Design Decisions with Theory-based Design Principles

    OpenAIRE

    Schermann, Michael;Gehlert, Andreas;Pohl, Klaus;Krcmar, Helmut

    2014-01-01

    Although the role of theories in design research is recognized, we show that little attention has been paid on how to use theories when designing new artifacts. We introduce design principles as a new methodological approach to address this problem. Design principles extend the notion of design rationales that document how a design decision emerged. We extend the concept of design rationales by using theoretical hypotheses to support or object to design decisions. At the example of developing...

  16. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    Directory of Open Access Journals (Sweden)

    Diana Guzys

    2015-05-01

    Full Text Available In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  17. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    Science.gov (United States)

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  18. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  19. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  20. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  1. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  2. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  3. Principled Missing Data Treatments.

    Science.gov (United States)

    Lang, Kyle M; Little, Todd D

    2018-04-01

    We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.

  4. Human reliability in complex systems: an overview

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1976-07-01

    A detailed analysis is presented of the main conceptual background underlying the areas of human reliability and human error. The concept of error is examined and generalized to that of human reliability, and some of the practical and methodological difficulties of reconciling the different standpoints of the human factors specialist and the engineer discussed. Following a survey of general reviews available on human reliability, quantitative techniques for prediction of human reliability are considered. An in-depth critical analysis of the various quantitative methods is then presented, together with the data bank requirements for human reliability prediction. Reliability considerations in process control and nuclear plant, and also areas of design, maintenance, testing and emergency situations are discussed. The effects of stress on human reliability are analysed and methods of minimizing these effects discussed. Finally, a summary is presented and proposals for further research are set out. (author)

  5. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research

    Science.gov (United States)

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242

  6. Methodology for time-dependent reliability analysis of accident sequences and complex reactor systems

    International Nuclear Information System (INIS)

    Paula, H.M.

    1984-01-01

    The work presented here is of direct use in probabilistic risk assessment (PRA) and is of value to utilities as well as the Nuclear Regulatory Commission (NRC). Specifically, this report presents a methodology and a computer program to calculate the expected number of occurrences for each accident sequence in an event tree. The methodology evaluates the time-dependent (instantaneous) and the average behavior of the accident sequence. The methodology accounts for standby safety system and component failures that occur (a) before they are demanded, (b) upon demand, and (c) during the mission (system operation). With respect to failures that occur during the mission, this methodology is unique in the sense that it models components that can be repaired during the mission. The expected number of system failures during the mission provides an upper bound for the probability of a system failure to run - the mission unreliability. The basic event modeling includes components that are continuously monitored, periodically tested, and those that are not tested or are otherwise nonrepairable. The computer program ASA allows practical applications of the method developed. This work represents a required extension of the presently available methodology and allows a more realistic PRA of nuclear power plants

  7. Safety and reliability assessment

    International Nuclear Information System (INIS)

    1979-01-01

    This report contains the papers delivered at the course on safety and reliability assessment held at the CSIR Conference Centre, Scientia, Pretoria. The following topics were discussed: safety standards; licensing; biological effects of radiation; what is a PWR; safety principles in the design of a nuclear reactor; radio-release analysis; quality assurance; the staffing, organisation and training for a nuclear power plant project; event trees, fault trees and probability; Automatic Protective Systems; sources of failure-rate data; interpretation of failure data; synthesis and reliability; quantification of human error in man-machine systems; dispersion of noxious substances through the atmosphere; criticality aspects of enrichment and recovery plants; and risk and hazard analysis. Extensive examples are given as well as case studies

  8. Advances in ranking and selection, multiple comparisons, and reliability methodology and applications

    CERN Document Server

    Balakrishnan, N; Nagaraja, HN

    2007-01-01

    S. Panchapakesan has made significant contributions to ranking and selection and has published in many other areas of statistics, including order statistics, reliability theory, stochastic inequalities, and inference. Written in his honor, the twenty invited articles in this volume reflect recent advances in these areas and form a tribute to Panchapakesan's influence and impact on these areas. Thematically organized, the chapters cover a broad range of topics from: Inference; Ranking and Selection; Multiple Comparisons and Tests; Agreement Assessment; Reliability; and Biostatistics. Featuring

  9. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  10. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  11. Non-utility generation and demand management reliability of customer delivery systems

    International Nuclear Information System (INIS)

    Hamoud, G.A.; Wang, L.

    1995-01-01

    A probabilistic methodology for evaluating the impact of non-utility generation (NUG) and demand management programs (DMP) on supply reliability of customer delivery systems was presented. The proposed method was based on the criteria that the supply reliability to the customers on the delivery system should not be affected by the integration of either NUG or DMPs. The method considered station load profile, load forecast, and uncertainty in size and availability of the nuio. Impacts on system reliability were expressed in terms of possible delays of the in-service date for new facilities or in terms of an increase in the system load carrying capability. Examples to illustrate the proposed methodology were provided. 10 refs., 8 tabs., 2 figs

  12. Principles of methodology guiding the study of religion as found in the work of Friedrich Heiler

    Directory of Open Access Journals (Sweden)

    Tatiana Samarina

    2013-02-01

    Full Text Available The issue of methodology in the study of religious phenomena appears as problematic to those Russian experts who deal with the scientific study of religious phenomena. On the other hand, western European researchers have already made much positive progress in this direction. This article attempts to define the principles which guide the scholar in the study of religion as found in the work of the renowned German scholar Friedrich Heiler. The author’s starting point is Heiler’s fundamental concept which permeates all his work and is most clearly defined in his last monograph: «Erscheinungsformen und Wesen der Religion». Heiler gives several pointers to students of religion: attention to detail, a fi rm grounding in the core matter of the object to be studies, and at the same time a comprehensive and panoramic view of religion and religious phenomena as a whole. The author concludes that Heiler’s protracted study of both Christianity and the Eastern religions led him to regard the phenomenological method as the most eff ective. The author presupposes a negative reception of this method by Russian students of religious phenomena due to the fact that the latter are too unfamiliar with Heiler’s work and conclusions

  13. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  14. Organizing the Methodology Work at Higher School

    Directory of Open Access Journals (Sweden)

    O. A. Plaksina

    2012-01-01

    Full Text Available The paper considers the methodology components of organizing the higher school training. The research and analysis of the existing methodology systems carried out by the authors reveals that their advantages and disadvantages are related to the type of the system creating element of the methodology system organizational structure. The optimal scheme of such system has been developed in the context of Vocational School Reorganization implying the specification and expansion of the set of basic design principles of any control system. Following the suggested organizational approach provides the grounds for teachers’ self development and professional growth. The methodology of the approach allows using the given structure in any higher educational institution, providing the system transition from its simple functioning to the sustainable development mode. 

  15. Embarrassment as a key to understanding cultural differences. Basic principles of cultural analysis

    DEFF Research Database (Denmark)

    Bouchet, Dominique

    1995-01-01

    I introduce here the principles I use in my investigation of intercultural marketing and management. I explain how I discovered them, and show how they spring from a theoretical understanding of the dynamic of cultural differences. One of the basic methodological principles for my analysis...

  16. IS THERE A NEED FOR THE POST-NON-CLASSICAL METHODOLOGY IN PEDAGOGY?

    Directory of Open Access Journals (Sweden)

    Vladislav L. Benin

    2014-01-01

    Full Text Available  The publication continues the discussion, started by Yu.V. Larina in ≪Education in Search of the Congruity Principle≫ concerning the modern methodology of pedagogical science; and identifies the criteria of the given principle along with the limitations of the post-non-classical approaches to the humanities.Methods: The methodology involves the analysis of existing view points, formalization of characteristics of post-non-classical science, and reflection of pedagogical principle of cultural conformity.Results: The research outcomes demonstrate that the gradual undermining of the fundamental science results in erosion of methodological background. In case of interdisciplinary subjects, a researcher is forced to integrate different methods and techniques, which provokes further disruption of the methodology.Scientific novelty: The author classifies and extrapolates to the humanities sphere the main characteristics of post-non-classical science; and makes a conclusion about the gradual decline of researchers’ training quality due to the lack of methodological clarity, and aggressive forms of science vulgarization leading to spontaneous development of clipping methodology.The practical significance: Implementation of the research findings can activate both theoretical and methodological aspects of teacher’s training and selfeducation.

  17. Reliability of application of inspection procedures

    Energy Technology Data Exchange (ETDEWEB)

    Murgatroyd, R A

    1988-12-31

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC). 3 refs.

  18. Reliability of application of inspection procedures

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1988-01-01

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC)

  19. Principles of development of the industry of technogenic waste processing

    Directory of Open Access Journals (Sweden)

    Maria A. Bayeva

    2014-01-01

    Full Text Available Objective to identify and substantiate the principles of development of the industry of technogenic waste processing. Methods systemic analysis and synthesis method of analogy. Results basing on the analysis of the Russian and foreign experience in the field of waste management and environmental protection the basic principles of development activities on technogenic waste processing are formulated the principle of legal regulation the principle of efficiency technologies the principle of ecological safety the principle of economic support. The importance of each principle is substantiated by the description of the situation in this area identifying the main problems and ways of their solution. Scientific novelty the fundamental principles of development of the industry of the industrial wastes processing are revealed the measures of state support are proposed. Practical value the presented theoretical conclusions and proposals are aimed primarily on theoretical and methodological substantiation and practical solutions to modern problems in the sphere of development of the industry of technogenic waste processing.

  20. Philosophy of democracy and Principles of Democracy

    Directory of Open Access Journals (Sweden)

    Jarmila Chovancová

    2016-07-01

    Full Text Available As the title of suggests the article deals with the problems of democracy, its philosophy and also dominant principles. The author reflects interpretation of democracy on the society with their different understand.             Democracy represents a form of government, a way of political life where these principles are put into practice.             Democracy and its separate principles are expressed in the ultimate legal rules in the democratic countries. Principle of participation as a democratic principle rests with the fact that citizens have right to participate in state administration either directly or via their elected representatives. This principle also ensures that citizens participating in state administration enjoy equal basic rights and liberties and also guarantees that no person can be excluded from participation in state administration or from access to elected or other posts.             Methodology: In the article I using method of analyze - I analyzing dominant problems of democracy-its principles in democratic countries. Another method is comparation- understanding democracy from historical aspect. And the end I also using method of synthesis-explanation democracy understand today.

  1. Engineering reliability in design phase: An application to AP-600 reactor passive safety system

    International Nuclear Information System (INIS)

    Majumdr, D.; Siahpush, A.S.; Hills, S.W.

    1992-01-01

    A computerized reliability enhancement methodology is described that can be used at the engineering design phase to help the designer achieve a desired reliability of the system. It can take into account the limitation imposed by a constraint such as budget, space, or weight. If the desired reliability of the system is known, it can determine the minimum reliabilities of the components, or how many redundant components are needed to achieve the desired reliability. This methodology is applied to examine the Automatic Depressurization System (ADS) of the new passively safe AP-600 reactor. The safety goal of a nuclear reactor dictates a certain reliability level of its components. It is found that a series parallel valve configuration instead of the parallel-series configuration of the four valves in one stage would improve the reliability of the ADS. Other valve characteristics and arrangements are explored to examine different reliability options for the system

  2. The origin of life and its methodological challenge.

    Science.gov (United States)

    Wächtershäuser, G

    1997-08-21

    The problem of the origin of life is discussed from a methodological point of view as an encounter between the teleological thinking of the historian and the mechanistic thinking of the chemist; and as the Kantian task of replacing teleology by mechanism. It is shown how the Popperian situational logic of historic understanding and the Popperian principle of explanatory power of scientific theories, when jointly applied to biochemistry, lead to a methodology of biochemical retrodiction, whereby common precursor functions are constructed for disparate successor functions. This methodology is exemplified by central tenets of the theory of the chemo-autotrophic origin of life: the proposal of a surface metabolism with a two-dimensional order; the basic polarity of life with negatively charged constituents on positively charged mineral surfaces; the surface-metabolic origin of phosphorylated sugar metabolism and nucleic acids; the origin of membrane lipids and of chemi-osmosis on pyrite surfaces; and the principles of the origin of the genetic machinery. The theory presents the early evolution of life as a process that begins with chemical necessity and winds up in genetic chance.

  3. Aviation Fuel System Reliability and Fail-Safety Analysis. Promising Alternative Ways for Improving the Fuel System Reliability

    Directory of Open Access Journals (Sweden)

    I. S. Shumilov

    2017-01-01

    Full Text Available The paper deals with design requirements for an aviation fuel system (AFS, AFS basic design requirements, reliability, and design precautions to avoid AFS failure. Compares the reliability and fail-safety of AFS and aircraft hydraulic system (AHS, considers the promising alternative ways to raise reliability of fuel systems, as well as elaborates recommendations to improve reliability of the pipeline system components and pipeline systems, in general, based on the selection of design solutions.It is extremely advisable to design the AFS and AHS in accordance with Aviation Regulations АП25 and Accident Prevention Guidelines, ICAO (International Civil Aviation Association, which will reduce risk of emergency situations, and in some cases even avoid heavy disasters.ATS and AHS designs should be based on the uniform principles to ensure the highest reliability and safety. However, currently, this principle is not enough kept, and AFS looses in reliability and fail-safety as compared with AHS. When there are the examined failures (single and their combinations the guidelines to ensure the AFS efficiency should be the same as those of norm-adopted in the Regulations АП25 for AHS. This will significantly increase reliability and fail-safety of the fuel systems and aircraft flights, in general, despite a slight increase in AFS mass.The proposed improvements through the use of components redundancy of the fuel system will greatly raise reliability of the fuel system of a passenger aircraft, which will, without serious consequences for the flight, withstand up to 2 failures, its reliability and fail-safety design will be similar to those of the AHS, however, above improvement measures will lead to a slightly increasing total mass of the fuel system.It is advisable to set a second pump on the engine in parallel with the first one. It will run in case the first one fails for some reasons. The second pump, like the first pump, can be driven from the

  4. Population health management guiding principles to stimulate collaboration and improve pharmaceutical care.

    Science.gov (United States)

    Steenkamer, Betty; Baan, Caroline; Putters, Kim; van Oers, Hans; Drewes, Hanneke

    2018-04-09

    Purpose A range of strategies to improve pharmaceutical care has been implemented by population health management (PHM) initiatives. However, which strategies generate the desired outcomes is largely unknown. The purpose of this paper is to identify guiding principles underlying collaborative strategies to improve pharmaceutical care and the contextual factors and mechanisms through which these principles operate. Design/methodology/approach The evaluation was informed by a realist methodology examining the links between PHM strategies, their outcomes and the contexts and mechanisms by which these strategies operate. Guiding principles were identified by grouping context-specific strategies with specific outcomes. Findings In total, ten guiding principles were identified: create agreement and commitment based on a long-term vision; foster cooperation and representation at the board level; use layered governance structures; create awareness at all levels; enable interpersonal links at all levels; create learning environments; organize shared responsibility; adjust financial strategies to market contexts; organize mutual gains; and align regional agreements with national policies and regulations. Contextual factors such as shared savings influenced the effectiveness of the guiding principles. Mechanisms by which these guiding principles operate were, for instance, fostering trust and creating a shared sense of the problem. Practical implications The guiding principles highlight how collaboration can be stimulated to improve pharmaceutical care while taking into account local constraints and possibilities. The interdependency of these principles necessitates effectuating them together in order to realize the best possible improvements and outcomes. Originality/value This is the first study using a realist approach to understand the guiding principles underlying collaboration to improve pharmaceutical care.

  5. An analysis of the human reliability on Three Mile Island II accident considering THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    Fonseca, Renato Alves da; Alvim, Antonio Carlos Marques

    2005-01-01

    The research on the Analysis of the Human Reliability becomes more important every day, as well as the study of the human factors and the contributions of the same ones to the incidents and accidents, mainly in complex plants or of high technology. The analysis here developed it uses the methodologies THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis), as well as, the tables and the cases presented in THERP Handbook and to develop a qualitative and quantitative study of an occurred nuclear accident. The chosen accident was it of Three Mile Island (TMI). The accident analysis has revealed a series of incorrect actions that resulted in the permanent loss of the reactor and shutdown of Unit 2. This study also aims at enhancing the understanding of the THERP and ATHEANA methods and at practical applications. In addition, it is possible to understand the influence of plant operational status on human failures and the influence of human failures on equipment of a system, in this case, a nuclear power plant. (author)

  6. Reliability engineering for nuclear and other high technology systems

    International Nuclear Information System (INIS)

    Lakner, A.A.; Anderson, R.T.

    1985-01-01

    This book is written for the reliability instructor, program manager, system engineer, design engineer, reliability engineer, nuclear regulator, probability risk assessment (PRA) analyst, general manager and others who are involved in system hardware acquisition, design and operation and are concerned with plant safety and operational cost-effectiveness. It provides criteria, guidelines and comprehensive engineering data affecting reliability; it covers the key aspects of system reliability as it relates to conceptual planning, cost tradeoff decisions, specification, contractor selection, design, test and plant acceptance and operation. It treats reliability as an integrated methodology, explicitly describing life cycle management techniques as well as the basic elements of a total hardware development program, including: reliability parameters and design improvement attributes, reliability testing, reliability engineering and control. It describes how these elements can be defined during procurement, and implemented during design and development to yield reliable equipment. (author)

  7. Prime implicants in dynamic reliability analysis

    International Nuclear Information System (INIS)

    Tyrväinen, Tero

    2016-01-01

    This paper develops an improved definition of a prime implicant for the needs of dynamic reliability analysis. Reliability analyses often aim to identify minimal cut sets or prime implicants, which are minimal conditions that cause an undesired top event, such as a system's failure. Dynamic reliability analysis methods take the time-dependent behaviour of a system into account. This means that the state of a component can change in the analysed time frame and prime implicants can include the failure of a component at different time points. There can also be dynamic constraints on a component's behaviour. For example, a component can be non-repairable in the given time frame. If a non-repairable component needs to be failed at a certain time point to cause the top event, we consider that the condition that it is failed at the latest possible time point is minimal, and the condition in which it fails earlier non-minimal. The traditional definition of a prime implicant does not account for this type of time-related minimality. In this paper, a new definition is introduced and illustrated using a dynamic flowgraph methodology model. - Highlights: • A new definition of a prime implicant is developed for dynamic reliability analysis. • The new definition takes time-related minimality into account. • The new definition is needed in dynamic flowgraph methodology. • Results can be represented by a smaller number of prime implicants.

  8. Study on the Management for the Nuclear Power Plant Maintenance and Equipment Reliability

    International Nuclear Information System (INIS)

    Yoon, Kyeongseop; Lee, Sangheon; Kim, Myungjin; Lee, Unjang

    2015-01-01

    In our country, many studies on the regulatory policy of the plant maintenance have ever been performed since 1998, but the relevant regulatory requirements were not established yet. These background mentioned above request us to study on the regulation policy and maintenance plan to improve the safety, reliability and efficiency of NPP. To solve these problems, in this study, we deduct the management methodology for the improvement of NPP maintenance and equipment reliability that is essential to secure the safety and efficiency of the commercial NPP. For analysis the maintenance and equipment reliability management methodology in overseas NPP. We studied maintenance and equipment reliability of USA, Canada and Europe(France, England, German). We also studied status and application condition of Korean NPP maintenance management technical development. We deducted an effective maintenance methodology that is needed to Korean NPP, as a result of comparison on the technical trend of the maintenance management between overseas and Korean, such like following. - Regulation form ·Specific provision of regulation requirement and application of form that is clarifying application standard - Maintenance management methodology, Maintenance management program. This results of study could be applied for regulation policy, law and guideline establishment of NPP maintenance, operation, supervision and a system establishment for maintenance management, education data about maintenance for NPP employees

  9. Guide for generic application of Reliability Centered Maintenance (RCM) recommendations

    International Nuclear Information System (INIS)

    Schwan, C.A.; Toomey, G.E.; Morgan, T.A.; Darling, S.S.

    1991-02-01

    Previously completed reliability centered maintenance (RCM) studies form the basis for developing or refining a preventive maintenance program. This report describes a generic methodology that will help utilities optimize nuclear plant maintenance programs using RCM techniques. This guide addresses the following areas: history of the generic methodology development process, and use of the generic methodology for conducting system-to-system and component-to-component evaluations. 2 refs., 2 figs., 5 tabs

  10. First-principles investigations of solid solution strengthening in Al alloys

    OpenAIRE

    Ma, Duancheng

    2012-01-01

    Any material properties, in principle, can be reproduced or predicted by performing firstprinciples calculations. Nowadays, however, we are dealing with complex alloy compositions and processes. The complexities cannot be fully described by first-principles, because of the limited computational power. The primary objective of this study is to investigate an important engineering problem, solid solution strengthening, in a simplified manner. The simplified scheme should allow fast and reliable...

  11. Designing MOOC: a shared view on didactical principles

    NARCIS (Netherlands)

    Stoyanov, Slavi; De Vries, Fred

    2018-01-01

    The innovative impact of the paper can be highlighted by the following statements: 1. Applying the Group Concept Mapping, a non-traditional and power research methodology for objectively identifying the shared vision of a group of experts on MOOC didactical principles. 2. Defining MOOC didactical

  12. Development of a methodology for conducting an integrated HRA/PRA --

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. (Brookhaven National Lab., Upton, NY (United States)); Wreathall, J. (Wreathall (John) and Co., Dublin, OH (United States)); Cooper, S.E. (Science Applications International Corp., McLean, VA (United States))

    1993-01-01

    During Low Power and Shutdown (LP S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP S, (2) identification of potentially important LP S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP S conditions for a pressurized water reactor (PWR).

  13. Methodological issues concerning the application of reliable laser particle sizing in soils

    Science.gov (United States)

    de Mascellis, R.; Impagliazzo, A.; Basile, A.; Minieri, L.; Orefice, N.; Terribile, F.

    2009-04-01

    During the past decade, the evolution of technologies has enabled laser diffraction (LD) to become a much widespread means of particle size distribution (PSD), replacing sedimentation and sieve analysis in many scientific fields mainly due to its advantages of versatility, fast measurement and high reproducibility. Despite such developments of the last decade, the soil scientist community has been quite reluctant to replace the good old sedimentation techniques (ST); possibly because of (i) the large complexity of the soil matrix inducing different types of artefacts (aggregates, deflocculating dynamics, etc.), (ii) the difficulties in relating LD results with results obtained through sedimentation techniques and (iii) the limited size range of most LD equipments. More recently LD granulometry is slowly gaining appreciation in soil science also because of some innovations including an enlarged size dynamic range (0,01-2000 m) and the ability to implement more powerful algorithms (e.g. Mie theory). Furthermore, LD PSD can be successfully used in the application of physically based pedo-transfer functions (i.e., Arya and Paris model) for investigations of soil hydraulic properties, due to the direct determination of PSD in terms of volume percentage rather than in terms of mass percentage, thus eliminating the need to adopt the rough approximation of a single value for soil particle density in the prediction process. Most of the recent LD work performed in soil science deals with the comparison with sedimentation techniques and show the general overestimation of the silt fraction following a general underestimation of the clay fraction; these well known results must be related with the different physical principles behind the two techniques. Despite these efforts, it is indeed surprising that little if any work is devoted to more basic methodological issues related to the high sensitivity of LD to the quantity and the quality of the soil samples. Our work aims to

  14. The reliability of physical examination tests for the diagnosis of anterior cruciate ligament rupture--A systematic review.

    Science.gov (United States)

    Lange, Toni; Freiberg, Alice; Dröge, Patrik; Lützner, Jörg; Schmitt, Jochen; Kopkow, Christian

    2015-06-01

    Systematic literature review. Despite their frequent application in routine care, a systematic review on the reliability of clinical examination tests to evaluate the integrity of the ACL is missing. To summarize and evaluate intra- and interrater reliability research on physical examination tests used for the diagnosis of ACL tears. A comprehensive systematic literature search was conducted in MEDLINE, EMBASE and AMED until May 30th 2013. Studies were included if they assessed the intra- and/or interrater reliability of physical examination tests for the integrity of the ACL. Methodological quality was evaluated with the Quality Appraisal of Reliability Studies (QAREL) tool by two independent reviewers. 110 hits were achieved of which seven articles finally met the inclusion criteria. These studies examined the reliability of four physical examination tests. Intrarater reliability was assessed in three studies and ranged from fair to almost perfect (Cohen's k = 0.22-1.00). Interrater reliability was assessed in all included studies and ranged from slight to almost perfect (Cohen's k = 0.02-0.81). The Lachman test is the physical tests with the highest intrarater reliability (Cohen's k = 1.00), the Lachman test performed in prone position the test with the highest interrater reliability (Cohen's k = 0.81). Included studies were partly of low methodological quality. A meta-analysis could not be performed due to the heterogeneity in study populations, reliability measures and methodological quality of included studies. Systematic investigations on the reliability of physical examination tests to assess the integrity of the ACL are scarce and of varying methodological quality. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Principles for statistical inference on big spatio-temporal data from climate models

    KAUST Repository

    Castruccio, Stefano; Genton, Marc G.

    2018-01-01

    The vast increase in size of modern spatio-temporal datasets has prompted statisticians working in environmental applications to develop new and efficient methodologies that are still able to achieve inference for nontrivial models within an affordable time. Climate model outputs push the limits of inference for Gaussian processes, as their size can easily be larger than 10 billion data points. Drawing from our experience in a set of previous work, we provide three principles for the statistical analysis of such large datasets that leverage recent methodological and computational advances. These principles emphasize the need of embedding distributed and parallel computing in the inferential process.

  16. Principles for statistical inference on big spatio-temporal data from climate models

    KAUST Repository

    Castruccio, Stefano

    2018-02-24

    The vast increase in size of modern spatio-temporal datasets has prompted statisticians working in environmental applications to develop new and efficient methodologies that are still able to achieve inference for nontrivial models within an affordable time. Climate model outputs push the limits of inference for Gaussian processes, as their size can easily be larger than 10 billion data points. Drawing from our experience in a set of previous work, we provide three principles for the statistical analysis of such large datasets that leverage recent methodological and computational advances. These principles emphasize the need of embedding distributed and parallel computing in the inferential process.

  17. Power system reliability memento; Memento de la surete du systeme electrique

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    The reliability memento of the French power system (national power transmission grid) is an educational document which purpose is to point out the role of each one as regards power system operating reliability. This memento was first published in 1999. Extensive changes have taken place since then. The new 2002 edition shows that system operating reliability is as an important subject as ever: 1 - foreword; 2 - system reliability: the basics; 3 - equipment measures taken in order to guarantee the reliability of the system; 4 - organisational and human measures taken to guarantee the reliability of the system; appendix 1 - system operation: basic concepts; appendix 2 - guiding principles governing the reliability of the power system; appendix 3 - international associations of transmission system operators; appendix 4 - description of major incidents.

  18. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  19. RAVONSICS-challenging for assuring software reliability of nuclear I and C system

    International Nuclear Information System (INIS)

    Hai Zeng; Ming Yang; Yoshikawa, Hidekazu

    2015-01-01

    As the “central nerve system”, the highly reliable Instrumentation and Control (I and C) systems, which provide the right functions and functions correctly, are always desirable not only for the end users of NPPs but also the suppliers of I and C systems. The Digitalization of nuclear I and C system happened in recent years brought a lot of new features for nuclear I and C system. On one side digital technology provides more functionalities, and it should be more reliable and robust; on the other side, digital technology brings new challenge for nuclear I and C system, especially the software running in the hardware component. The software provides flexible functionalities for nuclear I and C system, but it also brings the difficulties to evaluate the reliability and safety of it because of the complexity of software. The reliability of software, which is indispensable part of I and C system, will have essential impact on the reliability of the whole system, and people definitely want to know what the reliability of this intangible part is. The methods used for the evaluation of reliability of system and hardware hardly work for software, because the inherent difference of failure mechanism exists between software and hardware. Failure in software is systematically induced by design error, but failure in hardware is randomly induced by material and production. To continue the effort on this hot topic and to try to achieve consensus on the potential methodology for software reliability evaluation, a cooperative research project called RAVONSICS (Reliability and Verification and Validation of Nuclear Safety I and C Software) is being carried on by 7 Chinese partners, which includes University, research institute, utility, vendor, and safety regulatory body. The objective of RAVONSICS is to bring forwards the methodology for the software reliability evaluation, and the software verification technique. RAVONSICS works cooperatively with its European sister project

  20. Assessing Changes in the Reliability of the U.S. Electric Power System

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Peter H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stanford Univ., CA (United States). Dept. of Physics; LaCommare, Kristina H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Eto, Joseph H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sweeney, James L. [Stanford Univ., CA (United States)

    2015-08-01

    Over the past 15 years, the most well-publicized efforts to assess trends in U.S. electric power system reliability have focused only on a subset of all power interruption events (see, for example, Amin 2008 and Campbell 2012)—namely, only the very largest events, which trigger immediate emergency reporting to federal agencies and industry regulators. Anecdotally, these events are thought by many to represent no more than 10% of the power interruptions experienced annually by electricity consumers. Moreover, a review of these emergency reports has identified shortcomings in relying on these data as reliable sources for assessing trends, even with the reliability events they report (Fisher et al. 2012). Recent work has begun to address these limitations by examining trends in reliability data collected annually by electricity distribution companies (Eto et al. 2012). In principle, all power interruptions experienced by electricity customers, regardless of size, are recorded by the distribution utility. Moreover, distribution utilities have a long history of recording this information, often in response to mandates from state public utility commissions (Eto et al. 2006). Thus, studies that rely on reliability data collected by distribution utilities can, in principle, provide a more complete basis upon which to assess trends or changes in reliability over time.

  1. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Macchi, Marco; Garetti, Marco; Centrone, Domenico; Fumagalli, Luca; Piero Pavirani, Gian

    2012-01-01

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  2. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  3. Tailoring a Human Reliability Analysis to Your Industry Needs

    Science.gov (United States)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed

  4. Predicting Cost/Reliability/Maintainability of Advanced General Aviation Avionics Equipment

    Science.gov (United States)

    Davis, M. R.; Kamins, M.; Mooz, W. E.

    1978-01-01

    A methodology is provided for assisting NASA in estimating the cost, reliability, and maintenance (CRM) requirements for general avionics equipment operating in the 1980's. Practical problems of predicting these factors are examined. The usefulness and short comings of different approaches for modeling coast and reliability estimates are discussed together with special problems caused by the lack of historical data on the cost of maintaining general aviation avionics. Suggestions are offered on how NASA might proceed in assessing cost reliability CRM implications in the absence of reliable generalized predictive models.

  5. Basic safety principles for nuclear power plants

    International Nuclear Information System (INIS)

    1988-01-01

    Nuclear power plant safety requires a continuing quest for excellence. All individuals concerned should constantly be alert to opportunities to reduce risks to the lowest practicable level. The quest, however, is most likely to be fruitful if it is based on an understanding of the underlying objectives and principles of nuclear safety, and the way in which its aspects are interrelated. This report is an attempt to provide a logical framework for such an understanding. The proposed objectives and principles of nuclear safety are interconnected and must be taken as a whole; they do not constitute a menu from which selection can be made. The report takes account of current issues and developments. It includes the concept of safety objectives and the use of probabilistic safety assessment. Reliability targets for safety systems are discussed. The concept of a 'safety culture' is crucial. Attention has been paid to the need for planning for accident management. The report contains objectives and principles. The objectives state what is to be achieved; the principles state how to achieve it. In each case, the basic principle is stated as briefly as possible. The accompanying discussion comments on the reasons for the principle and its importance, as well as exceptions, the extent of coverage and any necessary clarification. The discussion is as important as the principle it augments. 4 figs

  6. Reliability analysis of maintenance operations for railway tracks

    International Nuclear Information System (INIS)

    Rhayma, N.; Bressolette, Ph.; Breul, P.; Fogli, M.; Saussine, G.

    2013-01-01

    Railway engineering is confronted with problems due to degradation of the railway network that requires important and costly maintenance work. However, because of the lack of knowledge on the geometrical and mechanical parameters of the track, it is difficult to optimize the maintenance management. In this context, this paper presents a new methodology to analyze the behavior of railway tracks. It combines new diagnostic devices which permit to obtain an important amount of data and thus to make statistics on the geometric and mechanical parameters and a non-intrusive stochastic approach which can be coupled with any mechanical model. Numerical results show the possibilities of this methodology for reliability analysis of different maintenance operations. In the future this approach will give important informations to railway managers to optimize maintenance operations using a reliability analysis

  7. Islam and the four principles of medical ethics.

    Science.gov (United States)

    Mustafa, Yassar

    2014-07-01

    The principles underpinning Islam's ethical framework applied to routine clinical scenarios remain insufficiently understood by many clinicians, thereby unfortunately permitting the delivery of culturally insensitive healthcare.This paper summarises the foundations of the Islamic ethical theory, elucidating the principles and methodology employed by the Muslim jurist in deriving rulings in the field of medical ethics. The four-principles approach, as espoused by Beauchamp and Childress, is also interpreted through the prism of Islamic ethical theory. Each of the four principles (beneficence, nonmaleficence,justice and autonomy) is investigated in turn, looking in particular at the extent to which each is rooted in the Islamic paradigm. This will provide an important insight into Islamic medical ethics, enabling the clinician to have a better informed discussion with the Muslim patient. It will also allow for a higher degree of concordance in consultations and consequently optimise culturally sensitive healthcare delivery.

  8. Thermodynamic assessment of the Sn–Sr system supported by first-principles calculations

    International Nuclear Information System (INIS)

    Zhao, Jingrui; Du, Yong; Zhang, Lijun; Wang, Aijun; Zhou, Liangcai; Zhao, Dongdong; Liang, Jianlie

    2012-01-01

    Highlights: ► All the literature data of Sn–Sr system is critically reviewed. ► First-principles calculation of enthalpy of formation is carried out for each compound. ► Thermodynamic parameters for Sn–Sr system are obtained by CALPHAD method. ► A hybrid approach of CALPHAD and first-principles calculations is recommended. - Abstract: A hybrid approach of CALPHAD and first-principles calculations was employed to perform a thermodynamic modeling of the Sn–Sr system. The experimental phase diagram and thermodynamic data available in the literature were critically reviewed. The enthalpies of formation for the 6 stoichiometric compounds (i.e. Sr 2 Sn, Sr 5 Sn 3 , SrSn, Sr 3 Sn 5 , SrSn 3 and SrSn 4 ) at 0 K were computed by means of first-principles calculations. These data were used as the experimental values in the optimization module PARROT in the subsequent CALPHAD assessment to provide thermodynamic parameters with sound physical meaning. A set of self-consistent thermodynamic parameters was finally obtained by considering reliable literature data and the first-principles computed results. Comprehensive comparisons between the calculated and measured quantities indicate that all the reliable experimental information can be satisfactorily accounted for by the present thermodynamic description.

  9. Evaluation of validity and reliability of a methodology for measuring human postural attitude and its relation to temporomandibular joint disorders

    Science.gov (United States)

    Fernández, Ramón Fuentes; Carter, Pablo; Muñoz, Sergio; Silva, Héctor; Venegas, Gonzalo Hernán Oporto; Cantin, Mario; Ottone, Nicolás Ernesto

    2016-01-01

    INTRODUCTION Temporomandibular joint disorders (TMJDs) are caused by several factors such as anatomical, neuromuscular and psychological alterations. A relationship has been established between TMJDs and postural alterations, a type of anatomical alteration. An anterior position of the head requires hyperactivity of the posterior neck region and shoulder muscles to prevent the head from falling forward. This compensatory muscular function may cause fatigue, discomfort and trigger point activation. To our knowledge, a method for assessing human postural attitude in more than one plane has not been reported. Thus, the aim of this study was to design a methodology to measure the external human postural attitude in frontal and sagittal planes, with proper validity and reliability analyses. METHODS The variable postures of 78 subjects (36 men, 42 women; age 18–24 years) were evaluated. The postural attitudes of the subjects were measured in the frontal and sagittal planes, using an acromiopelvimeter, grid panel and Fox plane. RESULTS The method we designed for measuring postural attitudes had adequate reliability and validity, both qualitatively and quantitatively, based on Cohen’s Kappa coefficient (> 0.87) and Pearson’s correlation coefficient (r = 0.824, > 80%). CONCLUSION This method exhibits adequate metrical properties and can therefore be used in further research on the association of human body posture with skeletal types and TMJDs. PMID:26768173

  10. The development on the methodology of the initiating event frequencies for liquid metal reactor KALIMER

    International Nuclear Information System (INIS)

    Jeong, K. S.; Yang, Z. A.; Ah, Y. B.; Jang, W. P.; Jeong, H. Y.; Ha, K. S.; Han, D. H.

    2002-01-01

    In this paper, the PSA methodology of PRISM,Light Water Reactor, Pressurized Heavy Water Reactor are analyzed and the methodology of Initiating Events for KALIMER are suggested. Also,the reliability assessment of assumptions for Pipes Corrosion Frequency is set up. The reliability assessment of Passive Safety System, one of Main Safety System of KALIMER, are discussed and analyzed

  11. NEW CO-EVOLUTION STRATEGIES OF THIRD MILLENNIUM; METHODOLOGICAL ASPECT

    Directory of Open Access Journals (Sweden)

    E. K. Bulygo

    2006-01-01

    Full Text Available The paper is devoted to an application of the co-evolution methodology to the social space. Principles of instability and non-linearity that are typical for contemporary natural science are used as a theoretical background of a new social methodology. Authors try to prove that the co-evolution strategy has a long pre-history in the ancient oriental philosophy and manifests itself in forms of modem culture

  12. Implementation of corporate governance principles in Romania

    Directory of Open Access Journals (Sweden)

    Ramona Iulia Țarțavulea (Dieaconescu

    2014-12-01

    Full Text Available The paper aims to conduct a study regarding the manner in which corporate governance principles are applied in Romania, in both public and private sector. In the first part of the paper, the corporate governance principles are presented as they are defined in Romania, in comparison with the main international sources of interest in the domain (OECD corporate governance principles, UE legal framework. The corporate governance (CG principles refer to issues regarding board composition, transparency of scope, objectives and policies; they define the relations between directors and managers, shareholders and stakeholders. The research methodology is based on both fundamental research and empirical study on the implementation of corporate governance principles in companies from Romania. The main instrument of research is a corporate governance index, calculated based on a framework proposed by the author. The corporate governance principles are transposed in criteria that compose the framework for the CG index. The results of the study consist of scores for each CG principles and calculation of CG index for seven companies selected from the public and private sector in Romania. The results are analyzed and discussed in order to formulate general and particular recommendations. The main conclusion of this study is that that a legal framework in the area of corporate governance regulation is needed in Romania. I consider that the main CG principles should be enforced by developing a mandatory legal framework.

  13. Improving patient safety: patient-focused, high-reliability team training.

    Science.gov (United States)

    McKeon, Leslie M; Cunningham, Patricia D; Oswaks, Jill S Detty

    2009-01-01

    Healthcare systems are recognizing "human factor" flaws that result in adverse outcomes. Nurses work around system failures, although increasing healthcare complexity makes this harder to do without risk of error. Aviation and military organizations achieve ultrasafe outcomes through high-reliability practice. We describe how reliability principles were used to teach nurses to improve patient safety at the front line of care. Outcomes include safety-oriented, teamwork communication competency; reflections on safety culture and clinical leadership are discussed.

  14. Reliability assessment using Bayesian networks. Case study on quantative reliability estimation of a software-based motor protection relay

    International Nuclear Information System (INIS)

    Helminen, A.; Pulkkinen, U.

    2003-06-01

    In this report a quantitative reliability assessment of motor protection relay SPAM 150 C has been carried out. The assessment focuses to the methodological analysis of the quantitative reliability assessment using the software-based motor protection relay as a case study. The assessment method is based on Bayesian networks and tries to take the full advantage of the previous work done in a project called Programmable Automation System Safety Integrity assessment (PASSI). From the results and experiences achieved during the work it is justified to claim that the assessment method presented in the work enables a flexible use of qualitative and quantitative elements of reliability related evidence in a single reliability assessment. At the same time the assessment method is a concurrent way of reasoning one's beliefs and references about the reliability of the system. Full advantage of the assessment method is taken when using the method as a way to cultivate the information related to the reliability of software-based systems. The method can also be used as a communicational instrument in a licensing process of software-based systems. (orig.)

  15. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  16. Study of methodology diversification in diagnostics

    International Nuclear Information System (INIS)

    Suda, Kazunori; Yonekawa, Tsuyoshi; Yoshikawa, Shinji; Hasegawa, Makoto

    1999-03-01

    There are several research activities to enhance safety and reliability of nuclear power plant operation and maintenance. We are developing a concept of an autonomous operation system where the role of operators is replaced with artificial intelligence. The purpose of the study described in this report is to develop a operator support system in abnormal plant situations. Conventionally, diagnostic modules based on individual methodology such as expert system have been developed and verified. In this report, methodology diversification is considered to integrate diagnostic modules which performance are confirmed using information processing technique. Technical issues to be considered in diagnostic methodology diversification are; 1)reliability of input data, 2)diversification of knowledge models, algorithms and reasoning schemes, 3)mutual complement and robustness. The diagnostic module utilizing the different approaches defined along with strategy of diversification was evaluated using fast breeder plant simulator. As a result, we confirmed that any singular diagnostic module can not meet accuracy criteria for the entire set of anomaly events. In contrast with this, we confirmed that every abnormality could be precisely diagnosed by a mutual combination. In other words, legitimacy of approach selected by strategy of diversification was shown, and methodology diversification attained clear efficiency for abnormal diagnosis. It has been also confirmed that the diversified diagnostic system implemented in this study is able to maintain its accuracy even in case that encountered scale of abnormality is different from reference cases embedded in the knowledge base. (author)

  17. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  18. Intelligent instrumentation principles and applications

    CERN Document Server

    Bhuyan, Manabendra

    2011-01-01

    With the advent of microprocessors and digital-processing technologies as catalyst, classical sensors capable of simple signal conditioning operations have evolved rapidly to take on higher and more specialized functions including validation, compensation, and classification. This new category of sensor expands the scope of incorporating intelligence into instrumentation systems, yet with such rapid changes, there has developed no universal standard for design, definition, or requirement with which to unify intelligent instrumentation. Explaining the underlying design methodologies of intelligent instrumentation, Intelligent Instrumentation: Principles and Applications provides a comprehensive and authoritative resource on the scientific foundations from which to coordinate and advance the field. Employing a textbook-like language, this book translates methodologies to more than 80 numerical examples, and provides applications in 14 case studies for a complete and working understanding of the material. Beginn...

  19. Diclosure and transparency in public sector: an analysis of convergence of the principles of governance

    Directory of Open Access Journals (Sweden)

    Luzia Zorzal

    2015-09-01

    Full Text Available Introduction: Studies on disclosure of private institutions are common, but do not occur when it comes to public institutions, where the disclosure of management is still very limited. Purpose: The article is part of doctorate research in Information Science in progress and investigates the principles of disclosure and transparency in the light of good governance practices applied to the public sector, to reduce information asymmetry and presents part of this research. Methodology: The methodological procedures were performed literature search and content analysis to identify the principles and standards of good governance practices recommended for public administration, aiming to systematize these recommendations as instruments of governance and verify the convergence of the principles of disclosure and transparency. Results: Partial results show convergence of disclosure and transparency principles. Conclusions: Indicate that public institutions should worry about performing the practices of good governance as a way to mitigate the informational asymmetry.

  20. Transformer design principles with applications to core-form power transformers

    CERN Document Server

    Del Vecchio, Robert M; Feeney, Mary-Ellen F

    2001-01-01

    Transformer Design Principles presents the theory of transformer operation and the methods and techniques of designing them. It emphasizes the physical principles and mathematical tools for simulating transformer behavior, including modern computer techniques. The scope of the book includes types of construction, circuit analysis, mechanical aspects of design, high voltage insulation requirements, and cooling design. The authors also address test procedures and reliability methods to assure successful design and discuss the economic analysis of designs. Summarizing material currently scattered

  1. Reliability Standards of Complex Engineering Systems

    Science.gov (United States)

    Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.

    2017-11-01

    Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.

  2. Development of a methodology for conducting an integrated HRA/PRA --

    International Nuclear Information System (INIS)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S.; Wreathall, J.; Cooper, S.E.

    1993-01-01

    During Low Power and Shutdown (LP ampersand S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP ampersand S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP ampersand S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP ampersand S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP ampersand S, (2) identification of potentially important LP ampersand S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP ampersand S conditions for a pressurized water reactor (PWR)

  3. CERN Technical training 2008 - Learning for the LHC: Special Workshop demonstrating reliability with accelerated testing

    CERN Multimedia

    2008-01-01

    Larry Edson’s workshop will show examples of quantitative reliability predictions based upon accelerated testing and demonstrates that reliability testing during the prototyping phase will help ascertain product shortcomings. When these weak points are addressed and the redesigned product is re-tested, the reliability of that product will become much higher. These methodologies successfully used in industry might be exceedingly useful also for component development in particle physics where reliability is of utmost importance. This training will provide participants with the skills necessary to demonstrate reliability requirements using accelerated testing methods. The workshop will focus on accelerated test design that employs increased stress levels. This approach has the advantage of reducing test time, sample size and test facility resources. The methodologies taught are applicable to all types of stresses, spanning the electro...

  4. CERN Technical training 2008 - Learning for the LHC: Special Workshop demonstrating reliability with accelerated testing

    CERN Multimedia

    2008-01-01

    Larry Edson’s workshop will show examples of quantitative reliability predictions based upon accelerated testing and demonstrate that reliability testing during the prototyping phase will help ascertain product shortcomings. When these weak points are addressed and the redesigned product is re-tested, the reliability of that product will become much higher. These methodologies successfully used in industry might be exceedingly useful also for component development in particle physics where reliability is of the utmost importance. This training will provide participants with the skills necessary to demonstrate reliability requirements using accelerated testing methods. The workshop will focus on accelerated test design that employs increased stress levels. This approach has the advantage of reducing test time, sample size and test facility resources. The methodologies taught are applicable to all types of stresses, spanning the elec...

  5. CERN Technical training 2008 - Learning for the LHC: Special workshop demonstrating reliability with accelerated testing

    CERN Multimedia

    2008-01-01

    Larry Edson’s workshop will show examples of quantitative reliability predictions based upon accelerated testing and demonstrate that reliability testing during the prototyping phase will help ascertain product shortcomings. When these weak points are addressed and the redesigned product is re-tested, the reliability of that product will become much higher. These methodologies successfully used in industry might be exceedingly useful also for component development in particle physics where reliability is of the utmost importance. This training will provide participants with the skills necessary to demonstrate reliability requirements using accelerated testing methods. The workshop will focus on accelerated test design that employs increased stress levels. This approach has the advantage of reducing test time, sample size and test facility resources. The methodologies taught are applicable to all types of stresses, spanning the elec...

  6. Methodology for evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.E.

    1992-01-01

    Effort in this project during the past year has focused on the development, refinement, and distribution of computer software that will allow current Receiver Operating Characteristic (ROC) methodology to be used conveniently and reliably by investigators in a variety of evaluation tasks in diagnostic medicine; and on the development of new ROC methodology that will broaden the spectrum of evaluation tasks and/or experimental settings to which the fundamental approach can be applied. Progress has been limited by the amount of financial support made available to the project

  7. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  8. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  9. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  10. Methodology for studying social advertising: A sociological aspect

    Directory of Open Access Journals (Sweden)

    S B Kalmykov

    2014-12-01

    Full Text Available The article describes the author’s dynamic processual methodology for the sociological study of social advertising that combines the multiversion paradigmatic approach, legitimization procedures, methodological principles of interconnection, multilevel analysis and the principles of sociological data formalization developed by P. Lazarsfeld. The author explains the multi-stage strategy of the methodology and the research procedures that provide new sociological knowledge about the processes of social advertising. The first stage involves analysis of the social advertising as a number of institutional, communicative, socio-cultural and socio-technological processes. The second stage consists of the development of the substantive aspects of social advertising dynamics and its dependence on the features of different socio-demographic groups. The third stage of the methodology includes a comparative analysis of the social advertising theoretical and empirical aspects and the subsequent assessment of its fundamental and applied capabilities. The author identifies two types of research practices: the first one consists of three levels of complexity - the first one is to design the social advertising categories and concepts; the second one requires a higher level of generalization; the third one supposes justification of the universal categorization and the social advertising conceptualization for different social areas as well as a comparative analysis of the theory of the social advertising impact developed by O.O. Savel’eva with the research results for the aims of the promotion of the sociology of advertising. The article concludes with the demonstration of the proposed methodology universality for different spheres of social reality.

  11. Reliability Evaluation for Optimizing Electricity Supply in a Developing Country

    Directory of Open Access Journals (Sweden)

    Mark Ndubuka NWOHU

    2007-09-01

    Full Text Available The reliability standards for electricity supply in a developing country, like Nigeria, have to be determined on past engineering principles and practice. Because of the high demand of electrical power due to rapid development, industrialization and rural electrification; the economic, social and political climate in which the electric power supply industry now operates should be critically viewed to ensure that the production of electrical power should be augmented and remain uninterrupted. This paper presents an economic framework that can be used to optimize electric power system reliability. Finally the cost models are investigated to take into account the economic analysis of system reliability, which can be periodically updated to improve overall reliability of electric power system.

  12. Correcting Fallacies in Validity, Reliability, and Classification

    Science.gov (United States)

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  13. Reliability in perceptual analysis of voice quality.

    Science.gov (United States)

    Bele, Irene Velsvik

    2005-12-01

    This study focuses on speaking voice quality in male teachers (n = 35) and male actors (n = 36), who represent untrained and trained voice users, because we wanted to investigate normal and supranormal voices. In this study, both substantial and methodologic aspects were considered. It includes a method for perceptual voice evaluation, and a basic issue was rater reliability. A listening group of 10 listeners, 7 experienced speech-language therapists, and 3 speech-language therapist students evaluated the voices by 15 vocal characteristics using VA scales. Two sets of voice signals were investigated: text reading (2 loudness levels) and sustained vowel (3 levels). The results indicated a high interrater reliability for most perceptual characteristics. Connected speech was evaluated more reliably, especially at the normal level, but both types of voice signals were evaluated reliably, although the reliability for connected speech was somewhat higher than for vowels. Experienced listeners tended to be more consistent in their ratings than did the student raters. Some vocal characteristics achieved acceptable reliability even with a smaller panel of listeners. The perceptual characteristics grouped in 4 factors reflected perceptual dimensions.

  14. Survey of Transmission Cost Allocation Methodologies for Regional Transmission Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Fink, S.; Porter, K.; Mudd, C.; Rogers, J.

    2011-02-01

    The report presents transmission cost allocation methodologies for reliability transmission projects, generation interconnection, and economic transmission projects for all Regional Transmission Organizations.

  15. Applying Lean principles and Kaizen rapid improvement events in public health practice.

    Science.gov (United States)

    Smith, Gene; Poteat-Godwin, Annah; Harrison, Lisa Macon; Randolph, Greg D

    2012-01-01

    This case study describes a local home health and hospice agency's effort to implement Lean principles and Kaizen methodology as a rapid improvement approach to quality improvement. The agency created a cross-functional team, followed Lean Kaizen methodology, and made significant improvements in scheduling time for home health nurses that resulted in reduced operational costs, improved working conditions, and multiple organizational efficiencies.

  16. Is DevOps another Project Management Methodology?

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2017-01-01

    Full Text Available In this paper, the authors aim to present the concept of DevOps (Development & Operations, considering its degree of novelty in the area of project management. Firstly, the authors will bring theoretical arguments to support the idea that DevOps is an early-stage methodology, built on the Agile principles, but coming with its own contributions in project management for software development and implementation. Therefore, we believe that after a short time, DevOps will replace its predecessors. Secondly, we experienced this methodology by developing a small project in academic environment by three teams of master students, using VersionOne software. The Conclusions will emphasize the relevance and the future expected effects of DevOps methodology in the project management domain.

  17. Profile of Research Methodology and Statistics Training of ...

    African Journals Online (AJOL)

    Background: Medical practitioners need to have knowledge of statistics and research principles, especially with the increasing emphasis on evidence-based medicine. The aim of this study was to determine the profile of research methodology and statistics training of undergraduate medical students at South African ...

  18. METHODOLOGICAL ASPECTS OF THE INTERNAL CONTROL SYSTEM FORMATION

    Directory of Open Access Journals (Sweden)

    Larisa I. Egorova

    2014-01-01

    Full Text Available The methodological aspects of the internal control system formation are stated in the article. The great attention is focused on the problems of financial statements misrepresentation. The basic principles and structure of the internal control system are discussed in this article.

  19. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  20. Solid state nuclear track detection principles, methods and applications

    CERN Document Server

    Durrani, S A; ter Haar, D

    1987-01-01

    Solid State Nuclear Track Detection: Principles, Methods and Applications is the second book written by the authors after Nuclear Tracks in Solids: Principles and Applications. The book is meant as an introduction to the subject solid state of nuclear track detection. The text covers the interactions of charged particles with matter; the nature of the charged-particle track; the methodology and geometry of track etching; thermal fading of latent damage trails on tracks; the use of dielectric track recorders in particle identification; radiation dossimetry; and solid state nuclear track detecti

  1. Monte Carlo importance sampling optimization for system reliability applications

    International Nuclear Information System (INIS)

    Campioni, Luca; Vestrucci, Paolo

    2004-01-01

    This paper focuses on the reliability analysis of multicomponent systems by the importance sampling technique, and, in particular, it tackles the optimization aspect. A methodology based on the minimization of the variance at the component level is proposed for the class of systems consisting of independent components. The claim is that, by means of such a methodology, the optimal biasing could be achieved without resorting to the typical approach by trials

  2. Human reliability: an evaluation of its understanding and prediction

    International Nuclear Information System (INIS)

    Joksimovich, V.

    1987-01-01

    This paper presents a viewpoint on the state-of-the-art in human reliability. The bases for this viewpoint are, by and large, research projects conducted by the NUS for the Electric Power Research Institute (EPRI) primarily with the objective of further enhancing the credibility of PRA methodology. The presentation is divided into the following key sections: Background and Overview, Methodology and Data Base with emphasis on the simulator data base

  3. Housing Accessibility Methodology Targeting Older People

    DEFF Research Database (Denmark)

    Helle, Tina

    accessibility problems before the planning of housing intervention strategies. It is also critical that housing standards addressing accessibility intended to accommodate people with functional limitations are valid in the sense that their definitions truly support accessibility. However, there is a paucity...... of valid and reliable assessment instruments targeting housing accessibility, and in-depth analysis of factors potentially impacting on reliability in complex assessment situations is remarkably absent. Moreover, the knowledge base informing the housing standards appears to be vague. We may therefore...... reasonably question the validity of the housing standards addressing accessibility. This thesis addresses housing accessibility methodology in general and the reliability of assessment and the validity of standards targeting older people with functional limitations and a dependence on mobility devices...

  4. Using continuous time stochastic modelling and nonparametric statistics to improve the quality of first principles models

    DEFF Research Database (Denmark)

    A methodology is presented that combines modelling based on first principles and data based modelling into a modelling cycle that facilitates fast decision-making based on statistical methods. A strong feature of this methodology is that given a first principles model along with process data......, the corresponding modelling cycle model of the given system for a given purpose. A computer-aided tool, which integrates the elements of the modelling cycle, is also presented, and an example is given of modelling a fed-batch bioreactor....

  5. RIO: a program to determine reliability importance and allocate optimal reliability goals

    International Nuclear Information System (INIS)

    Poloski, J.P.

    1978-09-01

    The designer of a nuclear plant must know the plant's associated risk limitations so that he can design the plant accordingly. To design a safety system, he must understand its importance and how it relates to the overall plant risk. The computer program RIO can aid the designer to understand a system's contribution to the plant's overall risk. The methodology developed and presented was sponsored by the Nuclear Research Applications Division of the Department of Energy for use in the Gas Cooled Fast Breeder Reactor (GCFR) Program. The principal motivation behind its development was the need to translate nuclear plants safety goals into reliability goals for systems which make up that plant. The method described herein will make use of the GCFR Accident Initiation and Progression Analyses (AIPA) event trees and other models in order to determine these reliability goals

  6. The Ocean Colour Climate Change Initiative: I. A Methodology for Assessing Atmospheric Correction Processors Based on In-Situ Measurements

    Science.gov (United States)

    Muller, Dagmar; Krasemann, Hajo; Brewin, Robert J. W.; Deschamps, Pierre-Yves; Doerffer, Roland; Fomferra, Norman; Franz, Bryan A.; Grant, Mike G.; Groom, Steve B.; Melin, Frederic; hide

    2015-01-01

    The Ocean Colour Climate Change Initiative intends to provide a long-term time series of ocean colour data and investigate the detectable climate impact. A reliable and stable atmospheric correction procedure is the basis for ocean colour products of the necessary high quality. In order to guarantee an objective selection from a set of four atmospheric correction processors, the common validation strategy of comparisons between in-situ and satellite derived water leaving reflectance spectra, is extended by a ranking system. In principle, the statistical parameters such as root mean square error, bias, etc. and measures of goodness of fit, are transformed into relative scores, which evaluate the relationship of quality dependent on the algorithms under study. The sensitivity of these scores to the selected database has been assessed by a bootstrapping exercise, which allows identification of the uncertainty in the scoring results. Although the presented methodology is intended to be used in an algorithm selection process, this paper focusses on the scope of the methodology rather than the properties of the individual processors.

  7. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  8. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  9. Efficient approach for reliability-based optimization based on weighted importance sampling approach

    International Nuclear Information System (INIS)

    Yuan, Xiukai; Lu, Zhenzhou

    2014-01-01

    An efficient methodology is presented to perform the reliability-based optimization (RBO). It is based on an efficient weighted approach for constructing an approximation of the failure probability as an explicit function of the design variables which is referred to as the ‘failure probability function (FPF)’. It expresses the FPF as a weighted sum of sample values obtained in the simulation-based reliability analysis. The required computational effort for decoupling in each iteration is just single reliability analysis. After the approximation of the FPF is established, the target RBO problem can be decoupled into a deterministic one. Meanwhile, the proposed weighted approach is combined with a decoupling approach and a sequential approximate optimization framework. Engineering examples are given to demonstrate the efficiency and accuracy of the presented methodology

  10. Applying reliability centered maintenance analysis principles to inservice testing

    International Nuclear Information System (INIS)

    Flude, J.W.

    1994-01-01

    Federal regulations require nuclear power plants to use inservice test (IST) programs to ensure the operability of safety-related equipment. IST programs are based on American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code requirements. Many of these plants also use Reliability Centered Maintenance (RCM) to optimize system maintenance. ASME Code requirements are hard to change. The process for requesting authority to use an alternate strategy is long and expensive. The difficulties of obtaining this authority make the use of RCM method on safety-related systems not cost effective. An ASME research task force on Risk Based Inservice Testing is investigating changing the Code. The change will allow plants to apply RCM methods to the problem of maintenance strategy selection for safety-related systems. The research task force is working closely with the Codes and Standards sections to develop a process related to the RCM process. Some day plants will be able to use this process to develop more efficient and safer maintenance strategies

  11. Time-advance algorithms based on Hamilton's principle

    International Nuclear Information System (INIS)

    Lewis, H.R.; Kostelec, P.J.

    1993-01-01

    Time-advance algorithms based on Hamilton's variational principle are being developed for application to problems in plasma physics and other areas. Hamilton's principle was applied previously to derive a system of ordinary differential equations in time whose solution provides an approximation to the evolution of a plasma described by the Vlasov-Maxwell equations. However, the variational principle was not used to obtain an algorithm for solving the ordinary differential equations numerically. The present research addresses the numerical solution of systems of ordinary differential equations via Hamilton's principle. The basic idea is first to choose a class of functions for approximating the solution of the ordinary differential equations over a specific time interval. Then the parameters in the approximating function are determined by applying Hamilton's principle exactly within the class of approximating functions. For example, if an approximate solution is desired between time t and time t + Δ t, the class of approximating functions could be polynomials in time up to some degree. The issue of how to choose time-advance algorithms is very important for achieving efficient, physically meaningful computer simulations. The objective is to reliably simulate those characteristics of an evolving system that are scientifically most relevant. Preliminary numerical results are presented, including comparisons with other computational methods

  12. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  13. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    Science.gov (United States)

    Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  14. The likelihood principle and its proof – a never-ending story…

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini

    2015-01-01

    An ongoing controversy in philosophy of statistics is the so-called “likelihood principle” essentially stating that all evidence which is obtained from an experiment about an unknown quantity θ is contained in the likelihood function of θ. Common classical statistical methodology, such as the use...... of significance tests, and confidence intervals, depends on the experimental procedure and unrealized events and thus violates the likelihood principle. The likelihood principle was identified by that name and proved in a famous paper by Allan Birnbaum in 1962. However, ever since both the principle itself...... as well as the proof has been highly debated. This presentation will illustrate the debate of both the principle and its proof, from 1962 and up to today. An often-used experiment to illustrate the controversy between classical interpretation and evidential confirmation based on the likelihood principle...

  15. Reliability and validity of emergency department triage systems

    NARCIS (Netherlands)

    van der Wulp, I.

    2010-01-01

    Reliability and validity of triage systems is important because this can affect patient safety. In this thesis, these aspects of two emergency department (ED) triage systems were studied as well as methodological aspects in these types of studies. The consistency, reproducibility, and criterion

  16. Quantitative dynamic reliability evaluation of AP1000 passive safety systems by using FMEA and GO-FLOW methodology

    International Nuclear Information System (INIS)

    Hashim Muhammad; Yoshikawa, Hidekazu; Matsuoka, Takeshi; Yang Ming

    2014-01-01

    The passive safety systems utilized in advanced pressurized water reactor (PWR) design such as AP1000 should be more reliable than that of active safety systems of conventional PWR by less possible opportunities of hardware failures and human errors (less human intervention). The objectives of present study are to evaluate the dynamic reliability of AP1000 plant in order to check the effectiveness of passive safety systems by comparing the reliability-related issues with that of active safety systems in the event of the big accidents. How should the dynamic reliability of passive safety systems properly evaluated? And then what will be the comparison of reliability results of AP1000 passive safety systems with the active safety systems of conventional PWR. For this purpose, a single loop model of AP1000 passive core cooling system (PXS) and passive containment cooling system (PCCS) are assumed separately for quantitative reliability evaluation. The transient behaviors of these passive safety systems are taken under the large break loss-of-coolant accident in the cold leg. The analysis is made by utilizing the qualitative method failure mode and effect analysis in order to identify the potential failure mode and success-oriented reliability analysis tool called GO-FLOW for quantitative reliability evaluation. The GO-FLOW analysis has been conducted separately for PXS and PCCS systems under the same accident. The analysis results show that reliability of AP1000 passive safety systems (PXS and PCCS) is increased due to redundancies and diversity of passive safety subsystems and components, and four stages automatic depressurization system is the key subsystem for successful actuation of PXS and PCCS system. The reliability results of PCCS system of AP1000 are more reliable than that of the containment spray system of conventional PWR. And also GO-FLOW method can be utilized for reliability evaluation of passive safety systems. (author)

  17. Equipment Reliability Program in NPP Krsko

    International Nuclear Information System (INIS)

    Skaler, F.; Djetelic, N.

    2006-01-01

    Operation that is safe, reliable, effective and acceptable to public is the common message in a mission statement of commercial nuclear power plants (NPPs). To fulfill these goals, nuclear industry, among other areas, has to focus on: 1 Human Performance (HU) and 2 Equipment Reliability (EQ). The performance objective of HU is as follows: The behaviors of all personnel result in safe and reliable station operation. While unwanted human behaviors in operations mostly result directly in the event, the behavior flaws either in the area of maintenance or engineering usually cause decreased equipment reliability. Unsatisfied Human performance leads even the best designed power plants into significant operating events, which can be found as well-known examples in nuclear industry. Equipment reliability is today recognized as the key to success. While the human performance at most NPPs has been improving since the start of WANO / INPO / IAEA evaluations, the open energy market has forced the nuclear plants to reduce production costs and operate more reliably and effectively. The balance between these two (opposite) goals has made equipment reliability even more important for safe, reliable and efficient production. Insisting on on-line operation by ignoring some principles of safety could nowadays in a well-developed safety culture and human performance environment exceed the cost of electricity losses. In last decade the leading USA nuclear companies put a lot of effort to improve equipment reliability primarily based on INPO Equipment Reliability Program AP-913 at their NPP stations. The Equipment Reliability Program is the key program not only for safe and reliable operation, but also for the Life Cycle Management and Aging Management on the way to the nuclear power plant life extension. The purpose of Equipment Reliability process is to identify, organize, integrate and coordinate equipment reliability activities (preventive and predictive maintenance, maintenance

  18. Reliability allocation in nuclear power plants

    International Nuclear Information System (INIS)

    Bari, R.A.; Cho, N.Z.; Papazoglou, I.A.

    1985-01-01

    The technical feasibility of allocating reliability and risk to reactor systems, subsystems, components, operations, and structures is investigated. A methodology is discussed which identifies top level risk indices as objective functions and plant-specific performance variables as decision variables. These are related by a risk model which includes cost as a top level risk index. A multiobjective optimization procedure is used to find non-inferior solutions in terms of the objective functions and the decision variables. The approach is illustrated for a boiling water reactor plant. The use of the methodology for both operating reactors and for advanced designs is briefly discussed. 16 refs., 1 fig

  19. Reliability technology principles and practice of failure prevention in electronic systems

    CERN Document Server

    Pascoe, Norman

    2011-01-01

    A unique book that describes the practical processes necessary to achieve failure free equipment performance, for quality and reliability engineers, design, manufacturing process and environmental test engineers. This book studies the essential requirements for successful product life cycle management. It identifies key contributors to failure in product life cycle management and particular emphasis is placed upon the importance of thorough Manufacturing Process Capability reviews for both in-house and outsourced manufacturing strategies. The readers? attention is also drawn to the ma

  20. Cosmological principles. II. Physical principles

    International Nuclear Information System (INIS)

    Harrison, E.R.

    1974-01-01

    The discussion of cosmological principle covers the uniformity principle of the laws of physics, the gravitation and cognizability principles, and the Dirac creation, chaos, and bootstrap principles. (U.S.)

  1. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    Science.gov (United States)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  2. Structural reliability in context of statistical uncertainties and modelling discrepancies

    International Nuclear Information System (INIS)

    Pendola, Maurice

    2000-01-01

    Structural reliability methods have been largely improved during the last years and have showed their ability to deal with uncertainties during the design stage or to optimize the functioning and the maintenance of industrial installations. They are based on a mechanical modeling of the structural behavior according to the considered failure modes and on a probabilistic representation of input parameters of this modeling. In practice, only limited statistical information is available to build the probabilistic representation and different sophistication levels of the mechanical modeling may be introduced. Thus, besides the physical randomness, other uncertainties occur in such analyses. The aim of this work is triple: 1. at first, to propose a methodology able to characterize the statistical uncertainties due to the limited number of data in order to take them into account in the reliability analyses. The obtained reliability index measures the confidence in the structure considering the statistical information available. 2. Then, to show a methodology leading to reliability results evaluated from a particular mechanical modeling but by using a less sophisticated one. The objective is then to decrease the computational efforts required by the reference modeling. 3. Finally, to propose partial safety factors that are evolving as a function of the number of statistical data available and as a function of the sophistication level of the mechanical modeling that is used. The concepts are illustrated in the case of a welded pipe and in the case of a natural draught cooling tower. The results show the interest of the methodologies in an industrial context. [fr

  3. NanoBiosensing Principles, Development and Application

    CERN Document Server

    Ju, Huangxian; Wang, Joseph

    2011-01-01

    This book will cover the full scope of nanobiosensing, which combines the newest research results in the cross-disciplines of chemistry, biology, and materials science with biosensing and bioanalysis to develop novel detection principles, sensing mechanisms, and device engineering methods. It not only covers the important types of nanomaterials for biosensing applications, including carbon nanotubes, carbon nanofiber, quantum dots, fullerenes, fluorescent and biological molecules, etc., but also illustrates a wide range of sensing principles, including electrochemical detection, fluorescence, chemiluminesence, antibody-antigen interactions, and magnetic detection. The book details novel developments in the methodology and devices of biosensing and bioanalysis combined with nanoscience and nanotechnology, as well as their applications in biomedicine and environmental monitoring. Furthermore, the reported works on the application and biofunction of nanoparticles have attracted extensive attention and interest, ...

  4. Foundational principles of classical Ayurveda research

    Directory of Open Access Journals (Sweden)

    Somik Raha

    2013-01-01

    Full Text Available Double-blind randomized controlled trials (RCTs are viewed as the golden standard of drug research in Western medicine. However, RCTs are far from "golden" in many respects. They are impractical for many therapies, such as for surgeries and complex lifestyle changes. They encourage a one-size-fits-all approach to medical treatment that fails to address the huge diversity among individual patients in terms of their physical and emotional symptoms, social and cultural upbringing, and other factors. Perhaps, more importantly, they do not help doctors make the best medical decisions required to produce optimal patient outcomes. To guide a search for an alternate model of medical research, three principles based on Ayurveda, an ancient and powerful system of health care that has stood the test of time, are presented. These principles, arrived at after mining Ayurvedic epistemology, are: Inductive learning, whole systems thinking, and individually optimized therapy. In honor of the ancient sages or "Rishis," whose voice is used to deliver Ayurvedic knowledge in the ancient texts of Ayurveda, these are referred to as the "Rishi principles." Individually optimized therapy is interpreted using the lens of decision analysis. Common research methodologies are examined for embodiment of these principles.

  5. [Methodological aspects of a study of medical service satisfaction in patients with borderline mental disorders].

    Science.gov (United States)

    Malygin, Ya V; Tsygankov, B D

    The authors discussed a methodology of the study of medical service satisfaction and it's factors: moment of assessment, methodology of data collection, format of data, bench-marking, principles of inclusion of questions into a questionnaire, organizing and frequency of conducting studies.

  6. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (theoretical and methodological foundations of vocational teacher education

    Directory of Open Access Journals (Sweden)

    Evgeny M. Dorozhkin

    2014-01-01

    methodology taking into consideration the target orientation, principles and approaches to the organization and its’ methods of scientific and educational activities implementation. The qualification structure formation of the teachers’ vocational training and providing advance principles of education are considered to be the most important conditions for the development of vocational teacher education. Scientific novelty. The research demonstrates creating the project of further vocational teacher education development in the post-industrial society. The pedagogical innovations transforming research findings into educational practice are considered to be the main tool of integration methodology means. Practical significance. The research findings highlight the proposed reforms for further teachers training system development of vocational institutes, which are in need of drastic restructuring. In the final part of the article the authors recommend some specific issues that can be discussed at the methodological workshop. 

  7. Reliability analysis of HVDC grid combined with power flow simulations

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yongtao; Langeland, Tore; Solvik, Johan [DNV AS, Hoevik (Norway); Stewart, Emma [DNV KEMA, Camino Ramon, CA (United States)

    2012-07-01

    Based on a DC grid power flow solver and the proposed GEIR, we carried out reliability analysis for a HVDC grid test system proposed by CIGRE working group B4-58, where the failure statistics are collected from literature survey. The proposed methodology is used to evaluate the impact of converter configuration on the overall reliability performance of the HVDC grid, where the symmetrical monopole configuration is compared with the bipole with metallic return wire configuration. The results quantify the improvement on reliability by using the later alternative. (orig.)

  8. In-plant reliability data base for nuclear power plant components: data collection and methodology report

    International Nuclear Information System (INIS)

    Drago, J.P.; Borkowski, R.J.; Pike, D.H.; Goldberg, F.F.

    1982-07-01

    The development of a component reliability data for use in nuclear power plant probabilistic risk assessments and reliabiilty studies is presented in this report. The sources of the data are the in-plant maintenance work request records from a sample of nuclear power plants. This data base is called the In-Plant Reliability Data (IPRD) system. Features of the IPRD system are compared with other data sources such as the Licensee Event Report system, the Nuclear Plant Reliability Data system, and IEEE Standard 500. Generic descriptions of nuclear power plant systems formulated for IPRD are given

  9. Methodological pluralism and structure of sociological theory

    Directory of Open Access Journals (Sweden)

    N. L. Polyakova

    2015-01-01

    Full Text Available In the paper the historical-sociological analysis is used as a means to show the differences between theoretical and empirical sociology. There exist several, basic traditions in theoretical sociology. The investigation of their competing theoretical and methodological principles carried out in the paper; identify some fundamental features of sociological theory as a whole.

  10. Reliability of Computer Analysis of Electrocardiograms (ECG) of ...

    African Journals Online (AJOL)

    Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...

  11. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  12. Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue

    Science.gov (United States)

    Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang

    2018-03-01

    Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.

  13. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  14. Methodology of theory of stage-by-stage long-term preparation of sportsmen in single combats

    Directory of Open Access Journals (Sweden)

    Arziutov G.

    2010-04-01

    Full Text Available Results over of researches are brought on methodology of theory of stage-by-stage preparation of sportsmen in single combats. The structuralness of theory lies in possibility simple verifications of its substantive provisions, principles and laws. Development of methodology enables to begin creation of map of trainer on the stages of long-term preparation. Laws, conformities to law, principles and rules, must be collected in a map. A map enables the trainers of reserve sport to use its content during all stages of preparation of sportsman.

  15. Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring

    Science.gov (United States)

    Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri

    2017-01-01

    Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…

  16. Development of reliability-based load and resistance factor design methods for piping

    International Nuclear Information System (INIS)

    Ayyub, Bilal M.; Hill, Ralph S. III; Balkey, Kenneth R.

    2003-01-01

    Current American Society of Mechanical Engineers (ASME) nuclear codes and standards rely primarily on deterministic and mechanistic approaches to design. The American Institute of Steel Construction and the American Concrete Institute, among other organizations, have incorporated probabilistic methodologies into their design codes. ASME nuclear codes and standards could benefit from developing a probabilistic, reliability-based, design methodology. This paper provides a plan to develop the technical basis for reliability-based, load and resistance factor design of ASME Section III, Class 2/3 piping for primary loading, i.e., pressure, deadweight and seismic. The plan provides a proof of concept in that LRFD can be used in the design of piping, and could achieve consistent reliability levels. Also, the results from future projects in this area could form the basis for code cases, and additional research for piping secondary loads. (author)

  17. A Reliable Methodology for Determining Seed Viability by Using Hyperspectral Data from Two Sides of Wheat Seeds.

    Science.gov (United States)

    Zhang, Tingting; Wei, Wensong; Zhao, Bin; Wang, Ranran; Li, Mingliu; Yang, Liming; Wang, Jianhua; Sun, Qun

    2018-03-08

    This study investigated the possibility of using visible and near-infrared (VIS/NIR) hyperspectral imaging techniques to discriminate viable and non-viable wheat seeds. Both sides of individual seeds were subjected to hyperspectral imaging (400-1000 nm) to acquire reflectance spectral data. Four spectral datasets, including the ventral groove side, reverse side, mean (the mean of two sides' spectra of every seed), and mixture datasets (two sides' spectra of every seed), were used to construct the models. Classification models, partial least squares discriminant analysis (PLS-DA), and support vector machines (SVM), coupled with some pre-processing methods and successive projections algorithm (SPA), were built for the identification of viable and non-viable seeds. Our results showed that the standard normal variate (SNV)-SPA-PLS-DA model had high classification accuracy for whole seeds (>85.2%) and for viable seeds (>89.5%), and that the prediction set was based on a mixed spectral dataset by only using 16 wavebands. After screening with this model, the final germination of the seed lot could be higher than 89.5%. Here, we develop a reliable methodology for predicting the viability of wheat seeds, showing that the VIS/NIR hyperspectral imaging is an accurate technique for the classification of viable and non-viable wheat seeds in a non-destructive manner.

  18. Influence of organizational factors on performance reliability

    International Nuclear Information System (INIS)

    Haber, S.B.; O'Brien, J.N.; Metlay, D.S.; Crouch, D.A.

    1991-12-01

    This is the first volume of a two-volume report. Volume 2 will be published at a later date. This report presents the results of a research project conducted by Brookhaven National Laboratory for the United States Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. The purpose of the project was to develop a general methodology to be use in the assessment of the organizational factors which affect performance reliability (safety) in a nuclear power plant. The research described in this report includes the development of the Nuclear Organization and Management Analysis Concept (GNOMIC). This concept characterizes the organizational factors that impact safety performance in a nuclear power plant and identifies some methods for systematically measuring and analyzing the influence of these factors on safety performance. This report is divided into two parts; Part 1 presents an overview of the development of the methodology, while Part 2 provides more details and a technical analysis of the methodological development. Specifically, the results of two demonstration studies, the feasibility of the methodology, and a specific applications for which the methodology was developed are presented

  19. INPRO Methodology for Sustainability Assessment of Nuclear Energy Systems: Environmental Impact of Stressors. INPRO Manual

    International Nuclear Information System (INIS)

    2016-01-01

    This publication provides guidance on assessing of sustainability of a nuclear energy system (NES) in the area of environmental impact of stressors. The INPRO methodology is a comprehensive tool for the assessment of sustainability of an NES. Basic principles, user requirements and criteria have been defined in different areas of INPRO methodology. These include economics, infrastructure, waste management, proliferation resistance, environmental impact of stressors, environmental impact from depletion of resources, and safety of nuclear reactors and fuel cycle facilities. The ultimate goal of the application of the INPRO methodology is to check whether the assessed NES fulfils all the criteria, and hence the user requirements and basic principles, and therefore presents a system for a Member State that is sustainable in the long term

  20. Estimating the Parameters of Software Reliability Growth Models Using the Grey Wolf Optimization Algorithm

    OpenAIRE

    Alaa F. Sheta; Amal Abdel-Raouf

    2016-01-01

    In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM) can be used to predict the number of...

  1. RELIABILITY MODELING BASED ON INCOMPLETE DATA: OIL PUMP APPLICATION

    Directory of Open Access Journals (Sweden)

    Ahmed HAFAIFA

    2014-07-01

    Full Text Available The reliability analysis for industrial maintenance is now increasingly demanded by the industrialists in the world. Indeed, the modern manufacturing facilities are equipped by data acquisition and monitoring system, these systems generates a large volume of data. These data can be used to infer future decisions affecting the health facilities. These data can be used to infer future decisions affecting the state of the exploited equipment. However, in most practical cases the data used in reliability modelling are incomplete or not reliable. In this context, to analyze the reliability of an oil pump, this work proposes to examine and treat the incomplete, incorrect or aberrant data to the reliability modeling of an oil pump. The objective of this paper is to propose a suitable methodology for replacing the incomplete data using a regression method.

  2. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  3. Similarity principles for equipment qualification by experience

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1988-07-01

    A methodology is developed for seismic qualification of nuclear plant equipment by applying similarity principles to existing experience data. Experience data are available from previous qualifications by analysis or testing, or from actual earthquake events. Similarity principles are defined in terms of excitation, equipment physical characteristics, and equipment response. Physical similarity is further defined in terms of a critical transfer function for response at a location on a primary structure, whose response can be assumed directly related to ultimate fragility of the item under elevated levels of excitation. Procedures are developed for combining experience data into composite specifications for qualification of equipment that can be shown to be physically similar to the reference equipment. Other procedures are developed for extending qualifications beyond the original specifications under certain conditions. Some examples for application of the procedures and verification of them are given for certain cases that can be approximated by a two degree of freedom simple primary/secondary system. Other examples are based on use of actual test data available from previous qualifications. Relationships of the developments with other previously-published methods are discussed. The developments are intended to elaborate on the rather broad revised guidelines developed by the IEEE 344 Standards Committee for equipment qualification in new nuclear plants. However, the results also contribute to filling a gap that exists between the IEEE 344 methodology and that previously developed by the Seismic Qualification Utilities Group. The relationship of the results to safety margin methodology is also discussed. (author)

  4. Adherence to Principles of Medical Ethics Among Physicians in Mazandaran Province, Iran.

    Science.gov (United States)

    Ghaderi, Ahmad; Malek, Farhad; Mohammadi, Mohammad; Rostami Maskopaii, Somayeh; Hamta, Amir; Madani, Seyyed Abdollah

    2018-01-01

    Considering that medical ethics is an applied subject providing systematic solutions to help physicians with moral issues, this research aimed to evaluate adherence to the principles of medical ethics among physicians on the basis of attitude of physicians of Mazandaran province. This cross-sectional study was conducted in Mazandaran province, Iran during 2015. A researcher-made questionnaire was used for data collection. The questionnaire was first completed by 40 physicians and its reliability was confirmed by obtaining a Cronbach's alpha coefficient equal to 0.818. Its validity was confirmed by medical ethics experts. Therefore, the questionnaire was reliable and valid. Analytical and descriptive analysis were performed. According to our findings, there is a significant correlation between some of variables of medical ethics principles. The results show that adherence to indicators of beneficence, non-maleficence and justice has been almost good; however, physicians' ethical behaviors which pertain towards the principle of autonomy have not been acceptable. There was not any significant difference in adherence to the principles of autonomy, beneficence and non-maleficence, and justice on the basis of sex, residency, education and occupation. According to the present study, more training is required to improve physicians' adherence to the principles of medical ethics . 2018 The Author(s). This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  5. Design for reliability of solid state lighting systems

    NARCIS (Netherlands)

    Perpiñà, X.; Werkhoven, R.J.; Jakovenko, J.; Formánek, J.; Vellvehi, M.; Jordà, X.; Kunen, J.M.G.; Bancken, P.; Bolt, P.J.

    2012-01-01

    This work presents a methodology to design an SSL system for reliability. An LED lamp is thermally characterised and its model thermally simulated, indicating that the LED board (FR4 board with thermal vias, copper tracks and LED package) is the thermally most stressed part. Therefore, a

  6. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Antônio Dâmaso

    2017-11-01

    Full Text Available Power consumption is a primary interest in Wireless Sensor Networks (WSNs, and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.

  7. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Maciel, Paulo

    2017-01-01

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078

  8. Glances at renewable and sustainable energy principles, approaches and methodologies for an ambiguous benchmark

    CERN Document Server

    Jenssen, Till

    2013-01-01

    Offering a thorough review of the principles of sustainability assessment, this book explores multi-criteria decision analysis, ecological footprint analysis and normative-functional concepts via case studies in developed, emerging and developing countries.

  9. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  10. Production and Reliability Oriented SOFC Cell and Stack Design

    DEFF Research Database (Denmark)

    Hauth, Martin; Lawlor, Vincent; Cartellieri, Peter

    2017-01-01

    The paper presents an innovative development methodology for a production and reliability oriented SOFC cell and stack design aiming at improving the stacks robustness, manufacturability, efficiency and cost. Multi-physics models allowed a probabilistic approach to consider statistical variations...... in production, material and operating parameters for the optimization phase. A methodology for 3D description of spatial distribution of material properties based on a random field models was developed and validated by experiments. Homogenized material models on multiple levels of the SOFC stack were...... and output parameters and to perform a sensitivity analysis were developed and implemented. The capabilities of the methodology is illustrated on two practical cases....

  11. THERMODYNAMIC MODELING AND FIRST-PRINCIPLES CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Turchi, P; Abrikosov, I; Burton, B; Fries, S; Grimvall, G; Kaufman, L; Korzhavyi, P; Manga, R; Ohno, M; Pisch, A; Scott, A; Zhang, W

    2005-12-15

    The increased application of quantum mechanical-based methodologies to the study of alloy stability has required a re-assessment of the field. The focus is mainly on inorganic materials in the solid state. In a first part, after a brief overview of the so-called ab initio methods with their approximations, constraints, and limitations, recommendations are made for a good usage of first-principles codes with a set of qualifiers. Examples are given to illustrate the power and the limitations of ab initio codes. However, despite the ''success'' of these methodologies, thermodynamics of complex multi-component alloys, as used in engineering applications, requires a more versatile approach presently afforded within CALPHAD. Hence, in a second part, the links that presently exist between ab initio methodologies, experiments, and CALPHAD approach are examined with illustrations. Finally, the issues of dynamical instability and of the role of lattice vibrations that still constitute the subject of ample discussions within the CALPHAD community are revisited in the light of the current knowledge with a set of recommendations.

  12. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  13. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    Science.gov (United States)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  14. Methodology for local verification of flow regimes in fuel assemblies charts

    International Nuclear Information System (INIS)

    Igor, Sharaevsky; Elena, Sharaevskaya; Domashev, E.D.; Alexander, Arkhypov; Vladimir, Kolochko

    2003-01-01

    The best estimate thermal hydraulic codes describe adequately two-phase flows in nuclear energy facilities if there is proper system of closed relations. It could be obtained from the reliable information on structure forms of two-phase flows, its boundaries and reliable regime charts. In the paper the methodology of automatic recognition of the boundaries of the main types of two phase flows for rod fuel assemblies is presented. The methodology is based on definition of thermal hydraulic parameters distribution in experimental fuel assembly. The measurements were carried out using ASD signals of acoustic noise. In the paper data on two-phase flow regimes boundaries recognition especially low boundaries of bubble flow are summarized for experimental fuel assembly. The methodology of flow regimes charts applied to recognition of upper boundaries of boiling crisis regime was verificated. The satisfactory coincidence with experimental results have been shown. (author)

  15. Approach to assurance of reliability of linear accelerator operation observations

    International Nuclear Information System (INIS)

    Bakov, S.M.; Borovikov, A.A.; Kavkun, S.L.

    1994-01-01

    The system approach to solving the task of assuring reliability of observations over the linear accelerator operation is proposed. The basic principles of this method consist in application of dependences between the facility parameters, decrease in the number of the system apparatus channels for data acquisition without replacement of failed channel by reserve one. The signal commutation unit, the introduction whereof into the data acquisition system essentially increases the reliability of the measurement system on the account of active reserve, is considered detail. 8 refs. 6 figs

  16. Principles and indicators of green living families in Thai context

    Directory of Open Access Journals (Sweden)

    Tamkarn Yuranun

    2016-01-01

    Full Text Available Green Living has been practiced in everyday life which is accepted worldwide. However, there is no concrete academic principles for Green Living. The understanding of Green Living is rather abstract. This study focuses on the academic principles and indicators of the Green Living Families in Thailand. The results from the studies will be used for further research.This qualitative study aims at proposing the principles and indicators of the Green Living Families in the Thai context. The research methodologies include the analysis and synthesis of various documents both from Thailand and foreign countries, interviews and observation of five Green Living Families. The results show that the principles consist of 1 production for one’s own consumption within the family 2 economical use of resources and 3 sharing with others and the society. The essential indicators comprise of 1 Knowledge, 2 Practice, and 3 Attitude on Green Living.

  17. Bioscience methodologies in physical chemistry an engineering and molecular approach

    CERN Document Server

    D'Amore, Alberto

    2013-01-01

    The field of bioscience methodologies in physical chemistry stands at the intersection of the power and generality of classical and quantum physics with the minute molecular complexity of chemistry and biology. This book provides an application of physical principles in explaining and rationalizing chemical and biological phenomena. It does not stick to the classical topics that are conventionally considered as part of physical chemistry; instead it presents principles deciphered from a modern point of view, which is the strength of this book.

  18. Accelerated reliability demonstration under competing failure modes

    International Nuclear Information System (INIS)

    Luo, Wei; Zhang, Chun-hua; Chen, Xun; Tan, Yuan-yuan

    2015-01-01

    The conventional reliability demonstration tests are difficult to apply to products with competing failure modes due to the complexity of the lifetime models. This paper develops a testing methodology based on the reliability target allocation for reliability demonstration under competing failure modes at accelerated conditions. The specified reliability at mission time and the risk caused by sampling of the reliability target for products are allocated for each failure mode. The risk caused by degradation measurement fitting of the target for a product involving performance degradation is equally allocated to each degradation failure mode. According to the allocated targets, the accelerated life reliability demonstration test (ALRDT) plans for the failure modes are designed. The accelerated degradation reliability demonstration test plans and the associated ALRDT plans for the degradation failure modes are also designed. Next, the test plan and the decision rules for the products are designed. Additionally, the effects of the discreteness of sample size and accepted number of failures for failure modes on the actual risks caused by sampling for the products are investigated. - Highlights: • Accelerated reliability demonstration under competing failure modes is studied. • The method is based on the reliability target allocation involving the risks. • The test plan for the products is based on the plans for all the failure modes. • Both failure mode and degradation failure modes are considered. • The error of actual risks caused by sampling for the products is small enough

  19. Reliability Models Applied to a System of Power Converters in Particle Accelerators

    OpenAIRE

    Siemaszko, D; Speiser, M; Pittet, S

    2012-01-01

    Several reliability models are studied when applied to a power system containing a large number of power converters. A methodology is proposed and illustrated in the case study of a novel linear particle accelerator designed for reaching high energies. The proposed methods result in the prediction of both reliability and availability of the considered system for optimisation purposes.

  20. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  1. Assessment of the Reliability of Concrete Slab Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Jensen, F. M.; Middleton, C. R.

    This paper is based on research performed for the Highways Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: Concrete Bridges". It contains details of a methodology which can be used to generate Whole Life (WL) reliability profiles....

  2. Assessing Financial Education Methods: Principles vs. Rules-of-Thumb Approaches

    Science.gov (United States)

    Skimmyhorn, William L.; Davies, Evan R.; Mun, David; Mitchell, Brian

    2016-01-01

    Despite thousands of programs and tremendous public and private interest in improving financial decision-making, little is known about how best to teach financial education. Using an experimental approach, the authors estimated the effects of two different education methodologies (principles-based and rules-of-thumb) on the knowledge,…

  3. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Science.gov (United States)

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  4. Integrated resource planning-concepts and principles

    Energy Technology Data Exchange (ETDEWEB)

    Atkinson, S.

    1994-12-31

    The concepts and principles of integrated resource planning (IRP) are outlined. The following topics are discussed: utility opportunities and methodologies, application considerations, ambitious energy-efficient programs, the future of IRP, three methods to study resource alternatives, the load adjustment method, simultaneous optimization, static analysis, utility profile data, load forecasts and shapes, load data, conversion, variable costs, external analysis, internal analysis, DSM objectives, supply-side prescreening, DSM screening analysis, DSM evaluation, the IRP process, risk analysis, collaborative planning process, and load shape objectives.

  5. Reliability modeling of digital RPS with consideration of undetected software faults

    Energy Technology Data Exchange (ETDEWEB)

    Khalaquzzaman, M.; Lee, Seung Jun; Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Man Cheol [Chung Ang Univ., Seoul (Korea, Republic of)

    2013-10-15

    This paper provides overview of different software reliability methodologies and proposes a technic for estimating the reliability of RPS with consideration of undetected software faults. Software reliability analysis of safety critical software has been challenging despite spending a huge effort for developing large number of software reliability models, and no consensus yet to attain on an appropriate modeling methodology. However, it is realized that the combined application of BBN based SDLC fault prediction method and random black-box testing of software would provide better ground for reliability estimation of safety critical software. Digitalizing the reactor protection system of nuclear power plant has been initiated several decades ago and now full digitalization has been adopted in the new generation of NPPs around the world because digital I and C systems have many better technical features like easier configurability and maintainability over analog I and C systems. Digital I and C systems are also drift-free and incorporation of new features is much easier. Rules and regulation for safe operation of NPPs are established and has been being practiced by the operators as well as regulators of NPPs to ensure safety. The failure mechanism of hardware and analog systems well understood and the risk analysis methods for these components and systems are well established. However, digitalization of I and C system in NPP introduces some crisis and uncertainty in reliability analysis methods of the digital systems/components because software failure mechanisms are still unclear.

  6. Methodological principles to study formation and development of floristic law in Ukraine

    Directory of Open Access Journals (Sweden)

    А. К. Соколова

    2014-06-01

    Full Text Available The paper investigates the problems associated with the determination of methods to study establishment of floristic law in Ukraine. It makes an investigation into the types of methods, establishes their interrelation and functional value. In addition, it analyzes the system of methodological reasons for development of ecological and floristic law and gives additional reasons.

  7. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  8. Reliability-based assessment of polyethylene pipe creep lifetime

    International Nuclear Information System (INIS)

    Khelif, Rabia; Chateauneuf, Alaa; Chaoui, Kamel

    2007-01-01

    Lifetime management of underground pipelines is mandatory for safe hydrocarbon transmission and distribution systems. The use of high-density polyethylene tubes subjected to internal pressure, external loading and environmental variations requires a reliability study in order to define the service limits and the optimal operating conditions. In service, the time-dependent phenomena, especially creep, take place during the pipe lifetime, leading to significant strength reduction. In this work, the reliability-based assessment of pipe lifetime models is carried out, in order to propose a probabilistic methodology for lifetime model selection and to determine the pipe safety levels as well as the most important parameters for pipeline reliability. This study is enhanced by parametric analysis on pipe configuration, gas pressure and operating temperature

  9. Reliability-based assessment of polyethylene pipe creep lifetime

    Energy Technology Data Exchange (ETDEWEB)

    Khelif, Rabia [LaMI-UBP and IFMA, Campus de Clermont-Fd, Les Cezeaux, BP 265, 63175 Aubiere Cedex (France); LR3MI, Departement de Genie Mecanique, Universite Badji Mokhtar, BP 12, Annaba 23000 (Algeria)], E-mail: rabia.khelif@ifma.fr; Chateauneuf, Alaa [LGC-University Blaise Pascal, Campus des Cezeaux, BP 206, 63174 Aubiere Cedex (France)], E-mail: alaa.chateauneuf@polytech.univ-bpclermont.fr; Chaoui, Kamel [LR3MI, Departement de Genie Mecanique, Universite Badji Mokhtar, BP 12, Annaba 23000 (Algeria)], E-mail: chaoui@univ-annaba.org

    2007-12-15

    Lifetime management of underground pipelines is mandatory for safe hydrocarbon transmission and distribution systems. The use of high-density polyethylene tubes subjected to internal pressure, external loading and environmental variations requires a reliability study in order to define the service limits and the optimal operating conditions. In service, the time-dependent phenomena, especially creep, take place during the pipe lifetime, leading to significant strength reduction. In this work, the reliability-based assessment of pipe lifetime models is carried out, in order to propose a probabilistic methodology for lifetime model selection and to determine the pipe safety levels as well as the most important parameters for pipeline reliability. This study is enhanced by parametric analysis on pipe configuration, gas pressure and operating temperature.

  10. A Single Conversation with a Wise Man Is Better than Ten Years of Study: A Model for Testing Methodologies for Pedagogy or Andragogy

    Science.gov (United States)

    Taylor, Bryan; Kroth, Michael

    2009-01-01

    This article creates the Teaching Methodology Instrument (TMI) to help determine the level of adult learning principles being used by a particular teaching methodology in a classroom. The instrument incorporates the principles and assumptions set forth by Malcolm Knowles of what makes a good adult learning environment. The Socratic method as used…

  11. A Mechanistic Reliability Assessment of RVACS and Metal Fuel Inherent Reactivity Feedbacks

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano; Grelle, Austin

    2017-09-24

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examined utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.

  12. MBA theory and application of business and management principles

    CERN Document Server

    Davim, J

    2016-01-01

    This book focuses on the relevant subjects in the curriculum of an MBA program. Covering many different fields within business, this book is ideal for readers who want to prepare for a Master of Business Administration degree. It provides discussions and exchanges of information on principles, strategies, models, techniques, methodologies and applications in the business area.

  13. Survey of bayesian belif nets for quantitative reliability assessment of safety critical software used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Sung, T.Y.; Jeong, H.S.; Park, J.H.; Kang, H.G.; Lee, K

    2001-03-01

    As part of the Probabilistic Safety Assessment of safety grade digital systems used in Nuclear Power plants research, measures and methodologies applicable to quantitative reliability assessment of safety critical software were surveyed. Among the techniques proposed in the literature we selected those which are in use widely and investigated their limitations in quantitative software reliability assessment. One promising methodology from the survey is Bayesian Belief Nets (BBN) which has a formalism and can combine various disparate evidences relevant to reliability into final decision under uncertainty. Thus we analyzed BBN and its application cases in digital systems assessment area and finally studied the possibility of its application to the quantitative reliability assessment of safety critical software.

  14. Survey of bayesian belif nets for quantitative reliability assessment of safety critical software used in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H. S.; Sung, T. Y.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K.

    2001-03-01

    As part of the Probabilistic Safety Assessment of safety grade digital systems used in Nuclear Power plants research, measures and methodologies applicable to quantitative reliability assessment of safety critical software were surveyed. Among the techniques proposed in the literature we selected those which are in use widely and investigated their limitations in quantitative software reliability assessment. One promising methodology from the survey is Bayesian Belief Nets (BBN) which has a formalism and can combine various disparate evidences relevant to reliability into final decision under uncertainty. Thus we analyzed BBN and its application cases in digital systems assessment area and finally studied the possibility of its application to the quantitative reliability assessment of safety critical software

  15. Pump performance and reliability follow-up by the French Safety Authorities

    International Nuclear Information System (INIS)

    Clausner, J.P.; De La Ronciere, X.; Scott de Martinville, E.; Courbiere, P.

    1990-12-01

    This paper will present, through actual examples, the methodology of the performance and reliability safety-related pumps evaluation applied by the French Safety Authorities and the lessons drawn from this evaluation

  16. A methodology for characterization and categorization of solutions for micro handling

    DEFF Research Database (Denmark)

    Gegeckaite, Asta; Hansen, Hans Nørgaard

    2005-01-01

    is in the range of 0.1-10 micrometers. The importance of considering the entire micro handling scenario is imperative if operational solutions should be designed. The methodology takes into consideration component design (dimension, geometry, material, weight etc.), type of handling operation (characteristics......This paper presents a methodology whereby solutions for micro handling are characterized and classified. The purpose of defining such a methodology is to identify different possible integrated solutions with respect to a specific micro handling scenario in a development phase. The typical accuracy......, tolerances, speed, lot sizes etc.) and handling/gripping principles (contact, non-contact etc.). The methodology will be applied to a case study in order to demonstrate the feasibility of the method....

  17. Factors Influencing the Reliability of the Glasgow Coma Scale: A Systematic Review.

    Science.gov (United States)

    Reith, Florence Cm; Synnot, Anneliese; van den Brande, Ruben; Gruen, Russell L; Maas, Andrew Ir

    2017-06-01

    The Glasgow Coma Scale (GCS) characterizes patients with diminished consciousness. In a recent systematic review, we found overall adequate reliability across different clinical settings, but reliability estimates varied considerably between studies, and methodological quality of studies was overall poor. Identifying and understanding factors that can affect its reliability is important, in order to promote high standards for clinical use of the GCS. The aim of this systematic review was to identify factors that influence reliability and to provide an evidence base for promoting consistent and reliable application of the GCS. A comprehensive literature search was undertaken in MEDLINE, EMBASE, and CINAHL from 1974 to July 2016. Studies assessing the reliability of the GCS in adults or describing any factor that influences reliability were included. Two reviewers independently screened citations, selected full texts, and undertook data extraction and critical appraisal. Methodological quality of studies was evaluated with the consensus-based standards for the selection of health measurement instruments checklist. Data were synthesized narratively and presented in tables. Forty-one studies were included for analysis. Factors identified that may influence reliability are education and training, the level of consciousness, and type of stimuli used. Conflicting results were found for experience of the observer, the pathology causing the reduced consciousness, and intubation/sedation. No clear influence was found for the professional background of observers. Reliability of the GCS is influenced by multiple factors and as such is context dependent. This review points to the potential for improvement from training and education and standardization of assessment methods, for which recommendations are presented. Copyright © 2017 by the Congress of Neurological Surgeons.

  18. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  19. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF VOCATIONAL TEACHER EDUCATION

    Directory of Open Access Journals (Sweden)

    E. M. Dorozhkin

    2014-01-01

    analysis of methodology taking into consideration the target orientation, principles and approaches to the organization and its’ methods of scientific and educational activitiesimplementation. The qualification structure formation of the teachers’ vocational training and providing advance principles of education are considered to be the most important conditions for the development of vocational teacher education.Scientific novelty. The research demonstrates creating the project of further vocational teacher education development in the post-industrial society. The pedagogical innovations transforming research findings into educational practice are considered to be the main tool of integration methodology means.Practical significance. The research findings highlight the proposed reforms for further teachers training system development of vocational institutes, which are in need of drastic restructuring. In the final part of the article the authors recommend some specific issues that can be discussed at the methodological workshop.

  20. Principles of parametric estimation in modeling language competition.

    Science.gov (United States)

    Zhang, Menghan; Gong, Tao

    2013-06-11

    It is generally difficult to define reasonable parameters and interpret their values in mathematical models of social phenomena. Rather than directly fitting abstract parameters against empirical data, we should define some concrete parameters to denote the sociocultural factors relevant for particular phenomena, and compute the values of these parameters based upon the corresponding empirical data. Taking the example of modeling studies of language competition, we propose a language diffusion principle and two language inheritance principles to compute two critical parameters, namely the impacts and inheritance rates of competing languages, in our language competition model derived from the Lotka-Volterra competition model in evolutionary biology. These principles assign explicit sociolinguistic meanings to those parameters and calculate their values from the relevant data of population censuses and language surveys. Using four examples of language competition, we illustrate that our language competition model with thus-estimated parameter values can reliably replicate and predict the dynamics of language competition, and it is especially useful in cases lacking direct competition data.

  1. Multi-Level Simulated Fault Injection for Data Dependent Reliability Analysis of RTL Circuit Descriptions

    Directory of Open Access Journals (Sweden)

    NIMARA, S.

    2016-02-01

    Full Text Available This paper proposes data-dependent reliability evaluation methodology for digital systems described at Register Transfer Level (RTL. It uses a hybrid hierarchical approach, combining the accuracy provided by Gate Level (GL Simulated Fault Injection (SFI and the low simulation overhead required by RTL fault injection. The methodology comprises the following steps: the correct simulation of the RTL system, according to a set of input vectors, hierarchical decomposition of the system into basic RTL blocks, logic synthesis of basic RTL blocks, data-dependent SFI for the GL netlists, and RTL SFI. The proposed methodology has been validated in terms of accuracy on a medium sized circuit – the parallel comparator used in Check Node Unit (CNU of the Low-Density Parity-Check (LDPC decoders. The methodology has been applied for the reliability analysis of a 128-bit Advanced Encryption Standard (AES crypto-core, for which the GL simulation was prohibitive in terms of required computational resources.

  2. Evaluation of speech errors in Putonghua speakers with cleft palate: a critical review of methodology issues.

    Science.gov (United States)

    Jiang, Chenghui; Whitehill, Tara L

    2014-04-01

    Speech errors associated with cleft palate are well established for English and several other Indo-European languages. Few articles describing the speech of Putonghua (standard Mandarin Chinese) speakers with cleft palate have been published in English language journals. Although methodological guidelines have been published for the perceptual speech evaluation of individuals with cleft palate, there has been no critical review of methodological issues in studies of Putonghua speakers with cleft palate. A literature search was conducted to identify relevant studies published over the past 30 years in Chinese language journals. Only studies incorporating perceptual analysis of speech were included. Thirty-seven articles which met inclusion criteria were analyzed and coded on a number of methodological variables. Reliability was established by having all variables recoded for all studies. This critical review identified many methodological issues. These design flaws make it difficult to draw reliable conclusions about characteristic speech errors in this group of speakers. Specific recommendations are made to improve the reliability and validity of future studies, as well to facilitate cross-center comparisons.

  3. Principles of Forming the State Budget of Ukraine: Process and System Approach

    Directory of Open Access Journals (Sweden)

    Zakhozhay Kostyantyn V.

    2017-09-01

    Full Text Available The aim of the article is considering the theoretical and methodological instruments of the State Budget of Ukraine and in view of this providing a more extended characteristic of the principles of the budget system, taking into account the role of the country’s main financial plan at five classical levels of economy. As a result of the research, there determined the necessity of supplementing the legislatively approved principles of the State Budget of Ukraine with the newly introduced principles of economic security and social protection of the population. In order to improve the theoretical and methodological instruments of the State Budget of Ukraine and the visibility of its impact on socio-economic processes under current conditions of the society development as well as to determine its role in the socio-economic space, it is suggested to consider the role of the main financial plan for mega-, macro-, meso-, micro- and nano-levels. Further practical application of the introduced principles of forming the State Budget of Ukraine on the basis of the process and system approach will enable development of many sectors of the national economy; increase the flow of investment; promote political stability; reduce the inflation, unemployment; increase production and exports; reduce the budget deficit and public debt; affect the increase in the financial potential and gold reserves of the state.

  4. Reliability of accumulators systems for Angra-I: a reavaluation

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Fleming, P.V.; Frutuoso e Melo, P.F.F.; Tayt-Sohn, L.C.

    1983-01-01

    A new evaluation of reliability analysis for accumulators systems of Angra-1, based on a study done in 1979/80 is done the methodology used is the same (WASH-1400). An additional point, a computer program was used to obtain the minimum cuts. (author) [pt

  5. An information system supporting design for reliability and maintenance

    International Nuclear Information System (INIS)

    Rit, J.F.; Beraud, M.T.

    1997-01-01

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.)

  6. An information system supporting design for reliability and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rit, J.F.; Beraud, M.T

    1997-12-31

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.) 10 refs.

  7. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  8. A Review on VSC-HVDC Reliability Modeling and Evaluation Techniques

    Science.gov (United States)

    Shen, L.; Tang, Q.; Li, T.; Wang, Y.; Song, F.

    2017-05-01

    With the fast development of power electronics, voltage-source converter (VSC) HVDC technology presents cost-effective ways for bulk power transmission. An increasing number of VSC-HVDC projects has been installed worldwide. Their reliability affects the profitability of the system and therefore has a major impact on the potential investors. In this paper, an overview of the recent advances in the area of reliability evaluation for VSC-HVDC systems is provided. Taken into account the latest multi-level converter topology, the VSC-HVDC system is categorized into several sub-systems and the reliability data for the key components is discussed based on sources with academic and industrial backgrounds. The development of reliability evaluation methodologies is reviewed and the issues surrounding the different computation approaches are briefly analysed. A general VSC-HVDC reliability evaluation procedure is illustrated in this paper.

  9. Tracing organizing principles: Learning from the history of systems biology

    DEFF Research Database (Denmark)

    Green, Sara; Wolkenhauer, Olaf

    2014-01-01

    on this historical background in order to increase the understanding of the motivation behind the search for general principles and to clarify different epistemic aims within systems biology. We pinpoint key aspects of earlier approaches that also underlie the current practice. These are i) the focus on relational......With the emergence of systems biology, the identification of organizing principles is being highlighted as a key research aim. Researchers attempt to “reverse engineer” the functional organization of biological systems using methodologies from mathematics, engineering and computer science while...... taking advantage of data produced by new experimental techniques. While systems biology is a relatively new approach, the quest for general principles of biological organization dates back to systems theoretic approaches in early and mid-twentieth century. The aim of this paper is to draw...

  10. Hadoop Cluster Deployment: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Ronaldo Celso Messias Correia

    2018-05-01

    Full Text Available For a long time, data has been treated as a general problem because it just represents fractions of an event without any relevant purpose. However, the last decade has been just about information and how to get it. Seeking meaning in data and trying to solve scalability problems, many frameworks have been developed to improve data storage and its analysis. As a framework, Hadoop was presented as a powerful tool to deal with large amounts of data. However, it still causes doubts about how to deal with its deployment and if there is any reliable method to compare the performance of distinct Hadoop clusters. This paper presents a methodology based on benchmark analysis to guide the Hadoop cluster deployment. The experiments employed The Apache Hadoop and the Hadoop distributions of Cloudera, Hortonworks, and MapR, analyzing the architectures on local and on clouding—using centralized and geographically distributed servers. The results show the methodology can be dynamically applied on a reliable comparison among different architectures. Additionally, the study suggests that the knowledge acquired can be used to improve the data analysis process by understanding the Hadoop architecture.

  11. Principle of accrual and compliance of income and expenses in accounting system

    Directory of Open Access Journals (Sweden)

    V.V. Travin

    2017-12-01

    Full Text Available The introduction of international accounting and financial reporting standards requires the deeper implementation of accounting principles, in particular, the principle of accrual and compliance costs. The current research has shown that its implementation helps to streamline the accounting process by reducing the need to verify the chosen methodology for compliance with other regulatory documents that regulate the peculiarities of accounting in various areas. The categories of «a system», «income» and «expenses» in the accounting system are investigated and their system characteristics are showed. The approach to realization of the principle of accrual and compliance of income and expenses in the accounting system is described and proposed. It involves the possibility of capitalizing costs in the value of stocks, non-current assets, in the form of receivables, or in the form of future periods. The capitalization of costs in value occurs when the costs are not considered as such, and are considered only as an increase in the asset, as an integral part of these or other values. The study takes into account the peculiarities of its influence on the methodology of accounting for financial results of the enterprise.

  12. Research on Connection and Function Reliability of the Oil&Gas Pipeline System

    Directory of Open Access Journals (Sweden)

    Xu Bo

    2017-01-01

    Full Text Available Pipeline transportation is the optimal way for energy delivery in terms of safety, efficiency and environmental protection. Because of the complexity of pipeline external system including geological hazards, social and cultural influence, it is a great challenge to operate the pipeline safely and reliable. Therefore, the pipeline reliability becomes an important issue. Based on the classical reliability theory, the analysis of pipeline system is carried out, then the reliability model of the pipeline system is built, and the calculation is addressed thereafter. Further the connection and function reliability model is applied to a practical active pipeline system, with the use of the proposed methodology of the pipeline system; the connection reliability and function reliability are obtained. This paper firstly presented to considerate the connection and function reliability separately and obtain significant contribution to establish the mathematical reliability model of pipeline system, hence provide fundamental groundwork for the pipeline reliability research in the future.

  13. Ethnography: principles, practice and potential.

    Science.gov (United States)

    Draper, Jan

    2015-05-06

    Ethnography is a methodology that is gaining popularity in nursing and healthcare research. It is concerned with studying people in their cultural context and how their behaviour, either as individuals or as part of a group, is influenced by this cultural context. Ethnography is a form of social research and has much in common with other forms of qualitative enquiry. While classical ethnography was characteristically concerned with describing 'other' cultures, contemporary ethnography has focused on settings nearer to home. This article outlines some of the underlying principles and practice of ethnography, and its potential for nursing and healthcare practice.

  14. Reliability analysis and utilization of PEMs in space application

    Science.gov (United States)

    Jiang, Xiujie; Wang, Zhihua; Sun, Huixian; Chen, Xiaomin; Zhao, Tianlin; Yu, Guanghua; Zhou, Changyi

    2009-11-01

    More and more plastic encapsulated microcircuits (PEMs) are used in space missions to achieve high performance. Since PEMs are designed for use in terrestrial operating conditions, the successful usage of PEMs in space harsh environment is closely related to reliability issues, which should be considered firstly. However, there is no ready-made methodology for PEMs in space applications. This paper discusses the reliability for the usage of PEMs in space. This reliability analysis can be divided into five categories: radiation test, radiation hardness, screening test, reliability calculation and reliability assessment. One case study is also presented to illuminate the details of the process, in which a PEM part is used in a joint space program Double-Star Project between the European Space Agency (ESA) and China. The influence of environmental constrains including radiation, humidity, temperature and mechanics on the PEM part has been considered. Both Double-Star Project satellites are still running well in space now.

  15. Topics in expert system design methodologies and tools

    CERN Document Server

    Tasso, C

    1989-01-01

    Expert Systems are so far the most promising achievement of artificial intelligence research. Decision making, planning, design, control, supervision and diagnosis are areas where they are showing great potential. However, the establishment of expert system technology and its actual industrial impact are still limited by the lack of a sound, general and reliable design and construction methodology.This book has a dual purpose: to offer concrete guidelines and tools to the designers of expert systems, and to promote basic and applied research on methodologies and tools. It is a coordinated coll

  16. The reliability of commonly used electrophysiology measures.

    Science.gov (United States)

    Brown, K E; Lohse, K R; Mayer, I M S; Strigaro, G; Desikan, M; Casula, E P; Meunier, S; Popa, T; Lamy, J-C; Odish, O; Leavitt, B R; Durr, A; Roos, R A C; Tabrizi, S J; Rothwell, J C; Boyd, L A; Orth, M

    Electrophysiological measures can help understand brain function both in healthy individuals and in the context of a disease. Given the amount of information that can be extracted from these measures and their frequent use, it is essential to know more about their inherent reliability. To understand the reliability of electrophysiology measures in healthy individuals. We hypothesized that measures of threshold and latency would be the most reliable and least susceptible to methodological differences between study sites. Somatosensory evoked potentials from 112 control participants; long-latency reflexes, transcranial magnetic stimulation with resting and active motor thresholds, motor evoked potential latencies, input/output curves, and short-latency sensory afferent inhibition and facilitation from 84 controls were collected at 3 visits over 24 months at 4 Track-On HD study sites. Reliability was assessed using intra-class correlation coefficients for absolute agreement, and the effects of reliability on statistical power are demonstrated for different sample sizes and study designs. Measures quantifying latencies, thresholds, and evoked responses at high stimulator intensities had the highest reliability, and required the smallest sample sizes to adequately power a study. Very few between-site differences were detected. Reliability and susceptibility to between-site differences should be evaluated for electrophysiological measures before including them in study designs. Levels of reliability vary substantially across electrophysiological measures, though there are few between-site differences. To address this, reliability should be used in conjunction with theoretical calculations to inform sample size and ensure studies are adequately powered to detect true change in measures of interest. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Quantification of Hand Motor Symptoms in Parkinson's Disease: A Proof-of-Principle Study Using Inertial and Force Sensors.

    Science.gov (United States)

    van den Noort, Josien C; Verhagen, Rens; van Dijk, Kees J; Veltink, Peter H; Vos, Michelle C P M; de Bie, Rob M A; Bour, Lo J; Heida, Ciska T

    2017-10-01

    This proof-of-principle study describes the methodology and explores and demonstrates the applicability of a system, existing of miniature inertial sensors on the hand and a separate force sensor, to objectively quantify hand motor symptoms in patients with Parkinson's disease (PD) in a clinical setting (off- and on-medication condition). Four PD patients were measured in off- and on- dopaminergic medication condition. Finger tapping, rapid hand opening/closing, hand pro/supination, tremor during rest, mental task and kinetic task, and wrist rigidity movements were measured with the system (called the PowerGlove). To demonstrate applicability, various outcome parameters of measured hand motor symptoms of the patients in off- vs. on-medication condition are presented. The methodology described and results presented show applicability of the PowerGlove in a clinical research setting, to objectively quantify hand bradykinesia, tremor and rigidity in PD patients, using a single system. The PowerGlove measured a difference in off- vs. on-medication condition in all tasks in the presented patients with most of its outcome parameters. Further study into the validity and reliability of the outcome parameters is required in a larger cohort of patients, to arrive at an optimal set of parameters that can assist in clinical evaluation and decision-making.

  18. Psychometric Principles in Measurement for Geoscience Education Research: A Climate Change Example

    Science.gov (United States)

    Libarkin, J. C.; Gold, A. U.; Harris, S. E.; McNeal, K.; Bowles, R.

    2015-12-01

    Understanding learning in geoscience classrooms requires that we use valid and reliable instruments aligned with intended learning outcomes. Nearly one hundred instruments assessing conceptual understanding in undergraduate science and engineering classrooms (often called concept inventories) have been published and are actively being used to investigate learning. The techniques used to develop these instruments vary widely, often with little attention to psychometric principles of measurement. This paper will discuss the importance of using psychometric principles to design, evaluate, and revise research instruments, with particular attention to the validity and reliability steps that must be undertaken to ensure that research instruments are providing meaningful measurement. An example from a climate change inventory developed by the authors will be used to exemplify the importance of validity and reliability, including the value of item response theory for instrument development. A 24-item instrument was developed based on published items, conceptions research, and instructor experience. Rasch analysis of over 1000 responses provided evidence for the removal of 5 items for misfit and one item for potential bias as measured via differential item functioning. The resulting 18-item instrument can be considered a valid and reliable measure based on pre- and post-implementation metrics. Consideration of the relationship between respondent demographics and concept inventory scores provides unique insight into the relationship between gender, religiosity, values and climate change understanding.

  19. Revised INPRO Methodology in the Area of Proliferation Resistance

    International Nuclear Information System (INIS)

    Park, J.H.; Lee, Y.D.; Yang, M.S.; Kim, J.K.; Haas, E.; Depisch, F.

    2008-01-01

    The official INPRO User Manual in the area of proliferation resistance is being processed for the evaluation of innovative nuclear energy systems. Proliferation resistance is one of the goals to be satisfied for future nuclear energy systems in INPRO. The features of currently updated and released INPRO methodology were introduced on basic principles, user requirements and indicators. The criteria for an acceptance limit were specified. The DUPIC fuel cycle was evaluated based on the updated INPRO methodology for the applicability of the INPRO User Manual. However, the INPRO methodology has some difficulty in quantifying the multiplicity and robustness as well as the total cost to improve proliferation resistance. Moreover, the integration method for the evaluation results still needs to be improved.

  20. Reliability evaluation of nuclear power plants

    International Nuclear Information System (INIS)

    Rondiris, I.L.

    1978-10-01

    The research described in this thesis is concerned with the reliability/safety analysis of complex systems, such as nuclear power stations, basically using the event tree methodology. The thesis introduces and assesses a computational technique which applies the methodology to complex systems by simulating their topology and operational logic. The technique develops the system event tree and relates each branch of this tree to its qualitative and quantitative impact on specified system outcomes following an abnormal operating condition. Then, the thesis aims at deducing the critical failure modes of complex systems. This is achieved by a new technique for deducing the minimal cut or tie sets of various system outcomes. The technique is, furthermore, expanded to identify potential common mode failures and cut or tie sets containing dependent failures of some components. After dealing with the qualitative part of a reliability study, the thesis introduces two methods for calculating the probability of a component being either in the failure or in the partial failure state. The first method deals with revealed faults and makes use of the concept of Markov processes. The second one deals with unrevealed faults and can be used to calculate the relevant probability of component taking into account its inspection and replacement process. (author)

  1. TOREX-4: a torsatron proof of principle experiment

    International Nuclear Information System (INIS)

    Politzer, P.A.; Lidsky, L.M.; Montgomery, D.B.

    1979-03-01

    TOREX-4 is a torsatron Proof of Principle experiment designed to simultaneously achieve ntau approx. = to 5 x 10 13 , n approx. = to 5 x 10 14 /cm 3 , and T greater than or equal to 1 keV. TOREX-4 is capable of operating without externally driven currents; sufficient neutral beam power to reach betas of 2 to 5% is provided. The unique 4(+2) constant pitch angle winding configuration allows the reliable design of large systems with far greater experimental flexibility than can be achieved in conventional stellarators of comparable size. This will allow investigation of the basic physics questions of the torsatron configuration over a wide range of plasma properties and field configurations without sacrifice of the Proof of Principle goals

  2. Transmission cost allocation based on power flow tracing considering reliability benefit

    International Nuclear Information System (INIS)

    Leepreechanon, N.; Singharerg, S.; Padungwech, W.; Nakawiro, W.; Eua-Arporn, B.; David, A.K.

    2007-01-01

    Power transmission networks must be able to accommodate the continuously growing demand for reliable and economical electricity. This paper presented a method to allocate transmission use and reliability cost to both generators and end-consumers. Although transmission cost allocation methods change depending on the local context of the electric power industry, there is a common principle that transmission line capacity should be properly allocated to accommodate actual power delivery with an adequate reliability margin. The method proposed in this paper allocates transmission embedded cost to both generators and loads in an equitable manner, incorporating probability indices to allocate transmission reliability margin among users in both supply and demand sides. The application of the proposed method was illustrated using Bialek's tracing method on a multiple-circuit, six-bus transmission system. Probabilistic indices known as the transmission internal reliability margin (TIRM) and transmission external reliability margin (TERM) decomposed from the transmission reliability margin (TRM) were introduced, making true cost of using overall transmission facilities. 6 refs., 11 tabs., 5 figs

  3. SGHWR fuel performance, safety and reliability

    International Nuclear Information System (INIS)

    Pickman, D.O.; Inglis, G.H.

    1977-05-01

    The design principles involved in fuel pins and elements need to take account of the sometimes conflicting requirements of safety and reliability. The principal factors involved in this optimisation are discussed and it is shown from fuel irradiation experience in the Winfrith SGHWR that the necessary bias towards safety has not resulted in a reliability level lower than that shown by other successful water reactor designs. Reliability has important economic implications. By a detailed evaluation of SGHWR fuel defects it is shown that very few defects can be shown to be related to design, rating, or burn-up. This demonstrates that economic aspects have not over-ridden necessary criteria that most be met to achieve the desirable reliability level. It is possible that large scale experience on SGHWR fuel may eventually demonstrate that the balance is too much in favour of reliability and consideration may be given to whether design changes favouring economy could be achieved without compromising safety. The safety criteria applied to SGHWR fuel are designed to avoid any possibility of a temperature runaway in any credible accident situation. the philosophy and supporting experimental work programme are outlines and the fuel design features which particularly contribute to maximising safety margins are outlined. Reference is made to the new 60-pin fuel element to be used in the commercial SGHWRs and to its comparison in design and performance aspects with the 36-pin element that has been used to date in the Winfrith SGHWR. (author)

  4. European methodology for qualification of NDT as developed by ENIQ

    International Nuclear Information System (INIS)

    Champigny, F.; Sandberg, U.; Engl, G.; Crutzen, S.; Lemaitre, P.

    1997-01-01

    The European Network for Inspection Qualification (ENIQ) groups the major part of the nuclear power plant operators in the European Union (and Switzerland). The main objective of ENIQ is to co-ordinate and manage at European level expertise and resources for the qualification of NDE inspection systems, primarily for nuclear components. In the framework of ENIQ the European methodology for qualification of NDT has been developed. In this paper the main principles of the European methodology are given besides the main activities and organisation of ENIQ. (orig.)

  5. Bayesian approach for the reliability assessment of corroded interdependent pipe networks

    International Nuclear Information System (INIS)

    Ait Mokhtar, El Hassene; Chateauneuf, Alaa; Laggoune, Radouane

    2016-01-01

    Pipelines under corrosion are subject to various environment conditions, and consequently it becomes difficult to build realistic corrosion models. In the present work, a Bayesian methodology is proposed to allow for updating the corrosion model parameters according to the evolution of environmental conditions. For reliability assessment of dependent structures, Bayesian networks are used to provide interesting qualitative and quantitative description of the information in the system. The qualitative contribution lies in the modeling of complex system, composed by dependent pipelines, as a Bayesian network. The quantitative one lies in the evaluation of the dependencies between pipelines by the use of a new method for the generation of conditional probability tables. The effectiveness of Bayesian updating is illustrated through an application where the new reliability of degraded (corroded) pipe networks is assessed. - Highlights: • A methodology for Bayesian network modeling of pipe networks is proposed. • Bayesian approach based on Metropolis - Hastings algorithm is conducted for corrosion model updating. • The reliability of corroded pipe network is assessed by considering the interdependencies between the pipelines.

  6. Systems principles of planning the net cost of oil and gas extraction

    Energy Technology Data Exchange (ETDEWEB)

    Ryazanova, N I

    1979-01-01

    The automated system of calculation of ASPC ''oil extraction'' is developed in order to improve the existing system of planning of the oil extracting sector. The most complete expression of the systems construction of the plan is found in the section ''net cost and profit.'' Unity of the production process advances definite requirements for construction of the plan for net cost of oil and gas extraction as the model of this unified process. According to these requirements, the plan for net cost must be developed on the basis of interrelationship of the indicators of the plan for net cost within the section and with indicators of other sections of the plan, methodological unity and continuity of the methods of planning net cost by elements of outlays, articles of calculation and technical-economic factors, methodological continuity of regimes and stages of planning, as well as based on methodological continuity of the control levels. The listed requirements are principles for systems planning of the net cost of oil and gas extraction. These principles guarantee improvement in planning of net cost of oil and gas extraction according to the requirements made for the national economic planning.

  7. Optimization of reliability centered predictive maintenance scheme for inertial navigation system

    International Nuclear Information System (INIS)

    Jiang, Xiuhong; Duan, Fuhai; Tian, Heng; Wei, Xuedong

    2015-01-01

    The goal of this study is to propose a reliability centered predictive maintenance scheme for a complex structure Inertial Navigation System (INS) with several redundant components. GO Methodology is applied to build the INS reliability analysis model—GO chart. Components Remaining Useful Life (RUL) and system reliability are updated dynamically based on the combination of components lifetime distribution function, stress samples, and the system GO chart. Considering the redundant design in INS, maintenance time is based not only on components RUL, but also (and mainly) on the timing of when system reliability fails to meet the set threshold. The definition of components maintenance priority balances three factors: components importance to system, risk degree, and detection difficulty. Maintenance Priority Number (MPN) is introduced, which may provide quantitative maintenance priority results for all components. A maintenance unit time cost model is built based on components MPN, components RUL predictive model and maintenance intervals for the optimization of maintenance scope. The proposed scheme can be applied to serve as the reference for INS maintenance. Finally, three numerical examples prove the proposed predictive maintenance scheme is feasible and effective. - Highlights: • A dynamic PdM with a rolling horizon is proposed for INS with redundant components. • GO Methodology is applied to build the system reliability analysis model. • A concept of MPN is proposed to quantify the maintenance sequence of components. • An optimization model is built to select the optimal group of maintenance components. • The optimization goal is minimizing the cost of maintaining system reliability

  8. The Pursuit of Chronically Reliable Neural Interfaces: A Materials Perspective.

    Science.gov (United States)

    Guo, Liang

    2016-01-01

    Brain-computer interfaces represent one of the most astonishing technologies in our era. However, the grand challenge of chronic instability and limited throughput of the electrode-tissue interface has significantly hindered the further development and ultimate deployment of such exciting technologies. A multidisciplinary research workforce has been called upon to respond to this engineering need. In this paper, I briefly review this multidisciplinary pursuit of chronically reliable neural interfaces from a materials perspective by analyzing the problem, abstracting the engineering principles, and summarizing the corresponding engineering strategies. I further draw my future perspectives by extending the proposed engineering principles.

  9. Time-dependent reliability analysis of flood defences

    International Nuclear Information System (INIS)

    Buijs, F.A.; Hall, J.W.; Sayers, P.B.; Gelder, P.H.A.J.M. van

    2009-01-01

    This paper describes the underlying theory and a practical process for establishing time-dependent reliability models for components in a realistic and complex flood defence system. Though time-dependent reliability models have been applied frequently in, for example, the offshore, structural safety and nuclear industry, application in the safety-critical field of flood defence has to date been limited. The modelling methodology involves identifying relevant variables and processes, characterisation of those processes in appropriate mathematical terms, numerical implementation, parameter estimation and prediction. A combination of stochastic, hierarchical and parametric processes is employed. The approach is demonstrated for selected deterioration mechanisms in the context of a flood defence system. The paper demonstrates that this structured methodology enables the definition of credible statistical models for time-dependence of flood defences in data scarce situations. In the application of those models one of the main findings is that the time variability in the deterioration process tends to be governed the time-dependence of one or a small number of critical attributes. It is demonstrated how the need for further data collection depends upon the relevance of the time-dependence in the performance of the flood defence system.

  10. Gamma prior distribution selection for Bayesian analysis of failure rate and reliability

    International Nuclear Information System (INIS)

    Waler, R.A.; Johnson, M.M.; Waterman, M.S.; Martz, H.F. Jr.

    1977-01-01

    It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure-rate parameter, lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this paper is to present a methodology which can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate, lambda, simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10 -3 ) = 0.50 and P(lambda less than 1.0 x 10 -5 ) = 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure-rate percentiles illustrated above, one can use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t 0 ) less than 0.99) = 0.50 and P(R(t 0 ) less than 0.99999) = 0.95 for some operating time t 0 . Also, the paper includes graphs for selected percentiles which assist an engineer in applying the methodology

  11. A study on methodologies for assessing safety critical network's risk impact on Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Lee, H. J.; Park, S. K.; Seo, S. J.

    2006-08-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for Nuclear Power Plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of the first year study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  12. Reliability Analysis for Adhesive Bonded Composite Stepped Lap Joints Loaded in Fatigue

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Sørensen, John Dalsgaard; Lund, Erik

    2012-01-01

    -1, where partial safety factors are introduced together with characteristic values. Asymptotic sampling is used to estimate the reliability with support points generated by randomized Sobol sequences. The predicted reliability level is compared with the implicitly required target reliability level defined......This paper describes a probabilistic approach to calculate the reliability of adhesive bonded composite stepped lap joints loaded in fatigue using three- dimensional finite element analysis (FEA). A method for progressive damage modelling is used to assess fatigue damage accumulation and residual...... by the wind turbine standard IEC 61400-1. Finally, an approach for the assessment of the reliability of adhesive bonded composite stepped lap joints loaded in fatigue is presented. The introduced methodology can be applied in the same way to calculate the reliability level of wind turbine blade components...

  13. A Valid and Reliable Tool to Assess Nursing Students` Clinical Performance

    OpenAIRE

    Mehrnoosh Pazargadi; Tahereh Ashktorab; Sharareh Khosravi; Hamid Alavi majd

    2013-01-01

    Background: The necessity of a valid and reliable assessment tool is one of the most repeated issues in nursing students` clinical evaluation. But it is believed that present tools are not mostly valid and can not assess students` performance properly.Objectives: This study was conducted to design a valid and reliable assessment tool for evaluating nursing students` performance in clinical education.Methods: In this methodological study considering nursing students` performance definition; th...

  14. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  15. 'Emerging technologies for the changing global market' - Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt

    1993-01-01

    This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.

  16. Bayesian inference and updating of reliability data

    International Nuclear Information System (INIS)

    Sabri, Z.A.; Cullingford, M.C.; David, H.T.; Husseiny, A.A.

    1980-01-01

    A Bayes methodology for inference of reliability values using available but scarce current data is discussed. The method can be used to update failure rates as more information becomes available from field experience, assuming that the performance of a given component (or system) exhibits a nonhomogeneous Poisson process. Bayes' theorem is used to summarize the historical evidence and current component data in the form of a posterior distribution suitable for prediction and for smoothing or interpolation. An example is given. It may be appropriate to apply the methodology developed here to human error data, in which case the exponential model might be used to describe the learning behavior of the operator or maintenance crew personnel

  17. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  18. Lean principles adoption in environmental management system (EMS - ISO 14001

    Directory of Open Access Journals (Sweden)

    Perumal Puvanasvaran

    2012-12-01

    Full Text Available Purpose: The purpose of this study is to examine the characteristic of the lean principles into ISO 14001 and to propose linkage of the lean principles and ISO 14001.Design/methodology/approach: To achieve the objective of the study, literature survey and quantitative research method using questionnaires survey are used.Findings and Originality/value: The findings of this study confirm that ISO 14001 certified company adopted lean production practices.  The study also proves that lean principles have positive and significant relationship with ISO 14001 EMS and the linkage can be made between lean principles and ISO 14001 to achieve Continual Improvement.Research limitations/implications: The small sizes of the sample of the participating companies are the main limitations of this study and this research mainly focuses on the manufacturing environment and services industry.Practical implications: This research show that all ISO 14001 companies do adopt at least one lean production practices and the main findings are lean principles has positive and highly significant relationship with ISO 14001 requirements.  This is because the integration of lean principles into ISO 14001 will serve practical methods for ISO14001 EMS to achieve the continual improvement.Originality/value: This research is amongst the first to study the combined lean principles with ISO 1400.  Based on the current situation, there is no integration within this two management system. 

  19. An Internal Audit Perspective on Differences between European Corporate Governance Codes and OECD Principles

    Directory of Open Access Journals (Sweden)

    Raluca Ivan

    2015-12-01

    Full Text Available The main purpose of this research is to realize an analysis from an internal audit perspective of European Corporate Governance Codes, in regards with Organization for Economic Cooperation and Development – OECD Principles of Corporate Governance. The research methodology used a classification of countries by legal regime, trying to obtain a global view over the differences between the European corporate governance codes and the OECD Principles provisions, from internal audit’s perspective. The findings suggest that the specificities of internal audit function when studying the differences between European Corporate Governance Codes and OECD Principles lead to different treatment.

  20. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  1. The scope of the LeChatelier Principle

    Science.gov (United States)

    George M., Lady; Quirk, James P.

    2007-07-01

    LeChatelier [Comptes Rendus 99 (1884) 786; Ann. Mines 13 (2) (1888) 157] showed that a physical system's “adjustment” to a disturbance to its equilibrium tended to be smaller as constraints were added to the adjustment process. Samuelson [Foundations of Economic Analysis, Harvard University Press, Cambridge, 1947] applied this result to economics in the context of the comparative statics of the actions of individual agents characterized as the solutions to optimization problems; and later (1960), extended the application of the Principle to a stable, multi-market equilibrium and the case of all commodities gross substitutes [e.g., L. Metzler, Stability of multiple markets: the hicks conditions. Econometrica 13 (1945) 277-292]. Refinements and alternative routes of derivation have appeared in the literature since then, e.g., Silberberg [The LeChatelier Principle as a corollary to a generalized envelope theorem, J. Econ. Theory 3 (1971) 146-155; A revision of comparative statics methodology in economics, or, how to do comparative statics on the back of an envelope, J. Econ. Theory 7 (1974) 159-172], Milgrom and Roberts [The LeChatelier Principle, Am. Econ. Rev. 86 (1996) 173-179], W. Suen, E. Silberberg, P. Tseng [The LeChatelier Principle: the long and the short of it, Econ. Theory 16 (2000) 471-476], and Chavas [A global analysis of constrained behavior: the LeChatelier Principle ‘in the large’, South. Econ. J. 72 (3) (2006) 627-644]. In this paper, we expand the scope of the Principle in various ways keyed to Samuelson's proposed means of testing comparative statics results (optimization, stability, and qualitative analysis). In the optimization framework, we show that the converse LeChatelier Principle also can be found in constrained optimization problems and for not initially “conjugate” sensitivities. We then show how the Principle and its converse can be found through the qualitative analysis of any linear system. In these terms, the Principle and

  2. MERMOS: an EDF project to update the PHRA methodology (Probabilistic Human Reliability Assessment)

    International Nuclear Information System (INIS)

    Le Bot, Pierre; Desmares, E.; Bieder, C.; Cara, F.; Bonnet, J.L.

    1998-01-01

    To account for successive evolution of nuclear power plants emergency operation, EDF had several times to review PHRA methodologies. It was particularly the case when event-based procedures were left behind to the benefit of state-based procedures. A more recent updating was necessary to get pieces of information on the new unit type N4 safety. The extent of changes in operation for this unit type (especially the computerization of both the control room and the procedures) required to deeply rethink existing PHRA methods. It also seemed necessary to - more explicitly than in the past - base the design of methods on concepts evolved in human sciences. These are the main ambitions of the project named MERMOS that started in 1996. The design effort for a new PHRA method is carried out by a multidisciplinary team involving reliability engineers, psychologists and ergonomists. An independent expert is in charge of project review. The method, considered as the analysis tool dedicated to PHRA analysts, is one of the two outcomes of the project. The other one is the formalization of the design approach for the method, aimed at a good appropriation of the method by the analysts. EDF's specificity in the field of PHRA and more generally PSA is that the method is not used by the designers but by analysts. Keeping track of the approach is also meant to guarantee its transposition to other EDF unit types such as 900 or 1300 MW PWR. The PHRA method is based upon a model of emergency operation called 'SAD model'. The formalization effort of the design approach lead to clarify and justify it. The model describes and explains both functioning and dys-functioning of emergency operation in PSA scenarios. It combines a systemic approach and what is called distributed cognition in cognitive sciences. Collective aspects are considered as an important feature in explaining phenomena under study in operation dys-functioning. The PHRA method is to be operational early next year (1998

  3. The Methodology of Psychological Research of Ecological Consciousness

    Directory of Open Access Journals (Sweden)

    Irina A. Shmeleva

    2009-01-01

    Full Text Available The paper examines the methodological principles of the psychological study of ecological consciousness as one of the urgent interdisciplinary problems of XX–XXI century, caused by the aggravation of global ecological problems and the need for the realization of the “sustainable development”ideas. Ecological consciousness is considered as multilayered, dynamic, reflexive element of human consciousness, incorporating multivariate, holistic aspects of interaction of the human being as the H.S. and the Humanity representative with the environment and the Planet. The possibility of the more active introduction of Russian psychology in the process is argued for in connection with the existing conceptual approaches, which compose the methodological basis for ecological consciousness research. Among these approaches are considered: the principles of holistic study of the human being by B. Ananyev, the methodology of system psychological description by V. Gansen and G. Sukhodolsky, the idea of reflexivity of consciousness by S. Rubinstein, the humanitarian- ecological imperative of the development of consciousness by V. Zinchenko, the theory of relations by V. Myasishev, consideration of ecological consciousness as relation to nature by S. Deryabo and V. Yasvin, theories of consciousness by V. Petrenko, V. Allakhverdov and other Russian psychologists. The value component of ecological consciousness is distinguished as the most significant. The possibility of applying the Values’ theory of the by S. Schwartz for studying the ecological values is discussed along with the prognostic potential of the universalism value.

  4. Implantable biomedical microsystems design principles and applications

    CERN Document Server

    Bhunia, Swarup; Sawan, Mohamad

    2015-01-01

    Research and innovation in areas such as circuits, microsystems, packaging, biocompatibility, miniaturization, power supplies, remote control, reliability, and lifespan are leading to a rapid increase in the range of devices and corresponding applications in the field of wearable and implantable biomedical microsystems, which are used for monitoring, diagnosing, and controlling the health conditions of the human body. This book provides comprehensive coverage of the fundamental design principles and validation for implantable microsystems, as well as several major application areas. Each co

  5. Clinical reliability and validity of elbow functional assessment in rheumatoid arthritis.

    NARCIS (Netherlands)

    Boer, Y.A. de; Ende, C.H.M. van den; Eygendaal, D.; Jolie, I.M.M.; Hazes, J.M.W.; Rozing, P.M.

    1999-01-01

    OBJECTIVES: (1) To investigate the measurement characteristics of the Hospital for Special Surgery (HSS) and Mayo Clinic elbow assessment instruments, utilizing methodological criteria including feasibility, reliability, validity, and discriminative ability; and (2) to develop an efficient and

  6. An Internal Audit Perspective on Differences between European Corporate Governance Codes and OECD Principles

    OpenAIRE

    Raluca Ivan

    2015-01-01

    The main purpose of this research is to realize an analysis from an internal audit perspective of European Corporate Governance Codes, in regards with Organization for Economic Cooperation and Development – OECD Principles of Corporate Governance. The research methodology used a classification of countries by legal regime, trying to obtain a global view over the differences between the European corporate governance codes and the OECD Principles provisions, from internal audit’s perspective. T...

  7. Novel methodology for pharmaceutical expenditure forecast

    OpenAIRE

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aball?a, Samuel; R?muzat, C?cile; Urbinati, Duccio; Kornfeld, ?sa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective: The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical e...

  8. Reliability prediction system based on the failure rate model for electronic components

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Lee, Hwa Ki

    2008-01-01

    Although many methodologies for predicting the reliability of electronic components have been developed, their reliability might be subjective according to a particular set of circumstances, and therefore it is not easy to quantify their reliability. Among the reliability prediction methods are the statistical analysis based method, the similarity analysis method based on an external failure rate database, and the method based on the physics-of-failure model. In this study, we developed a system by which the reliability of electronic components can be predicted by creating a system for the statistical analysis method of predicting reliability most easily. The failure rate models that were applied are MILHDBK- 217F N2, PRISM, and Telcordia (Bellcore), and these were compared with the general purpose system in order to validate the effectiveness of the developed system. Being able to predict the reliability of electronic components from the stage of design, the system that we have developed is expected to contribute to enhancing the reliability of electronic components

  9. Thermal performance envelopes for MHTGRs - Reliability by design

    International Nuclear Information System (INIS)

    Etzel, K.T.; Howard, W.W.; Zgliczynski, J.

    1992-01-01

    Thermal performance envelopes are used to specify steady-state design requirements for the systems of the modular high-temperature gas-cooled reactor (MHTGR) to maximize plant performance reliability with optimized design. The thermal performance envelopes are constructed around the expected operating point to account for uncertainties in actual plant as-built parameters and plant operation. The components are then designed to perform successfully at all points within the envelope. As a result, plant reliability is maximized by accounting for component thermal performance variation in the design. The design is optimized by providing a means to determine required margins in a disciplined and visible fashion. This is accomplished by coordinating these requirements with the various system and component designers in the early stages of the design, applying the principles of total quality management. The design is challenged by the more complex requirements associated with a range of operating conditions, but in return, high probability of delivering reliable performance throughout the plant life is ensured

  10. IEEE guide for the analysis of human reliability

    International Nuclear Information System (INIS)

    Dougherty, E.M. Jr.

    1987-01-01

    The Institute of Electrical and Electronics Engineers (IEEE) working group 7.4 of the Human Factors and Control Facilities Subcommittee of the Nuclear Power Engineering Committee (NPEC) has released its fifth draft of a Guide for General Principles of Human Action Reliability Analysis for Nuclear Power Generating Stations, for approval of NPEC. A guide is the least mandating in the IEEE hierarchy of standards. The purpose is to enhance the performance of an human reliability analysis (HRA) as a part of a probabilistic risk assessment (PRA), to assure reproducible results, and to standardize documentation. The guide does not recommend or even discuss specific techniques, which are too rapidly evolving today. Considerable maturation in the analysis of human reliability in a PRA context has taken place in recent years. The IEEE guide on this subject is an initial step toward bringing HRA out of the research and development arena into the toolbox of standard engineering practices

  11. Use of reliability engineering in development and manufacturing of metal parts

    International Nuclear Information System (INIS)

    Khan, A.; Iqbal, M.A.; Asif, M.

    2005-01-01

    The reliability engineering predicts modes of failures and weak links before the system is built instead of failure case study. The reliability engineering analysis will help in the manufacturing economy, assembly accuracy and qualification by testing, leading to production of metal parts in an aerospace industry. This methodology will also minimize the performance constraints in any requirement for the application of metal components in aerospace systems. The reliability engineering predicts the life of the parts under loading conditions whether dynamic or static. Reliability predictions can help engineers in making decisions about design of components, materials selection and qualification under applied stress levels. Two methods of reliability prediction i.e. Part Stress Analysis and Part Count have been used in this study. In this paper we will discuss how these two methods can be used to measure reliability of a system during development phases, which includes the measuring effect of environmental and operational variables. The equations are used to measure the reliability of each type of component, as well as, integration for measuring system applied for the reliability analysis. (author)

  12. Inferring principles for sustainable development of business through analogies from ecological systems

    Directory of Open Access Journals (Sweden)

    K. Sriram

    2013-03-01

    Full Text Available The literature in the field of sustainable development (SD of businesses is piecemeal and diverse. This paper identifies and integrates principles that businesses could use for transformation towards SD. This is done through analogical reasoning from the source context of ecological systems to the target contexts of business socio-economic systems and machine/technology systems. The methodologies of systems thinking and morphological analysis supplement the analogical reasoning. Based on this, twelve principles for sustainable development of business are inferred for business managers and policy makers.

  13. Interpretive reliability of two common MMPI-2 profiles

    Directory of Open Access Journals (Sweden)

    Mark A. Deskovitz

    2016-12-01

    Full Text Available Users of multi-scale tests like the MMPI-2 tend not to interpret scales one at a time in a way that would correspond to standard scale-level reliability information. Instead, clinicians integrate inferences from a multitude of scales simultaneously, producing a descriptive narrative that is thought to characterize the examinee. This study was an attempt to measure the reliability of such integrated interpretations using a q-sort research methodology. Participants were 20 MMPI-2 users who responded to E-mail solicitations on professional listservs and in personal emails. Each participant interpreted one of two common MMPI-2 profiles using a q-set of 100 statements designed for MMPI-2 interpretation. To measure the “interpretive reliability” of the MMPI-2 profile interpretations, q-sort descriptions were intercorrelated. Mean pairwise interpretive reliability was .39, lower than expected, and there was no significant difference in reliability between profiles. There was also not a significant difference between within-profile and cross-profile correlations. Q-set item analysis was conducted to determine which individual statements had the most impact on interpretive reliability. Although sampling in this study was limited, implications for the field reliability of MMPI-2 interpretation are sobering.

  14. Vending machine assessment methodology. A systematic review.

    Science.gov (United States)

    Matthews, Melissa A; Horacek, Tanya M

    2015-07-01

    The nutritional quality of food and beverage products sold in vending machines has been implicated as a contributing factor to the development of an obesogenic food environment. How comprehensive, reliable, and valid are the current assessment tools for vending machines to support or refute these claims? A systematic review was conducted to summarize, compare, and evaluate the current methodologies and available tools for vending machine assessment. A total of 24 relevant research studies published between 1981 and 2013 met inclusion criteria for this review. The methodological variables reviewed in this study include assessment tool type, study location, machine accessibility, product availability, healthfulness criteria, portion size, price, product promotion, and quality of scientific practice. There were wide variations in the depth of the assessment methodologies and product healthfulness criteria utilized among the reviewed studies. Of the reviewed studies, 39% evaluated machine accessibility, 91% evaluated product availability, 96% established healthfulness criteria, 70% evaluated portion size, 48% evaluated price, 52% evaluated product promotion, and 22% evaluated the quality of scientific practice. Of all reviewed articles, 87% reached conclusions that provided insight into the healthfulness of vended products and/or vending environment. Product healthfulness criteria and complexity for snack and beverage products was also found to be variable between the reviewed studies. These findings make it difficult to compare results between studies. A universal, valid, and reliable vending machine assessment tool that is comprehensive yet user-friendly is recommended. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. A comparative reliability analysis of free-piston Stirling machines

    Science.gov (United States)

    Schreiber, Jeffrey G.

    2001-02-01

    A free-piston Stirling power convertor is being developed for use in an advanced radioisotope power system to provide electric power for NASA deep space missions. These missions are typically long lived, lasting for up to 14 years. The Department of Energy (DOE) is responsible for providing the radioisotope power system for the NASA missions, and has managed the development of the free-piston power convertor for this application. The NASA Glenn Research Center has been involved in the development of Stirling power conversion technology for over 25 years and is currently providing support to DOE. Due to the nature of the potential missions, long life and high reliability are important features for the power system. Substantial resources have been spent on the development of long life Stirling cryocoolers for space applications. As a very general statement, free-piston Stirling power convertors have many features in common with free-piston Stirling cryocoolers, however there are also significant differences. For example, designs exist for both power convertors and cryocoolers that use the flexure bearing support system to provide noncontacting operation of the close-clearance moving parts. This technology and the operating experience derived from one application may be readily applied to the other application. This similarity does not pertain in the case of outgassing and contamination. In the cryocooler, the contaminants normally condense in the critical heat exchangers and foul the performance. In the Stirling power convertor just the opposite is true as contaminants condense on non-critical surfaces. A methodology was recently published that provides a relative comparison of reliability, and is applicable to systems. The methodology has been applied to compare the reliability of a Stirling cryocooler relative to that of a free-piston Stirling power convertor. The reliability analysis indicates that the power convertor should be able to have superior reliability

  16. Application of exemption principles to low-level waste disposal and recycle of wastes from nuclear facilities

    International Nuclear Information System (INIS)

    Kennedy, W.E. Jr.; Hemming, C.R.; O'Donnell, F.R.; Linsley, G.S.

    1988-04-01

    The International Atomic Energy Agency (IAEA) and other international groups are considering exempting from regulatory control certain radiation sources and practices, initially under the general heading of de minimis. A significant fraction of the wastes from industry, research, medicine, and the nuclear fuel cycle are contaminated to such low levels that the associated risks to health are trivial. IAEA work has been conducted by Advisory Groups to establish principles for exemption, and to apply the principles to various areas of waste management. In the second area, the main objectives have been to illustrate a methodology for developing practical radiological criteria through the application of the IAEA preliminary exemption principles, to establish generic criteria, and to determine the practicability of the preliminary exemption principles. The method used relies on a modeling assessment of the potential radiation exposure pathways and scenarios for individuals and population groups following the unrestricted release of materials. This paper describes the IAEA's assessment methodology and presents the generic results expressed in terms of the limiting activity concentration in municipal waste and in low-activity materials for recycle and reuse. 2 refs., 2 tabs

  17. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  18. Twelve Principles for Green Energy Storage in Grid Applications.

    Science.gov (United States)

    Arbabzadeh, Maryam; Johnson, Jeremiah X; Keoleian, Gregory A; Rasmussen, Paul G; Thompson, Levi T

    2016-01-19

    The introduction of energy storage technologies to the grid could enable greater integration of renewables, improve system resilience and reliability, and offer cost effective alternatives to transmission and distribution upgrades. The integration of energy storage systems into the electrical grid can lead to different environmental outcomes based on the grid application, the existing generation mix, and the demand. Given this complexity, a framework is needed to systematically inform design and technology selection about the environmental impacts that emerge when considering energy storage options to improve sustainability performance of the grid. To achieve this, 12 fundamental principles specific to the design and grid application of energy storage systems are developed to inform policy makers, designers, and operators. The principles are grouped into three categories: (1) system integration for grid applications, (2) the maintenance and operation of energy storage, and (3) the design of energy storage systems. We illustrate the application of each principle through examples published in the academic literature, illustrative calculations, and a case study with an off-grid application of vanadium redox flow batteries (VRFBs). In addition, trade-offs that can emerge between principles are highlighted.

  19. Reliability of the Bulb Dynamometer for Assessing Grip Strength

    Directory of Open Access Journals (Sweden)

    Colleen Maher

    2018-04-01

    Full Text Available Background: Hand function is an overall indicator of health and is often measured using grip strength. Handheld dynamometry is the most common method of measuring grip strength. The purpose of this study was to determine the inter-rater and test-retest reliability, the reliability of one trial versus three trials, and the preliminary norms for a young adult population using the Baseline® Pneumatic Squeeze Bulb Dynamometer (30 psi. Methods: This study used a one-group methodological design. One hundred and three healthy adults (30 males and 73 females were recruited. Six measurements were collected for each hand per participant. The data was analyzed using Intraclass Correlation Coefficients (ICC two-way effects model (2,2 and paired-samples t-tests. Results: The ICC for inter-rater reliability ranged from 0.955 to 0.977. Conclusion: The results of this study suggest that the bulb dynamometer is a reliable tool to measure grip strength and should be further explored for reliable and valid use in diverse populations and as an alternative to the Jamar dynamometer.

  20. Integration of infrared thermography into various maintenance methodologies

    Science.gov (United States)

    Morgan, William T.

    1993-04-01

    Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.

  1. A Regulatory Perspective on the Performance and Reliability of Nuclear Passive Safety Systems

    International Nuclear Information System (INIS)

    Quan, Pham Trung; Lee, Sukho

    2016-01-01

    Passive safety systems have been proven to enhance the safety of NPPs. When an accident such as station blackout occurs, these systems can perform the following functions: the decay heat removal, passive safety injection, containment cooling, and the retention of radioactive materials. Following the IAEA definitions, using passive safety systems reduces reliance on active components to achieve proper actuation and not requiring operator intervention in accident conditions. That leads to the deviations in boundary conditions of the critical process or geometric parameters, which activate and operate the system to perform accident prevention and mitigation functions. The main difficulties in evaluation of functional failure of passive systems arise because of (a) lack of plant operational experience; (b) scarcity of adequate experimental data from integral test facilities or from separate effect tests in order to understand the performance characteristics of these passive systems, not only at normal operation but also during accidents and transients; (c) lack of accepted definitions of failure modes for these systems; and (d) difficulty in modeling certain physical behavior of these systems. Reliability assessment of the PSS is still one of the important issues. Several reliability methodologies such as REPAS, RMPS and ASPRA have been applied to the reliability assessments. However, some issues are remained unresolved due to lack of understanding of the treatment of dynamic failure characteristics of components of the PSS, the treatment of dynamic variation of independence process parameters such as ambient temperature and the functional failure criteria of the PSS. Dynamic reliability methodologies should be integrated in the PSS reliability analysis to have a true estimate of system failure probability. The methodology should estimate the physical variation of the parameters and the frequency of the accident sequences when the dynamic effects are considered

  2. The study of evaluation methodology of the aging and degradation researches

    International Nuclear Information System (INIS)

    Cho, C. J.; Park, Z. H.; Jeong, I. S.

    2001-01-01

    To judge the usefulness of aging related researches like PLIM (Plant lifetime Management) and aging related degradation, et. al. in PSR(Periodic Safety Review), the evaluation methodology of the R and D have been proposed up to now are reviewed. The infometric methodology is considered to be the optimum method for the evaluation of the nuclear related researches. And finally, to increase the objectiveness and reliability of the infometric methodology in the aging and degradation researches, the indexes of safety, technology and economics are introduced. From this study, the infometric methodology has the advantage of the actual engineering evaluation in the nuclear related researches with other methodologies, but for the further research, the effective construction of DB and survey of various statistics in the technical reports and papers are needed

  3. A RELIABILITY TEST USED FOR THE DEVELOPMENT OF A LOYALTY SCALE

    Directory of Open Access Journals (Sweden)

    Florin-Alexandru LUCA

    2017-06-01

    Full Text Available The development of a loyalty model involves the construction of a proper research instrument. For the loyalty model of the clients for financial services, the pre-testing of the research questionnaire represents a significant stage. This article presents the methodology used in this stage for testing the reliability of a loyalty scale. Firstly, this implies choosing the appropriate scales for each variable included in the suggested research model. Secondly, the internal consistency for each of these scales is measured as an indicator of their reliability. The reliability analysis described represents an essential stage in building a measurement instrument for a loyalty model.

  4. HTGR plant availability and reliability evaluations. Volume I. Summary of evaluations

    International Nuclear Information System (INIS)

    Cadwallader, G.J.; Hannaman, G.W.; Jacobsen, F.K.; Stokely, R.J.

    1976-12-01

    The report (1) describes a reliability assessment methodology for systematically locating and correcting areas which may contribute to unavailability of new and uniquely designed components and systems, (2) illustrates the methodology by applying it to such components in a high-temperature gas-cooled reactor [Public Service Company of Colorado's Fort St. Vrain 330-MW(e) HTGR], and (3) compares the results of the assessment with actual experience. The methodology can be applied to any component or system; however, it is particularly valuable for assessments of components or systems which provide essential functions, or the failure or mishandling of which could result in relatively large economic losses

  5. HTGR plant availability and reliability evaluations. Volume I. Summary of evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Cadwallader, G.J.; Hannaman, G.W.; Jacobsen, F.K.; Stokely, R.J.

    1976-12-01

    The report (1) describes a reliability assessment methodology for systematically locating and correcting areas which may contribute to unavailability of new and uniquely designed components and systems, (2) illustrates the methodology by applying it to such components in a high-temperature gas-cooled reactor (Public Service Company of Colorado's Fort St. Vrain 330-MW(e) HTGR), and (3) compares the results of the assessment with actual experience. The methodology can be applied to any component or system; however, it is particularly valuable for assessments of components or systems which provide essential functions, or the failure or mishandling of which could result in relatively large economic losses.

  6. Human reliability analysis of performing tasks in plants based on fuzzy integral

    International Nuclear Information System (INIS)

    Washio, Takashi; Kitamura, Yutaka; Takahashi, Hideaki

    1991-01-01

    The effective improvement of the human working conditions in nuclear power plants might be a solution for the enhancement of the operation safety. The human reliability analysis (HRA) gives a methodological basis of the improvement based on the evaluation of human reliability under various working conditions. This study investigates some difficulties of the human reliability analysis using conventional linear models and recent fuzzy integral models, and provides some solutions to the difficulties. The following practical features of the provided methods are confirmed in comparison with the conventional methods: (1) Applicability to various types of tasks (2) Capability of evaluating complicated dependencies among working condition factors (3) A priori human reliability evaluation based on a systematic task analysis of human action processes (4) A conversion scheme to probability from indices representing human reliability. (author)

  7. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    Science.gov (United States)

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  8. Constructing the principles: Method and metaphysics in the progress of theoretical physics

    Science.gov (United States)

    Glass, Lawrence C.

    This thesis presents a new framework for the philosophy of physics focused on methodological differences found in the practice of modern theoretical physics. The starting point for this investigation is the longstanding debate over scientific realism. Some philosophers have argued that it is the aim of science to produce an accurate description of the world including explanations for observable phenomena. These scientific realists hold that our best confirmed theories are approximately true and that the entities they propose actually populate the world, whether or not they have been observed. Others have argued that science achieves only frameworks for the prediction and manipulation of observable phenomena. These anti-realists argue that truth is a misleading concept when applied to empirical knowledge. Instead, focus should be on the empirical adequacy of scientific theories. This thesis argues that the fundamental distinction at issue, a division between true scientific theories and ones which are empirically adequate, is best explored in terms of methodological differences. In analogy with the realism debate, there are at least two methodological strategies. Rather than focusing on scientific theories as wholes, this thesis takes as units of analysis physical principles which are systematic empirical generalizations. The first possible strategy, the conservative, takes the assumption that the empirical adequacy of a theory in one domain serves as good evidence for such adequacy in other domains. This then motivates the application of the principle to new domains. The second strategy, the innovative, assumes that empirical adequacy in one domain does not justify the expectation of adequacy in other domains. New principles are offered as explanations in the new domain. The final part of the thesis is the application of this framework to two examples. On the first, Lorentz's use of the aether is reconstructed in terms of the conservative strategy with respect to

  9. Suppression of panel flutter of near-space aircraft based on non-probabilistic reliability theory

    Directory of Open Access Journals (Sweden)

    Ye-Wei Zhang

    2016-03-01

    Full Text Available The vibration active control of the composite panels with the uncertain parameters in the hypersonic flow is studied using the non-probabilistic reliability theory. Using the piezoelectric patches as active control actuators, dynamic equations of panel are established by finite element method and Hamilton’s principle. And the control model of panel with uncertain parameters is obtained. According to the non-probabilistic reliability index, and besides being based on H∞ robust control theory and non-probabilistic reliability theory, the non-probabilistic reliability performance function is given. Moreover, the relationships between the robust controller and H∞ performance index and reliability are established. Numerical results show that the control method under the influence of reliability, H∞ performance index, and approaching velocity is effective to the vibration suppression of panel in the whole interval of uncertain parameters.

  10. Modeling reliability of power systems substations by using stochastic automata networks

    International Nuclear Information System (INIS)

    Šnipas, Mindaugas; Radziukynas, Virginijus; Valakevičius, Eimutis

    2017-01-01

    In this paper, stochastic automata networks (SANs) formalism to model reliability of power systems substations is applied. The proposed strategy allows reducing the size of state space of Markov chain model and simplifying system specification. Two case studies of standard configurations of substations are considered in detail. SAN models with different assumptions were created. SAN approach is compared with exact reliability calculation by using a minimal path set method. Modeling results showed that total independence of automata can be assumed for relatively small power systems substations with reliable equipment. In this case, the implementation of Markov chain model by a using SAN method is a relatively easy task. - Highlights: • We present the methodology to apply stochastic automata network formalism to create Markov chain models of power systems. • The stochastic automata network approach is combined with minimal path sets and structural functions. • Two models of substation configurations with different model assumptions are presented to illustrate the proposed methodology. • Modeling results of system with independent automata and functional transition rates are similar. • The conditions when total independence of automata can be assumed are addressed.

  11. Reliability and Availability Evaluation of Wireless Sensor Networks for Industrial Applications

    Science.gov (United States)

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements. PMID:22368497

  12. Methodology for maintenance analysis based on hydroelectric power stations reliability; Metodologia para realizar analisis de mantenimiento basado en confiabilidad en centrales hidroelectricas

    Energy Technology Data Exchange (ETDEWEB)

    Rea Soto, Rogelio; Calixto Rodriguez, Roberto; Sandoval Valenzuela, Salvador; Velasco Flores, Rocio; Garcia Lizarraga, Maria del Carmen [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2012-07-01

    A methodology to carry out Reliability Centered Maintenance (RCM) studies for hydroelectric power plants is presented. The methodology is an implantation/extension of the guidelines proposed by the Engineering Society for Advanced Mobility Land, Sea and Space in the SAE-JA1012 standard. With the purpose of answering the first five questions, that are set out in that standard, the use of standard ISO14224 is strongly recommended. This approach standardizes failure mechanisms and homogenizes RCM studies with the process of collecting failure and maintenance data. The use of risk matrixes to rank the importance of each failure based on a risk criteria is also proposed. [Spanish] Se presenta una metodologia para realizar estudios de mantenimiento Basado en Confiabilidad (RCM) aplicados a la industria hidroelectrica. La metodologia es una implantacion/ extension realizada por los autores de este trabajo, de los lineamientos propuestos por la Engineering Society for Advanced Mobility Land, Sea and Space en el estandar SAE-JA1012. Para contestar las primeras cinco preguntas del estandar se propone tomar como base los modos y mecanismos de fallas de componentes documentados en la guia para recopilar datos de falla en el estandar ISO-14224. Este enfoque permite estandarizar la descripcion de mecanismos de fallas de los equipos, tanto en el estudio RCM como en el proceso de recopilacion de datos de falla y de mantenimiento, lo que permite retroalimentar el ciclo de mejora continua de los procesos RCM. Tambien se propone el uso de matrices de riesgo para jerarquizar la importancia de los mecanismos de falla con base en el nivel de riesgo.

  13. Reliability analysis of numerical simulation in near field behavior

    International Nuclear Information System (INIS)

    Kobayashi, Akira; Yamamoto, Kiyohito; Chijimatsu, Masakazu; Fujita, Tomoo

    2008-01-01

    The uncertainties of the boundary conditions, the elastic modulus and Poisson's ratio on the mechanical behavior at near field of high level radioactive waste repository were examined. The method used to examine the error propagation was the first order second moment method. The reliability of the maximum principal stress, maximum shear stress at crown of the tunnel and the minimum principal stress at spring line was examined for one million years. For elastic model, the reliability of the maximum shear stress gradually decreased while that of the maximum principle stress increased. That of the minimum principal stress was relatively low for one million years. This tendency was similar to that from the damage model. (author)

  14. Validity and reliability of the Persian version of mobile phone addiction scale

    OpenAIRE

    Mazaheri, Maryam Amidi; Karbasi, Mojtaba

    2014-01-01

    Background: With regard to large number of mobile users especially among college students in Iran, addiction to mobile phone is attracting increasing concern. There is an urgent need for reliable and valid instrument to measure this phenomenon. This study examines validity and reliability of the Persian version of mobile phone addiction scale (MPAIS) in college students. Materials and Methods: this methodological study was down in Isfahan University of Medical Sciences. One thousand one hundr...

  15. Quantitative reliability assessment for safety critical system software

    International Nuclear Information System (INIS)

    Chung, Dae Won; Kwon, Soon Man

    2005-01-01

    An essential issue in the replacement of the old analogue I and C to computer-based digital systems in nuclear power plants is the quantitative software reliability assessment. Software reliability models have been successfully applied to many industrial applications, but have the unfortunate drawback of requiring data from which one can formulate a model. Software which is developed for safety critical applications is frequently unable to produce such data for at least two reasons. First, the software is frequently one-of-a-kind, and second, it rarely fails. Safety critical software is normally expected to pass every unit test producing precious little failure data. The basic premise of the rare events approach is that well-tested software does not fail under normal routine and input signals, which means that failures must be triggered by unusual input data and computer states. The failure data found under the reasonable testing cases and testing time for these conditions should be considered for the quantitative reliability assessment. We will present the quantitative reliability assessment methodology of safety critical software for rare failure cases in this paper

  16. Improving patient care in cardiac surgery using Toyota production system based methodology.

    Science.gov (United States)

    Culig, Michael H; Kunkle, Richard F; Frndak, Diane C; Grunden, Naida; Maher, Thomas D; Magovern, George J

    2011-02-01

    A new cardiac surgery program was developed in a community hospital setting using the operational excellence (OE) method, which is based on the principles of the Toyota production system. The initial results of the first 409 heart operations, performed over the 28 months between March 1, 2008, and June 30, 2010, are presented. Operational excellence methodology was taught to the cardiac surgery team. Coaching started 2 months before the opening of the program and continued for 24 months. Of the 409 cases presented, 253 were isolated coronary artery bypass graft operations. One operative death occurred. According to the database maintained by The Society of Thoracic Surgeons, the risk-adjusted operative mortality rate was 61% lower than the regional rate. Likewise, the risk-adjusted rate of major complications was 57% lower than The Society of Thoracic Surgeons regional rate. Daily solution to determine cause was attempted on 923 distinct perioperative problems by all team members. Using the cost of complications as described by Speir and coworkers, avoiding predicted complications resulted in a savings of at least $884,900 as compared with the regional average. By the systematic use of a real time, highly formatted problem-solving methodology, processes of care improved daily. Using carefully disciplined teamwork, reliable implementation of evidence-based protocols was realized by empowering the front line to make improvements. Low rates of complications were observed, and a cost savings of $3,497 per each case of isolated coronary artery bypass graft was realized. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Reliability-Based Optimal Design for Very Large Floating Structure

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shu-hua(张淑华); FUJIKUBO Masahiko

    2003-01-01

    Costs and losses induced by possible future extreme environmental conditions and difficulties in repairing post-yielding damage strongly suggest the need for proper consideration in design rather than just life loss prevention. This can be addressed through the development of design methodology that balances the initial cost of the very large floating structure (VLFS) against the expected potential losses resulting from future extreme wave-induced structural damage. Here, the development of a methodology for determining optimal, cost-effective design will be presented and applied to a VLFS located in the Tokyo bay. Optimal design criteria are determined based on the total expected life-cycle cost and acceptable damage probability and curvature of the structure, and a set of sizes of the structure are obtained. The methodology and applications require expressions of the initial cost and the expected life-cycle damage cost as functions of the optimal design variables. This study includes the methodology, total life-cycle cost function, structural damage modeling, and reliability analysis.

  18. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  19. Can we reliably benchmark health technology assessment organizations?

    Science.gov (United States)

    Drummond, Michael; Neumann, Peter; Jönsson, Bengt; Luce, Bryan; Schwartz, J Sanford; Siebert, Uwe; Sullivan, Sean D

    2012-04-01

    In recent years, there has been growth in the use of health technology assessment (HTA) for making decisions about the reimbursement, coverage, or guidance on the use of health technologies. Given this greater emphasis on the use of HTA, it is important to develop standards of good practice and to benchmark the various HTA organizations against these standards. This study discusses the conceptual and methodological challenges associated with benchmarking HTA organizations and proposes a series of audit questions based on a previously published set of principles of good practice. It is concluded that a benchmarking exercise would be feasible and useful, although the question of who should do the benchmarking requires further discussion. Key issues for further research are the alternative methods for weighting the various principles and for generating an overall score, or summary statement of adherence to the principles. Any weighting system, if developed, would need to be explored in different jurisdictions to assess the extent to which the relative importance of the principles is perceived to vary. Finally, the development and precise wording of the audit questions requires further study, with a view to making the questions as unambiguous as possible, and the reproducibility of the assessments as high as possible.

  20. Reliability of physical functioning tests in patients with low back pain: a systematic review.

    Science.gov (United States)

    Denteneer, Lenie; Van Daele, Ulrike; Truijen, Steven; De Hertogh, Willem; Meirte, Jill; Stassijns, Gaetane

    2018-01-01

    The aim of this study was to provide a comprehensive overview of physical functioning tests in patients with low back pain (LBP) and to investigate their reliability. A systematic computerized search was finalized in four different databases on June 24, 2017: PubMed, Web of Science, Embase, and MEDLINE. Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were followed during all stages of this review. Clinical studies that investigate the reliability of physical functioning tests in patients with LBP were eligible. The methodological quality of the included studies was assessed with the use of the Consensus-based Standards for the selection of health Measurement Instruments (COSMIN) checklist. To come to final conclusions on the reliability of the identified clinical tests, the current review assessed three factors, namely, outcome assessment, methodological quality, and consistency of description. A total of 20 studies were found eligible and 38 clinical tests were identified. Good overall test-retest reliability was concluded for the extensor endurance test (intraclass correlation coefficient [ICC]=0.93-0.97), the flexor endurance test (ICC=0.90-0.97), the 5-minute walking test (ICC=0.89-0.99), the 50-ft walking test (ICC=0.76-0.96), the shuttle walk test (ICC=0.92-0.99), the sit-to-stand test (ICC=0.91-0.99), and the loaded forward reach test (ICC=0.74-0.98). For inter-rater reliability, only one test, namely, the Biering-Sörensen test (ICC=0.88-0.99), could be concluded to have an overall good inter-rater reliability. None of the identified clinical tests could be concluded to have a good intrarater reliability. Further investigation should focus on a better overall study methodology and the use of identical protocols for the description of clinical tests. The assessment of reliability is only a first step in the recommendation process for the use of clinical tests. In future research, the identified clinical tests in the