WorldWideScience

Sample records for reliability principles methodology

  1. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  2. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  3. Methodology for allocating reliability and risk

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1986-05-01

    This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs

  4. Contextual factors, methodological principles and teacher cognition

    Directory of Open Access Journals (Sweden)

    Rupert Walsh

    2014-01-01

    Full Text Available Teachers in various contexts worldwide are sometimes unfairly criticized for not putting teaching methods developed for the well-resourced classrooms of Western countries into practice. Factors such as the teachers’ “misconceptualizations” of “imported” methods, including Communicative Language Teaching (CLT, are often blamed, though the challenges imposed by “contextual demands,” such as large class sizes, are sometimes recognised. Meanwhile, there is sometimes an assumption that in the West there is a happy congruence between policy supportive of CLT or Task-Based Language Teaching, teacher education and supervision, and curriculum design with teachers’ cognitions and their practices. Our case study of three EFL teachers at a UK adult education college is motivated by a wish to question this assumption. Findings from observational and interview data suggest the practices of two teachers were largely consistent with their methodological principles, relating to stronger and weaker forms of CLT respectively, as well as to more general educational principles, such as a concern for learners; the supportive environment seemed to help. The third teacher appeared to put “difficult” contextual factors, for example, tests, ahead of methodological principles without, however, obviously benefiting. Implications highlight the important role of teacher cognition research in challenging cultural assumptions.

  5. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  6. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  7. CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology

    Science.gov (United States)

    Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy

    2006-01-01

    This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.

  8. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  9. Health economic evaluation: important principles and methodology.

    Science.gov (United States)

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  10. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  11. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  12. Methodology for reliability, economic and environmental assessment of wave energy

    International Nuclear Information System (INIS)

    Thorpe, T.W.; Muirhead, S.

    1994-01-01

    As part of the Preliminary Actions in Wave Energy R and D for DG XII's Joule programme, methodologies were developed to facilitate assessment of the reliability, economics and environmental impact of wave energy. This paper outlines these methodologies, their limitations and areas requiring further R and D. (author)

  13. Flash memories economic principles of performance, cost and reliability optimization

    CERN Document Server

    Richter, Detlev

    2014-01-01

    The subject of this book is to introduce a model-based quantitative performance indicator methodology applicable for performance, cost and reliability optimization of non-volatile memories. The complex example of flash memories is used to introduce and apply the methodology. It has been developed by the author based on an industrial 2-bit to 4-bit per cell flash development project. For the first time, design and cost aspects of 3D integration of flash memory are treated in this book. Cell, array, performance and reliability effects of flash memories are introduced and analyzed. Key performance parameters are derived to handle the flash complexity. A performance and array memory model is developed and a set of performance indicators characterizing architecture, cost and durability is defined.   Flash memories are selected to apply the Performance Indicator Methodology to quantify design and technology innovation. A graphical representation based on trend lines is introduced to support a requirement based pr...

  14. Reliability evaluation of thermophysical properties from first-principles calculations.

    Science.gov (United States)

    Palumbo, Mauro; Fries, Suzana G; Dal Corso, Andrea; Kürmann, Fritz; Hickel, Tilmann; Neugebauer, Jürg

    2014-08-20

    Thermophysical properties, such as heat capacity, bulk modulus and thermal expansion, are of great importance for many technological applications and are traditionally determined experimentally. With the rapid development of computational methods, however, first-principles computed temperature-dependent data are nowadays accessible. We evaluate various computational realizations of such data in comparison to the experimental scatter. The work is focussed on the impact of different first-principles codes (QUANTUM ESPRESSO and VASP), pseudopotentials (ultrasoft and projector augmented wave) as well as phonon determination methods (linear response and direct force constant method) on these properties. Based on the analysis of data for two pure elements, Cr and Ni, consequences for the reliability of temperature-dependent first-principles data in computational thermodynamics are discussed.

  15. Principle of maximum entropy for reliability analysis in the design of machine components

    Science.gov (United States)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  16. Reliability assessment of passive containment isolation system using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Srivastava, A.; Prasad, Hari; Anthony, A.; Gaikwad, A.J.; Bhatia, S.; Sinha, R.K.

    2008-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) has been employed for evaluation of the reliability of passive systems. The methodology has been applied to the passive containment isolation system (PCIS) of the Indian advanced heavy water reactor (AHWR). In the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to carryout the desired function. The methodology first determines the operational characteristics of the system and the failure conditions by assigning a predetermined failure criterion. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the PCIS performance. APSRA proposes to compare the code predictions with the test data to generate the uncertainties on the failure parameter prediction, which is later considered in the code for accurate prediction of failure surface of the system. Once the failure surface of the system is predicted, the cause of failure is examined through root diagnosis, which occurs mainly due to failure of mechanical components. The failure probability of these components is evaluated through a classical PSA treatment using the generic data. The reliability of the PCIS is evaluated from the probability of availability of the components for the success of the passive containment isolation system

  17. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  18. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.; El-Bassioni, A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed

  19. Transmission embedded cost allocation methodology with consideration of system reliability

    International Nuclear Information System (INIS)

    Hur, D.; Park, J.-K.; Yoo, C.-I.; Kim, B.H.

    2004-01-01

    In a vertically integrated utility industry, the cost of reliability, as a separate service, has not received much rigorous analysis. However, as a cornerstone of restructuring the industry, the transmission service pricing must change to be consistent with, and supportive of, competitive wholesale electricity markets. This paper focuses on the equitable allocation of transmission network embedded costs including the transmission reliability cost based on the contributions of each generator to branch flows under normal conditions as well as the line outage impact factor under a variety of load levels. A numerical example on a six-bus system is given to illustrate the applications of the proposed methodology. (author)

  20. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Science.gov (United States)

    2010-10-01

    ... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... 49 Transportation 4 2010-10-01 2010-10-01 false General Principles of Reliability-Based... the design level of safety and reliability of the equipment; (2) To restore safety and reliability to...

  1. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  2. A reliability assessment methodology for the VHTR passive safety system

    International Nuclear Information System (INIS)

    Lee, Hyungsuk; Jae, Moosung

    2014-01-01

    The passive safety system of a VHTR (Very High Temperature Reactor), which has recently attracted worldwide attention, is currently being considered for the design of safety improvements for the next generation of nuclear power plants in Korea. The functionality of the passive system does not rely on an external source of an electrical support system, but on the intelligent use of natural phenomena. Its function involves an ultimate heat sink for a passive secondary auxiliary cooling system, especially during a station blackout such as the case of the Fukushima Daiichi reactor accidents. However, it is not easy to quantitatively evaluate the reliability of passive safety for the purpose of risk analysis, considering the existing active system failure since the classical reliability assessment method cannot be applied. Therefore, we present a new methodology to quantify the reliability based on reliability physics models. This evaluation framework is then applied to of the conceptually designed VHTR in Korea. The Response Surface Method (RSM) is also utilized for evaluating the uncertainty of the maximum temperature of nuclear fuel. The proposed method could contribute to evaluating accident sequence frequency and designing new innovative nuclear systems, such as the reactor cavity cooling system (RCCS) in VHTR to be designed and constructed in Korea.

  3. Improved FTA methodology and application to subsea pipeline reliability design.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  4. Seismic reliability assessment methodology for CANDU concrete containment structures

    International Nuclear Information System (INIS)

    Stephens, M.J.; Nessim, M.A.; Hong, H.P.

    1995-05-01

    A study was undertaken to develop a reliability-based methodology for the assessment of existing CANDU concrete containment structures with respect to seismic loading. The focus of the study was on defining appropriate specified values and partial safety factors for earthquake loading and resistance parameters. Key issues addressed in the work were the identification of an approach to select design earthquake spectra that satisfy consistent safety levels, and the use of structure-specific data in the evaluation of structural resistance. (author). 23 refs., 9 tabs., 15 figs

  5. METHODOLOGICAL PRINCIPLES OF FORMING REPERTOIRE OF STUDENTS’ FOLK INSTRUMENTAL ORCHESTRA

    Directory of Open Access Journals (Sweden)

    Mykola Pshenychnykh

    2016-11-01

    Full Text Available One of the main aspects of forming future music teachers’ professional competence, connected with mastering professional musical and performing skills in the course “Orchestra Class” and realized in the activity of students’ performing group, is revealed. Nowadays the problem of creative personality development is relevant, as creative future music art teachers freely orient themselves and guide pupils students in today's cultural environment, music and media space, have a strong musical taste and aesthetic guidelines. The music genre groups have been characterized in the article. It is thought that these groups are the traditional components of repertoire of folk and orchestra student groups: arrangements of folk tunes; works of Ukrainian and world classics, orchestrated for the folk groups, taking into account each orchestra performing possibilities; works by contemporary authors, written specifically for the orchestra of folk instruments. The main methodological principles of selecting the repertoire for the student orchestra of folk instruments are disclosed, including: technical, artistic and performing capabilities of student groups; involvement of works of different genres into the repertoire; correspondence of orchestra scores to instrumental composition of the student orchestra, and their correction if it is necessary; selecting works, whose performing arouses interest of the student audience; using the experience of the leading professional ensembles of folk instruments; constant updating the orchestra's repertoire. In the conclusion the author emphasizes that taking into account the methodological tips helps solve the main tasks within the course of “Orchestra Class”. These tips are the following: students’ acquaintance with the history of foundation, composition, ways of musicianship, technique of playing the instrument of folk instrument orchestra and acquaintance with specific orchestral music; development of all

  6. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  7. IEEE guide for general principles of reliability analysis of nuclear power generating station protection systems

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Presented is the Institute of Electrical and Electronics Engineers, Inc. (IEEE) guide for general principles of reliability analysis of nuclear power generating station protection systems. The document has been prepared to provide the basic principles needed to conduct a reliability analysis of protection systems. Included is information on qualitative and quantitative analysis, guides for failure data acquisition and use, and guide for establishment of intervals

  8. Future of structural reliability methodology in nuclear power plant technology

    Energy Technology Data Exchange (ETDEWEB)

    Schueeller, G I [Technische Univ. Muenchen (Germany, F.R.); Kafka, P [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany, F.R.)

    1978-10-01

    This paper presents the authors' personal view as to which areas of structural reliability in nuclear power plant design need most urgently to be advanced. Aspects of simulation modeling, design rules, codification and specification of reliability, system analysis, probabilistic structural dynamics, rare events and particularly the interaction of systems and structural reliability are discussed. As an example, some considerations of the interaction effects between the protective systems and the pressure vessel are stated. The paper concludes with recommendation for further research.

  9. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  10. Applying reliability centered maintenance analysis principles to inservice testing

    International Nuclear Information System (INIS)

    Flude, J.W.

    1994-01-01

    Federal regulations require nuclear power plants to use inservice test (IST) programs to ensure the operability of safety-related equipment. IST programs are based on American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code requirements. Many of these plants also use Reliability Centered Maintenance (RCM) to optimize system maintenance. ASME Code requirements are hard to change. The process for requesting authority to use an alternate strategy is long and expensive. The difficulties of obtaining this authority make the use of RCM method on safety-related systems not cost effective. An ASME research task force on Risk Based Inservice Testing is investigating changing the Code. The change will allow plants to apply RCM methods to the problem of maintenance strategy selection for safety-related systems. The research task force is working closely with the Codes and Standards sections to develop a process related to the RCM process. Some day plants will be able to use this process to develop more efficient and safer maintenance strategies

  11. FACING ISO 9001 . PRINCIPLE OF PDCA, METHODOLOGY 8D

    Directory of Open Access Journals (Sweden)

    S. V. Yurchenko

    2014-01-01

    Full Text Available Tools of management system by means of which it is possible to operate any information flows both in the organization and in the chain consumer-supplier are presented. Models of control cards help to build and show productive system of management, applying at the same time methodology 8D.

  12. Evaluation of methodologies for remunerating wind power's reliability in Colombia

    International Nuclear Information System (INIS)

    Botero B, Sergio; Isaza C, Felipe; Valencia, Adriana

    2010-01-01

    Colombia strives to have enough firm capacity available to meet unexpected power shortages and peak demand; this is clear from mechanisms currently in place that provide monetary incentives (in the order of nearly US$ 14/MW h) to power producers that can guarantee electricity provision during scarcity periods. Yet, wind power in Colombia is not able to currently guarantee firm power because an accepted methodology to calculate its potential firm capacity does not exist. In this paper we argue that developing such methodology would provide an incentive to potential investors to enter into this low carbon technology. This paper analyzes three methodologies currently used in energy markets around the world to calculate firm wind energy capacity: PJM, NYISO, and Spain. These methodologies are initially selected due to their ability to accommodate to the Colombian energy regulations. The objective of this work is to determine which of these methodologies makes most sense from an investor's perspective, to ultimately shed light into developing a methodology to be used in Colombia. To this end, the authors developed a methodology consisting on the elaboration of a wind model using the Monte-Carlo simulation, based on known wind behaviour statistics of a region with adequate wind potential in Colombia. The simulation gives back random generation data, representing the resource's inherent variability and simulating the historical data required to evaluate the mentioned methodologies, thus achieving the technology's theoretical generation data. The document concludes that the evaluated methodologies are easy to implement and that these do not require historical data (important for Colombia, where there is almost no historical wind power data). It is also found that the Spanish methodology provides a higher Capacity Value (and therefore a higher return to investors). The financial assessment results show that it is crucial that these types of incentives exist to make viable

  13. Design methodologies for reliability of SSL LED boards

    NARCIS (Netherlands)

    Jakovenko, J.; Formánek, J.; Perpiñà, X.; Jorda, X.; Vellvehi, M.; Werkhoven, R.J.; Husák, M.; Kunen, J.M.G.; Bancken, P.; Bolt, P.J.; Gasse, A.

    2013-01-01

    This work presents a comparison of various LED board technologies from thermal, mechanical and reliability point of view provided by an accurate 3-D modelling. LED boards are proposed as a possible technology replacement of FR4 LED boards used in 400 lumen retrofit SSL lamps. Presented design

  14. Use of PRA methodology for enhancing operational safety and reliability

    International Nuclear Information System (INIS)

    Chu, B.; Rumble, E.; Najafi, B.; Putney, B.; Young, J.

    1985-01-01

    This paper describes a broad scope, on-going R and D study, sponsored by the Electric Power Research Institute (EPRI) to utilize key features of the state-of-the-art plant information management and system analysis techniques to develop and demonstrate a practical engineering tool for assisting plant engineering and operational staff to perform their activities more effectively. The study is foreseen to consist of two major activities: to develop a user-friendly, integrated software system; and to demonstrate the applications of this software on-site. This integrated software, Reliability Analysis Program with In-Plant Data (RAPID), will consist of three types of interrelated elements: an Executive Controller which will provide engineering and operations staff users with interface and control of the other two software elements, a Data Base Manager which can acquire, store, select, and transfer data, and Applications Modules which will perform the specific reliability-oriented functions. A broad range of these functions has been envisaged. The immediate emphasis will be focused on four application modules: a Plant Status Module, a Technical Specification Optimization Module, a Reliability Assessment Module, and a Utility Module for acquiring plant data

  15. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  16. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  17. Reliability evaluation methodologies for ensuring container integrity of stored transuranic (TRU) waste

    International Nuclear Information System (INIS)

    Smith, K.L.

    1995-06-01

    This report provides methodologies for providing defensible estimates of expected transuranic waste storage container lifetimes at the Radioactive Waste Management Complex. These methodologies can be used to estimate transuranic waste container reliability (for integrity and degradation) and as an analytical tool to optimize waste container integrity. Container packaging and storage configurations, which directly affect waste container integrity, are also addressed. The methodologies presented provide a means for demonstrating Resource Conservation and Recovery Act waste storage requirements

  18. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  19. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  20. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  1. Integrating rock mechanics issues with repository design through design process principles and methodology

    International Nuclear Information System (INIS)

    Bieniawski, Z.T.

    1996-01-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as open-quotes design for manufactureclose quotes or open-quotes concurrent engineeringclose quotes are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of open-quotes Design for Constructibility and Performanceclose quotes is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance

  2. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    International Nuclear Information System (INIS)

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  3. A reach of the principle of entry and the principle of reliability in the real estate cadastre in our court practice

    OpenAIRE

    Cvetić Radenka M.

    2015-01-01

    Through the review of the principle of entry and the principle of reliability in the Real Estate Cadastre and their reach in our court practice, this article indicates the indispensability of compliance with these principles for the sake of legal certainty. A formidable and a complex role of the court when applying law in order to rightfully resolve an individual case has been underlined. Having regard to the accountability of the courts for the efficacy of the legal system, without any inten...

  4. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  5. Methodological principles outline discipline "Organization studies-tourism activity" using information technologies.

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-08-01

    Full Text Available The basic methodological principles of the disciplines of tourism and local history with information technology. 15 analyzed the literature and experience of leading experts in the field of sports and health tourism, and orienteering. Identified principles of academic disciplines of tourism and local history: the shift in emphasis from sports tourism to the cognitive, health tourism, the development of spiritual qualities, acquisition of life skills in nature, discovery and development of pedagogical and psychological abilities, character traits through the study of native land, the development of cognitive-research abilities, physical abilities, motor skills, application of modern information technology.

  6. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  7. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  8. Methodological and Methodical Principles of the Empirical Study of Spiritual Development of a Personality

    Directory of Open Access Journals (Sweden)

    Olga Klymyshyn

    2017-06-01

    Full Text Available The article reveals the essence of the methodological principles of the spiritual development of a personality. The results of the theoretical analysis of psychological content of spirituality from the positions of system and structural approach to studying of a personality, age patterns of the mental personality development, the sacramental nature of human person, mechanisms of human spiritual development are taken into consideration. The interpretation of spirituality and the spiritual development of a personality is given. Initial principles of the organization of the empirical research of the spiritual development of a personality (ontogenetic, sociocultural, self-determination, system are presented. Such parameters of the estimation of a personality’s spiritual development as general index of the development of spiritual potential, indexes of the development of ethical, aesthetical, cognitive, existential components of spirituality, index of religiousness of a personality are described. Methodological support of psychological diagnostic research is defined.

  9. Inter comparison of REPAS and APSRA methodologies for passive system reliability analysis

    International Nuclear Information System (INIS)

    Solanki, R.B.; Krishnamurthy, P.R.; Singh, Suneet; Varde, P.V.; Verma, A.K.

    2014-01-01

    The increasing use of passive systems in the innovative nuclear reactors puts demand on the estimation of the reliability assessment of these passive systems. The passive systems operate on the driving forces such as natural circulation, gravity, internal stored energy etc. which are moderately weaker than that of active components. Hence, phenomenological failures (virtual components) are equally important as that of equipment failures (real components) in the evaluation of passive systems reliability. The contribution of the mechanical components to the passive system reliability can be evaluated in a classical way using the available component reliability database and well known methods. On the other hand, different methods are required to evaluate the reliability of processes like thermohydraulics due to lack of adequate failure data. The research is ongoing worldwide on the reliability assessment of the passive systems and their integration into PSA, however consensus is not reached. Two of the most widely used methods are Reliability Evaluation of Passive Systems (REPAS) and Assessment of Passive System Reliability (APSRA). Both these methods characterize the uncertainties involved in the design and process parameters governing the function of the passive system. However, these methods differ in the quantification of passive system reliability. Inter comparison among different available methods provides useful insights into the strength and weakness of different methods. This paper highlights the results of the thermal hydraulic analysis of a typical passive isolation condenser system carried out using RELAP mode 3.2 computer code applying REPAS and APSRA methodologies. The failure surface is established for the passive system under consideration and system reliability has also been evaluated using these methods. Challenges involved in passive system reliabilities are identified, which require further attention in order to overcome the shortcomings of these

  10. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs.

  11. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea

    2014-01-01

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs

  12. Basic Principles of Electrical Network Reliability Optimization in Liberalised Electricity Market

    Science.gov (United States)

    Oleinikova, I.; Krishans, Z.; Mutule, A.

    2008-01-01

    The authors propose to select long-term solutions to the reliability problems of electrical networks in the stage of development planning. The guide lines or basic principles of such optimization are: 1) its dynamical nature; 2) development sustainability; 3) integrated solution of the problems of network development and electricity supply reliability; 4) consideration of information uncertainty; 5) concurrent consideration of the network and generation development problems; 6) application of specialized information technologies; 7) definition of requirements for independent electricity producers. In the article, the major aspects of liberalized electricity market, its functions and tasks are reviewed, with emphasis placed on the optimization of electrical network development as a significant component of sustainable management of power systems.

  13. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    International Nuclear Information System (INIS)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J.

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction

  14. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction.

  15. Reliability assessment of passive isolation condenser system of AHWR using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Prasad, Hari; Anthony, A.; Bhatia, S.K.; Sinha, R.K.

    2009-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) is used for evaluation of reliability of passive isolation condenser system of the Indian Advanced Heavy Water Reactor (AHWR). As per the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to perform the design basis function. The methodology first determines the operational characteristics of the system and the failure conditions based on a predetermined failure criterion. The parameters that could degrade the system performance are identified and considered for analysis. Different modes of failure and their cause are identified. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the isolation condenser system performance. Once the failure surface of the system is predicted, the causes of failure are examined through root diagnosis, which occur mainly due to failure of mechanical components. Reliability of the system is evaluated through a classical PSA treatment based on the failure probability of the components using generic data

  16. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  17. An integrated methodology for the dynamic performance and reliability evaluation of fault-tolerant systems

    International Nuclear Information System (INIS)

    Dominguez-Garcia, Alejandro D.; Kassakian, John G.; Schindall, Joel E.; Zinchuk, Jeffrey J.

    2008-01-01

    We propose an integrated methodology for the reliability and dynamic performance analysis of fault-tolerant systems. This methodology uses a behavioral model of the system dynamics, similar to the ones used by control engineers to design the control system, but also incorporates artifacts to model the failure behavior of each component. These artifacts include component failure modes (and associated failure rates) and how those failure modes affect the dynamic behavior of the component. The methodology bases the system evaluation on the analysis of the dynamics of the different configurations the system can reach after component failures occur. For each of the possible system configurations, a performance evaluation of its dynamic behavior is carried out to check whether its properties, e.g., accuracy, overshoot, or settling time, which are called performance metrics, meet system requirements. Markov chains are used to model the stochastic process associated with the different configurations that a system can adopt when failures occur. This methodology not only enables an integrated framework for evaluating dynamic performance and reliability of fault-tolerant systems, but also enables a method for guiding the system design process, and further optimization. To illustrate the methodology, we present a case-study of a lateral-directional flight control system for a fighter aircraft

  18. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  19. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  20. Bulk Fuel Pricing: DOD Needs to Take Additional Actions to Establish a More Reliable Methodology

    Science.gov (United States)

    2015-11-19

    Page 1 GAO-16-78R Bulk Fuel Pricing 441 G St. N.W. Washington, DC 20548 November 19, 2015 The Honorable Ashton Carter The Secretary of...Defense Bulk Fuel Pricing : DOD Needs to Take Additional Actions to Establish a More Reliable Methodology Dear Secretary Carter: Each fiscal...year, the Office of the Under Secretary of Defense (Comptroller), in coordination with the Defense Logistics Agency, sets a standard price per barrel

  1. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  2. Methodology for risk assessment and reliability applied for pipeline engineering design and industrial valves operation

    Energy Technology Data Exchange (ETDEWEB)

    Silveira, Dierci [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Escola de Engenharia Industrial e Metalurgia. Lab. de Sistemas de Producao e Petroleo e Gas], e-mail: dsilveira@metal.eeimvr.uff.br; Batista, Fabiano [CICERO, Rio das Ostras, RJ (Brazil)

    2009-07-01

    Two kinds of situations may be distinguished for estimating the operating reliability when maneuvering industrial valves and the probability of undesired events in pipelines and industrial plants: situations in which the risk is identified in repetitive cycles of operations and situations in which there is a permanent hazard due to project configurations introduced by decisions during the engineering design definition stage. The estimation of reliability based on the influence of design options requires the choice of a numerical index, which may include a composite of human operating parameters based on biomechanics and ergonomics data. We first consider the design conditions under which the plant or pipeline operator reliability concepts can be applied when operating industrial valves, and then describe in details the ergonomics and biomechanics risks that would lend itself to engineering design database development and human reliability modeling and assessment. This engineering design database development and reliability modeling is based on a group of engineering design and biomechanics parameters likely to lead to over-exertion forces and working postures, which are themselves associated with the functioning of a particular plant or pipeline. This approach to construct based on ergonomics and biomechanics for a more common industrial valve positioning in the plant layout is proposed through the development of a methodology to assess physical efforts and operator reach, combining various elementary operations situations. These procedures can be combined with the genetic algorithm modeling and four elements of the man-machine systems: the individual, the task, the machinery and the environment. The proposed methodology should be viewed not as competing to traditional reliability and risk assessment bur rather as complementary, since it provides parameters related to physical efforts values for valves operation and workspace design and usability. (author)

  3. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  4. A study on a reliability assessment methodology for the VHTR safety systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok

    2012-02-01

    The passive safety system of a 300MWt VHTR (Very High Temperature Reactor)which has attracted worldwide attention recently is actively considered for designing the improvement in the safety of the next generation nuclear power plant. The passive system functionality does not rely on an external source of the electrical support system,but on an intelligent use of the natural phenomena, such as convection, conduction, radiation, and gravity. It is not easy to evaluate quantitatively the reliability of the passive safety for the risk analysis considering the existing active system failure since the classical reliability assessment method could not be applicable. Therefore a new reliability methodology needs to be developed and applied for evaluating the reliability of the conceptual designed VHTR in this study. The preliminary evaluation and conceptualization are performed using the concept of the load and capacity theory related to the reliability physics model. The method of response surface method (RSM) is also utilized for evaluating the maximum temperature of nuclear fuel in this study. The significant variables and their correlation are considered for utilizing the GAMMA+ code. The proposed method might contribute to designing the new passive system of the VHTR

  5. Reliability assessment of Passive Containment Cooling System of an Advanced Reactor using APSRA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Mukesh, E-mail: mukeshd@barc.gov.in [Reactor Engineering Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Chakravarty, Aranyak [School of Nuclear Studies and Application, Jadavpur University, Kolkata 700032 (India); Nayak, A.K. [Reactor Engineering Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Prasad, Hari; Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2014-10-15

    Highlights: • The paper deals with the reliability assessment of Passive Containment Cooling System of Advanced Heavy Water Reactor. • Assessment of Passive System ReliAbility (APSRA) methodology is used for reliability assessment. • Performance assessment of the PCCS is initially performed during a postulated design basis LOCA. • The parameters affecting the system performance are then identified and considered for further analysis. • The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. - Abstract: Passive Systems are increasingly playing a prominent role in the advanced nuclear reactor systems and are being utilised in normal operations as well as safety systems of the reactors following an accident. The Passive Containment Cooling System (PCCS) is one of the several passive safety features in an Advanced Reactor (AHWR). In this paper, the APSRA methodology has been employed for reliability evaluation of the PCCS of AHWR. Performance assessment of the PCCS is initially performed during a postulated design basis LOCA using the best-estimate code RELAP5/Mod 3.2. The parameters affecting the system performance are then identified and considered for further analysis. Based on some pre-determined failure criterion, the failure surface for the system is predicted using the best-estimate code taking into account the deviations of the identified parameters from their nominal states as well as the model uncertainties inherent to the best estimate code. Root diagnosis is then carried out to determine the various failure causes, which occurs mainly due to malfunctioning of mechanical components. The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. The reliability of the PCCS is then evaluated from the probability of availability of these components.

  6. Reliability assessment of Passive Containment Cooling System of an Advanced Reactor using APSRA methodology

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Chakravarty, Aranyak; Nayak, A.K.; Prasad, Hari; Gopika, V.

    2014-01-01

    Highlights: • The paper deals with the reliability assessment of Passive Containment Cooling System of Advanced Heavy Water Reactor. • Assessment of Passive System ReliAbility (APSRA) methodology is used for reliability assessment. • Performance assessment of the PCCS is initially performed during a postulated design basis LOCA. • The parameters affecting the system performance are then identified and considered for further analysis. • The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. - Abstract: Passive Systems are increasingly playing a prominent role in the advanced nuclear reactor systems and are being utilised in normal operations as well as safety systems of the reactors following an accident. The Passive Containment Cooling System (PCCS) is one of the several passive safety features in an Advanced Reactor (AHWR). In this paper, the APSRA methodology has been employed for reliability evaluation of the PCCS of AHWR. Performance assessment of the PCCS is initially performed during a postulated design basis LOCA using the best-estimate code RELAP5/Mod 3.2. The parameters affecting the system performance are then identified and considered for further analysis. Based on some pre-determined failure criterion, the failure surface for the system is predicted using the best-estimate code taking into account the deviations of the identified parameters from their nominal states as well as the model uncertainties inherent to the best estimate code. Root diagnosis is then carried out to determine the various failure causes, which occurs mainly due to malfunctioning of mechanical components. The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. The reliability of the PCCS is then evaluated from the probability of availability of these components

  7. 4. Principles of Art from Antiquity to Contemporary Pedagogy in the Context of Methodology of Art Education

    Directory of Open Access Journals (Sweden)

    Olimpiada Arbuz-Spatari

    2016-03-01

    Full Text Available The methodologies of Art Education is a system of educational documents - principles, rules, methods, procedures, forms - designed determinative - reflective thinking from teleology, content, communication arts / cultural / scientific, reception and receiver topic communicating, and are subject oriented educated / creator student under the laws of education, communication and artistic principles.

  8. A methodology and success/failure criteria for determining emergency diesel generator reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wyckoff, H. L. [Electric Power Research Institute, Palo Alto, California (United States)

    1986-02-15

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  9. A methodology and success/failure criteria for determining emergency diesel generator reliability

    International Nuclear Information System (INIS)

    Wyckoff, H.L.

    1986-01-01

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  10. Reliability Centered Maintenance (RCM) Methodology and Application to the Shutdown Cooling System for APR-1400 Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Faragalla, Mohamed M.; Emmanuel, Efenji; Alhammadi, Ibrahim; Awwal, Arigi M.; Lee, Yong Kwan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    Shutdown Cooling System (SCS) is a safety-related system that is used in conjunction with the Main Steam and Main or Auxiliary Feedwater Systems to reduce the temperature of the Reactor Coolant System (RCS) in post shutdown periods from the hot shutdown operating temperature to the refueling temperature. In this paper RCM methodology is applied to (SCS). RCM analysis is performed based on evaluation of Failure Modes Effects and Criticality Analysis (FME and CA) on the component, system and plant. The Logic Tree Analysis (LTA) is used to determine the optimum maintenance tasks. The main objectives of RCM is the safety, preserve the System function, the cost-effective maintenance of the plant components and increase the reliability and availability value. The RCM methodology is useful for improving the equipment reliability by strengthening the management of equipment condition, and leads to a significant decrease in the number of periodical maintenance, extended maintenance cycle, longer useful life of equipment, and decrease in overall maintenance cost. It also focuses on the safety of the system by assigning criticality index to the various components and further selecting maintenance activities based on the risk of failure involved. Therefore, it can be said that RCM introduces a maintenance plan designed for maximum safety in an economical manner and making the system more reliable. For the SCP, increasing the number of condition monitoring tasks will improve the availability of the SCP. It is recommended to reduce the number of periodic maintenance activities.

  11. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  12. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  13. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  14. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  15. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  16. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  17. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  18. Functional components for a design strategy: Hot cell shielding in the high reliability safeguards methodology

    Energy Technology Data Exchange (ETDEWEB)

    Borrelli, R.A., E-mail: rborrelli@uidaho.edu

    2016-08-15

    The high reliability safeguards (HRS) methodology has been established for the safeguardability of advanced nuclear energy systems (NESs). HRS is being developed in order to integrate safety, security, and safeguards concerns, while also optimizing these with operational goals for facilities that handle special nuclear material (SNM). Currently, a commercial pyroprocessing facility is used as an example system. One of the goals in the HRS methodology is to apply intrinsic features of the system to a design strategy. This current study investigates the thickness of the hot cell walls that could adequately shield processed materials. This is an important design consideration that carries implications regarding the formation of material balance areas, the location of key measurement points, and material flow in the facility.

  19. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  20. A reach of the principle of entry and the principle of reliability in the real estate cadastre in our court practice

    Directory of Open Access Journals (Sweden)

    Cvetić Radenka M.

    2015-01-01

    Full Text Available Through the review of the principle of entry and the principle of reliability in the Real Estate Cadastre and their reach in our court practice, this article indicates the indispensability of compliance with these principles for the sake of legal certainty. A formidable and a complex role of the court when applying law in order to rightfully resolve an individual case has been underlined. Having regard to the accountability of the courts for the efficacy of the legal system, without any intention to disavow the court practice, some deficiencies have been pointed out, with the aim to help. An abstract manner of legal norms necessarily requires a creative role of courts in cases which cannot be easily qualified. For that reason certain deviations ought to be made followed by reasoning which unambiguously leads to the conclusion that only a specific decision which the court rendered is possible and just.

  1. THE GENERAL METHODOLOGICAL PRINCIPLES OF COMBINED OPTIONAL ONLINE ENGLISH LANGUAGE TRAINING OF PRIMARY SCHOOL STUDENTS

    Directory of Open Access Journals (Sweden)

    E. I. Zadorozhnaya

    2016-01-01

    Full Text Available The aim of the publication is to demonstrate the implementation of general methodological principles of optional elementary school online foreign languages learning on an example of a virtual course for students of the second and third grades.Methods. The methods involve pedagogical modeling and projecting; the experience of foreign and Russian methodists, teachers and researchers is analysed, generalized and adjusted to the modern realias.Results and scientific novelty. On the basis of the requirements of the state educational standard and interest of pupils in computer games, the author’s technique of the combined facultative educational activities integrated to training in English at elementary school is developed. Online training in the form of games (additional to the major classroom activities gives a possibility of the choice of tasks interesting to children, studying the material at optimum comfortable and individual speed; it is possible to perform the tasks at home excluding the stressful situations that are specific to school examination, and allows pupils to master most effectively personal, metasubject and object competences. In general context of quality improvement of the general education, the modernization of educational process assumes not only justification of its new maintenance, but also restructuring of scientific and methodical support which has to meet essential needs of teachers and pupils, to facilitate access to necessary specific information. The lack of methodical base of creation of electronic distance resources for foreign-language education of younger school students has motivated the author to create own methodical concept of online training taking into account age of pupils. The complex of the general methodical principles is thoroughly considered; based on the general methodical principles, the proposed modular technique of the organization of an online class is created and implemented. Interactive blocks are

  2. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Dreyhaupt, Jens

    2017-05-01

    Full Text Available An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”. Compared with studies with individual randomization, studies with cluster randomization normally require (significantly larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies.Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  3. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    Science.gov (United States)

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  4. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  5. Field programmable gate array reliability analysis using the dynamic flow graph methodology

    Energy Technology Data Exchange (ETDEWEB)

    McNelles, Phillip; Lu, Lixuan [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario (Canada)

    2016-10-15

    Field programmable gate array (FPGA)-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs), for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM). It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the 'IEEE 1164 standard', registers (D flip-flops), configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  6. Application of REPAS Methodology to Assess the Reliability of Passive Safety Systems

    Directory of Open Access Journals (Sweden)

    Franco Pierro

    2009-01-01

    Full Text Available The paper deals with the presentation of the Reliability Evaluation of Passive Safety System (REPAS methodology developed by University of Pisa. The general objective of the REPAS is to characterize in an analytical way the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems. The REPAS can be used in the design of the passive safety systems to assess their goodness and to optimize their costs. It may also provide numerical values that can be used in more complex safety assessment studies and it can be seen as a support to Probabilistic Safety Analysis studies. With regard to this, some examples in the application of the methodology are reported in the paper. A best-estimate thermal-hydraulic code, RELAP5, has been used to support the analyses and to model the selected systems. Probability distributions have been assigned to the uncertain input parameters through engineering judgment. Monte Carlo method has been used to propagate uncertainties and Wilks' formula has been taken into account to select sample size. Failure criterions are defined in terms of nonfulfillment of the defined design targets.

  7. A methodology based in particle swarm optimization algorithm for preventive maintenance focused in reliability and cost

    International Nuclear Information System (INIS)

    Luz, Andre Ferreira da

    2009-01-01

    In this work, a Particle Swarm Optimization Algorithm (PSO) is developed for preventive maintenance optimization. The proposed methodology, which allows the use flexible intervals between maintenance interventions, instead of considering fixed periods (as usual), allows a better adaptation of scheduling in order to deal with the failure rates of components under aging. Moreover, because of this flexibility, the planning of preventive maintenance becomes a difficult task. Motivated by the fact that the PSO has proved to be very competitive compared to other optimization tools, this work investigates the use of PSO as an alternative tool of optimization. Considering that PSO works in a real and continuous space, it is a challenge to use it for discrete optimization, in which scheduling may comprise variable number of maintenance interventions. The PSO model developed in this work overcome such difficulty. The proposed PSO searches for the best policy for maintaining and considers several aspects, such as: probability of needing repair (corrective maintenance), the cost of such repairs, typical outage times, costs of preventive maintenance, the impact of maintaining the reliability of systems as a whole, and the probability of imperfect maintenance. To evaluate the proposed methodology, we investigate an electro-mechanical system consisting of three pumps and four valves, High Pressure Injection System (HPIS) of a PWR. Results show that PSO is quite efficient in finding the optimum preventive maintenance policies for the HPIS. (author)

  8. Validity and reliability of using photography for measuring knee range of motion: a methodological study

    Directory of Open Access Journals (Sweden)

    Adie Sam

    2011-04-01

    Full Text Available Abstract Background The clinimetric properties of knee goniometry are essential to appreciate in light of its extensive use in the orthopaedic and rehabilitative communities. Intra-observer reliability is thought to be satisfactory, but the validity and inter-rater reliability of knee goniometry often demonstrate unacceptable levels of variation. This study tests the validity and reliability of measuring knee range of motion using goniometry and photographic records. Methods Design: Methodology study assessing the validity and reliability of one method ('Marker Method' which uses a skin marker over the greater trochanter and another method ('Line of Femur Method' which requires estimation of the line of femur. Setting: Radiology and orthopaedic departments of two teaching hospitals. Participants: 31 volunteers (13 arthritic and 18 healthy subjects. Knee range of motion was measured radiographically and photographically using a goniometer. Three assessors were assessed for reliability and validity. Main outcomes: Agreement between methods and within raters was assessed using concordance correlation coefficient (CCCs. Agreement between raters was assessed using intra-class correlation coefficients (ICCs. 95% limits of agreement for the mean difference for all paired comparisons were computed. Results Validity (referenced to radiographs: Each method for all 3 raters yielded very high CCCs for flexion (0.975 to 0.988, and moderate to substantial CCCs for extension angles (0.478 to 0.678. The mean differences and 95% limits of agreement were narrower for flexion than they were for extension. Intra-rater reliability: For flexion and extension, very high CCCs were attained for all 3 raters for both methods with slightly greater CCCs seen for flexion (CCCs varied from 0.981 to 0.998. Inter-rater reliability: For both methods, very high ICCs (min to max: 0.891 to 0.995 were obtained for flexion and extension. Slightly higher coefficients were obtained

  9. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  10. A novel application for the cavalieri principle: a stereological and methodological study.

    Science.gov (United States)

    Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin

    2009-08-01

    The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method.

  11. [Principles and methodology for ecological rehabilitation and security pattern design in key project construction].

    Science.gov (United States)

    Chen, Li-Ding; Lu, Yi-He; Tian, Hui-Ying; Shi, Qian

    2007-03-01

    Global ecological security becomes increasingly important with the intensive human activities. The function of ecological security is influenced by human activities, and in return, the efficiency of human activities will also be affected by the patterns of regional ecological security. Since the 1990s, China has initiated the construction of key projects "Yangtze Three Gorges Dam", "Qinghai-Tibet Railway", "West-to-East Gas Pipeline", "West-to-East Electricity Transmission" and "South-to-North Water Transfer" , etc. The interaction between these projects and regional ecological security has particularly attracted the attention of Chinese government. It is not only important for the regional environmental protection, but also of significance for the smoothly implementation of various projects aimed to develop an ecological rehabilitation system and to design a regional ecological security pattern. This paper made a systematic analysis on the types and characteristics of key project construction and their effects on the environment, and on the basis of this, brought forward the basic principles and methodology for ecological rehabilitation and security pattern design in this construction. It was considered that the following issues should be addressed in the implementation of a key project: 1) analysis and evaluation of current regional ecological environment, 2) evaluation of anthropogenic disturbances and their ecological risk, 3) regional ecological rehabilitation and security pattern design, 4) scenario analysis of environmental benefits of regional ecological security pattern, 5) re-optimization of regional ecological system framework, and 6) establishment of regional ecosystem management plan.

  12. Principles of methodology guiding the study of religion as found in the work of Friedrich Heiler

    Directory of Open Access Journals (Sweden)

    Tatiana Samarina

    2013-02-01

    Full Text Available The issue of methodology in the study of religious phenomena appears as problematic to those Russian experts who deal with the scientific study of religious phenomena. On the other hand, western European researchers have already made much positive progress in this direction. This article attempts to define the principles which guide the scholar in the study of religion as found in the work of the renowned German scholar Friedrich Heiler. The author’s starting point is Heiler’s fundamental concept which permeates all his work and is most clearly defined in his last monograph: «Erscheinungsformen und Wesen der Religion». Heiler gives several pointers to students of religion: attention to detail, a fi rm grounding in the core matter of the object to be studies, and at the same time a comprehensive and panoramic view of religion and religious phenomena as a whole. The author concludes that Heiler’s protracted study of both Christianity and the Eastern religions led him to regard the phenomenological method as the most eff ective. The author presupposes a negative reception of this method by Russian students of religious phenomena due to the fact that the latter are too unfamiliar with Heiler’s work and conclusions

  13. Preparation of methodology for reliability analysis of selected digital segments of the instrumentation and control systems of NPPs. Pt. 1

    International Nuclear Information System (INIS)

    Hustak, S.; Patrik, M.; Babic, P.

    2000-12-01

    The report is structured as follows: (i) Introduction; (ii) Important notions relating to the safety and dependability of software systems for nuclear power plants (selected notions from IAEA Technical Report No. 397; safety aspects of software application; reliability/dependability aspects of digital systems); (iii) Peculiarities of digital systems and ways to a dependable performance of the required function (failures in the system and principles of defence against them; ensuring resistance of digital systems against failures at various hardware and software levels); (iv) The issue of analytical procedures to assess the safety and reliability of safety-related digital systems (safety and reliability assessment at an early stage of the project; general framework of reliability analysis of complex systems; choice of an appropriate quantitative measure of software reliability); (v) Selected qualitative and quantitative information about the reliability of digital systems; the use of relations between the incidence of various types of faults); and (vi) Conclusions and recommendations. (P.A.)

  14. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  15. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  16. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  17. Assessing and updating the reliability of concrete bridges subjected to spatial deterioration - principles and software implementation

    DEFF Research Database (Denmark)

    Schneider, Ronald; Fischer, Johannes; Bügler, Maximilian

    2015-01-01

    to implement the method presented here. The software prototype is applied to a typical highway bridge and the influence of inspection information on the system deterioration state and the structural reliability is quantified taking into account the spatial correlation of the corrosion process. This work...

  18. Glances at renewable and sustainable energy principles, approaches and methodologies for an ambiguous benchmark

    CERN Document Server

    Jenssen, Till

    2013-01-01

    Offering a thorough review of the principles of sustainability assessment, this book explores multi-criteria decision analysis, ecological footprint analysis and normative-functional concepts via case studies in developed, emerging and developing countries.

  19. Methodological issues concerning the application of reliable laser particle sizing in soils

    Science.gov (United States)

    de Mascellis, R.; Impagliazzo, A.; Basile, A.; Minieri, L.; Orefice, N.; Terribile, F.

    2009-04-01

    During the past decade, the evolution of technologies has enabled laser diffraction (LD) to become a much widespread means of particle size distribution (PSD), replacing sedimentation and sieve analysis in many scientific fields mainly due to its advantages of versatility, fast measurement and high reproducibility. Despite such developments of the last decade, the soil scientist community has been quite reluctant to replace the good old sedimentation techniques (ST); possibly because of (i) the large complexity of the soil matrix inducing different types of artefacts (aggregates, deflocculating dynamics, etc.), (ii) the difficulties in relating LD results with results obtained through sedimentation techniques and (iii) the limited size range of most LD equipments. More recently LD granulometry is slowly gaining appreciation in soil science also because of some innovations including an enlarged size dynamic range (0,01-2000 m) and the ability to implement more powerful algorithms (e.g. Mie theory). Furthermore, LD PSD can be successfully used in the application of physically based pedo-transfer functions (i.e., Arya and Paris model) for investigations of soil hydraulic properties, due to the direct determination of PSD in terms of volume percentage rather than in terms of mass percentage, thus eliminating the need to adopt the rough approximation of a single value for soil particle density in the prediction process. Most of the recent LD work performed in soil science deals with the comparison with sedimentation techniques and show the general overestimation of the silt fraction following a general underestimation of the clay fraction; these well known results must be related with the different physical principles behind the two techniques. Despite these efforts, it is indeed surprising that little if any work is devoted to more basic methodological issues related to the high sensitivity of LD to the quantity and the quality of the soil samples. Our work aims to

  20. MERMOS: an EDF project to update the PHRA methodology (Probabilistic Human Reliability Assessment)

    International Nuclear Information System (INIS)

    Le Bot, Pierre; Desmares, E.; Bieder, C.; Cara, F.; Bonnet, J.L.

    1998-01-01

    To account for successive evolution of nuclear power plants emergency operation, EDF had several times to review PHRA methodologies. It was particularly the case when event-based procedures were left behind to the benefit of state-based procedures. A more recent updating was necessary to get pieces of information on the new unit type N4 safety. The extent of changes in operation for this unit type (especially the computerization of both the control room and the procedures) required to deeply rethink existing PHRA methods. It also seemed necessary to - more explicitly than in the past - base the design of methods on concepts evolved in human sciences. These are the main ambitions of the project named MERMOS that started in 1996. The design effort for a new PHRA method is carried out by a multidisciplinary team involving reliability engineers, psychologists and ergonomists. An independent expert is in charge of project review. The method, considered as the analysis tool dedicated to PHRA analysts, is one of the two outcomes of the project. The other one is the formalization of the design approach for the method, aimed at a good appropriation of the method by the analysts. EDF's specificity in the field of PHRA and more generally PSA is that the method is not used by the designers but by analysts. Keeping track of the approach is also meant to guarantee its transposition to other EDF unit types such as 900 or 1300 MW PWR. The PHRA method is based upon a model of emergency operation called 'SAD model'. The formalization effort of the design approach lead to clarify and justify it. The model describes and explains both functioning and dys-functioning of emergency operation in PSA scenarios. It combines a systemic approach and what is called distributed cognition in cognitive sciences. Collective aspects are considered as an important feature in explaining phenomena under study in operation dys-functioning. The PHRA method is to be operational early next year (1998

  1. Reliability technology principles and practice of failure prevention in electronic systems

    CERN Document Server

    Pascoe, Norman

    2011-01-01

    A unique book that describes the practical processes necessary to achieve failure free equipment performance, for quality and reliability engineers, design, manufacturing process and environmental test engineers. This book studies the essential requirements for successful product life cycle management. It identifies key contributors to failure in product life cycle management and particular emphasis is placed upon the importance of thorough Manufacturing Process Capability reviews for both in-house and outsourced manufacturing strategies. The readers? attention is also drawn to the ma

  2. Summary of the preparation of methodology for digital system reliability analysis for PSA purposes

    International Nuclear Information System (INIS)

    Hustak, S.; Babic, P.

    2001-12-01

    The report is structured as follows: Specific features of and requirements for the digital part of NPP Instrumentation and Control (I and C) systems (Computer-controlled digital technologies and systems of the NPP I and C system; Specific types of digital technology failures and preventive provisions; Reliability requirements for the digital parts of I and C systems; Safety requirements for the digital parts of I and C systems; Defence-in-depth). Qualitative analyses of NPP I and C system reliability and safety (Introductory system analysis; Qualitative requirements for and proof of NPP I and C system reliability and safety). Quantitative reliability analyses of the digital parts of I and C systems (Selection of a suitable quantitative measure of digital system reliability; Selected qualitative and quantitative findings regarding digital system reliability; Use of relations among the occurrences of the various types of failure). Mathematical section in support of the calculation of the various types of indices (Boolean reliability models, Markovian reliability models). Example of digital system analysis (Description of a selected protective function and the relevant digital part of the I and C system; Functional chain examined, its components and fault tree). (P.A.)

  3. Application case study of AP1000 automatic depressurization system (ADS) for reliability evaluation by GO-FLOW methodology

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Muhammad, E-mail: hashimsajid@yahoo.com; Hidekazu, Yoshikawa, E-mail: yosikawa@kib.biglobe.ne.jp; Takeshi, Matsuoka, E-mail: mats@cc.utsunomiya-u.ac.jp; Ming, Yang, E-mail: myang.heu@gmail.com

    2014-10-15

    Highlights: • Discussion on reasons why AP1000 equipped with ADS system comparatively to PWR. • Clarification of full and partial depressurization of reactor coolant system by ADS system. • Application case study of four stages ADS system for reliability evaluation in LBLOCA. • GO-FLOW tool is capable to evaluate dynamic reliability of passive safety systems. • Calculated ADS reliability result significantly increased dynamic reliability of PXS. - Abstract: AP1000 nuclear power plant (NPP) utilized passive means for the safety systems to ensure its safety in events of transient or severe accidents. One of the unique safety systems of AP1000 to be compared with conventional PWR is the “four stages Automatic Depressurization System (ADS)”, and ADS system originally works as an active safety system. In the present study, authors first discussed the reasons of why four stages ADS system is added in AP1000 plant to be compared with conventional PWR in the aspect of reliability. And then explained the full and partial depressurization of RCS system by four stages ADS in events of transient and loss of coolant accidents (LOCAs). Lastly, the application case study of four stages ADS system of AP1000 has been conducted in the aspect of reliability evaluation of ADS system under postulated conditions of full RCS depressurization during large break loss of a coolant accident (LBLOCA) in one of the RCS cold legs. In this case study, the reliability evaluation is made by GO-FLOW methodology to determinate the influence of ADS system in dynamic reliability of passive core cooling system (PXS) of AP1000, i.e. what will happen if ADS system fails or successfully actuate. The GO-FLOW is success-oriented reliability analysis tool and is capable to evaluating the systems reliability/unavailability alternatively to Fault Tree Analysis (FTA) and Event Tree Analysis (ETA) tools. Under these specific conditions of LBLOCA, the GO-FLOW calculated reliability results indicated

  4. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  5. A Methodology for Building Faculty Support for the United Nations Principles for Responsible Management Education

    Science.gov (United States)

    Maloni, Michael J.; Smith, Shane D.; Napshin, Stuart

    2012-01-01

    Evidence from extant literature indicates that faculty support is a critical driver for implementing the United Nations Principles for Responsible Management Education (PRME), particularly for schools pursuing an advanced, cross-disciplinary level of sustainability integration. However, there is limited existing research offering insight into how…

  6. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology

    Science.gov (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  7. In-plant reliability data base for nuclear power plant components: data collection and methodology report

    International Nuclear Information System (INIS)

    Drago, J.P.; Borkowski, R.J.; Pike, D.H.; Goldberg, F.F.

    1982-07-01

    The development of a component reliability data for use in nuclear power plant probabilistic risk assessments and reliabiilty studies is presented in this report. The sources of the data are the in-plant maintenance work request records from a sample of nuclear power plants. This data base is called the In-Plant Reliability Data (IPRD) system. Features of the IPRD system are compared with other data sources such as the Licensee Event Report system, the Nuclear Plant Reliability Data system, and IEEE Standard 500. Generic descriptions of nuclear power plant systems formulated for IPRD are given

  8. Public views on principles for health care priority setting: findings of a European cross-country study using Q methodology.

    Science.gov (United States)

    van Exel, Job; Baker, Rachel; Mason, Helen; Donaldson, Cam; Brouwer, Werner

    2015-02-01

    Resources available to the health care sector are finite and typically insufficient to fulfil all the demands for health care in the population. Decisions must be made about which treatments to provide. Relatively little is known about the views of the general public regarding the principles that should guide such decisions. We present the findings of a Q methodology study designed to elicit the shared views in the general public across ten countries regarding the appropriate principles for prioritising health care resources. In 2010, 294 respondents rank ordered a set of cards and the results of these were subject to by-person factor analysis to identify common patterns in sorting. Five distinct viewpoints were identified, (I) "Egalitarianism, entitlement and equality of access"; (II) "Severity and the magnitude of health gains"; (III) "Fair innings, young people and maximising health benefits"; (IV) "The intrinsic value of life and healthy living"; (V) "Quality of life is more important than simply staying alive". Given the plurality of views on the principles for health care priority setting, no single equity principle can be used to underpin health care priority setting. Hence, the process of decision making becomes more important, in which, arguably, these multiple perspectives in society should be somehow reflected. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  10. Advances in ranking and selection, multiple comparisons, and reliability methodology and applications

    CERN Document Server

    Balakrishnan, N; Nagaraja, HN

    2007-01-01

    S. Panchapakesan has made significant contributions to ranking and selection and has published in many other areas of statistics, including order statistics, reliability theory, stochastic inequalities, and inference. Written in his honor, the twenty invited articles in this volume reflect recent advances in these areas and form a tribute to Panchapakesan's influence and impact on these areas. Thematically organized, the chapters cover a broad range of topics from: Inference; Ranking and Selection; Multiple Comparisons and Tests; Agreement Assessment; Reliability; and Biostatistics. Featuring

  11. Seismic reliability assessment methodology for CANDU concrete containment structures-phase 11

    International Nuclear Information System (INIS)

    Hong, H.P.

    1996-07-01

    This study was undertaken to verify a set of load factors for reliability-based seismic evaluation of CANDU containment structures in Eastern Canada. Here, the new, site-specific, results of probabilistic seismic hazard assessment (response spectral velocity) were applied. It was found that the previously recommended load factors are relatively insensitive to the new seismic hazard information, and are adequate for a reliability-based seismic evaluation process. (author). 4 refs., 5 tabs., 9 figs

  12. Methodological principles to study formation and development of floristic law in Ukraine

    Directory of Open Access Journals (Sweden)

    А. К. Соколова

    2014-06-01

    Full Text Available The paper investigates the problems associated with the determination of methods to study establishment of floristic law in Ukraine. It makes an investigation into the types of methods, establishes their interrelation and functional value. In addition, it analyzes the system of methodological reasons for development of ecological and floristic law and gives additional reasons.

  13. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  14. Operationalising the Lean principles in maternity service design using 3P methodology.

    Science.gov (United States)

    Smith, Iain

    2016-01-01

    The last half century has seen significant changes to Maternity services in England. Though rates of maternal and infant mortality have fallen to very low levels, this has been achieved largely through hospital admission. It has been argued that maternity services may have become over-medicalised and service users have expressed a preference for more personalised care. NHS England's national strategy sets out a vision for a modern maternity service that continues to deliver safe care whilst also adopting the principles of personalisation. Therefore, there is a need to develop maternity services that balance safety with personal choice. To address this challenge, a maternity unit in North East England considered improving their service through refurbishment or building new facilities. Using a design process known as the production preparation process (or 3P), the Lean principles of understanding user value, mapping value-streams, creating flow, developing pull processes and continuous improvement were applied to the design of a new maternity department. Multiple stakeholders were engaged in the design through participation in a time-out (3P) workshop in which an innovative pathway and facility for maternity services were co-designed. The team created a hybrid model that they described as "wrap around care" in which the Lean concept of pull was applied to create a service and facility design in which expectant mothers were put at the centre of care with clinicians, skills, equipment and supplies drawn towards them in line with acuity changes as needed. Applying the Lean principles using the 3P method helped stakeholders to create an innovative design in line with the aspirations and objectives of the National Maternity Review. The case provides a practical example of stakeholders applying the Lean principles to maternity services and demonstrates the potential applicability of the Lean 3P approach to design healthcare services in line with policy requirements.

  15. Developing "Personality" Taxonomies: Metatheoretical and Methodological Rationales Underlying Selection Approaches, Methods of Data Generation and Reduction Principles.

    Science.gov (United States)

    Uher, Jana

    2015-12-01

    Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".

  16. A reliability-based preventive maintenance methodology for the projection spot welding machine

    Directory of Open Access Journals (Sweden)

    Fayzimatov Ulugbek

    2018-06-01

    Full Text Available An effective operations of a projection spot welding (PSW machine is closely related to the effec-tiveness of the maintenance. Timely maintenance can prevent failures and improve reliability and maintainability of the machine. Therefore, establishing the maintenance frequency for the welding machine is one of the most important tasks for plant engineers. In this regard, reliability analysis of the welding machine can be used to establish preventive maintenance intervals (PMI and to identify the critical parts of the system. In this reliability and maintainability study, analysis of the PSW machine was carried out. The failure and repair data for analysis were obtained from automobile manufacturing company located in Uzbekistan. The machine was divided into three main sub-systems: electrical, pneumatic and hydraulic. Different distributions functions for all sub-systems was tested and their parameters tabulated. Based on estimated parameters of the analyzed distribu-tions, PMI for the PSW machines sub-systems at different reliability levels was calculated. Finally, preventive measures for enhancing the reliability of the PSW machine sub-systems are suggested.

  17. Methodology for time-dependent reliability analysis of accident sequences and complex reactor systems

    International Nuclear Information System (INIS)

    Paula, H.M.

    1984-01-01

    The work presented here is of direct use in probabilistic risk assessment (PRA) and is of value to utilities as well as the Nuclear Regulatory Commission (NRC). Specifically, this report presents a methodology and a computer program to calculate the expected number of occurrences for each accident sequence in an event tree. The methodology evaluates the time-dependent (instantaneous) and the average behavior of the accident sequence. The methodology accounts for standby safety system and component failures that occur (a) before they are demanded, (b) upon demand, and (c) during the mission (system operation). With respect to failures that occur during the mission, this methodology is unique in the sense that it models components that can be repaired during the mission. The expected number of system failures during the mission provides an upper bound for the probability of a system failure to run - the mission unreliability. The basic event modeling includes components that are continuously monitored, periodically tested, and those that are not tested or are otherwise nonrepairable. The computer program ASA allows practical applications of the method developed. This work represents a required extension of the presently available methodology and allows a more realistic PRA of nuclear power plants

  18. More Than Just a Discursive Practice? Conceptual Principles and Methodological Aspects of Dispositif Analysis

    Directory of Open Access Journals (Sweden)

    Andrea D. Bührmann

    2007-05-01

    Full Text Available This article gives an introduction into the conceptual and practical field of dispositf analysis—a field that is of great importance but that is as yet underdeveloped. In order to render this introduction, we first explain the terms discourse and dispositif. Then we examine the conceptual instruments and methodological procedures of dispositf analysis. In this way, we define the relations between discourse and (a non discoursive practices (b subjectification, (c everyday orders of knowledge and (d institutional practices like societal changes as central issues of dispositif analysis. Furthermore, we point out the methodological possibilities and limitations of dispositif analysis. We demonstrate these possibilities and limitations with some practical examples. In general, this article aims to provide an extension of the perspectives of discourse theory and research by stressing the relations between normative orders of knowledge, their effects on interactions and individual self–reflections connected with them. URN: urn:nbn:de:0114-fqs0702281

  19. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  20. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  1. Methodology for sodium fire vulnerability assessment of sodium cooled fast reactor based on the Monte-Carlo principle

    International Nuclear Information System (INIS)

    Song, Wei; Wu, Yuanyu; Hu, Wenjun; Zuo, Jiaxu

    2015-01-01

    Highlights: • Monte-Carlo principle coupling with fire dynamic code is adopted to perform sodium fire vulnerability assessment. • The method can be used to calculate the failure probability of sodium fire scenarios. • A calculation example and results are given to illustrate the feasibility of the methodology. • Some critical parameters and experience are shared. - Abstract: Sodium fire is a typical and distinctive hazard in sodium cooled fast reactors, which is significant for nuclear safety. In this paper, a method of sodium fire vulnerability assessment based on the Monte-Carlo principle was introduced, which could be used to calculate the probabilities of every failure mode in sodium fire scenarios. After that, the sodium fire scenario vulnerability assessment of primary cold trap room of China Experimental Fast Reactor was performed to illustrate the feasibility of the methodology. The calculation result of the example shows that the conditional failure probability of key cable is 23.6% in the sodium fire scenario which is caused by continuous sodium leakage because of the isolation device failure, but the wall temperature, the room pressure and the aerosol discharge mass are all lower than the safety limits.

  2. Methodology for sodium fire vulnerability assessment of sodium cooled fast reactor based on the Monte-Carlo principle

    Energy Technology Data Exchange (ETDEWEB)

    Song, Wei [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China); Wu, Yuanyu [ITER Organization, Route de Vinon-sur-Verdon, 13115 Saint-Paul-lès-Durance (France); Hu, Wenjun [China Institute of Atomic Energy, P. O. Box 275(34), Beijing (China); Zuo, Jiaxu, E-mail: zuojiaxu@chinansc.cn [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China)

    2015-11-15

    Highlights: • Monte-Carlo principle coupling with fire dynamic code is adopted to perform sodium fire vulnerability assessment. • The method can be used to calculate the failure probability of sodium fire scenarios. • A calculation example and results are given to illustrate the feasibility of the methodology. • Some critical parameters and experience are shared. - Abstract: Sodium fire is a typical and distinctive hazard in sodium cooled fast reactors, which is significant for nuclear safety. In this paper, a method of sodium fire vulnerability assessment based on the Monte-Carlo principle was introduced, which could be used to calculate the probabilities of every failure mode in sodium fire scenarios. After that, the sodium fire scenario vulnerability assessment of primary cold trap room of China Experimental Fast Reactor was performed to illustrate the feasibility of the methodology. The calculation result of the example shows that the conditional failure probability of key cable is 23.6% in the sodium fire scenario which is caused by continuous sodium leakage because of the isolation device failure, but the wall temperature, the room pressure and the aerosol discharge mass are all lower than the safety limits.

  3. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  4. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    Science.gov (United States)

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  5. A first-principles generic methodology for representing the knowledge base of a process diagnostic expert system

    International Nuclear Information System (INIS)

    Reifman, J.; Briggs, L.L.; Wei, T.Y.C.

    1990-01-01

    In this paper we present a methodology for identifying faulty component candidates of process malfunctions through basic physical principles of conservation, functional classification of components and information from the process schematics. The basic principles of macroscopic balance of mass, momentum and energy in thermal hydraulic control volumes are applied in a novel approach to incorporate deep knowledge into the knowledge base. Additional deep knowledge is incorporated through the functional classification of process components according to their influence in disturbing the macroscopic balance equations. Information from the process schematics is applied to identify the faulty component candidates after the type of imbalance in the control volumes is matched against the functional classification of the components. Except for the information from the process schematics, this approach is completely general and independent of the process under consideration. The use of basic first-principles, which are physically correct, and the process-independent architecture of the diagnosis procedure allow for the verification and validation of the system. A prototype process diagnosis expert system is developed and a test problem is presented to identify faulty component candidates in the presence of a single failure in a hypothetical balance of plant of a liquid metal nuclear reactor plant

  6. [Methodologic developmental principles of standardized surveys within the scope of social gerontologic studies].

    Science.gov (United States)

    Bansemir, G

    1987-01-01

    The conception and evaluation of standardized oral or written questioning as quantifying instruments of research orientate by the basic premises of Marxist-Leninist theory of recognition and general scientific logic. In the present contribution the socio-gerontological research process is outlined in extracts. By referring to the intrinsic connection between some of its essential components--problem, formation of hypotheses, obtaining indicators/measurement, preliminary examination, evaluation-as well as to typical errors and (fictitious) examples of practical research, this contribution contrasts the natural, apparently uncomplicated course of structured questioning with its qualitative methodological fundamentals and demands.

  7. An evaluation of the reliability and usefulness of external-initiator PRA [probabilistic risk analysis] methodologies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally ''mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab

  8. An evaluation of the reliability and usefulness of external-initiator PRA (probabilistic risk analysis) methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Budnitz, R.J.; Lambert, H.E. (Future Resources Associates, Inc., Berkeley, CA (USA))

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab.

  9. A reliability as an independent variable (RAIV) methodology for optimizing test planning for liquid rocket engines

    Science.gov (United States)

    Strunz, Richard; Herrmann, Jeffrey W.

    2011-12-01

    The hot fire test strategy for liquid rocket engines has always been a concern of space industry and agency alike because no recognized standard exists. Previous hot fire test plans focused on the verification of performance requirements but did not explicitly include reliability as a dimensioning variable. The stakeholders are, however, concerned about a hot fire test strategy that balances reliability, schedule, and affordability. A multiple criteria test planning model is presented that provides a framework to optimize the hot fire test strategy with respect to stakeholder concerns. The Staged Combustion Rocket Engine Demonstrator, a program of the European Space Agency, is used as example to provide the quantitative answer to the claim that a reduced thrust scale demonstrator is cost beneficial for a subsequent flight engine development. Scalability aspects of major subsystems are considered in the prior information definition inside the Bayesian framework. The model is also applied to assess the impact of an increase of the demonstrated reliability level on schedule and affordability.

  10. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  11. Study on seismic reliability for foundation grounds and surrounding slopes of nuclear power plants. Proposal of evaluation methodology and integration of seismic reliability evaluation system

    International Nuclear Information System (INIS)

    Ohtori, Yasuki; Kanatani, Mamoru

    2006-01-01

    This paper proposes an evaluation methodology of annual probability of failure for soil structures subjected to earthquakes and integrates the analysis system for seismic reliability of soil structures. The method is based on margin analysis, that evaluates the ground motion level at which structure is damaged. First, ground motion index that is strongly correlated with damage or response of the specific structure, is selected. The ultimate strength in terms of selected ground motion index is then evaluated. Next, variation of soil properties is taken into account for the evaluation of seismic stability of structures. The variation of the safety factor (SF) is evaluated and then the variation is converted into the variation of the specific ground motion index. Finally, the fragility curve is developed and then the annual probability of failure is evaluated combined with seismic hazard curve. The system facilitates the assessment of seismic reliability. A generator of random numbers, dynamic analysis program and stability analysis program are incorporated into one package. Once we define a structural model, distribution of the soil properties, input ground motions and so forth, list of safety factors for each sliding line is obtained. Monte Carlo Simulation (MCS), Latin Hypercube Sampling (LHS), point estimation method (PEM) and first order second moment (FOSM) implemented in this system are also introduced. As numerical examples, a ground foundation and a surrounding slope are assessed using the proposed method and the integrated system. (author)

  12. Living with uncertainty: from the precautionary principle to the methodology of ongoing normative assessment

    International Nuclear Information System (INIS)

    Dupuy, J.P.; Grinbaum, A.

    2005-01-01

    The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). (authors)

  13. [The system theory of aging: methodological principles, basic tenets and applications].

    Science.gov (United States)

    Krut'ko, V N; Dontsov, V I; Zakhar'iashcheva, O V

    2009-01-01

    The paper deals with the system theory of aging constructed on the basis of present-day scientific methodology--the system approach. The fundamental cause for aging is discrete existence of individual life forms, i.e. living organisms which, from the thermodynamic point of view, are not completely open systems. The primary aging process (build-up of chaos and system disintegration of aging organism) obeys the second law of thermodynamics or the law of entropy increase in individual partly open systems. In living organisms the law is exhibited as synergy of four main aging mechanisms: system "pollution" of organism, loss of non-regenerative elements, accumulation of damages and deformations, generation of variability on all levels, and negative changes in regulation processes and consequent degradation of the organism systematic character. These are the general aging mechanisms; however, the regulatory mechanisms may be important equally for organism aging and search for ways to prolong active life.

  14. First principles calculations using density matrix divide-and-conquer within the SIESTA methodology

    International Nuclear Information System (INIS)

    Cankurtaran, B O; Gale, J D; Ford, M J

    2008-01-01

    The density matrix divide-and-conquer technique for the solution of Kohn-Sham density functional theory has been implemented within the framework of the SIESTA methodology. Implementation details are provided where the focus is on the scaling of the computation time and memory use, in both serial and parallel versions. We demonstrate the linear-scaling capabilities of the technique by providing ground state calculations of moderately large insulating, semiconducting and (near-) metallic systems. This linear-scaling technique has made it feasible to calculate the ground state properties of quantum systems consisting of tens of thousands of atoms with relatively modest computing resources. A comparison with the existing order-N functional minimization (Kim-Mauri-Galli) method is made between the insulating and semiconducting systems

  15. The improving of methodological principles of enterprise competitiveness management under the crisis

    Directory of Open Access Journals (Sweden)

    Marina Dyadyuk

    2016-12-01

    Full Text Available The purpose of this research is methodological bases improving and forming of practical tools for enterprise competitiveness management under the crisis. The specific features of the competitive environment of enterprises in Ukraine under the global and national crisis are researched in the article. From this it is concluded that any enterprise must have a greater degree of flexibility than in periods of stability or economic growth for obtaining and maintaining of competitive advantages in the current period of global instability. Flexibility and adaptability of the economic system is the main prerequisite for obtaining and developing of enterprise competitive advantages and stem component of competitiveness. We identified and characterized the methodological components of adaptive management process on the base of systematic approach and with taking into account views of scientists. The obtained scientific results are the basis for conceptual model of integrated system of enterprise adaptive management in terms of dynamic and uncertainty environment. We propose to implement this kind of control on three levels: strategic (preventive management, functionality (tactical management and operational (symptomatic management on the base of analyzing economically grounded views and existing positions. It all together will ensure effective adaptation at the macroeconomic, meso and micro levels of management. The main purpose of the proposed integrated management system is ensuring the stability and integrity of enterprises activity in terms of variability and uncertainty of the environment. The implementation of such management system provides the enterprise with certain competitive advantages. It will allow to Ukrainian enterprises maintaining the competitive position in unfavorable external conditions, but also maintaining and improving the competitiveness.

  16. Methodologic principles of regulation of patient irradiation during X-ray investigations

    International Nuclear Information System (INIS)

    Vorob'ev, E.I.; Knizhnikov, V.A.; Stavitskij, R.V.; Barkhudarov, R.M.; Lyass, F.M.; Lebedev, L.A.

    1986-01-01

    Attention is paid to the regulation of radiation doses in biomedical radiography on the basis of cost-benefit concept i.e. using optimization principle. Conceptual bases and main dose limits for roentgenological investigations are suggested. Radiation doses and information value from the application of roentgenological investigations for different patients are unequal and depend on the desease type and nature. Three categories of patients subjected to roentgenological examinations are separated: Asub(d)-persons having oncologic desease or with suspicion for it, Bsub(d)-persons which are examined to define diagnosis and to select thrapeutic method in connection with nononcologic deseases, Csub(d)-persons which are subjected to procedures both with preventive and investigative purpose. For Asub(d) category persons radiation doses to all organs should not exceed 0.5 Sv (to eye lens-0.3 Sv), for Bsub(d) category-1/3 of maximum permissible dose to personel envisaged in radiation protection guides (RPG-76), for Csub(d) category-0.5 Sv/year. Control level of dose equivalents limiting different organs exposure due to routine roentgenological procedures, which were obtained by means of dosimetric investigations using tissue-equivalent phantom of a reference man, are presented

  17. Natural Circulation in Water Cooled Nuclear Power Plants Phenomena, models, and methodology for system reliability assessments

    Energy Technology Data Exchange (ETDEWEB)

    Jose Reyes

    2005-02-14

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''.

  18. Natural Circulation in Water Cooled Nuclear Power Plants Phenomena, models, and methodology for system reliability assessments

    International Nuclear Information System (INIS)

    Jose Reyes

    2005-01-01

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''

  19. Circular instead of hierarchical: methodological principles for the evaluation of complex interventions

    Directory of Open Access Journals (Sweden)

    Fønnebø Vinjar

    2006-06-01

    Full Text Available Abstract Background The reasoning behind evaluating medical interventions is that a hierarchy of methods exists which successively produce improved and therefore more rigorous evidence based medicine upon which to make clinical decisions. At the foundation of this hierarchy are case studies, retrospective and prospective case series, followed by cohort studies with historical and concomitant non-randomized controls. Open-label randomized controlled studies (RCTs, and finally blinded, placebo-controlled RCTs, which offer most internal validity are considered the most reliable evidence. Rigorous RCTs remove bias. Evidence from RCTs forms the basis of meta-analyses and systematic reviews. This hierarchy, founded on a pharmacological model of therapy, is generalized to other interventions which may be complex and non-pharmacological (healing, acupuncture and surgery. Discussion The hierarchical model is valid for limited questions of efficacy, for instance for regulatory purposes and newly devised products and pharmacological preparations. It is inadequate for the evaluation of complex interventions such as physiotherapy, surgery and complementary and alternative medicine (CAM. This has to do with the essential tension between internal validity (rigor and the removal of bias and external validity (generalizability. Summary Instead of an Evidence Hierarchy, we propose a Circular Model. This would imply a multiplicity of methods, using different designs, counterbalancing their individual strengths and weaknesses to arrive at pragmatic but equally rigorous evidence which would provide significant assistance in clinical and health systems innovation. Such evidence would better inform national health care technology assessment agencies and promote evidence based health reform.

  20. Methodological principles for optimising functional MRI experiments; Methodische Grundlagen der Optimierung funktioneller MR-Experimente

    Energy Technology Data Exchange (ETDEWEB)

    Wuestenberg, T. [Georg-August-Universitaet Goettingen, Abteilung fuer Medizinische Psychologie (Germany); Georg-August-Universitaet, Abteilung fuer Medizinische Psychologie, Goettingen (Germany); Giesel, F.L. [Deutsches Kebsforschungszentrum (DKFZ) Heidelberg, Abteilung fuer Radiologische Diagnostik (Germany); Strasburger, H. [Georg-August-Universitaet Goettingen, Abteilung fuer Medizinische Psychologie (Germany)

    2005-02-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [German] Die funktionelle Magnetresonanztomographie (fMRT) des Zentralnervensystems ist eine der meistgenutzten Methoden zur Lokalisierung neuronaler Aktivitaet im Gehirn. Obwohl die Sensitivitaet der fMRT vergleichsweise gering ist, kann durch die Auswahl geeigneter experimenteller Parameter die Empfindlichkeit dieses bildgebenden Verfahrens gesteigert und die Reliabilitaet der Ergebnisse gewaehrleistet werden. In diesem Artikel werden deshalb Ansaetze fuer die Optimierung des Paradigmendesigns, der MR-Bildgebung und der Datenauswertung diskutiert. Klinischen Forschern und interessierten Aerzten sollen dadurch Richtgroessen fuer die Durchfuehrung effektiver fMRT-Experimente vermittelt werden. (orig.)

  1. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  2. A Methodology and Toolkit for Deploying Reliable Security Policies in Critical Infrastructures

    Directory of Open Access Journals (Sweden)

    Faouzi Jaïdi

    2018-01-01

    Full Text Available Substantial advances in Information and Communication Technologies (ICT bring out novel concepts, solutions, trends, and challenges to integrate intelligent and autonomous systems in critical infrastructures. A new generation of ICT environments (such as smart cities, Internet of Things, edge-fog-social-cloud computing, and big data analytics is emerging; it has different applications to critical domains (such as transportation, communication, finance, commerce, and healthcare and different interconnections via multiple layers of public and private networks, forming a grid of critical cyberphysical infrastructures. Protecting sensitive and private data and services in critical infrastructures is, at the same time, a main objective and a great challenge for deploying secure systems. It essentially requires setting up trusted security policies. Unfortunately, security solutions should remain compliant and regularly updated to follow and track the evolution of security threats. To address this issue, we propose an advanced methodology for deploying and monitoring the compliance of trusted access control policies. Our proposal extends the traditional life cycle of access control policies with pertinent activities. It integrates formal and semiformal techniques allowing the specification, the verification, the implementation, the reverse-engineering, the validation, the risk assessment, and the optimization of access control policies. To automate and facilitate the practice of our methodology, we introduce our system SVIRVRO that allows managing the extended life cycle of access control policies. We refer to an illustrative example to highlight the relevance of our contributions.

  3. Principle and methodology of nuclear power plant site selection. Application to radiocobalt cycle in the Rhone river

    International Nuclear Information System (INIS)

    Georges, J.

    1987-01-01

    In a first bibliographic part, after some generalities on radioactivity and nuclear power, general principles of radiation protection and national and international regulations are presented. The methodology of the radioecological study involved in site selection is developed. In a second more experimental part, the processing of radiocobalt gamma radioactivity measurement in water, fishes, plants and Rhone river sediments demonstrates the influence of age and geographical situation of the nuclear power stations located along the river. A laboratory experiment of cobalt 60 transfer from chironomes larvae to carp is carried out. Comparison with the results of other laboratory experiments makes it possible to propose an experimental model of cobalt transfer within a fresh water ecosystem; radioactivity levels calculated for various compartments seem to be consistent with the Rhone river levels [fr

  4. A methodology to aid in the design of naval steels: Linking first principles calculations to mesoscale modeling

    International Nuclear Information System (INIS)

    Spanos, G.; Geltmacher, A.B.; Lewis, A.C.; Bingert, J.F.; Mehl, M.; Papaconstantopoulos, D.; Mishin, Y.; Gupta, A.; Matic, P.

    2007-01-01

    This paper provides a brief overview of a multidisciplinary effort at the Naval Research Laboratory aimed at developing a computationally-based methodology to assist in the design of advanced Naval steels. This program uses multiple computational techniques ranging from the atomistic length scale to continuum response. First-principles electronic structure calculations using density functional theory were employed, semi-empirical angular dependent potentials were developed based on the embedded atom method, and these potentials were used as input into Monte-Carlo and molecular dynamics simulations. Experimental techniques have also been applied to a super-austenitic stainless steel (AL6XN) to provide experimental input, guidance, verification, and enhancements to the models. These experimental methods include optical microscopy, scanning electron microscopy, transmission electron microscopy, electron backscatter diffraction, and serial sectioning in conjunction with computer-based three-dimensional reconstruction and quantitative analyses. The experimental results are also used as critical input into mesoscale finite element models of materials response

  5. The methodology of population surveys of headache prevalence, burden and cost: Principles and recommendations from the Global Campaign against Headache

    Science.gov (United States)

    2014-01-01

    The global burden of headache is very large, but knowledge of it is far from complete and needs still to be gathered. Published population-based studies have used variable methodology, which has influenced findings and made comparisons difficult. Among the initiatives of the Global Campaign against Headache to improve and standardize methods in use for cross-sectional studies, the most important is the production of consensus-based methodological guidelines. This report describes the development of detailed principles and recommendations. For this purpose we brought together an expert consensus group to include experience and competence in headache epidemiology and/or epidemiology in general and drawn from all six WHO world regions. The recommendations presented are for anyone, of whatever background, with interests in designing, performing, understanding or assessing studies that measure or describe the burden of headache in populations. While aimed principally at researchers whose main interests are in the field of headache, they should also be useful, at least in parts, to those who are expert in public health or epidemiology and wish to extend their interest into the field of headache disorders. Most of all, these recommendations seek to encourage collaborations between specialists in headache disorders and epidemiologists. The focus is on migraine, tension-type headache and medication-overuse headache, but they are not intended to be exclusive to these. The burdens arising from secondary headaches are, in the majority of cases, more correctly attributed to the underlying disorders. Nevertheless, the principles outlined here are relevant for epidemiological studies on secondary headaches, provided that adequate definitions can be not only given but also applied in questionnaires or other survey instruments. PMID:24467862

  6. Life Cycle Assessment for desalination: a review on methodology feasibility and reliability.

    Science.gov (United States)

    Zhou, Jin; Chang, Victor W-C; Fane, Anthony G

    2014-09-15

    As concerns of natural resource depletion and environmental degradation caused by desalination increase, research studies of the environmental sustainability of desalination are growing in importance. Life Cycle Assessment (LCA) is an ISO standardized method and is widely applied to evaluate the environmental performance of desalination. This study reviews more than 30 desalination LCA studies since 2000s and identifies two major issues in need of improvement. The first is feasibility, covering three elements that support the implementation of the LCA to desalination, including accounting methods, supporting databases, and life cycle impact assessment approaches. The second is reliability, addressing three essential aspects that drive uncertainty in results, including the incompleteness of the system boundary, the unrepresentativeness of the database, and the omission of uncertainty analysis. This work can serve as a preliminary LCA reference for desalination specialists, but will also strengthen LCA as an effective method to evaluate the environment footprint of desalination alternatives. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  8. A note on the application of probabilistic structural reliability methodology to nuclear power plants

    International Nuclear Information System (INIS)

    Maurer, H.A.

    1978-01-01

    The interest shown in the general prospects of primary energy in European countries prompted description of the actual European situation. Explanation of the needs for installation of nuclear power plants in most contries of the European Communities are given. Activities of the Commission of the European Communities to initiate a progressive harmonization of already existing European criteria, codes and complementary requirements in order to improve the structural reliability of components and systems of nuclear power plants are summarized. Finally, the applicability of a probabilistic safety analysis to facilitate decision-making as to safety by defining acceptable target and limit values, coupled with a subjective estimate as it is applied in the safety analyses performed in most European countries, is demonstrated. (Auth.)

  9. Reliability of case definitions for public health surveillance assessed by Round-Robin test methodology

    Directory of Open Access Journals (Sweden)

    Claus Hermann

    2006-05-01

    Full Text Available Abstract Background Case definitions have been recognized to be important elements of public health surveillance systems. They are to assure comparability and consistency of surveillance data and have crucial impact on the sensitivity and the positive predictive value of a surveillance system. The reliability of case definitions has rarely been investigated systematically. Methods We conducted a Round-Robin test by asking all 425 local health departments (LHD and the 16 state health departments (SHD in Germany to classify a selection of 68 case examples using case definitions. By multivariate analysis we investigated factors linked to classification agreement with a gold standard, which was defined by an expert panel. Results A total of 7870 classifications were done by 396 LHD (93% and all SHD. Reporting sensitivity was 90.0%, positive predictive value 76.6%. Polio case examples had the lowest reporting precision, salmonellosis case examples the highest (OR = 0.008; CI: 0.005–0.013. Case definitions with a check-list format of clinical criteria resulted in higher reporting precision than case definitions with a narrative description (OR = 3.08; CI: 2.47–3.83. Reporting precision was higher among SHD compared to LHD (OR = 1.52; CI: 1.14–2.02. Conclusion Our findings led to a systematic revision of the German case definitions and build the basis for general recommendations for the creation of case definitions. These include, among others, that testable yes/no criteria in a check-list format is likely to improve reliability, and that software used for data transmission should be designed in strict accordance with the case definitions. The findings of this study are largely applicable to case definitions in many other countries or international networks as they share the same structural and editorial characteristics of the case definitions evaluated in this study before their revision.

  10. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  11. Natural circulation in water cooled nuclear power plants: Phenomena, models, and methodology for system reliability assessments

    International Nuclear Information System (INIS)

    2005-11-01

    In recent years it has been recognized that the application of passive safety systems (i.e. those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. Further, the IAEA Conference on The Safety of Nuclear Power: Strategy for the Future which was convened in 1991 noted that for new plants 'the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate'. Considering the weak driving forces of passive systems based on natural circulation, careful design and analysis methods must be employed to assure that the systems perform their intended functions. To support the development of advanced water cooled reactor designs with passive systems, investigations of natural circulation are an ongoing activity in several IAEA Member States. Some new designs also utilize natural circulation as a means to remove core power during normal operation. In response to the motivating factors discussed above, and to foster international collaboration on the enabling technology of passive systems that utilize natural circulation, an IAEA Coordinated Research Project (CRP) on Natural Circulation Phenomena, Modelling and Reliability of Passive Systems that Utilize Natural Circulation was started in early 2004. Building on the shared expertise within the CRP, this publication presents extensive information on natural circulation phenomena, models, predictive tools and experiments that currently support design and analyses of natural circulation systems and highlights areas where additional research is needed. Therefore, this publication serves both to provide a description of the present state of knowledge on natural circulation in water cooled nuclear power plants and to guide the planning and conduct of the CRP in

  12. Evaluation of validity and reliability of a methodology for measuring human postural attitude and its relation to temporomandibular joint disorders

    Science.gov (United States)

    Fernández, Ramón Fuentes; Carter, Pablo; Muñoz, Sergio; Silva, Héctor; Venegas, Gonzalo Hernán Oporto; Cantin, Mario; Ottone, Nicolás Ernesto

    2016-01-01

    INTRODUCTION Temporomandibular joint disorders (TMJDs) are caused by several factors such as anatomical, neuromuscular and psychological alterations. A relationship has been established between TMJDs and postural alterations, a type of anatomical alteration. An anterior position of the head requires hyperactivity of the posterior neck region and shoulder muscles to prevent the head from falling forward. This compensatory muscular function may cause fatigue, discomfort and trigger point activation. To our knowledge, a method for assessing human postural attitude in more than one plane has not been reported. Thus, the aim of this study was to design a methodology to measure the external human postural attitude in frontal and sagittal planes, with proper validity and reliability analyses. METHODS The variable postures of 78 subjects (36 men, 42 women; age 18–24 years) were evaluated. The postural attitudes of the subjects were measured in the frontal and sagittal planes, using an acromiopelvimeter, grid panel and Fox plane. RESULTS The method we designed for measuring postural attitudes had adequate reliability and validity, both qualitatively and quantitatively, based on Cohen’s Kappa coefficient (> 0.87) and Pearson’s correlation coefficient (r = 0.824, > 80%). CONCLUSION This method exhibits adequate metrical properties and can therefore be used in further research on the association of human body posture with skeletal types and TMJDs. PMID:26768173

  13. Use of curium neutron flux from head-end pyroprocessing subsystems for the High Reliability Safeguards methodology

    Energy Technology Data Exchange (ETDEWEB)

    Borrelli, R.A., E-mail: r.angelo.borrelli@gmail.com

    2014-10-01

    The deployment of nuclear energy systems (NESs) is expanding around the world. Nations are investing in NESs as a means to establish energy independence, grow national economies, and address climate change. Transitioning to the advanced nuclear fuel cycle can meet growing energy demands and ensure resource sustainability. However, nuclear facilities in all phases of the advanced fuel cycle must be ‘safeguardable,’ where safety, safeguards, and security are integrated into a practical design strategy. To this end, the High Reliability Safeguards (HRS) approach is a continually developing safeguardability methodology that applies intrinsic design features and employs a risk-informed approach for systems assessment that is safeguards-motivated. Currently, a commercial pyroprocessing facility is used as the example system. This paper presents a modeling study that investigates the neutron flux associated with processed materials. The intent of these studies is to determine if the neutron flux will affect facility design, and subsequently, safeguardability. The results presented in this paper are for the head-end subsystems in a pyroprocessing facility. The collective results from these studies will then be used to further develop the HRS methodology.

  14. Principles of radiotracer methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kisieleski, W.E.

    1975-06-01

    Information is presented on the properties of radioisotopes and radiations, methods and equipment for radiation detection, safety in handling radioactive materials, the biological effects of radiation, and the design of radiotracer studies. (LCL)

  15. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  16. An analysis of the human reliability on Three Mile Island II accident considering THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    Fonseca, Renato Alves da; Alvim, Antonio Carlos Marques

    2005-01-01

    The research on the Analysis of the Human Reliability becomes more important every day, as well as the study of the human factors and the contributions of the same ones to the incidents and accidents, mainly in complex plants or of high technology. The analysis here developed it uses the methodologies THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis), as well as, the tables and the cases presented in THERP Handbook and to develop a qualitative and quantitative study of an occurred nuclear accident. The chosen accident was it of Three Mile Island (TMI). The accident analysis has revealed a series of incorrect actions that resulted in the permanent loss of the reactor and shutdown of Unit 2. This study also aims at enhancing the understanding of the THERP and ATHEANA methods and at practical applications. In addition, it is possible to understand the influence of plant operational status on human failures and the influence of human failures on equipment of a system, in this case, a nuclear power plant. (author)

  17. A Reliable Methodology for Determining Seed Viability by Using Hyperspectral Data from Two Sides of Wheat Seeds.

    Science.gov (United States)

    Zhang, Tingting; Wei, Wensong; Zhao, Bin; Wang, Ranran; Li, Mingliu; Yang, Liming; Wang, Jianhua; Sun, Qun

    2018-03-08

    This study investigated the possibility of using visible and near-infrared (VIS/NIR) hyperspectral imaging techniques to discriminate viable and non-viable wheat seeds. Both sides of individual seeds were subjected to hyperspectral imaging (400-1000 nm) to acquire reflectance spectral data. Four spectral datasets, including the ventral groove side, reverse side, mean (the mean of two sides' spectra of every seed), and mixture datasets (two sides' spectra of every seed), were used to construct the models. Classification models, partial least squares discriminant analysis (PLS-DA), and support vector machines (SVM), coupled with some pre-processing methods and successive projections algorithm (SPA), were built for the identification of viable and non-viable seeds. Our results showed that the standard normal variate (SNV)-SPA-PLS-DA model had high classification accuracy for whole seeds (>85.2%) and for viable seeds (>89.5%), and that the prediction set was based on a mixed spectral dataset by only using 16 wavebands. After screening with this model, the final germination of the seed lot could be higher than 89.5%. Here, we develop a reliable methodology for predicting the viability of wheat seeds, showing that the VIS/NIR hyperspectral imaging is an accurate technique for the classification of viable and non-viable wheat seeds in a non-destructive manner.

  18. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  19. Principles of performance and reliability modeling and evaluation essays in honor of Kishor Trivedi on his 70th birthday

    CERN Document Server

    Puliafito, Antonio

    2016-01-01

    This book presents the latest key research into the performance and reliability aspects of dependable fault-tolerant systems and features commentary on the fields studied by Prof. Kishor S. Trivedi during his distinguished career. Analyzing system evaluation as a fundamental tenet in the design of modern systems, this book uses performance and dependability as common measures and covers novel ideas, methods, algorithms, techniques, and tools for the in-depth study of the performance and reliability aspects of dependable fault-tolerant systems. It identifies the current challenges that designers and practitioners must face in order to ensure the reliability, availability, and performance of systems, with special focus on their dynamic behaviors and dependencies, and provides system researchers, performance analysts, and practitioners with the tools to address these challenges in their work. With contributions from Prof. Trivedi's former PhD students and collaborators, many of whom are internationally recognize...

  20. Development of a methodology for conducting an integrated HRA/PRA --. Task 1, An assessment of human reliability influences during LP&S conditions PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., McLean, VA (United States)

    1993-06-01

    During Low Power and Shutdown (LP&S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant`s systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP&S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP&S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP&S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP&S, (2) identification of potentially important LP&S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP&S conditions for a pressurized water reactor (PWR).

  1. Quantitative dynamic reliability evaluation of AP1000 passive safety systems by using FMEA and GO-FLOW methodology

    International Nuclear Information System (INIS)

    Hashim Muhammad; Yoshikawa, Hidekazu; Matsuoka, Takeshi; Yang Ming

    2014-01-01

    The passive safety systems utilized in advanced pressurized water reactor (PWR) design such as AP1000 should be more reliable than that of active safety systems of conventional PWR by less possible opportunities of hardware failures and human errors (less human intervention). The objectives of present study are to evaluate the dynamic reliability of AP1000 plant in order to check the effectiveness of passive safety systems by comparing the reliability-related issues with that of active safety systems in the event of the big accidents. How should the dynamic reliability of passive safety systems properly evaluated? And then what will be the comparison of reliability results of AP1000 passive safety systems with the active safety systems of conventional PWR. For this purpose, a single loop model of AP1000 passive core cooling system (PXS) and passive containment cooling system (PCCS) are assumed separately for quantitative reliability evaluation. The transient behaviors of these passive safety systems are taken under the large break loss-of-coolant accident in the cold leg. The analysis is made by utilizing the qualitative method failure mode and effect analysis in order to identify the potential failure mode and success-oriented reliability analysis tool called GO-FLOW for quantitative reliability evaluation. The GO-FLOW analysis has been conducted separately for PXS and PCCS systems under the same accident. The analysis results show that reliability of AP1000 passive safety systems (PXS and PCCS) is increased due to redundancies and diversity of passive safety subsystems and components, and four stages automatic depressurization system is the key subsystem for successful actuation of PXS and PCCS system. The reliability results of PCCS system of AP1000 are more reliable than that of the containment spray system of conventional PWR. And also GO-FLOW method can be utilized for reliability evaluation of passive safety systems. (author)

  2. Reliable lateral and vertical manipulations of a single Cu adatom on a Cu(111) surface with multi-atom apex tip: semiempirical and first-principles simulations

    International Nuclear Information System (INIS)

    Xie Yiqun; Liu Qingwei; Zhang Peng; Wang Songyou; Li Yufen; Gan Fuxi; Zhuang Jun; Zhang Wenqing; Zhuang Min

    2008-01-01

    We study the reliability of the lateral manipulation of a single Cu adatom on a Cu(111) surface with single-atom, dimer and trimer apex tips using both semiempirical and first-principles simulations. The dependence of the manipulation reliability on tip height is investigated. For the single-atom apex tip the manipulation reliability increases monotonically with decreasing tip height. For the dimer and trimer apex tips the manipulation reliability is greatly improved compared to that for the single-atom apex tip over a certain tip-height range. Two kinds of mechanism are found responsible for this improvement. One is the so-called enhanced interaction mechanism in which the lateral tip-adatom interaction in the manipulation direction is improved. The other is the suspended atom mechanism in which the relative lateral trapping ability of the tip is improved due to the strong vertical attraction of the tip on the adatom. Both mechanisms occur in the manipulations with the trimer apex tip, while in those with the dimer apex tip only the former is effective. Moreover, we present a method to realize reversible vertical manipulation of a single atom on a Cu(111) surface with the trimer apex tip, based on its strong vertical and lateral attraction on the adatom

  3. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  4. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model of dynamic safety system (DSS) shows that the estimated reliability of the system is quite reasonable and realistic. (author)

  5. Review of IAEA recommendations on the principles and methodologies for limiting releases of radioactive effluents to the environment

    International Nuclear Information System (INIS)

    Ahmed, J.U.

    1988-01-01

    The limitation of radioactive releases is governed by the basic principles of radiation protection as presented in the ICRP Publication No. 26 and IAEA Safety Series No. 9. Unter its current programme on release limitation the IAEA issued Safety Series No. 77 on principles for release limitation and Safety Series No. 67 on protection against transboundary radiation exposures. A Safety Guide on global upper bounds is now nearly ready for publication, and to guide on the application of Safety Series No. 77, four documents are in various stages of completion

  6. Introducing the MINDER research project: Methodologies for Improvement of Non-residential buildings' Daily Energy Efficiency Reliability

    OpenAIRE

    Berker, Thomas; Gansmo, Helen Jøsok; Junghans, Antje

    2014-01-01

    In the Norwegian building sector, we are currently witnessing the transition from a realization gap - the gap between availability of solutions and their implementation - to a reliability gap: the gap between the building's potential performances as it is commissioned to its users and its actual performance in daily use. When new solutions do not live up to their promises, not only the performance of the individual building is at stake. The reliability gap can easily grow into a credibility g...

  7. Methodology for the application of the I.C.R.P. optimization principle. The case of radioactive effluent control systems in the nuclear fuel cycle

    International Nuclear Information System (INIS)

    Lochard, Jacques; Maccia, Carlo; Pages, Pierre.

    1980-10-01

    This report aims at giving a detailed methodology to help improving decision making process in the radiation protection field, according to the optimization principle of the ICRP. A model was elaborated in such a general way as to be applicable for public as well as occupational radiation protection. The main steps of the model are: 1) the assessment of collective doses and residual health effects associated with a given radiation protection level, 2) the determination of protection costs, 3) the decision analysis: cost effectiveness and cost-benefit analysis. The model is implemented by means of a conversational computer program. This methodology is exemplified with the problem of the choice of waste treatment systems for the PWRs in France. The public impact of radioactive releases is evaluated for the population within 100 km around the site. The main results are presented for two existing sites of the French nuclear program [fr

  8. Thermal Protection for Mars Sample Return Earth Entry Vehicle: A Grand Challenge for Design Methodology and Reliability Verification

    Science.gov (United States)

    Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.

    2017-01-01

    Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.

  9. Progress in Methodologies for the Assessment of Passive Safety System Reliability in Advanced Reactors. Results from the Coordinated Research Project on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors

    International Nuclear Information System (INIS)

    2014-09-01

    Strong reliance on inherent and passive design features has become a hallmark of many advanced reactor designs, including several evolutionary designs and nearly all advanced small and medium sized reactor (SMR) designs. Advanced nuclear reactor designs incorporate several passive systems in addition to active ones — not only to enhance the operational safety of the reactors but also to eliminate the possibility of serious accidents. Accordingly, the assessment of the reliability of passive safety systems is a crucial issue to be resolved before their extensive use in future nuclear power plants. Several physical parameters affect the performance of a passive safety system, and their values at the time of operation are unknown a priori. The functions of passive systems are based on basic physical laws and thermodynamic principals, and they may not experience the same kind of failures as active systems. Hence, consistent efforts are required to qualify the reliability of passive systems. To support the development of advanced nuclear reactor designs with passive systems, investigations into their reliability using various methodologies are being conducted in several Member States with advanced reactor development programmes. These efforts include reliability methods for passive systems by the French Atomic Energy and Alternative Energies Commission, reliability evaluation of passive safety system by the University of Pisa, Italy, and assessment of passive system reliability by the Bhabha Atomic Research Centre, India. These different approaches seem to demonstrate a consensus on some aspects. However, the developers of the approaches have been unable to agree on the definition of reliability in a passive system. Based on these developments and in order to foster collaboration, the IAEA initiated the Coordinated Research Project (CRP) on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors in 2008. The

  10. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study

    OpenAIRE

    Douglas H. Marin dos Santos; Álvaro N. Atallah

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the ...

  11. Reliability Evaluation Methodologies of Fault Tolerant Techniques of Digital I and C Systems in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Bo Gyung; Kang, Hyun Gook; Seong, Poong Hyun; Lee, Seung Jun

    2011-01-01

    Since the reactor protection system was replaced from analog to digital, digital reactor protection system has 4 redundant channels and each channel has several modules. It is necessary for various fault tolerant techniques to improve availability and reliability due to using complex components in DPPS. To use the digital system, it is necessary to improve the reliability and availability of a system through fault-tolerant techniques. Several researches make an effort to effects of fault tolerant techniques. However, the effects of fault tolerant techniques have not been properly considered yet in most fault tree models. Various fault-tolerant techniques, which used in digital system in NPPs, should reflect in fault tree analysis for getting lower system unavailability and more reliable PSA. When fault-tolerant techniques are modeled in fault tree, categorizing the module to detect by each fault tolerant techniques, fault coverage, detection period and the fault recovery should be considered. Further work will concentrate on various aspects for fault tree modeling. We will find other important factors, and found a new theory to construct the fault tree model

  12. Reliability and Validity of Digital Imagery Methodology for Measuring Starting Portions and Plate Waste from School Salad Bars.

    Science.gov (United States)

    Bean, Melanie K; Raynor, Hollie A; Thornton, Laura M; Sova, Alexandra; Dunne Stewart, Mary; Mazzeo, Suzanne E

    2018-04-12

    Scientifically sound methods for investigating dietary consumption patterns from self-serve salad bars are needed to inform school policies and programs. To examine the reliability and validity of digital imagery for determining starting portions and plate waste of self-serve salad bar vegetables (which have variable starting portions) compared with manual weights. In a laboratory setting, 30 mock salads with 73 vegetables were made, and consumption was simulated. Each component (initial and removed portion) was weighed; photographs of weighed reference portions and pre- and post-consumption mock salads were taken. Seven trained independent raters visually assessed images to estimate starting portions to the nearest ¼ cup and percentage consumed in 20% increments. These values were converted to grams for comparison with weighed values. Intraclass correlations between weighed and digital imagery-assessed portions and plate waste were used to assess interrater reliability and validity. Pearson's correlations between weights and digital imagery assessments were also examined. Paired samples t tests were used to evaluate mean differences (in grams) between digital imagery-assessed portions and measured weights. Interrater reliabilities were excellent for starting portions and plate waste with digital imagery. For accuracy, intraclass correlations were moderate, with lower accuracy for determining starting portions of leafy greens compared with other vegetables. However, accuracy of digital imagery-assessed plate waste was excellent. Digital imagery assessments were not significantly different from measured weights for estimating overall vegetable starting portions or waste; however, digital imagery assessments slightly underestimated starting portions (by 3.5 g) and waste (by 2.1 g) of leafy greens. This investigation provides preliminary support for use of digital imagery in estimating starting portions and plate waste from school salad bars. Results might inform

  13. The validity and reliability of the type 2 diabetes and health promotion scale Turkish version: a methodological study.

    Science.gov (United States)

    Yildiz, Esra; Kavuran, Esin

    2018-03-01

    A healthy promotion is important for maintaining health and preventing complications in patients with type 2 diabetes. The aim of the present study was to examine the psychometrics of a recently developed tool that can be used to screen for a health-promoting lifestyle in patients with type 2 diabetes. Data were collected from outpatients attending diabetes clinics. The Type 2 Diabetes and Health Promotion Scale (T2DHPS) and a demographic questionnaire were administered to 295 participants. Forward-backward translation of the original English version was used to develop a Turkish version. Internal consistency of the scale was assessed by Cronbach's alpha. An explanatory factor analysis and confirmatory factor analysis used validity of the Type 2 Diabetes and Health Promotion Scale - Turkish version. Kaiser-Meyer-Olkin (KMO) and Bartlett's sphericity tests showed that the sample met the criteria required for factor analysis. The reliability coefficient for the total scale was 0.84, and alpha coefficients for the subscales ranged from 0.57 to 0.92. A six-factor solution was obtained that explained 59.3% of the total variance. The ratio of chi-square statistics to degrees of freedom (χ 2 /df) 3.30 (χ 2 = 1157.48/SD = 350); error of root mean square approximation (RMSEA) 0.061; GFI value of 0.91 and comparative fit index (CFI) value was obtained as 0.91. Turkish version of The T2DHPS is a valid and reliable tool that can be used to assess patients' health-promoting lifestyle behaviours. Validity and reliability studies in different cultures and regions are recommended. © 2017 Nordic College of Caring Science.

  14. Response surface methodology approach for structural reliability analysis: An outline of typical applications performed at CEC-JRC, Ispra

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1982-01-01

    The paper presents the main results of the work carried out at JRC-Ispra for the study of specific problems posed by the application of the response surface methodology to the exploration of structural and nuclear reactor safety codes. Some relevant studies have been achieved: assessment of structure behaviours in the case of seismic occurrences; determination of the probability of coherent blockage in LWR fuel elements due to LOCA occurrence; analysis of ATWS consequences in PWR reactors by means of an ALMOD code; analysis of the first wall for an experimental fusion reactor by means of the Bersafe code. (orig.)

  15. First-principles model potentials for lattice-dynamical studies: general methodology and example of application to ferroic perovskite oxides.

    Science.gov (United States)

    Wojdeł, Jacek C; Hermet, Patrick; Ljungberg, Mathias P; Ghosez, Philippe; Íñiguez, Jorge

    2013-07-31

    We present a scheme to construct model potentials, with parameters computed from first principles, for large-scale lattice-dynamical simulations of materials. We mimic the traditional solid-state approach to the investigation of vibrational spectra, i.e., we start from a suitably chosen reference configuration of the compound and describe its energy as a function of arbitrary atomic distortions by means of a Taylor series. Such a form of the potential-energy surface is general, trivial to formulate for any material, and physically transparent. Further, such models involve clear-cut approximations, their precision can be improved in a systematic fashion, and their simplicity allows for convenient and practical strategies to compute/fit the potential parameters. We illustrate our scheme with two challenging cases in which the model potential is strongly anharmonic, namely, the ferroic perovskite oxides PbTiO3 and SrTiO3. Studying these compounds allows us to better describe the connection between the so-called effective-Hamiltonian method and ours (which may be seen as an extension of the former), and to show the physical insight and predictive power provided by our approach-e.g., we present new results regarding the factors controlling phase-transition temperatures, novel phase transitions under elastic constraints, an improved treatment of thermal expansion, etc.

  16. Methodological Approaches to Ensuring Innovative Development of Metallurgical Enterprises on the Basis of Principles of Economic Nationalism

    Directory of Open Access Journals (Sweden)

    Denysov Kostyantyn V.

    2017-01-01

    Full Text Available The economic, energy and environmental aspects of the activities of metallurgical enterprises are analyzed in the context of the need to ensure their sustainable development. The high energy intensity of the production process, the low efficiency and irrational structure of capital expenditures for environmental protection, the dominance of material costs in the final cost of finished products at the expense of labor and social contributions are indicated. There proved the low effectiveness of the previous measures of the industrial policy of the metallurgical industry innovative development that were not in compliance with the requirements of the WTO and led to taking compensatory measures against the Ukrainian steel on world markets. The potential of economic nationalism as a system for ensuring the innovative development of the metallurgical industry is considered. There determined the priorities of the industrial policy for the development of metallurgical enterprises based on the principles of economic nationalism and taking into account the global trends in the development of trade and economic relations and Ukraine’s commitments to the WTO.

  17. Methodological Principles of Assessing the Volume of Investment Influx from Non-State Pension Funds into the Economy of Ukraine

    Directory of Open Access Journals (Sweden)

    Dmitro Leonov

    2004-11-01

    Full Text Available This article addresses the processes of forming investment resources from nonstate pension funds under current conditions in Ukraine and the laws and regula tions that define the principles of the formation of in vestment institutions. Based on factors that in the near est future will affect the decisionmaking process by which different kinds of investors make payments to non state pension funds, we develop a procedure for assessing the volume of investment influx from nonstate pension funds into the economy and propose a procedure for long and shortterm prognosis of the volume of investment in flux from nonstate pension funds into the Ukrainian economy.

  18. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Douglas H. Marin dos Santos

    2015-06-01

    Full Text Available The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA, aiming to provide publicly access to a broad range of biomedical information to be made available on the platform ClinicalTrials (available at https://www.clinicaltrials.gov. We accessed ClinicalTrials.gov and evaluated the compliance of researchers and sponsors with the FDAAA. Our sample comprised 243 protocols of clinical trials of biological monoclonal antibodies (mAb adalimumab, bevacizumab, infliximab, rituximab, and trastuzumab. We demonstrate that the new legislation has positively affected transparency patterns in clinical research, through a significant increase in publication and online reporting rates after the enactment of the law. Poorly designed trials, however, remain a challenge to be overcome, due to a high prevalence of methodological flaws. These flaws affect the quality of clinical information available, breaching ethical duties of sponsors and researchers, as well as the human right to health.

  19. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study.

    Science.gov (United States)

    Marin Dos Santos, Douglas H; Atallah, Álvaro N

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the platform ClinicalTrials (available at https://www.clinicaltrials.gov). We accessed ClinicalTrials.gov and evaluated the compliance of researchers and sponsors with the FDAAA. Our sample comprised 243 protocols of clinical trials of biological monoclonal antibodies (mAb) adalimumab, bevacizumab, infliximab, rituximab, and trastuzumab. We demonstrate that the new legislation has positively affected transparency patterns in clinical research, through a significant increase in publication and online reporting rates after the enactment of the law. Poorly designed trials, however, remain a challenge to be overcome, due to a high prevalence of methodological flaws. These flaws affect the quality of clinical information available, breaching ethical duties of sponsors and researchers, as well as the human right to health.

  20. Calculating emissions into the air. General methodological principles; Calcul des emissions dans l'air. Principes methodologiques generaux

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-05-01

    Knowing the quantities of certain substances discharged into the atmosphere is a necessary and fundamental stage in any environmental protection policy to tackle today's problems such as acid rain, the degradation of air quality, global warming and climate change, the depletion of the ozone layer, etc. This quantification, usually known as an 'emission inventory', is built on a set of specific rules which may vary from one inventory to another. This state of affairs presents the enormous disadvantage that the data available are not comparable. At the international level, an attempt at harmonization has been going on for some years between the various international bodies. This work is being pursued in parallel with the improvement of methodologies to estimate discharges from various types of source. To take account of changes in specifications and of improvements in our understanding of phenomena giving rise to atmospheric pollution, the results of inventories of emissions need to be regularly revised, even retrospectively, to maintain a consistent series. CITEPA, which acts as a National Reference Centre, has developed a system of inventories as part of the CORALIE programme with financial help from the French Ministry for Planning and the Environment. (author)

  1. Calculating emissions into the air. General methodological principles; Calcul des emissions dans l'air. Principes methodologiques generaux

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-05-01

    Knowing the quantities of certain substances discharged into the atmosphere is a necessary and fundamental stage in any environmental protection policy to tackle today's problems such as acid rain, the degradation of air quality, global warming and climate change, the depletion of the ozone layer, etc. This quantification, usually known as an 'emission inventory', is built on a set of specific rules which may vary from one inventory to another. This state of affairs presents the enormous disadvantage that the data available are not comparable. At the international level, an attempt at harmonization has been going on for some years between the various international bodies. This work is being pursued in parallel with the improvement of methodologies to estimate discharges from various types of source. To take account of changes in specifications and of improvements in our understanding of phenomena giving rise to atmospheric pollution, the results of inventories of emissions need to be regularly revised, even retrospectively, to maintain a consistent series. CITEPA, which acts as a National Reference Centre, has developed a system of inventories as part of the CORALIE programme with financial help from the French Ministry for Planning and the Environment. (author)

  2. A human reliability analysis of the Three Mile power plant accident considering the THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    Fonseca, Renato Alves da

    2004-03-01

    The main purpose of this work is the study of human reliability using the THERP (Technique for Human Error Prediction) and ATHEANA methods (A Technique for Human Error Analysis), and some tables and also, from case studies presented on the THERP Handbook to develop a qualitative and quantitative study of nuclear power plant accident. This accident occurred in the TMI (Three Mile Island Unit 2) power plant, PWR type plant, on March 28th, 1979. The accident analysis has revealed a series of incorrect actions, which resulted in the Unit 2 shut down and permanent loss of the reactor. This study also aims at enhancing the understanding of the THERP method and ATHEANA, and of its practical applications. In addition, it is possible to understand the influence of plant operational status on human failures and of these on equipment of a system, in this case, a nuclear power plant. (author)

  3. Study on the methodology for predicting and preventing errors to improve reliability of maintenance task in nuclear power plant

    International Nuclear Information System (INIS)

    Hanafusa, Hidemitsu; Iwaki, Toshio; Embrey, D.

    2000-01-01

    The objective of this study was to develop and effective methodology for predicting and preventing errors in nuclear power plant maintenance tasks. A method was established by which chief maintenance personnel can predict and reduce errors when reviewing the maintenance procedures and while referring to maintenance supporting systems and methods in other industries including aviation and chemical plant industries. The method involves the following seven steps: 1. Identification of maintenance tasks. 2. Specification of important tasks affecting safety. 3. Assessment of human errors occurring during important tasks. 4. Identification of Performance Degrading Factors. 5. Dividing important tasks into sub-tasks. 6. Extraction of errors using Predictive Human Error Analysis (PHEA). 7. Development of strategies for reducing errors and for recovering from errors. By way of a trial, this method was applied to the pump maintenance procedure in nuclear power plants. This method is believed to be capable of identifying the expected errors in important tasks and supporting the development of error reduction measures. By applying this method, the number of accidents resulting form human errors during maintenance can be reduced. Moreover, the maintenance support base using computers was developed. (author)

  4. Measurement of Cue-Induced Craving in Human Methamphetamine- Dependent Subjects New Methodological Hopes for Reliable Assessment of Treatment Efficacy

    Directory of Open Access Journals (Sweden)

    Zahra Alam Mehrjerdi

    2011-09-01

    Full Text Available Methamphetamine (MA is a highly addictive psychostimulant drug with crucial impacts on individuals on various levels. Exposure to methamphetamine-associated cues in laboratory can elicit measureable craving and autonomic reactivity in most individuals with methamphetamine dependence and the cue reactivity can model how craving would result in continued drug seeking behaviors and relapse in real environments but study on this notion is still limited. In this brief article, the authors review studies on cue-induced craving in human methamphetamine- dependent subjects in a laboratory-based approach. Craving for methamphetamine is elicited by a variety of methods in laboratory such as paraphernalia, verbal and visual cues and imaginary scripts. In this article, we review the studies applying different cues as main methods of craving incubation in laboratory settings. The brief reviewed literature provides strong evidence that craving for methamphetamine in laboratory conditions is significantly evoked by different cues. Cue-induced craving has important treatment and clinical implications for psychotherapists and clinicians when we consider the role of induced craving in evoking intense desire or urge to use methamphetamine after or during a period of successful craving prevention program. Elicited craving for methamphetamine in laboratory conditions is significantly influenced by methamphetamine-associated cues and results in rapid craving response toward methamphetamine use. This notion can be used as a main core for laboratory-based assessment of treatment efficacy for methamphetamine-dependent patients. In addition, the laboratory settings for studying craving can bridge the gap between somehow-non-reliable preclinical animal model studies and budget demanding randomized clinical trials.

  5. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  6. Methodological Principles of Polycultural Education

    Directory of Open Access Journals (Sweden)

    Andrienko Nadejda Konstantinovna

    2017-09-01

    Full Text Available This article elaborates polycultural education approach. Three main approaches for understanding of polycultural education (acculturation, dialogue and social cum psychological are considered and conceptions evolved within their framework. Attention is also paid to the foreign and domestic (Russian researches in this area. In the end of the article the author handles the idea of “polycultural education”. The article deals with the following: dialogue approach, which is based on the ideas of cultures dialogue, openness, cultural pluralism; activity-oriented conception of polycultural education, conception of multi-perspective education, conception of “cultural differences”, and conception of social education.

  7. Methodology for maintenance analysis based on hydroelectric power stations reliability; Metodologia para realizar analisis de mantenimiento basado en confiabilidad en centrales hidroelectricas

    Energy Technology Data Exchange (ETDEWEB)

    Rea Soto, Rogelio; Calixto Rodriguez, Roberto; Sandoval Valenzuela, Salvador; Velasco Flores, Rocio; Garcia Lizarraga, Maria del Carmen [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2012-07-01

    A methodology to carry out Reliability Centered Maintenance (RCM) studies for hydroelectric power plants is presented. The methodology is an implantation/extension of the guidelines proposed by the Engineering Society for Advanced Mobility Land, Sea and Space in the SAE-JA1012 standard. With the purpose of answering the first five questions, that are set out in that standard, the use of standard ISO14224 is strongly recommended. This approach standardizes failure mechanisms and homogenizes RCM studies with the process of collecting failure and maintenance data. The use of risk matrixes to rank the importance of each failure based on a risk criteria is also proposed. [Spanish] Se presenta una metodologia para realizar estudios de mantenimiento Basado en Confiabilidad (RCM) aplicados a la industria hidroelectrica. La metodologia es una implantacion/ extension realizada por los autores de este trabajo, de los lineamientos propuestos por la Engineering Society for Advanced Mobility Land, Sea and Space en el estandar SAE-JA1012. Para contestar las primeras cinco preguntas del estandar se propone tomar como base los modos y mecanismos de fallas de componentes documentados en la guia para recopilar datos de falla en el estandar ISO-14224. Este enfoque permite estandarizar la descripcion de mecanismos de fallas de los equipos, tanto en el estudio RCM como en el proceso de recopilacion de datos de falla y de mantenimiento, lo que permite retroalimentar el ciclo de mejora continua de los procesos RCM. Tambien se propone el uso de matrices de riesgo para jerarquizar la importancia de los mecanismos de falla con base en el nivel de riesgo.

  8. PRINCIPLES OF RE-ENGINEERING METHODOLOGY FOR TECHNOLOGICAL PROCESS IN PROCESSING OF RAW MATERIAL COMPONENTS WHILE PRODUCING CEMENT AND SILICATE PRODUCTS

    Directory of Open Access Journals (Sweden)

    I. A. Busel

    2014-01-01

    necessity to modernize technological equipment used for grinding raw material components with the purpose to improve efficiency and quality, power- and resource saving. The possibility of using various grinding aids that permit to increase grinding productivity is shown in the paper. The paper studies an automation concept of the control system which used for grinding process of mineral raw material. A conceptual model for complexation of various methods grinding aids has been proposed in the paper. The paper presents methodological principles for simulation of technological process used for processing of mineral raw material while producing cement and silicate products. The parameters which are to be controlled and which are necessary for development of computer simulations of technological grinding process have been determined in the paper. The paper justifies an application of imitation simulation for creation of computer models. Methodology for imitation simulation of the technological process has been studied in the paper. The paper confirms the possibility to use analytical and probability methods. Imitation simulations of a grinding mill operation have been developed on the basis of experimental data and probability functions. The possibility of controlling technological process of raw material grinding has been demonstrated in the paper.While implementing the proposed complex of organizational and technical recommendations it is possible to increase grinding productivity up to 30-50 % and significantly reduce и существенно снизить energy consumption for mineral raw material grinding during production of cement and silicate products. The combined reengineering methodology for grinding process including all the mentioned intensification methods substantially increases quality of final products and reduces its self-cost that will favour its compatibility and attractiveness for consumers.

  9. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  10. Assessing the Reliability of Merging Chickering & Gamson's Seven Principles for Good Practice with Merrill's Different Levels of Instructional Strategy (DLISt7)

    Science.gov (United States)

    Jabar, Syaril Izwann; Albion, Peter R.

    2016-01-01

    Based on Chickering and Gamson's (1987) Seven Principles for Good Practice, this research project attempted to revitalize the principles by merging them with Merrill's (2006) Different Levels of Instructional Strategy. The aim was to develop, validate, and standardize a measurement instrument (DLISt7) using a pretest-posttest Internet…

  11. Evaluation of a Propolis Water Extract Using a Reliable RP-HPLC Methodology and In Vitro and In Vivo Efficacy and Safety Characterisation

    Science.gov (United States)

    Rocha, Bruno Alves; Bueno, Paula Carolina Pires; Vaz, Mirela Mara de Oliveira Lima Leite; Nascimento, Andresa Piacezzi; Ferreira, Nathália Ursoli; Moreno, Gabriela de Padua; Rodrigues, Marina Rezende; Costa-Machado, Ana Rita de Mello; Barizon, Edna Aparecida; Campos, Jacqueline Costa Lima; de Oliveira, Pollyanna Francielli; Acésio, Nathália de Oliveira; Martins, Sabrina de Paula Lima; Tavares, Denise Crispim; Berretta, Andresa Aparecida

    2013-01-01

    Since the beginning of propolis research, several groups have studied its antibacterial, antifungal, and antiviral properties. However, most of these studies have only employed propolis ethanolic extract (PEE) leading to little knowledge about the biological activities of propolis water extract (PWE). Based on this, in a previous study, we demonstrated the anti-inflammatory and immunomodulatory activities of PWE. In order to better understand the equilibrium between effectiveness and toxicity, which is essential for a new medicine, the characteristics of PWE were analyzed. We developed and validated an RP-HPLC method to chemically characterize PWE and PEE and evaluated the in vitro antioxidant/antimicrobial activity for both extracts and the safety of PWE via determining genotoxic potential using in vitro and in vivo mammalian micronucleus assays. We have concluded that the proposed analytical methodology was reliable, and both extracts showed similar chemical composition. The extracts presented antioxidant and antimicrobial effects, while PWE demonstrated higher antioxidant activity and more efficacious for the most of the microorganisms tested than PEE. Finally, PWE was shown to be safe using micronucleus assays. PMID:23710228

  12. Evaluation of a Propolis Water Extract Using a Reliable RP-HPLC Methodology and In Vitro and In Vivo Efficacy and Safety Characterisation

    Directory of Open Access Journals (Sweden)

    Bruno Alves Rocha

    2013-01-01

    Full Text Available Since the beginning of propolis research, several groups have studied its antibacterial, antifungal, and antiviral properties. However, most of these studies have only employed propolis ethanolic extract (PEE leading to little knowledge about the biological activities of propolis water extract (PWE. Based on this, in a previous study, we demonstrated the anti-inflammatory and immunomodulatory activities of PWE. In order to better understand the equilibrium between effectiveness and toxicity, which is essential for a new medicine, the characteristics of PWE were analyzed. We developed and validated an RP-HPLC method to chemically characterize PWE and PEE and evaluated the in vitro antioxidant/antimicrobial activity for both extracts and the safety of PWE via determining genotoxic potential using in vitro and in vivo mammalian micronucleus assays. We have concluded that the proposed analytical methodology was reliable, and both extracts showed similar chemical composition. The extracts presented antioxidant and antimicrobial effects, while PWE demonstrated higher antioxidant activity and more efficacious for the most of the microorganisms tested than PEE. Finally, PWE was shown to be safe using micronucleus assays.

  13. Development of core technology for KNGR system design; development of quantitative reliability evaluation methodologies of KNGR digital I and C components

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Choi, Jong Gyun; Kim, Ung Soo; Kim, Jong Hyun; Kim, Man Cheol; Lee, Seung Jun; Lee, Young Je; Ha, Jun Soo [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2002-03-01

    For the digital systems to be applied to the nuclear industry, which has its unique conservertive to safety, reliability assessment of digital systems is a prerequisite. But, because digital systems show different failure modes from compared to existing analog systems, the existing reliability assessment method cannot be applied to digital systems. It means that a new reliability assessment method for digital systems should be developed. The goal of this study is development of reliability assessment method for digital systems on board level and related software tool. To achieve the goal, we have conducted researches on development of a database for hardware components for digital I and C systems, development of a reliability assessment model for the reliability prediction of digital systems on board level, and the applicability to KNGR digital I and C systems. We developed a database for reliability assessment of digital hardware components, a reliability assessment method for digital systems with consideration of software and hardware together, and a software tool for the reliability assessment of digital systems, which is named as RelPredic. We plan to apply the results of this study to the reliability assessment of digital systems in KNGR digital I and C systems. 13 refs., 71 figs., 31 tabs. (Author)

  14. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  15. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  16. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  17. An Analysis of the Relationship of Teaching Methodology and the Students' Level of Cognition with Student Achievement in Principles of Marketing.

    Science.gov (United States)

    Hallgren, Kenneth Glenn

    A study investigated the relationship of students' cognitive level of development and teaching methodology with student achievement. The sample was composed of 79 students in two sections of the introductory marketing course at the University of Northern Colorado. The control group was taught by a lecture strategy, and the experimental group by a…

  18. The reliability of a segmentation methodology for assessing intramuscular adipose tissue and other soft-tissue compartments of lower leg MRI images.

    Science.gov (United States)

    Karampatos, Sarah; Papaioannou, Alexandra; Beattie, Karen A; Maly, Monica R; Chan, Adrian; Adachi, Jonathan D; Pritchard, Janet M

    2016-04-01

    Determine the reliability of a magnetic resonance (MR) image segmentation protocol for quantifying intramuscular adipose tissue (IntraMAT), subcutaneous adipose tissue, total muscle and intermuscular adipose tissue (InterMAT) of the lower leg. Ten axial lower leg MRI slices were obtained from 21 postmenopausal women using a 1 Tesla peripheral MRI system. Images were analyzed using sliceOmatic™ software. The average cross-sectional areas of the tissues were computed for the ten slices. Intra-rater and inter-rater reliability were determined and expressed as the standard error of measurement (SEM) (absolute reliability) and intraclass coefficient (ICC) (relative reliability). Intra-rater and inter-rater reliability for IntraMAT were 0.991 (95% confidence interval [CI] 0.978-0.996, p soft tissue compartments, the ICCs were all >0.90 (p soft-tissue compartments of the lower leg. A standard operating procedure manual is provided to assist users, and SEM values can be used to estimate sample size and determine confidence in repeated measurements in future research.

  19. Cosmological principles. II. Physical principles

    International Nuclear Information System (INIS)

    Harrison, E.R.

    1974-01-01

    The discussion of cosmological principle covers the uniformity principle of the laws of physics, the gravitation and cognizability principles, and the Dirac creation, chaos, and bootstrap principles. (U.S.)

  20. Basic principles of STT-MRAM cell operation in memory arrays

    International Nuclear Information System (INIS)

    Khvalkovskiy, A V; Apalkov, D; Watts, S; Chepulskii, R; Beach, R S; Ong, A; Tang, X; Driskill-Smith, A; Lottis, D; Chen, E; Nikitin, V; Krounbi, M; Butler, W H; Visscher, P B

    2013-01-01

    For reliable operation, individual cells of an STT-MRAM memory array must meet specific requirements on their performance. In this work we review some of these requirements and discuss the fundamental physical principles of STT-MRAM operation, covering the range from device level to chip array performance, and methodology for its development. (paper)

  1. Methodology of comprehensive evaluation of the effectiveness and reliability of production lines of preparation of sea water for the cultivation of aquatic organisms

    Directory of Open Access Journals (Sweden)

    S. D. Ugryumova

    2016-01-01

    Full Text Available The factors affecting the efficiency and reliability of technical systems. Set stages of development and modernization of production lines that correspond to specific stages of evaluating the effectiveness and reliability. Considered several methods of definition of indicators of indicators of efficiency and reliability of the equipment in technological lines of fisheries sector: forecasting methods, structural methods, physical methods, logical-probability method (method by I.A. Ryabinin and topological method. Advantages and disadvantages, allowing you to work out the most suitable method, process lines preparation of sea water for the cultivation of aquatic organisms, connected in series. Modernized technological line of preparation of sea water for the cultivation of aquatic organisms differing from the typical line of seawater in hatcheries (Far East, as the presence of a large number of instrumentation: sensors, salinity and temperature; motomeru that continuously monitor turbidity in the range of 50÷100 EMF (30÷60 mg/1 by kaolin; signaling the flow sensors volume level of the filtrate and the backfill layer; analyzers of chemical composition of sea water; analyzers of suspended mechanical impurities; signaling sensors of acidity and oxygen content and replacement filters coarse, fine cleaning and auxiliary equipment. A program of comprehensive evaluation of the effectiveness and reliability of production lines, revealed that conducted the modernization of production line preparation of sea water for the cultivation of aquatic organisms has improved its efficiency by an average of 1.71% to reduce the amount of manual labor by 15.1%; control the process; provide the most rapid, efficient purification of sea water; reduce the cost of replacement filter media.

  2. Brain GABA Detection in vivo with the J-editing 1H MRS Technique: A Comprehensive Methodological Evaluation of Sensitivity Enhancement, Macromolecule Contamination and Test-Retest Reliability

    Science.gov (United States)

    Shungu, Dikoma C.; Mao, Xiangling; Gonzales, Robyn; Soones, Tacara N.; Dyke, Jonathan P.; van der Veen, Jan Willem; Kegeles, Lawrence S.

    2016-01-01

    Abnormalities in brain γ-aminobutyric acid (GABA) have been implicated in various neuropsychiatric and neurological disorders. However, in vivo GABA detection by proton magnetic resonance spectroscopy (1H MRS) presents significant challenges arising from low brain concentration, overlap by much stronger resonances, and contamination by mobile macromolecule (MM) signals. This study addresses these impediments to reliable brain GABA detection with the J-editing difference technique on a 3T MR system in healthy human subjects by (a) assessing the sensitivity gains attainable with an 8-channel phased-array head coil, (b) determining the magnitude and anatomic variation of the contamination of GABA by MM, and (c) estimating the test-retest reliability of measuring GABA with this method. Sensitivity gains and test-retest reliability were examined in the dorsolateral prefrontal cortex (DLPFC), while MM levels were compared across three cortical regions: the DLPFC, the medial prefrontal cortex (MPFC) and the occipital cortex (OCC). A 3-fold higher GABA detection sensitivity was attained with the 8-channel head coil compared to the standard single-channel head coil in DLPFC. Despite significant anatomic variation in GABA+MM and MM across the three brain regions (p GABA+MM was relatively stable across the three voxels, ranging from 41% to 49%, a non-significant regional variation (p = 0.58). The test-retest reliability of GABA measurement, expressed either as ratios to voxel tissue water (W) or total creatine, was found to be very high for both the single-channel coil and the 8-channel phased-array coil. For the 8-channel coil, for example, Pearson’s correlation coefficient of test vs. retest for GABA/W was 0.98 (R2 = 0.96, p = 0.0007), the percent coefficient of variation (CV) was 1.25%, and the intraclass correlation coefficient (ICC) was 0.98. Similar reliability was also found for the co-edited resonance of combined glutamate and glutamine (Glx) for both coils. PMID

  3. Principles of Bioremediation Assessment

    Science.gov (United States)

    Madsen, E. L.

    2001-12-01

    Although microorganisms have successfully and spontaneously maintained the biosphere since its inception, industrialized societies now produce undesirable chemical compounds at rates that outpace naturally occurring microbial detoxification processes. This presentation provides an overview of both the complexities of contaminated sites and methodological limitations in environmental microbiology that impede the documentation of biodegradation processes in the field. An essential step toward attaining reliable bioremediation technologies is the development of criteria which prove that microorganisms in contaminated field sites are truly active in metabolizing contaminants of interest. These criteria, which rely upon genetic, biochemical, physiological, and ecological principles and apply to both in situ and ex situ bioremediation strategies include: (i) internal conservative tracers; (ii) added conservative tracers; (iii) added radioactive tracers; (iv) added isotopic tracers; (v) stable isotopic fractionation patterns; (vi) detection of intermediary metabolites; (vii) replicated field plots; (viii) microbial metabolic adaptation; (ix) molecular biological indicators; (x) gradients of coreactants and/or products; (xi) in situ rates of respiration; (xii) mass balances of contaminants, coreactants, and products; and (xiii) computer modeling that incorporates transport and reactive stoichiometries of electron donors and acceptors. The ideal goal is achieving a quantitative understanding of the geochemistry, hydrogeology, and physiology of complex real-world systems.

  4. Measuring the Performance of Attention Networks with the Dalhousie Computerized Attention Battery (DalCAB): Methodology and Reliability in Healthy Adults.

    Science.gov (United States)

    Jones, Stephanie A H; Butler, Beverly C; Kintzel, Franziska; Johnson, Anne; Klein, Raymond M; Eskes, Gail A

    2016-01-01

    Attention is an important, multifaceted cognitive domain that has been linked to three distinct, yet interacting, networks: alerting, orienting, and executive control. The measurement of attention and deficits of attention within these networks is critical to the assessment of many neurological and psychiatric conditions in both research and clinical settings. The Dalhousie Computerized Attention Battery (DalCAB) was created to assess attentional functions related to the three attention networks using a range of tasks including: simple reaction time, go/no-go, choice reaction time, dual task, flanker, item and location working memory, and visual search. The current study provides preliminary normative data, test-retest reliability (intraclass correlations) and practice effects in DalCAB performance 24-h after baseline for healthy young adults (n = 96, 18-31 years). Performance on the DalCAB tasks demonstrated Good to Very Good test-retest reliability for mean reaction time, while accuracy and difference measures (e.g., switch costs, interference effects, and working memory load effects) were most reliable for tasks that require more extensive cognitive processing (e.g., choice reaction time, flanker, dual task, and conjunction search). Practice effects were common and pronounced at the 24-h interval. In addition, performance related to specific within-task parameters of the DalCAB sub-tests provides preliminary support for future formal assessment of the convergent validity of our interpretation of the DalCAB as a potential clinical and research assessment tool for measuring aspects of attention related to the alerting, orienting, and executive control networks.

  5. Measuring the performance of attention networks with the Dalhousie Computerized Attention Battery (DalCAB: Methodology and reliability in healthy adults

    Directory of Open Access Journals (Sweden)

    Stephanie Anne Holland Jones

    2016-06-01

    Full Text Available Attention is an important, multifaceted cognitive domain that has been linked to three distinct, yet interacting, networks: alerting, orienting, and executive control. The measurement of attention and deficits of attention within these networks is critical to the assessment of many neurological and psychiatric conditions in both research and clinical settings. The Dalhousie Computerized Attention Battery (DalCAB was created to assess attentional functions related to the three attention networks using a range of tasks including: simple reaction time, go/no-go, choice reaction time, dual task, flanker, item and location working memory and visual search. The current study provides preliminary normative data, test-retest reliability (intraclass correlations and practice effects in DalCAB performance 24-hours after baseline for healthy young adults (n = 96, 18-31 years. Performance on the DalCAB tasks demonstrated Good to Excellent test-retest reliability for mean reaction time, while accuracy and difference measures (e.g., switch costs, interference effects and working memory load effects were most reliable for tasks that require more extensive cognitive processing (e.g., choice reaction time, flanker, dual task, and conjunction search. Practice effects were common and pronounced at the 24-hour interval. In addition, performance related to specific within-task parameters of the DalCAB sub-tests provides preliminary support for future formal assessment of the convergent validity of our interpretation of the DalCAB as a potential clinical and research assessment tool for measuring aspects of attention related to the alerting, orienting and executive control networks.Keywords: computerized assessment; attention; orienting; alerting; executive function

  6. Safety and reliability of pressure components with special emphasis on the contribution of component and large specimen testing to structural integrity assessment methodology. Vol. 1 and 2

    International Nuclear Information System (INIS)

    1987-01-01

    The 51 papers of the 13. MPA-seminar contribute to structural integrity assessment methodology with special emphasis on the component and large specimen testing. 8 of the papers deal with fracture mechanics, 6 papers with dynamic loading, 13 papers with nondestructive testing, 2 papers with radiation embrittlement, 5 papers with pipe failure, 4 papers with components, 2 papers with thermal shock loading, 5 papers with the high temperature behaviour, 4 papers with the integrity of vessels and 3 papers with the integrity of welded joints. Especially also the fracture behaviour of steel material is verificated. All papers are separately indexed and analysed for the database. (DG) [de

  7. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  8. Principles and methodology for translation and cross-cultural adaptation of the Nordic Occupational Skin Questionnaire (NOSQ-2002) to Spanish and Catalan.

    Science.gov (United States)

    Sala-Sastre, Nohemi; Herdman, Mike; Navarro, Lidia; de la Prada, Miriam; Pujol, Ramón M; Serra, Consol; Alonso, Jordi; Flyvholm, Mari-Ann; Giménez-Arnau, Ana M

    2009-08-01

    Occupational skin diseases are among the most frequent work-related diseases in industrialized countries. The Nordic Occupational Skin Questionnaire (NOSQ-2002), developed in English, is a useful tool for screening of occupational skin diseases. To culturally adapt the NOSQ-2002 to Spanish and Catalan and to assess the clarity, comprehension, cultural relevance and appropriateness of the translated versions. The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) principles of good practice for the translation and cultural adaptation of patient-reported outcomes were followed. After translation into the target language, a first consensus version of the questionnaire was evaluated in multiple cognitive debriefing interviews. The expert panel introduced some modifications in 39 (68%) and 27 (47%) items in the Spanish and Catalan version, respectively (e.g. addition of examples and definitions, reformulation of instructions and use of direct question format). This version was back translated and submitted to the original authors, who suggested a further seven and two modifications in the Spanish and Catalan versions, respectively. A second set of cognitive interviews were performed. A consensus version of both questionnaires was obtained after final modifications based on comments by the patients. The final versions of the Spanish and Catalan NOSQ-2002 questionnaires are now available at www.NRCWE.dk/NOSQ.

  9. Methodological principles for the evaluation of impact of the variability and the climatic change in the human health, a statistical focus

    International Nuclear Information System (INIS)

    Ortiz Bulto, Paulo Lazaro; Vladimir Guevara, Antonio; Ulloa, Jacqueline; Aparicio, Marilyn

    2001-01-01

    Signal detection of climate variability or change and the evaluation of its specific effects, requires an understanding of the variations in the observed data, which describe the natural climate variability and change signals. It is also necessary to understand the complex interactions that make up the climate system. In the present work, an unusual methodological approach is taken to evaluate the effects and impacts of climate variability and change on the behaviour of different diseases, on the basis of practical experience of its application in four countries of the Caribbean, Central and South America: Cuba, Panama, Bolivia and Paraguay. For the determination of the climate signal change multivariate analysis techniques (empirical orthogonal functions) were used, combined with robust methods of time series decomposition (decomposition by median). They allowed us to describe the changes observed in the seasonal patterns of climate and epidemiological diseases for the period 1991-1999, with respect to the period 1961-1990. These results were used to build an autoregressive model with non-constant variance, with a climate index based on the signals obtained from the decompositions, which enters the model as an exogenous variable in order to make projections of the diseases

  10. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  11. Bernoulli's Principle

    Science.gov (United States)

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  12. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  13. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  14. Modern electronic maintenance principles

    CERN Document Server

    Garland, DJ

    2013-01-01

    Modern Electronic Maintenance Principles reviews the principles of maintaining modern, complex electronic equipment, with emphasis on preventive and corrective maintenance. Unfamiliar subjects such as the half-split method of fault location, functional diagrams, and fault finding guides are explained. This book consists of 12 chapters and begins by stressing the need for maintenance principles and discussing the problem of complexity as well as the requirements for a maintenance technician. The next chapter deals with the connection between reliability and maintenance and defines the terms fai

  15. Developing principles of growth

    DEFF Research Database (Denmark)

    Neergaard, Helle; Fleck, Emma

    of the principles of growth among women-owned firms. Using an in-depth case study methodology, data was collected from women-owned firms in Denmark and Ireland, as these countries are similar in contextual terms, e.g. population and business composition, dominated by micro, small and medium-sized enterprises....... Extending on principles put forward in effectuation theory, we propose that women grow their firms according to five principles which enable women’s enterprises to survive in the face of crises such as the current financial world crisis....

  16. Uma metodologia bayesiana para estudos de confiabilidade na fase de projeto: aplicação em um produto eletrônico A bayesian methodology for reliability studies in the design phase: application to an electronic product

    Directory of Open Access Journals (Sweden)

    Ruth Myriam Ramírez Pongo

    1997-12-01

    . This is particularly true when the product technology limits the acceleration factor, as with electronic products, for example. The methodology proposed in this paper combines test results, which are routinely performed during the product development cycle, with additional relevant information that is useful in the assessment of its reliability. In order to illustrate the methodology, it was applied to an electronic equipment, assessing its reliability during the design phase. The computations were performed considering component reliabilities, attribute test data, and also judgement of the product development team.

  17. State of the art of probabilistic safety analysis (PSA) in the FRG, and principles of a PSA-guideline

    International Nuclear Information System (INIS)

    Balfanz, H.P.

    1987-01-01

    Contents of the articles: Survey of PSA performed during licensing procedures of an NPP; German Nuclear Standards' requirements on the reliability of safety systems; PSA-guideline for NPP: Principles and suggestions; Motivation and tasks of PSA; Aspects of the methodology of safety analyses; Structure of event tree and fault tree analyses; Extent of safety analyses; Performance and limits of PSA. (orig./HSCH)

  18. Integrating the Carbon and Water Footprints’ Costs in the Water Framework Directive 2000/60/EC Full Water Cost Recovery Concept: Basic Principles Towards Their Reliable Calculation and Socially Just Allocation

    Directory of Open Access Journals (Sweden)

    Anastasia Papadopoulou

    2012-01-01

    Full Text Available This paper presents the basic principles for the integration of the water and carbon footprints cost into the resource and environmental costs respectively, taking the suggestions set by the Water Framework Directive (WFD 2000/60/EC one step forward. WFD states that full water cost recovery (FWCR should be based on the estimation of the three sub-costs related: direct; environmental; and resource cost. It also strongly suggests the EU Member States develop and apply effective water pricing policies to achieve FWCR. These policies must be socially just to avoid any social injustice phenomena. This is a very delicate task to handle, especially within the fragile economic conditions that the EU is facing today. Water losses play a crucial role for the FWC estimation. Water losses should not be neglected since they are one of the major “water uses” in any water supply network. A methodology is suggested to reduce water losses and the related Non Revenue Water (NRW index. An Expert Decision Support System is proposed to assess the FWC incorporating the Water and Carbon Footprint costs.

  19. El análisis de criticidad, una metodología para mejorar la confiabilidad operacional // Criticality analysis , a methodology to improve the operational reliability.

    Directory of Open Access Journals (Sweden)

    R. Huerta Mendoza

    2000-10-01

    establecer prioridades, yfocalizar el esfuerzo que garantice el éxito maximizando la rentabilidad.Palabras claves: confiabilidad, criticidad, seguridad, ambiente, riesgo, disponibilidad, mejoramiento.___________________________________________________________________Abstract:The criticality analysis is a methodology that allows to establish the hierarchy or priorities of processes, systems andequipments, creating a structure that facilitates the taking of effective and correct decisions, addressing the effort and theresources in areas where it is more important or necessary to improve the operational dependability, based on the currentreality.The improvement of the operational dependability of any installation or their systems and components is associated withfour fundamental aspects: human dependability, dependability of the process, dependability of the design and thedependability of the maintenance. Regrettably, difficultly is in our hands limitless resources, so much economic as human,to be able to improve at the same time, these four aspects in all the areas of a company.The approaches to carry out a criticality analysis are associated with: security, surroundings, production, operation costsand maintenance, failure rate and repair time mainly. These approaches are related with a mathematical equation thatgenerates punctuation for each evaluated element.The generated list, result of a team work, allows to even and to homologate approaches to establish priorities, and focalisethe effort that guarantees the success maximizing the profitability..Key words:. PDVSA, dependability, criticality, security, surroundings, risk, readiness, improvement,changes.

  20. Contextual factors, methodological principles and teacher cognition

    OpenAIRE

    Walsh, Rupert; Wyatt, Mark

    2014-01-01

    Teachers in various contexts worldwide are sometimes unfairly criticized for not putting teaching methods developed for the well-resourced classrooms of Western countries into practice. Factors such as the teachers’ “misconceptualizations” of “imported” methods, including Communicative Language Teaching (CLT), are often blamed, though the challenges imposed by “contextual demands,” such as large class sizes, are sometimes recognised. Meanwhile, there is sometimes an assumption that in the Wes...

  1. Robust design principles for reducing variation in functional performance

    DEFF Research Database (Denmark)

    Christensen, Martin Ebro; Howard, Thomas J.

    2016-01-01

    This paper identifies, describes and classifies a comprehensive collection of variation reduction principles (VRP) that can be used to increase the robustness of a product and reduce its variation in functional performance. Performance variation has a negative effect on the reliability and percei......This paper identifies, describes and classifies a comprehensive collection of variation reduction principles (VRP) that can be used to increase the robustness of a product and reduce its variation in functional performance. Performance variation has a negative effect on the reliability...... and perceived quality of a product and efforts should be made to minimise it. The design principles are identified by a systematic decomposition of the Taguchi Transfer Function in combination with the use of existing literature and the authors’ experience. The paper presents 15 principles and describes...... their advantages and disadvantages along with example cases. Subsequently, the principles are classified based on their applicability in the various development and production stages. The VRP are to be added to existing robust design methodologies, helping the designer to think beyond robust design tool and method...

  2. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  3. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  4. Development of reliable pavement models.

    Science.gov (United States)

    2011-05-01

    The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...

  5. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  6. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  7. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  8. Estimation of Bridge Reliability Distributions

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    In this paper it is shown how the so-called reliability distributions can be estimated using crude Monte Carlo simulation. The main purpose is to demonstrate the methodology. Therefor very exact data concerning reliability and deterioration are not needed. However, it is intended in the paper to ...

  9. General principles of radiotherapy

    International Nuclear Information System (INIS)

    Easson, E.C.

    1985-01-01

    The daily practice of any established branch of medicine should be based on some acceptable principles. This chapter is concerned with the general principles on which the radiotherapy of the Manchester school is based. Though many radiotherapists in other centres would doubtless accept these principles, there are sufficiently wide differences in practice throughout the world to suggest that some therapists adhere to a fundamentally different philosophy. The authors believe it is important, especially for those beginning their formal training in radiotherapy, to subscribe to an internally consistent school of thought, employing methods of treatment for each type of lesion in each anatomical site that are based on accepted principles and subjected to continuous rigorous scrutiny to test their effectiveness. Not only must each therapeutic technique be evaluated, but the underlying principles too must be questioned if and when this seems indicated. It is a feature of this hospital that similar lesions are all treated by the same technique, so long as statistical evidence justifies such a policy. All members of the staff adhere to the accepted policy until or unless reliable reasons are adduced to change this policy

  10. Variational principles

    CERN Document Server

    Moiseiwitsch, B L

    2004-01-01

    This graduate-level text's primary objective is to demonstrate the expression of the equations of the various branches of mathematical physics in the succinct and elegant form of variational principles (and thereby illuminate their interrelationship). Its related intentions are to show how variational principles may be employed to determine the discrete eigenvalues for stationary state problems and to illustrate how to find the values of quantities (such as the phase shifts) that arise in the theory of scattering. Chapter-by-chapter treatment consists of analytical dynamics; optics, wave mecha

  11. The principles of radiation protection

    International Nuclear Information System (INIS)

    2004-01-01

    The aim of radiation protection is to avoid or to reduce the risks linked to ionizing radiation. In order to reduce these risks, the radiation protection uses three great principles: justification, optimization and limitation of radiation doses. to apply these principles, the radiation protection has regulatory and technical means adapted to three different categories of people: public, patients and workers. The nuclear safety authority elaborates the regulation, and carries out monitoring of the reliable application of radiation protection system. (N.C.)

  12. Safety Principles

    Directory of Open Access Journals (Sweden)

    V. A. Grinenko

    2011-06-01

    Full Text Available The offered material in the article is picked up so that the reader could have a complete representation about concept “safety”, intrinsic characteristics and formalization possibilities. Principles and possible strategy of safety are considered. A material of the article is destined for the experts who are taking up the problems of safety.

  13. Maquet principle

    Energy Technology Data Exchange (ETDEWEB)

    Levine, R.B.; Stassi, J.; Karasick, D.

    1985-04-01

    Anterior displacement of the tibial tubercle is a well-accepted orthopedic procedure in the treatment of certain patellofemoral disorders. The radiologic appearance of surgical procedures utilizing the Maquet principle has not been described in the radiologic literature. Familiarity with the physiologic and biochemical basis for the procedure and its postoperative appearance is necessary for appropriate roentgenographic evaluation and the radiographic recognition of complications.

  14. Cosmological principle

    International Nuclear Information System (INIS)

    Wesson, P.S.

    1979-01-01

    The Cosmological Principle states: the universe looks the same to all observers regardless of where they are located. To most astronomers today the Cosmological Principle means the universe looks the same to all observers because density of the galaxies is the same in all places. A new Cosmological Principle is proposed. It is called the Dimensional Cosmological Principle. It uses the properties of matter in the universe: density (rho), pressure (p), and mass (m) within some region of space of length (l). The laws of physics require incorporation of constants for gravity (G) and the speed of light (C). After combining the six parameters into dimensionless numbers, the best choices are: 8πGl 2 rho/c 2 , 8πGl 2 rho/c 4 , and 2 Gm/c 2 l (the Schwarzchild factor). The Dimensional Cosmological Principal came about because old ideas conflicted with the rapidly-growing body of observational evidence indicating that galaxies in the universe have a clumpy rather than uniform distribution

  15. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  16. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  17. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  18. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  19. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  20. PRINCIPLES OF CONTENT FORMATION EDUCATIONAL ELECTRONIC RESOURCE

    Directory of Open Access Journals (Sweden)

    О Ю Заславская

    2017-12-01

    Full Text Available The article considers modern possibilities of information and communication technologies for the design of electronic educational resources. The conceptual basis of the open educational multimedia system is based on the modular architecture of the electronic educational resource. The content of the electronic training module can be implemented in several versions of the modules: obtaining information, practical exercises, control. The regularities in the teaching process in modern pedagogical theory are considered: general and specific, and the principles for the formation of the content of instruction at different levels are defined, based on the formulated regularities. On the basis of the analysis, the principles of the formation of the electronic educational resource are determined, taking into account the general and didactic patterns of teaching.As principles of the formation of educational material for obtaining information for the electronic educational resource, the article considers: the principle of methodological orientation, the principle of general scientific orientation, the principle of systemic nature, the principle of fundamentalization, the principle of accounting intersubject communications, the principle of minimization. The principles of the formation of the electronic training module of practical studies in the article include: the principle of systematic and dose based consistency, the principle of rational use of study time, the principle of accessibility. The principles of the formation of the module for monitoring the electronic educational resource can be: the principle of the operationalization of goals, the principle of unified identification diagnosis.

  1. Notes on human factors problems in process plant reliability and safety prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.; Taylor, J.R.

    1976-09-01

    The basis for plant operator reliability evaluation is described. Principles for plant design, necessary to permit reliability evaluation, are outlined. Five approaches to the plant operator reliability problem are described. Case stories, illustrating operator reliability problems, are given. (author)

  2. Development of a methodology for the application of the analysis of human reliability to individualized temporary storage facility; Desarrollo de una metodologia de aplicacion del Analisis de Fiabilidad Humana a una instalacion de Almacen Temporal Individualizado

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, P.; Dies, J.; Tapia, C.; Blas, A. de

    2014-07-01

    The paper aims to present the methodology that has been developed with the purpose of applying an ATI without the need of having experts during the process of modelling and quantification analysis of HRA. The developed methodology is based on ATHEANA and relies on the use of other methods of analysis of human action and in-depth analysis. (Author)

  3. Fundamental Principles of Alarm Design

    DEFF Research Database (Denmark)

    Us, Tolga; Jensen, Niels; Lind, Morten

    2011-01-01

    Traditionally alarms are designed on the basis of empirical guidelines rather than on a sound scientific framework rooted in a theoretical foundation for process and control system design. This paper proposes scientific principles and a methodology for design of alarms based on a functional...... be applied to any engineering system which can be modeled by MFM. The methodology provides a set of alarms which can facilitate event interpretation and operator support for abnormal situation management. The proposed design methodology provides the information content of the alarms, but does not deal...

  4. The DYLAM approach to systems safety and reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.

    1988-01-01

    A survey of the principal features and applications of DYLAM (Dynamic Logical Analytical Methodology) is presented, whose basic principles can be summarized as follows: after a particular modelling of the component states, computerized heuristical procedures generate stochastic configurations of the system, whereas the resulting physical processes are simultaneously simulated to give account of the possible interactions between physics and states and, on the other hand, to search for system dangerous configurations and related probabilities. The association of probabilistic techniques for describing the states with physical equations for describing the process results in a very powerful tool for safety and reliability assessment of systems potentially subjected to dangerous incidental transients. A comprehensive picture of DYLAM capability for manifold applications can be obtained by the review of the study cases analyzed (LMFBR core accident, systems reliability assessment, accident simulation, man-machine interaction analysis, chemical reactors safety, etc.)

  5. Zymography Principles.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2017-01-01

    Zymography, the detection, identification, and even quantification of enzyme activity fractionated by gel electrophoresis, has received increasing attention in the last years, as revealed by the number of articles published. A number of enzymes are routinely detected by zymography, especially with clinical interest. This introductory chapter reviews the major principles behind zymography. New advances of this method are basically focused towards two-dimensional zymography and transfer zymography as will be explained in the rest of the chapters. Some general considerations when performing the experiments are outlined as well as the major troubleshooting and safety issues necessary for correct development of the electrophoresis.

  6. Basic principles

    International Nuclear Information System (INIS)

    Wilson, P.D.

    1996-01-01

    Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)

  7. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  8. Reliability and Probabilistic Risk Assessment - How They Play Together

    Science.gov (United States)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will

  9. Functional principles of registry-based service discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Tan, C.; Hartel, P.H.; Hartog, den J.I.; Scholten, J.

    2005-01-01

    As Service Discovery Protocols (SDP) are becoming increasingly important for ubiquitous computing, they must behave according to predefined principles. We present the functional Principles of Service Discovery for robust, registry-based service discovery. A methodology to guarantee adherence to

  10. Constructing Ethical Principles for Synthetic Biology

    DEFF Research Database (Denmark)

    Dige, Morten

    2010-01-01

    The ethical discussion over synbio naturally raises metaquestions or questions of methodology: Which ethical principles and values could or should function as orientation or guidelines in discussing these issues?...

  11. Agile foundations principles, practices and frameworks

    CERN Document Server

    Measey, Peter; Gray, Alex; Levy, Richard; Oliver, Les; Roberts, Barbara; Short, Michael; Wilmshurst, Darren; Wolf, Lazaro

    2015-01-01

    Agile practices transform the way organisations carry out business and respond to change. But to realise success, an Agile mindset needs to be adopted throughout an organisation. This book gives a comprehensive introduction to Agile principles and methodologies.

  12. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  13. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  14. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  15. Safety and reliability assessment

    International Nuclear Information System (INIS)

    1979-01-01

    This report contains the papers delivered at the course on safety and reliability assessment held at the CSIR Conference Centre, Scientia, Pretoria. The following topics were discussed: safety standards; licensing; biological effects of radiation; what is a PWR; safety principles in the design of a nuclear reactor; radio-release analysis; quality assurance; the staffing, organisation and training for a nuclear power plant project; event trees, fault trees and probability; Automatic Protective Systems; sources of failure-rate data; interpretation of failure data; synthesis and reliability; quantification of human error in man-machine systems; dispersion of noxious substances through the atmosphere; criticality aspects of enrichment and recovery plants; and risk and hazard analysis. Extensive examples are given as well as case studies

  16. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  17. Principled Missing Data Treatments.

    Science.gov (United States)

    Lang, Kyle M; Little, Todd D

    2018-04-01

    We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.

  18. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  19. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  20. Gesture & Principle

    DEFF Research Database (Denmark)

    Hvejsel, Marie Frier

    2018-01-01

    as a student, professional, or architectural researcher. It is the hypothesis of this paper that, in its initial questioning of the task of the Greek tekton (as a masterbuilder) capable of bringing together aesthetics and technique in a given context, tectonic theory has unique potential in this matter......, installations, and equipment threaten to undermine the primary spatial purpose and quality of architecture as a sensuous enrichment of everyday life. This calls for continuous critical positioning within the field as well as a systematic method for acquiring knowledge about an architectural problem, whether....... This potential is investigated through a rereading of the development of tectonic theory in architecture, which is done in relation to the present conditions and methodological challenges facing the discipline. As a result, this paper outlines a direction for the repositioning, development, and application...

  1. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  2. Robust Reliability or reliable robustness? - Integrated consideration of robustness and reliability aspects

    DEFF Research Database (Denmark)

    Kemmler, S.; Eifler, Tobias; Bertsche, B.

    2015-01-01

    products are and vice versa. For a comprehensive understanding and to use existing synergies between both domains, this paper discusses the basic principles of Reliability- and Robust Design theory. The development of a comprehensive model will enable an integrated consideration of both domains...

  3. Aerospace reliability applied to biomedicine.

    Science.gov (United States)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  4. Contact spectroscopy of high-temperature superconductors (Review). I - Physical and methodological principles of the contact spectroscopy of high-temperature superconductors. Experimental results for La(2-x)Sr(x)CuO4 and their discussion

    Science.gov (United States)

    Ianson, I. K.

    1991-03-01

    Research in the field of high-temperature superconductors based on methods of tunneling and microcontact spectroscopy is reviewed in a systematic manner. The theoretical principles of the methods are presented, and various types of contacts are described and classified. Attention is given to deviations of the measured volt-ampere characteristics from those predicted by simple theoretical models and those observed for conventional superconductors. Results of measurements of the energy gap and fine structure of volt ampere characteristic derivatives are presented for La(2-x)Sr(x)CuO4.

  5. Proposed Reliability/Cost Model

    Science.gov (United States)

    Delionback, L. M.

    1982-01-01

    New technique estimates cost of improvement in reliability for complex system. Model format/approach is dependent upon use of subsystem cost-estimating relationships (CER's) in devising cost-effective policy. Proposed methodology should have application in broad range of engineering management decisions.

  6. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  7. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  8. Environmental Zoning: Some methodological implications

    NARCIS (Netherlands)

    Ike, Paul; Voogd, Henk

    1991-01-01

    The purpose of this article is to discuss some methodological problems of environmental zoning. The principle of environmental zoning will be elaborated. In addition an overview is given of a number of approaches that have been followed in practice to arrive at an integral judgement. Finally some

  9. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  10. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  11. Advertisement without Ethical Principles?

    Directory of Open Access Journals (Sweden)

    Wojciech Słomski

    2007-10-01

    Full Text Available The article replies to the question, whether the advertisement can exist without ethical principles or ethics should be the basis of the advertisement. One can say that the ethical opinion of the advertisement does not depend on content and the form of advertising content exclusively, but also on recipients consciousness. The advertisement appeals to the emotions more than to the intellect, thus restricting the area of conscious and based on rational premises choice, so it is morally bad. It is not that the moral evil immanently underlines the advertisement, but it concerns the mechanisms which cause that the advertisement turns out to be effective. The only admissible form of the advertisement would be the reliable full information about the advantages and flaws of the concrete advertised product. The only admissible form of the advertisement would be the reliable full information about the advantages and defects of the concrete advertised product. The most serious difficulty connected with the ethical opinion of the advertisement is the fact that the advertisement is the indispensable link of the present economy, and everyone who accepts the free market and perceives the positives of the economic growth, should also accept the advertisement. The advertisement constitutes the element of the economic activity, so in consequence the responsibility first of all lies with enterprises for its far-reaching results.

  12. The gauge principle vs. the equivalence principle

    International Nuclear Information System (INIS)

    Gates, S.J. Jr.

    1984-01-01

    Within the context of field theory, it is argued that the role of the equivalence principle may be replaced by the principle of gauge invariance to provide a logical framework for theories of gravitation

  13. Equivalence principles and electromagnetism

    Science.gov (United States)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  14. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  15. Reliability-based condition assessment of steel containment and liners

    International Nuclear Information System (INIS)

    Ellingwood, B.; Bhattacharya, B.; Zheng, R.

    1996-11-01

    Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs

  16. On the complex analysis of the reliability, safety, and economic efficiency of atomic electric power stations

    International Nuclear Information System (INIS)

    Emel'yanov, I.Ya.; Klemin, A.I.; Polyakov, E.F.

    1977-01-01

    The problem is posed of effectively increasing the engineering performance of nuclear electric power stations (APS). The principal components of the engineering performance of modern large APS are considered: economic efficiency, radiation safety, reliability, and their interrelationship. A nomenclature is proposed for the quantitative indices which most completely characterize the enumerated properties and are convenient for the analysis of the engineering performance. The urgent problem of developing a methodology for the complex analysis and optimization of the principal performance components is considered; this methodology is designed to increase the efficiency of the work on high-performance competitive APS. The principle of complex optimization of the reliability, safety, and economic-efficiency indices is formulated; specific recommendations are made for the practical realization of this principle. The structure of the complex quantiative analysis of the enumerated performance components is given. The urgency and promise of the complex approach to solving the problem of APS optimization is demonstrated, i.e., the solution of the problem of creating optimally reliable, fairly safe, and maximally economically efficient stations

  17. The analysis of pricing principles at domestic industrial enterprises

    OpenAIRE

    I.M. Rjabchenko; V.V. Bozhkova

    2013-01-01

    The analysis of pricing principles at domestic industrial enterprisesTheoretical and methodological aspects of marketing pricing formation are investigated in the article. The aim of this research is systematization of marketing pricing principles and formation of corresponding offers concerning perfection of a domestic industrial enterprises pricing policy.The results of the analysis. The authors note that pricing principles are important element of pricing methodology which form basic posit...

  18. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  19. Overview of system reliability analyses for PSA

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2012-01-01

    Overall explanations are given for many matters relating to system reliability analysis. Systems engineering, Operations research, Industrial engineering, Quality control are briefly explained. Many system reliability analysis methods including advanced methods are introduced. Discussions are given for FMEA, reliability block diagram, Markov model, Petri net, Bayesian network, goal tree success tree, dynamic flow graph methodology, cell-to-cell mapping technique, the GO-FLOW and others. (author)

  20. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  1. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  2. Power Industry Reliability Coordination in Asia in a Market Environment

    OpenAIRE

    Hammons, Thomas J.; Voropai, Nikolai I.

    2010-01-01

    This paper addresses the problems of power supply reliability in a market environment. The specific features of economic interrelations between the power supply organization and consumers in terms of reliability assurance are examined and the principles of providing power supply reliability are formulated. The economic mechanisms of coordinating the interests of power supply organization and consumers to provide power supply reliability are discussed. Reliability of restructuring China's powe...

  3. A human reliability analysis of the Three Mile power plant accident considering the THERP and ATHEANA methodologies; Uma analise da confiabilidade humana do acidente na Usina de Three Mile Island II considerando as metodologias Therp e Atheana

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Renato Alves da

    2004-03-15

    The main purpose of this work is the study of human reliability using the THERP (Technique for Human Error Prediction) and ATHEANA methods (A Technique for Human Error Analysis), and some tables and also, from case studies presented on the THERP Handbook to develop a qualitative and quantitative study of nuclear power plant accident. This accident occurred in the TMI (Three Mile Island Unit 2) power plant, PWR type plant, on March 28th, 1979. The accident analysis has revealed a series of incorrect actions, which resulted in the Unit 2 shut down and permanent loss of the reactor. This study also aims at enhancing the understanding of the THERP method and ATHEANA, and of its practical applications. In addition, it is possible to understand the influence of plant operational status on human failures and of these on equipment of a system, in this case, a nuclear power plant. (author)

  4. Development of a highly reliable CRT processor

    International Nuclear Information System (INIS)

    Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya

    1996-01-01

    Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)

  5. METHODOLOGICAL PROBLEMS OF E-LEARNING DIDACTICS

    Directory of Open Access Journals (Sweden)

    Sergey F. Sergeev

    2015-01-01

    Full Text Available The article is devoted to the discussion of the methodological problems of e-learning, didactic issues the use of advanced networking and Internet technologies to create training systems and simulators based on the methodological principles of non-classical and post-non-classical psychology and pedagogy. 

  6. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  7. Reliability Assessment of Concrete Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Middleton, C. R.

    This paper is partly based on research performed for the Highways Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: concrete bridges". It contains the details of a methodology which can be used to generate Whole Life (WL) reliability...... profiles. These WL reliability profiles may be used to establish revised rules for concrete bridges. This paper is to some extend based on Thoft-Christensen et. al. [1996], Thoft-Christensen [1996] et. al. and Thoft-Christensen [1996]....

  8. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  9. MEMS reliability: coming of age

    Science.gov (United States)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  10. Design for reliability: NASA reliability preferred practices for design and test

    Science.gov (United States)

    Lalli, Vincent R.

    1994-01-01

    This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.

  11. Survey of Transmission Cost Allocation Methodologies for Regional Transmission Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Fink, S.; Porter, K.; Mudd, C.; Rogers, J.

    2011-02-01

    The report presents transmission cost allocation methodologies for reliability transmission projects, generation interconnection, and economic transmission projects for all Regional Transmission Organizations.

  12. Quantified reliability and risk assessment methodology in safety evaluation and licensing: survey of practice and trends in E.C. countries; partial contribution in decision making, perpective of safety goals

    International Nuclear Information System (INIS)

    Vinck, W.F.

    1982-01-01

    Quantified reliability analysis of structures and systems and the quantified risk-concept is increasingly developed and applied in safety evaluation and in the licensing/regulatory process where deterministic approaches are however still predominant. A description of the types of application and a survey of the diversified opinions and the problem areas (e.g. the validity of input data, uncertainties in consequence modelling, human factors, common mode failures, etc.) are given. The significance of quantified risk assessment and comparisons, as one of the contributors in the solution to acceptability of modern technology such as nuclear power production, is discussed. Other contributions, such as benefit assessment and cost-efficiency of risk reduction, are also put into perspective within the decision-making process and in the problem of actual acceptance of new technologies. The growing need of developing and agreeing on overall safety objectives (how safe is safe enough) is finally discussed, in the light of the increasing diversity of approaches in the interconnected areas of accident hypotheses/sequences, siting parameters and technical bases for emergency planning; the latter problem being also closely connected to decisional processes for acceptability and to actual acceptance

  13. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  14. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  15. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  16. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  17. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  18. Load Control System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  19. THE EQUALITY PRINCIPLE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    CLAUDIA ANDRIŢOI

    2013-05-01

    Full Text Available The problem premises and the objectives followed: the idea of inserting the equality principle between the freedom and the justice principles is manifested in positive law in two stages, as a general idea of all judicial norms and as requirement of the owner of a subjective right of the applicants of an objective law. Equality in face of the law and of public authorities can not involve the idea of standardization, of uniformity, of enlisting of all citizens under the mark of the same judicial regime, regardless of their natural or socio-professional situation. Through the Beijing Platform and the position documents of the European Commission we have defined the integrative approach of equality as representing an active and visible integration of the gender perspective in all sectors and at all levels. The research methods used are: the conceptualist method, the logical method and the intuitive method necessary as means of reasoning in order to argue our demonstration. We have to underline the fact that the system analysis of the research methods of the judicial phenomenon doesn’t agree with “value ranking”, because one value cannot be generalized in rapport to another. At the same time, we must fight against a methodological extremism. The final purpose of this study is represented by the reaching of the perfecting/excellence stage by all individuals through the promotion of equality and freedom. This supposes the fact that the existence of a non-discrimination favourable frame (fairness represents a means and a condition of self-determination, and the state of perfection/excellency is a result of this self-determination, the condition necessary for the obtaining of this nondiscrimination frame for all of us and in conditions of freedom for all individuals, represents the same condition that promotes the state of perfection/excellency. In conclusion we may state the fact that the equality principle represents a true catalyst of the

  20. Seeking high reliability in primary care: Leadership, tools, and organization.

    Science.gov (United States)

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an

  1. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  2. Human reliability in complex systems: an overview

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1976-07-01

    A detailed analysis is presented of the main conceptual background underlying the areas of human reliability and human error. The concept of error is examined and generalized to that of human reliability, and some of the practical and methodological difficulties of reconciling the different standpoints of the human factors specialist and the engineer discussed. Following a survey of general reviews available on human reliability, quantitative techniques for prediction of human reliability are considered. An in-depth critical analysis of the various quantitative methods is then presented, together with the data bank requirements for human reliability prediction. Reliability considerations in process control and nuclear plant, and also areas of design, maintenance, testing and emergency situations are discussed. The effects of stress on human reliability are analysed and methods of minimizing these effects discussed. Finally, a summary is presented and proposals for further research are set out. (author)

  3. Applying franchising principles to improving water and sanitation services reliability

    CSIR Research Space (South Africa)

    Wall, K

    2008-11-01

    Full Text Available CSIR research has found that franchising partnerships could alleviate and address many challenges in the operation and maintenance of water services infrastructure. Franchising brings appropriate training to those on-site, and also offers backup off...

  4. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  5. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  6. System reliability of corroding pipelines

    International Nuclear Information System (INIS)

    Zhou Wenxing

    2010-01-01

    A methodology is presented in this paper to evaluate the time-dependent system reliability of a pipeline segment that contains multiple active corrosion defects and is subjected to stochastic internal pressure loading. The pipeline segment is modeled as a series system with three distinctive failure modes due to corrosion, namely small leak, large leak and rupture. The internal pressure is characterized as a simple discrete stochastic process that consists of a sequence of independent and identically distributed random variables each acting over a period of one year. The magnitude of a given sequence follows the annual maximum pressure distribution. The methodology is illustrated through a hypothetical example. Furthermore, the impact of the spatial variability of the pressure loading and pipe resistances associated with different defects on the system reliability is investigated. The analysis results suggest that the spatial variability of pipe properties has a negligible impact on the system reliability. On the other hand, the spatial variability of the internal pressure, initial defect sizes and defect growth rates can have a significant impact on the system reliability.

  7. Standards in reliability and safety engineering

    International Nuclear Information System (INIS)

    O'Connor, Patrick

    1998-01-01

    This article explains how the highest 'world class' levels of reliability and safety are achieved, by adherence to the basic principles of excellence in design, production, support and maintenance, by continuous improvement, and by understanding that excellence and improvement lead to reduced costs. These principles are contrasted with the methods that have been developed and standardised, particularly military standards for reliability, ISO9000, and safety case regulations. The article concludes that the formal, standardised approaches are misleading and counterproductive, and recommends that they be replaced by a philosophy based on the realities of human performance

  8. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  9. Ethnography: principles, practice and potential.

    Science.gov (United States)

    Draper, Jan

    2015-05-06

    Ethnography is a methodology that is gaining popularity in nursing and healthcare research. It is concerned with studying people in their cultural context and how their behaviour, either as individuals or as part of a group, is influenced by this cultural context. Ethnography is a form of social research and has much in common with other forms of qualitative enquiry. While classical ethnography was characteristically concerned with describing 'other' cultures, contemporary ethnography has focused on settings nearer to home. This article outlines some of the underlying principles and practice of ethnography, and its potential for nursing and healthcare practice.

  10. Reliability Assessment and Reliability-Based Inspection and Maintenance of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Ramírez, José G. Rangel; Sørensen, John Dalsgaard

    2009-01-01

    Probabilistic methodologies represent an important tool to identify the suitable strategy to inspect and deal with the deterioration in structures such as offshore wind turbines (OWT). Reliability based methods such as Risk Based Inspection (RBI) planning may represent a proper methodology to opt...

  11. Intelligent instrumentation principles and applications

    CERN Document Server

    Bhuyan, Manabendra

    2011-01-01

    With the advent of microprocessors and digital-processing technologies as catalyst, classical sensors capable of simple signal conditioning operations have evolved rapidly to take on higher and more specialized functions including validation, compensation, and classification. This new category of sensor expands the scope of incorporating intelligence into instrumentation systems, yet with such rapid changes, there has developed no universal standard for design, definition, or requirement with which to unify intelligent instrumentation. Explaining the underlying design methodologies of intelligent instrumentation, Intelligent Instrumentation: Principles and Applications provides a comprehensive and authoritative resource on the scientific foundations from which to coordinate and advance the field. Employing a textbook-like language, this book translates methodologies to more than 80 numerical examples, and provides applications in 14 case studies for a complete and working understanding of the material. Beginn...

  12. Reliability of nuclear power plants and equipment

    International Nuclear Information System (INIS)

    1985-01-01

    The standard sets the general principles, a list of reliability indexes and demands on their selection. Reliability indexes of nuclear power plants include the simple indexes of fail-safe operation, life and maintainability, and of storage capability. All terms and notions are explained and methods of evaluating the indexes briefly listed - statistical, and calculation experimental. The dates when the standard comes in force in the individual CMEA countries are given. (M.D.)

  13. Study of evaluation techniques of software safety and reliability in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Park, N. J.; Shin, C. Y. [Chungnam National Univ., Taejon (Korea, Republic of)

    1999-04-15

    Software system development process and software quality assurance activities are examined in this study. Especially software safety and reliability requirements in nuclear power plant are investigated. For this purpose methodologies and tools which can be applied to software analysis, design, implementation, testing, maintenance step are evaluated. Necessary tasks for each step are investigated. Duty, input, and detailed activity for each task are defined to establish development process of high quality software system. This means applying basic concepts of software engineering and principles of system development. This study establish a guideline that can assure software safety and reliability requirements in digitalized nuclear plant systems and can be used as a guidebook of software development process to assure software quality many software development organization.

  14. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  15. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  16. Radiation protection principles

    International Nuclear Information System (INIS)

    Ismail Bahari

    2007-01-01

    The presentation outlines the aspects of radiation protection principles. It discussed the following subjects; radiation hazards and risk, the objectives of radiation protection, three principles of the system - justification of practice, optimization of protection and safety, dose limit

  17. Principles of project management

    Science.gov (United States)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  18. On the Reliability of Implicit and Explicit Memory Measures.

    Science.gov (United States)

    Buchner, Axel; Wippich, Werner

    2000-01-01

    Studied the reliability of implicit and explicit memory tests in experiments involving these tests. Results with 168, 84, 120, and 128 undergraduates show that methodological artifacts may cause implicit memory tests to have lower reliability than explicit memory tests, but that implicit tests need not necessarily be less reliable. (SLD)

  19. The certainty principle (review)

    OpenAIRE

    Arbatsky, D. A.

    2006-01-01

    The certainty principle (2005) allowed to conceptualize from the more fundamental grounds both the Heisenberg uncertainty principle (1927) and the Mandelshtam-Tamm relation (1945). In this review I give detailed explanation and discussion of the certainty principle, oriented to all physicists, both theorists and experimenters.

  20. Quantum Action Principle with Generalized Uncertainty Principle

    OpenAIRE

    Gu, Jie

    2013-01-01

    One of the common features in all promising candidates of quantum gravity is the existence of a minimal length scale, which naturally emerges with a generalized uncertainty principle, or equivalently a modified commutation relation. Schwinger's quantum action principle was modified to incorporate this modification, and was applied to the calculation of the kernel of a free particle, partly recovering the result previously studied using path integral.

  1. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  2. Analysis of NPP protection structure reliability under impact of a falling aircraft

    International Nuclear Information System (INIS)

    Shul'man, G.S.

    1996-01-01

    Methodology for evaluation of NPP protection structure reliability by impact of aircraft fall down is considered. The methodology is base on the probabilistic analysis of all potential events. The problem is solved in three stages: determination of loads on structural units, calculation of local reliability of protection structures by assigned loads and estimation of the structure reliability. The methodology proposed may be applied at the NPP design stage and by determination of reliability of already available structures

  3. THE BASIC PRINCIPLES OF RESEARCH IN NEUROEDUCATION STUDIES

    Directory of Open Access Journals (Sweden)

    Ali Nouri

    2016-06-01

    Full Text Available The present paper assembles contributions from the areas of education, psychology, cognitive science, and of course, neuroeducation itself to introduce the basic principles of research in the field of neuroeducation studies. It is particularly important, as such it is a useful way to justify researchers about what neuroeducation as a specific domain do that no other field can do as well or cannot do at all. Based on the literature reviewed, neuroeducational research can be understood as an interdisciplinary endeavor to develop an insightful understanding and holistic picture of problems related to learning and education. It thus epistemologically is based on an integrated methodological pluralism paradigm. This requires researchers to understand multiple methods and methodologies and employ as they formulate their own research projects. Researchers have a critical role to play in providing systematic evidence and conclusions that are scientifically valid and reliable and educationally relevant and usable. One significant implication of this argument is the need to strengthen the quality of the research component in graduate programs of the field and train interested researchers in the identification and formulation of relevant research questions.

  4. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  5. Implementation of corporate governance principles in Romania

    Directory of Open Access Journals (Sweden)

    Ramona Iulia Țarțavulea (Dieaconescu

    2014-12-01

    Full Text Available The paper aims to conduct a study regarding the manner in which corporate governance principles are applied in Romania, in both public and private sector. In the first part of the paper, the corporate governance principles are presented as they are defined in Romania, in comparison with the main international sources of interest in the domain (OECD corporate governance principles, UE legal framework. The corporate governance (CG principles refer to issues regarding board composition, transparency of scope, objectives and policies; they define the relations between directors and managers, shareholders and stakeholders. The research methodology is based on both fundamental research and empirical study on the implementation of corporate governance principles in companies from Romania. The main instrument of research is a corporate governance index, calculated based on a framework proposed by the author. The corporate governance principles are transposed in criteria that compose the framework for the CG index. The results of the study consist of scores for each CG principles and calculation of CG index for seven companies selected from the public and private sector in Romania. The results are analyzed and discussed in order to formulate general and particular recommendations. The main conclusion of this study is that that a legal framework in the area of corporate governance regulation is needed in Romania. I consider that the main CG principles should be enforced by developing a mandatory legal framework.

  6. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  7. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  8. Dimensional cosmological principles

    International Nuclear Information System (INIS)

    Chi, L.K.

    1985-01-01

    The dimensional cosmological principles proposed by Wesson require that the density, pressure, and mass of cosmological models be functions of the dimensionless variables which are themselves combinations of the gravitational constant, the speed of light, and the spacetime coordinates. The space coordinate is not the comoving coordinate. In this paper, the dimensional cosmological principle and the dimensional perfect cosmological principle are reformulated by using the comoving coordinate. The dimensional perfect cosmological principle is further modified to allow the possibility that mass creation may occur. Self-similar spacetimes are found to be models obeying the new dimensional cosmological principle

  9. Adsorption by powders and porous solids principles, methodology and applications

    CERN Document Server

    Rouquerol, Jean; Llewellyn, Philip; Maurin, Guillaume; Sing, Kenneth SW

    2013-01-01

    The declared objective of this book is to provide an introductory review of the various theoretical and practical aspects of adsorption by powders and porous solids with particular reference to materials of technological importance. The primary aim is to meet the needs of students and non-specialists who are new to surface science or who wish to use the advanced techniques now available for the determination of surface area, pore size and surface characterization. In addition, a critical account is given of recent work on the adsorptive properties of activated carbons, oxides, clays and zeolit

  10. [Basic principles and methodological considerations of health economic evaluations].

    Science.gov (United States)

    Loza, Cesar; Castillo-Portilla, Manuel; Rojas, José Luis; Huayanay, Leandro

    2011-01-01

    Health Economics is an essential instrument for health management, and economic evaluations can be considered as tools assisting the decision-making process for the allocation of resources in health. Currently, economic evaluations are increasingly being used worldwide, thus encouraging evidence-based decision-making and seeking efficient and rational alternatives within the framework of health services activities. In this review, we present an overview and define the basic types of economic evaluations, with emphasis on complete Economic Evaluations (EE). In addition, we review key concepts regarding the perspectives from which EE can be conducted, the types of costs that can be considered, the time horizon, discounting, assessment of uncertainty and decision rules. Finally, we describe concepts about the extrapolation and spread of economic evaluations in health.

  11. Reliability assessment based on subjective inferences

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    The reliability information which comes from subjective analysis is often incomplete prior. This information can be generally assumed to exist in the form of either a stated prior mean of R (reliability) or a stated prior credibility interval on R. An efficient approach is developed to determine a complete beta prior distribution from the subjective information according to the principle of maximum entropy, and the the reliability of survival/failure product is assessed via Bayes theorem. Numerical examples are presented to illustrate the methods

  12. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  13. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  14. The Maximum Entropy Principle and the Modern Portfolio Theory

    Directory of Open Access Journals (Sweden)

    Ailton Cassetari

    2003-12-01

    Full Text Available In this work, a capital allocation methodology base don the Principle of Maximum Entropy was developed. The Shannons entropy is used as a measure, concerning the Modern Portfolio Theory, are also discuted. Particularly, the methodology is tested making a systematic comparison to: 1 the mean-variance (Markovitz approach and 2 the mean VaR approach (capital allocations based on the Value at Risk concept. In principle, such confrontations show the plausibility and effectiveness of the developed method.

  15. Equipment Reliability Program in NPP Krsko

    International Nuclear Information System (INIS)

    Skaler, F.; Djetelic, N.

    2006-01-01

    Operation that is safe, reliable, effective and acceptable to public is the common message in a mission statement of commercial nuclear power plants (NPPs). To fulfill these goals, nuclear industry, among other areas, has to focus on: 1 Human Performance (HU) and 2 Equipment Reliability (EQ). The performance objective of HU is as follows: The behaviors of all personnel result in safe and reliable station operation. While unwanted human behaviors in operations mostly result directly in the event, the behavior flaws either in the area of maintenance or engineering usually cause decreased equipment reliability. Unsatisfied Human performance leads even the best designed power plants into significant operating events, which can be found as well-known examples in nuclear industry. Equipment reliability is today recognized as the key to success. While the human performance at most NPPs has been improving since the start of WANO / INPO / IAEA evaluations, the open energy market has forced the nuclear plants to reduce production costs and operate more reliably and effectively. The balance between these two (opposite) goals has made equipment reliability even more important for safe, reliable and efficient production. Insisting on on-line operation by ignoring some principles of safety could nowadays in a well-developed safety culture and human performance environment exceed the cost of electricity losses. In last decade the leading USA nuclear companies put a lot of effort to improve equipment reliability primarily based on INPO Equipment Reliability Program AP-913 at their NPP stations. The Equipment Reliability Program is the key program not only for safe and reliable operation, but also for the Life Cycle Management and Aging Management on the way to the nuclear power plant life extension. The purpose of Equipment Reliability process is to identify, organize, integrate and coordinate equipment reliability activities (preventive and predictive maintenance, maintenance

  16. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  17. Principles of development of the industry of technogenic waste processing

    Directory of Open Access Journals (Sweden)

    Maria A. Bayeva

    2014-01-01

    Full Text Available Objective to identify and substantiate the principles of development of the industry of technogenic waste processing. Methods systemic analysis and synthesis method of analogy. Results basing on the analysis of the Russian and foreign experience in the field of waste management and environmental protection the basic principles of development activities on technogenic waste processing are formulated the principle of legal regulation the principle of efficiency technologies the principle of ecological safety the principle of economic support. The importance of each principle is substantiated by the description of the situation in this area identifying the main problems and ways of their solution. Scientific novelty the fundamental principles of development of the industry of the industrial wastes processing are revealed the measures of state support are proposed. Practical value the presented theoretical conclusions and proposals are aimed primarily on theoretical and methodological substantiation and practical solutions to modern problems in the sphere of development of the industry of technogenic waste processing.

  18. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  19. Methodological foundations of target market enterprise orientation

    OpenAIRE

    N.V. Karpenko

    2012-01-01

    In the article the author determines the importance of target market orientation maintenance which content is based on marketing principles and envisages the interrelationship of market segmentation processes and positioning. Proposed methodological principles of segmentation implementation are the result of the authors own research, and the process of positioning is examined through the five-level system that contains three stages and two variants of organizational behavior.

  20. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  1. Multi-Disciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  2. EDUCATION IN SEARCH OF THE ADEQUACY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Y. V. Larin

    2014-01-01

    Full Text Available The paper discusses the acute methodology problem: elicitation of the fundamental principle of modern education. In the course of retrospective analysis, the author attempts to trace the essence and comparative historical specificity of the principle in question, and find out whether the currently declared one actually corresponds with the society demands and time requirements.Consequently, the author singles out three successive historical types of education, each of them based on the respective ideological and methodological assumptions. The first one (the 17th – mid-19th century, based on the ontological system of the «Man and Nature», regards the man as a natural creature and proclaims a fundamental educational principle of adequacy to nature. The second type, formed by the end of the 19th century and based on the ontological system of the «Man and Society», takes the man as a social creature and puts forward a fundamental educational principle of adequacy to society. And finally, the multi-dimensional ontological system of the «ManNature-Culture-Society», developed in the mid-20th century, defines the man as a bio-socio-cultural creature and forms a basis for a new fundamental educational principle of adequacy to culture.The paper maintains that the principle of adequacy to nature corresponds with the classical period of education history; orientation on the social adequacy represents its non-classical stage; and consequently, the principle of cultural adequacy signifies the post-non-classical phase. In conclusion, the author argues that resumption of the initial educational principle of adequacy to nature can be regarded as moving backward.

  3. EDUCATION IN SEARCH OF THE ADEQUACY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Y. V. Larin

    2015-03-01

    Full Text Available The paper discusses the acute methodology problem: elicitation of the fundamental principle of modern education. In the course of retrospective analysis, the author attempts to trace the essence and comparative historical specificity of the principle in question, and find out whether the currently declared one actually corresponds with the society demands and time requirements.Consequently, the author singles out three successive historical types of education, each of them based on the respective ideological and methodological assumptions. The first one (the 17th – mid-19th century, based on the ontological system of the «Man and Nature», regards the man as a natural creature and proclaims a fundamental educational principle of adequacy to nature. The second type, formed by the end of the 19th century and based on the ontological system of the «Man and Society», takes the man as a social creature and puts forward a fundamental educational principle of adequacy to society. And finally, the multi-dimensional ontological system of the «ManNature-Culture-Society», developed in the mid-20th century, defines the man as a bio-socio-cultural creature and forms a basis for a new fundamental educational principle of adequacy to culture.The paper maintains that the principle of adequacy to nature corresponds with the classical period of education history; orientation on the social adequacy represents its non-classical stage; and consequently, the principle of cultural adequacy signifies the post-non-classical phase. In conclusion, the author argues that resumption of the initial educational principle of adequacy to nature can be regarded as moving backward.

  4. Basic safety principles for nuclear power plants

    International Nuclear Information System (INIS)

    1988-01-01

    Nuclear power plant safety requires a continuing quest for excellence. All individuals concerned should constantly be alert to opportunities to reduce risks to the lowest practicable level. The quest, however, is most likely to be fruitful if it is based on an understanding of the underlying objectives and principles of nuclear safety, and the way in which its aspects are interrelated. This report is an attempt to provide a logical framework for such an understanding. The proposed objectives and principles of nuclear safety are interconnected and must be taken as a whole; they do not constitute a menu from which selection can be made. The report takes account of current issues and developments. It includes the concept of safety objectives and the use of probabilistic safety assessment. Reliability targets for safety systems are discussed. The concept of a 'safety culture' is crucial. Attention has been paid to the need for planning for accident management. The report contains objectives and principles. The objectives state what is to be achieved; the principles state how to achieve it. In each case, the basic principle is stated as briefly as possible. The accompanying discussion comments on the reasons for the principle and its importance, as well as exceptions, the extent of coverage and any necessary clarification. The discussion is as important as the principle it augments. 4 figs

  5. Philosophy of democracy and Principles of Democracy

    Directory of Open Access Journals (Sweden)

    Jarmila Chovancová

    2016-07-01

    Full Text Available As the title of suggests the article deals with the problems of democracy, its philosophy and also dominant principles. The author reflects interpretation of democracy on the society with their different understand.             Democracy represents a form of government, a way of political life where these principles are put into practice.             Democracy and its separate principles are expressed in the ultimate legal rules in the democratic countries. Principle of participation as a democratic principle rests with the fact that citizens have right to participate in state administration either directly or via their elected representatives. This principle also ensures that citizens participating in state administration enjoy equal basic rights and liberties and also guarantees that no person can be excluded from participation in state administration or from access to elected or other posts.             Methodology: In the article I using method of analyze - I analyzing dominant problems of democracy-its principles in democratic countries. Another method is comparation- understanding democracy from historical aspect. And the end I also using method of synthesis-explanation democracy understand today.

  6. Biomechanics principles and practices

    CERN Document Server

    Peterson, Donald R

    2014-01-01

    Presents Current Principles and ApplicationsBiomedical engineering is considered to be the most expansive of all the engineering sciences. Its function involves the direct combination of core engineering sciences as well as knowledge of nonengineering disciplines such as biology and medicine. Drawing on material from the biomechanics section of The Biomedical Engineering Handbook, Fourth Edition and utilizing the expert knowledge of respected published scientists in the application and research of biomechanics, Biomechanics: Principles and Practices discusses the latest principles and applicat

  7. Fusion research principles

    CERN Document Server

    Dolan, Thomas James

    2013-01-01

    Fusion Research, Volume I: Principles provides a general description of the methods and problems of fusion research. The book contains three main parts: Principles, Experiments, and Technology. The Principles part describes the conditions necessary for a fusion reaction, as well as the fundamentals of plasma confinement, heating, and diagnostics. The Experiments part details about forty plasma confinement schemes and experiments. The last part explores various engineering problems associated with reactor design, vacuum and magnet systems, materials, plasma purity, fueling, blankets, neutronics

  8. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  9. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  10. Of plants and reliability

    International Nuclear Information System (INIS)

    Schneider Horst

    2009-01-01

    Behind the political statements made about the transformer event at the Kruemmel nuclear power station (KKK) in the summer of 2009 there are fundamental issues of atomic law. Pursuant to Articles 20 and 28 of its Basic Law, Germany is a state in which the rule of law applies. Consequently, the aspects of atomic law associated with the incident merit a closer look, all the more so as the items concerned have been known for many years. Important aspects in the debate about the Kruemmel nuclear power plant are the fact that the transformer is considered part of the nuclear power station under atomic law and thus a ''plant'' subject to surveillance by the nuclear regulatory agencies, on the one hand, and the reliability under atomic law of the operator and the executive personnel responsible, on the other hand. Both ''plant'' and ''reliability'' are terms focusing on nuclear safety. Hence the question to what extent safety was affected in the Kruemmel incident. The classification of the event as 0 = no or only a very slight safety impact on the INES scale (INES = International Nuclear Event Scale) should not be used to put aside the safety issue once and for all. Points of fact and their technical significance must be considered prior to any legal assessment. Legal assessments and regulations are associated with facts and circumstances. Any legal examination is based on the facts as determined and elucidated. Any other procedure would be tantamount to an inadmissible legal advance conviction. Now, what is the position of political statements, i.e. political assessments and political responsibility? If everything is done the correct way, they come at the end, after exploration of the facts and evaluation under applicable law. Sometimes things are handled differently, with consequences which are not very helpful. In the light of the provisions about the rule of law as laid down in the Basic Law, the new federal government should be made to observe the proper sequence of

  11. Design for ASIC reliability for low-temperature applications

    Science.gov (United States)

    Chen, Yuan; Mojaradi, Mohammad; Westergard, Lynett; Billman, Curtis; Cozy, Scott; Burke, Gary; Kolawa, Elizabeth

    2005-01-01

    In this paper, we present a methodology to design for reliability for low temperature applications without requiring process improvement. The developed hot carrier aging lifetime projection model takes into account both the transistor substrate current profile and temperature profile to determine the minimum transistor size needed in order to meet reliability requirements. The methodology is applicable for automotive, military, and space applications, where there can be varying temperature ranges. A case study utilizing this methodology is given to design for reliability into a custom application-specific integrated circuit (ASIC) for a Mars exploration mission.

  12. Integrating evidence-based principles into the undergraduate ...

    African Journals Online (AJOL)

    Background. The research methodology module was reviewed as part of the overall revision of the undergraduate physiotherapy curriculum of Stellenbosch University. This created an ideal platform from which to assess how to align the principles of evidence-based practice (EBP) with research methodology. Fostering the ...

  13. Database principles programming performance

    CERN Document Server

    O'Neil, Patrick

    2014-01-01

    Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi

  14. Principles of ecotoxicology

    National Research Council Canada - National Science Library

    Walker, C. H

    2012-01-01

    "Now in its fourth edition, this exceptionally accessible text provides students with a multidisciplinary perspective and a grounding in the fundamental principles required for research in toxicology today...

  15. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  16. D5.3 Reading reliability report

    DEFF Research Database (Denmark)

    Cetin, Bilge Kartal; Galiotto, Carlo; Cetin, Kamil

    2010-01-01

    This deliverable presents a detailed description of the main causes of reading reliability degradation. Two main groups of impairments are recognized: those at the physical layer (e.g., fading, multipath, electromagnetic interference, shadowing due to obstacles, tag orientation misalignment, tag...... bending, metallic environments, etc.) and those at the medium access control sub-layer (e.g., collisions due to tag-to-tag, reader-to-reader and multiple readers-to-tag interference). The review presented in this deliverable covers previous reliability reports and existing definitions of RFID reading...... reliability. Performance metrics and methodologies for assessing reading reliability are further discussed. This document also presents a review of state-of-the-art RFID reading reliability improvement schemes. The solutions are classified into physical- (PHY), medium access control- (MAC), upper-, and cross...

  17. Structural Reliability of Wind Turbine Blades

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    turbine blades. The main purpose is to draw a clear picture of how reliability-based design of wind turbines can be done in practice. The objectives of the thesis are to create methodologies for efficient reliability assessment of composite materials and composite wind turbine blades, and to map...... the uncertainties in the processes, materials and external conditions that have an effect on the health of a composite structure. The study considers all stages in a reliability analysis, from defining models of structural components to obtaining the reliability index and calibration of partial safety factors...... by developing new models and standards or carrying out tests The following aspects are covered in detail: ⋅ The probabilistic aspects of ultimate strength of composite laminates are addressed. Laminated plates are considered as a general structural reliability system where each layer in a laminate is a separate...

  18. Reliability and risk treatment centered maintenance

    International Nuclear Information System (INIS)

    Pexa, Martin; Hladik, Tomas; Ales, Zdenek; Legat, Vaclav; Muller, Miroslav; Valasek, Petr; Havlu, Vit

    2014-01-01

    We propose a new methodology for application of well-known tools - RCM, RBI and SIF pro - with the aim to treat risks by means of suitable maintenance. The basis of the new methodology is the complex application of all three methods at the same time and not separately as is typical today. The proposed methodology suggests having just one managing team for reliability and risk treatment centred maintenance (RRTCM), employing existing RCM, RBI, and SIFpro tools concurrently. This approach allows for significant reduction of engineering activities' duration. In the proposed methodology these activities are staged into five phases and structured to eliminate all duplication resulting from separate application of the three tools. The newly proposed methodology saves 45% to 50% of the engineering workload and dequate significant financial savings.

  19. Reliability and risk treatment centered maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Pexa, Martin; Hladik, Tomas; Ales, Zdenek; Legat, Vaclav; Muller, Miroslav; Valasek, Petr [Czech University of Life Sciences Prague, Kamycka (Czech Republic); Havlu, Vit [Unipetrol A. S, Prague (Czech Republic)

    2014-10-15

    We propose a new methodology for application of well-known tools - RCM, RBI and SIF pro - with the aim to treat risks by means of suitable maintenance. The basis of the new methodology is the complex application of all three methods at the same time and not separately as is typical today. The proposed methodology suggests having just one managing team for reliability and risk treatment centred maintenance (RRTCM), employing existing RCM, RBI, and SIFpro tools concurrently. This approach allows for significant reduction of engineering activities' duration. In the proposed methodology these activities are staged into five phases and structured to eliminate all duplication resulting from separate application of the three tools. The newly proposed methodology saves 45% to 50% of the engineering workload and dequate significant financial savings.

  20. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  1. APPLYING THE PRINCIPLES OF ACCOUNTING IN

    OpenAIRE

    NAGY CRISTINA MIHAELA; SABĂU CRĂCIUN; ”Tibiscus” University of Timişoara, Faculty of Economic Science

    2015-01-01

    The application of accounting principles (accounting principle on accrual basis; principle of business continuity; method consistency principle; prudence principle; independence principle; the principle of separate valuation of assets and liabilities; intangibility principle; non-compensation principle; the principle of substance over form; the principle of threshold significance) to companies that are in bankruptcy procedure has a number of particularities. Thus, some principl...

  2. Scaled CMOS Technology Reliability Users Guide

    Science.gov (United States)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  3. Accelerated reliability demonstration under competing failure modes

    International Nuclear Information System (INIS)

    Luo, Wei; Zhang, Chun-hua; Chen, Xun; Tan, Yuan-yuan

    2015-01-01

    The conventional reliability demonstration tests are difficult to apply to products with competing failure modes due to the complexity of the lifetime models. This paper develops a testing methodology based on the reliability target allocation for reliability demonstration under competing failure modes at accelerated conditions. The specified reliability at mission time and the risk caused by sampling of the reliability target for products are allocated for each failure mode. The risk caused by degradation measurement fitting of the target for a product involving performance degradation is equally allocated to each degradation failure mode. According to the allocated targets, the accelerated life reliability demonstration test (ALRDT) plans for the failure modes are designed. The accelerated degradation reliability demonstration test plans and the associated ALRDT plans for the degradation failure modes are also designed. Next, the test plan and the decision rules for the products are designed. Additionally, the effects of the discreteness of sample size and accepted number of failures for failure modes on the actual risks caused by sampling for the products are investigated. - Highlights: • Accelerated reliability demonstration under competing failure modes is studied. • The method is based on the reliability target allocation involving the risks. • The test plan for the products is based on the plans for all the failure modes. • Both failure mode and degradation failure modes are considered. • The error of actual risks caused by sampling for the products is small enough

  4. Correcting Fallacies in Validity, Reliability, and Classification

    Science.gov (United States)

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  5. Methodological basis for formation of uniterruptible education content for future specialists of atomic-nuclear complex

    International Nuclear Information System (INIS)

    Burtebayev, N.; Burtebayeva, J.T.; Basharuly, R.; Altynsarin, Y.

    2009-01-01

    Full text: For science-reliable determination of the content of uninterruptible education system, as a rule, the following levels of theoretical-methodological approach are used in complex: 1) science-wide methodological level based on the dialectical laws of knowledge theory; 2) science-wide methodological level based on the principles and the provisions of system analysis; 3) particular science methodological level based on the laws and the principles of any specific science [1]. Such holistic approach covering all levels of science methodology is required for determination of the content of uninterruptible education for future specialists of nuclear profile. Indeed, considering the problem related to the content of uninterruptible education from the point of the first science-wide methodological level we shall follow primary the requirements of dialectical 'Law of common, special and single unity', where firstly the universal values in science, culture and technology forming the united invariant of education content of the world education space is positioned as the 'common' component of uninterruptible education content; secondly, the theoretical-practical achievements gained in the countries of any region (for example Eurasian space) are positioned as the 'special' component of the content for the training of the specialists of nuclear profile; thirdly, the content elements determined in accordance with socio-economic order of the specific society introducing the national interests of the specific country (for example, Republic of Kazakhstan) are positioned as the 'single' component of the education content for the future specialists of atomic-nuclear complex. Inseparable unity of the above mentioned components of the education content which have been determined in accordance with the laws, principles and provisions of all three levels of science-methodological approach assures the high level competence and the functional mobility of nuclear profile specialist

  6. The genetic difference principle.

    Science.gov (United States)

    Farrelly, Colin

    2004-01-01

    In the newly emerging debates about genetics and justice three distinct principles have begun to emerge concerning what the distributive aim of genetic interventions should be. These principles are: genetic equality, a genetic decent minimum, and the genetic difference principle. In this paper, I examine the rationale of each of these principles and argue that genetic equality and a genetic decent minimum are ill-equipped to tackle what I call the currency problem and the problem of weight. The genetic difference principle is the most promising of the three principles and I develop this principle so that it takes seriously the concerns of just health care and distributive justice in general. Given the strains on public funds for other important social programmes, the costs of pursuing genetic interventions and the nature of genetic interventions, I conclude that a more lax interpretation of the genetic difference principle is appropriate. This interpretation stipulates that genetic inequalities should be arranged so that they are to the greatest reasonable benefit of the least advantaged. Such a proposal is consistent with prioritarianism and provides some practical guidance for non-ideal societies--that is, societies that do not have the endless amount of resources needed to satisfy every requirement of justice.

  7. The principle of equivalence

    International Nuclear Information System (INIS)

    Unnikrishnan, C.S.

    1994-01-01

    Principle of equivalence was the fundamental guiding principle in the formulation of the general theory of relativity. What are its key elements? What are the empirical observations which establish it? What is its relevance to some new experiments? These questions are discussed in this article. (author). 11 refs., 5 figs

  8. The Dutch premium principle

    NARCIS (Netherlands)

    van Heerwaarden, A.E.; Kaas, R.

    1992-01-01

    A premium principle is derived, in which the loading for a risk is the reinsurance loading for an excess-of-loss cover. It is shown that the principle is well-behaved in the sense that it results in larger premiums for risks that are larger in stop-loss order or in stochastic dominance.

  9. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  10. The anthropic principle

    International Nuclear Information System (INIS)

    Carr, B.J.

    1982-01-01

    The anthropic principle (the conjecture that certain features of the world are determined by the existence of Man) is discussed with the listing of the objections, and is stated that nearly all the constants of nature may be determined by the anthropic principle which does not give exact values for the constants but only their orders of magnitude. (J.T.)

  11. A Simulation Model for Machine Efficiency Improvement Using Reliability Centered Maintenance: Case Study of Semiconductor Factory

    Directory of Open Access Journals (Sweden)

    Srisawat Supsomboon

    2014-01-01

    Full Text Available The purpose of this study was to increase the quality of product by focusing on the machine efficiency improvement. The principle of the reliability centered maintenance (RCM was applied to increase the machine reliability. The objective was to create preventive maintenance plan under reliability centered maintenance method and to reduce defects. The study target was set to reduce the Lead PPM for a test machine by simulating the proposed preventive maintenance plan. The simulation optimization approach based on evolutionary algorithms was employed for the preventive maintenance technique selection process to select the PM interval that gave the best total cost and Lead PPM values. The research methodology includes procedures such as following the priority of critical components in test machine, analyzing the damage and risk level by using Failure Mode and Effects Analysis (FMEA, calculating the suitable replacement period through reliability estimation, and optimizing the preventive maintenance plan. From the result of the study it is shown that the Lead PPM of test machine can be reduced. The cost of preventive maintenance, cost of good product, and cost of lost product were decreased.

  12. Mach's holographic principle

    International Nuclear Information System (INIS)

    Khoury, Justin; Parikh, Maulik

    2009-01-01

    Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.

  13. Variational principles in physics

    CERN Document Server

    Basdevant, Jean-Louis

    2007-01-01

    Optimization under constraints is an essential part of everyday life. Indeed, we routinely solve problems by striking a balance between contradictory interests, individual desires and material contingencies. This notion of equilibrium was dear to thinkers of the enlightenment, as illustrated by Montesquieu’s famous formulation: "In all magistracies, the greatness of the power must be compensated by the brevity of the duration." Astonishingly, natural laws are guided by a similar principle. Variational principles have proven to be surprisingly fertile. For example, Fermat used variational methods to demonstrate that light follows the fastest route from one point to another, an idea which came to be known as Fermat’s principle, a cornerstone of geometrical optics. Variational Principles in Physics explains variational principles and charts their use throughout modern physics. The heart of the book is devoted to the analytical mechanics of Lagrange and Hamilton, the basic tools of any physicist. Prof. Basdev...

  14. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  15. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  16. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  17. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  18. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  19. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  20. Reliability assessment of nuclear structural systems

    International Nuclear Information System (INIS)

    Reich, M.; Hwang, H.

    1983-01-01

    Reliability assessment of nuclear structural systems has been receiving more emphasis over the last few years. This paper deals with the recent progress made by the Structural Analysis Division of Brookhaven National Laboratory (BNL), in the development of a probability-based reliability analysis methodology for safety evaluation of reactor containments and other seismic category I structures. An important feature of this methodology is the incorporation of finite element analysis and random vibration theory. By utilizing this method, it is possible to evaluate the safety of nuclear structures under various static and dynamic loads in terms of limit state probability. Progress in other related areas, such as the establishment of probabilistic characteristics for various loads and structural resistance, are also described. Results of an application of the methodology to a realistic reinforced concrete containment subjected to dead and live loads, accidental internal pressures and earthquake ground accelerations are presented

  1. Methodological pluralism and structure of sociological theory

    Directory of Open Access Journals (Sweden)

    N. L. Polyakova

    2015-01-01

    Full Text Available In the paper the historical-sociological analysis is used as a means to show the differences between theoretical and empirical sociology. There exist several, basic traditions in theoretical sociology. The investigation of their competing theoretical and methodological principles carried out in the paper; identify some fundamental features of sociological theory as a whole.

  2. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  3. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  4. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  5. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  6. Reliability engineering for nuclear and other high technology systems

    International Nuclear Information System (INIS)

    Lakner, A.A.; Anderson, R.T.

    1985-01-01

    This book is written for the reliability instructor, program manager, system engineer, design engineer, reliability engineer, nuclear regulator, probability risk assessment (PRA) analyst, general manager and others who are involved in system hardware acquisition, design and operation and are concerned with plant safety and operational cost-effectiveness. It provides criteria, guidelines and comprehensive engineering data affecting reliability; it covers the key aspects of system reliability as it relates to conceptual planning, cost tradeoff decisions, specification, contractor selection, design, test and plant acceptance and operation. It treats reliability as an integrated methodology, explicitly describing life cycle management techniques as well as the basic elements of a total hardware development program, including: reliability parameters and design improvement attributes, reliability testing, reliability engineering and control. It describes how these elements can be defined during procurement, and implemented during design and development to yield reliable equipment. (author)

  7. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  8. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  9. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  10. Implantable biomedical microsystems design principles and applications

    CERN Document Server

    Bhunia, Swarup; Sawan, Mohamad

    2015-01-01

    Research and innovation in areas such as circuits, microsystems, packaging, biocompatibility, miniaturization, power supplies, remote control, reliability, and lifespan are leading to a rapid increase in the range of devices and corresponding applications in the field of wearable and implantable biomedical microsystems, which are used for monitoring, diagnosing, and controlling the health conditions of the human body. This book provides comprehensive coverage of the fundamental design principles and validation for implantable microsystems, as well as several major application areas. Each co

  11. Reliability of thermal interface materials: A review

    International Nuclear Information System (INIS)

    Due, Jens; Robinson, Anthony J.

    2013-01-01

    Thermal interface materials (TIMs) are used extensively to improve thermal conduction across two mating parts. They are particularly crucial in electronics thermal management since excessive junction-to-ambient thermal resistances can cause elevated temperatures which can negatively influence device performance and reliability. Of particular interest to electronic package designers is the thermal resistance of the TIM layer at the end of its design life. Estimations of this allow the package to be designed to perform adequately over its entire useful life. To this end, TIM reliability studies have been performed using accelerated stress tests. This paper reviews the body of work which has been performed on TIM reliability. It focuses on the various test methodologies with commentary on the results which have been obtained for the different TIM materials. Based on the information available in the open literature, a test procedure is proposed for TIM selection based on beginning and end of life performance. - Highlights: ► This paper reviews the body of work which has been performed on TIM reliability. ► Test methodologies for reliability testing are outlined. ► Reliability results for the different TIM materials are discussed. ► A test procedure is proposed for TIM selection BOLife and EOLife performance.

  12. Composite reliability evaluation for transmission network planning

    Directory of Open Access Journals (Sweden)

    Jiashen Teh

    2018-01-01

    Full Text Available As the penetration of wind power into the power system increases, the ability to assess the reliability impact of such interaction becomes more important. The composite reliability evaluations involving wind energy provide ample opportunities for assessing the benefits of different wind farm connection points. A connection to the weak area of the transmission network will require network reinforcement for absorbing the additional wind energy. Traditionally, the reinforcements are performed by constructing new transmission corridors. However, a new state-of-art technology such as the dynamic thermal rating (DTR system, provides new reinforcement strategy and this requires new reliability assessment method. This paper demonstrates a methodology for assessing the cost and the reliability of network reinforcement strategies by considering the DTR systems when large scale wind farms are connected to the existing power network. Sequential Monte Carlo simulations were performed and all DTRs and wind speed were simulated using the auto-regressive moving average (ARMA model. Various reinforcement strategies were assessed from their cost and reliability aspects. Practical industrial standards are used as guidelines when assessing costs. Due to this, the proposed methodology in this paper is able to determine the optimal reinforcement strategies when both the cost and reliability requirements are considered.

  13. Limitations of Boltzmann's principle

    International Nuclear Information System (INIS)

    Lavenda, B.H.

    1995-01-01

    The usual form of Boltzmann's principle assures that maximum entropy, or entropy reduction, occurs with maximum probability, implying a unimodal distribution. Boltzmann's principle cannot be applied to nonunimodal distributions, like the arcsine law, because the entropy may be concave only over a limited portion of the interval. The method of subordination shows that the arcsine distribution corresponds to a process with a single degree of freedom, thereby confirming the invalidation of Boltzmann's principle. The fractalization of time leads to a new distribution in which arcsine and Cauchy distributions can coexist simultaneously for nonintegral degrees of freedom between √2 and 2

  14. Biomedical engineering principles

    CERN Document Server

    Ritter, Arthur B; Valdevit, Antonio; Ascione, Alfred N

    2011-01-01

    Introduction: Modeling of Physiological ProcessesCell Physiology and TransportPrinciples and Biomedical Applications of HemodynamicsA Systems Approach to PhysiologyThe Cardiovascular SystemBiomedical Signal ProcessingSignal Acquisition and ProcessingTechniques for Physiological Signal ProcessingExamples of Physiological Signal ProcessingPrinciples of BiomechanicsPractical Applications of BiomechanicsBiomaterialsPrinciples of Biomedical Capstone DesignUnmet Clinical NeedsEntrepreneurship: Reasons why Most Good Designs Never Get to MarketAn Engineering Solution in Search of a Biomedical Problem

  15. [Bioethics of principles].

    Science.gov (United States)

    Pérez-Soba Díez del Corral, Juan José

    2008-01-01

    Bioethics emerges about the tecnological problems of acting in human life. Emerges also the problem of the moral limits determination, because they seem exterior of this practice. The Bioethics of Principles, take his rationality of the teleological thinking, and the autonomism. These divergence manifest the epistemological fragility and the great difficulty of hmoralñ thinking. This is evident in the determination of autonomy's principle, it has not the ethical content of Kant's propose. We need a new ethic rationality with a new refelxion of new Principles whose emerges of the basic ethic experiences.

  16. Principles of dynamics

    CERN Document Server

    Hill, Rodney

    2013-01-01

    Principles of Dynamics presents classical dynamics primarily as an exemplar of scientific theory and method. This book is divided into three major parts concerned with gravitational theory of planetary systems; general principles of the foundations of mechanics; and general motion of a rigid body. Some of the specific topics covered are Keplerian Laws of Planetary Motion; gravitational potential and potential energy; and fields of axisymmetric bodies. The principles of work and energy, fictitious body-forces, and inertial mass are also looked into. Other specific topics examined are kinematics

  17. Hamilton's principle for beginners

    International Nuclear Information System (INIS)

    Brun, J L

    2007-01-01

    I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a line. Next, students are challenged to add gravity to reinforce the argument and, finally, a two-dimensional motion in a vertical plane is considered. Furthermore these examples force us to be very clear about such an abstract principle

  18. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  19. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  20. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  1. Vaccinology: principles and practice

    National Research Council Canada - National Science Library

    Morrow, John

    2012-01-01

    ... principles to implementation. This is an authoritative textbook that details a comprehensive and systematic approach to the science of vaccinology focusing on not only basic science, but the many stages required to commercialize...

  2. On the invariance principle

    Energy Technology Data Exchange (ETDEWEB)

    Moller-Nielsen, Thomas [University of Oxford (United Kingdom)

    2014-07-01

    Physicists and philosophers have long claimed that the symmetries of our physical theories - roughly speaking, those transformations which map solutions of the theory into solutions - can provide us with genuine insight into what the world is really like. According to this 'Invariance Principle', only those quantities which are invariant under a theory's symmetries should be taken to be physically real, while those quantities which vary under its symmetries should not. Physicists and philosophers, however, are generally divided (or, indeed, silent) when it comes to explaining how such a principle is to be justified. In this paper, I spell out some of the problems inherent in other theorists' attempts to justify this principle, and sketch my own proposed general schema for explaining how - and when - the Invariance Principle can indeed be used as a legitimate tool of metaphysical inference.

  3. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  4. Minimum entropy production principle

    Czech Academy of Sciences Publication Activity Database

    Maes, C.; Netočný, Karel

    2013-01-01

    Roč. 8, č. 7 (2013), s. 9664-9677 ISSN 1941-6016 Institutional support: RVO:68378271 Keywords : MINEP Subject RIV: BE - Theoretical Physics http://www.scholarpedia.org/article/Minimum_entropy_production_principle

  5. Global ethics and principlism.

    Science.gov (United States)

    Gordon, John-Stewart

    2011-09-01

    This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together.

  6. A technical survey on issues of the quantitative evaluation of software reliability

    International Nuclear Information System (INIS)

    Park, J. K; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K. Y.; Park, J. K.

    2000-04-01

    To develop the methodology for evaluating the software reliability included in digital instrumentation and control system (I and C), many kinds of methodologies/techniques that have been proposed from the software reliability engineering fuel are analyzed to identify the strong and week points of them. According to analysis results, methodologies/techniques that can be directly applied for the evaluation of the software reliability are not exist. Thus additional researches to combine the most appropriate methodologies/techniques from existing ones would be needed to evaluate the software reliability. (author)

  7. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  8. Microprocessors principles and applications

    CERN Document Server

    Debenham, Michael J

    1979-01-01

    Microprocessors: Principles and Applications deals with the principles and applications of microprocessors and covers topics ranging from computer architecture and programmed machines to microprocessor programming, support systems and software, and system design. A number of microprocessor applications are considered, including data processing, process control, and telephone switching. This book is comprised of 10 chapters and begins with a historical overview of computers and computing, followed by a discussion on computer architecture and programmed machines, paying particular attention to t

  9. Electrical and electronic principles

    CERN Document Server

    Knight, S A

    1991-01-01

    Electrical and Electronic Principles, 2, Second Edition covers the syllabus requirements of BTEC Unit U86/329, including the principles of control systems and elements of data transmission. The book first tackles series and parallel circuits, electrical networks, and capacitors and capacitance. Discussions focus on flux density, electric force, permittivity, Kirchhoff's laws, superposition theorem, arrangement of resistors, internal resistance, and powers in a circuit. The text then takes a look at capacitors in circuit, magnetism and magnetization, electromagnetic induction, and alternating v

  10. Microwave system engineering principles

    CERN Document Server

    Raff, Samuel J

    1977-01-01

    Microwave System Engineering Principles focuses on the calculus, differential equations, and transforms of microwave systems. This book discusses the basic nature and principles that can be derived from thermal noise; statistical concepts and binomial distribution; incoherent signal processing; basic properties of antennas; and beam widths and useful approximations. The fundamentals of propagation; LaPlace's Equation and Transmission Line (TEM) waves; interfaces between homogeneous media; modulation, bandwidth, and noise; and communications satellites are also deliberated in this text. This bo

  11. Electrical and electronic principles

    CERN Document Server

    Knight, SA

    1988-01-01

    Electrical and Electronic Principles, 3 focuses on the principles involved in electrical and electronic circuits, including impedance, inductance, capacitance, and resistance.The book first deals with circuit elements and theorems, D.C. transients, and the series circuits of alternating current. Discussions focus on inductance and resistance in series, resistance and capacitance in series, power factor, impedance, circuit magnification, equation of charge, discharge of a capacitor, transfer of power, and decibels and attenuation. The manuscript then examines the parallel circuits of alternatin

  12. Remark on Heisenberg's principle

    International Nuclear Information System (INIS)

    Noguez, G.

    1988-01-01

    Application of Heisenberg's principle to inertial frame transformations allows a distinction between three commutative groups of reciprocal transformations along one direction: Galilean transformations, dual transformations, and Lorentz transformations. These are three conjugate groups and for a given direction, the related commutators are all proportional to one single conjugation transformation which compensates for uniform and rectilinear motions. The three transformation groups correspond to three complementary ways of measuring space-time as a whole. Heisenberg's Principle then gets another explanation [fr

  13. Methodological Developments in Geophysical Assimilation Modeling

    Science.gov (United States)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  14. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  15. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  16. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  17. Qualitative methodology in a psychoanalytic single case study

    DEFF Research Database (Denmark)

    Grünbaum, Liselotte

    features and breaks in psychotherapy investigated. One aim of the study was to contribute to the development of a transparent and systematic methodology for the psychoanalytic case study by application of rigorous qualitative research methodology. To this end, inductive-deductive principles in line...

  18. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  19. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  20. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  1. Guide for generic application of Reliability Centered Maintenance (RCM) recommendations

    International Nuclear Information System (INIS)

    Schwan, C.A.; Toomey, G.E.; Morgan, T.A.; Darling, S.S.

    1991-02-01

    Previously completed reliability centered maintenance (RCM) studies form the basis for developing or refining a preventive maintenance program. This report describes a generic methodology that will help utilities optimize nuclear plant maintenance programs using RCM techniques. This guide addresses the following areas: history of the generic methodology development process, and use of the generic methodology for conducting system-to-system and component-to-component evaluations. 2 refs., 2 figs., 5 tabs

  2. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  3. A Principle of Intentionality.

    Science.gov (United States)

    Turner, Charles K

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett's model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone.

  4. The traveltime holographic principle

    Science.gov (United States)

    Huang, Yunsong; Schuster, Gerard T.

    2015-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  5. Ethical principles of scientific communication

    Directory of Open Access Journals (Sweden)

    Baranov G. V.

    2017-03-01

    Full Text Available the article presents the principles of ethical management of scientific communication. The author approves the priority of ethical principle of social responsibility of the scientist.

  6. Seismic stops vs. snubbers, a reliable alternative

    International Nuclear Information System (INIS)

    Cloud, R.L.; Anderson, P.H.; Leung, J.S.M.

    1988-01-01

    The Seismic Stops methodology has been developed to provide a reliable alternative for providing seismic support to nuclear power plant piping. The concept is based on using rigid passive supports with large clearances. These gaps permit unrestrained thermal expansion while limiting excessive seismic displacements. This type of restraint has performed successfully in fossil fueled power plants. A simplified production analysis tool has been developed which evaluates the nonlinear piping response including the effect of the gapped supports. The methodology utilizes the response spectrum approach and has been incorporated into a piping analysis computer program RLCA-GAP. Full scale shake table tests of piping specimens were performed to provide test correlation with the developed methodology. Analyses using RLCA-GAP were in good agreement with test results. A sample piping system was evaluated using the Seismic Stops methodology to replace the existing snubbers with passive gapped supports. To provide further correlation data, the sample system was also evaluated using nonlinear time history analysis. The correlation comparisons showed RLCA-GAP to be a viable methodology and a reliable alternative for snubber optimization and elimination. (orig.)

  7. Life management and operational experience feedback - tools to enhance safety and reliability of the NPP

    International Nuclear Information System (INIS)

    Mach, P.

    1997-01-01

    Preparation has started of the Temelin power plant centralized equipment database. Principles of reliability centered maintenance are studied, and use of these activities will be made in the Plant Ageing Management Programme. The aims of the Programme are as follows: selection of important components subject to ageing, data collection, determination of dominant stressors, development, selection and validation of ageing evaluation methods, setup of experience feedback, determination of responsibilities, methodologies and strategy, elaboration of programme procedures and documentation, and maintenance of programme flexibility. Pilot studies of component ageing are under way: for the reactor pressure vessel, steam generator, pressurizer, piping, ECCS and cables. The organizational structure of the Operational Experience Feedback system is described, as are the responsibility of staff and sources of information. (M.D.)

  8. Parts and Components Reliability Assessment: A Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  9. Reliability of application of inspection procedures

    Energy Technology Data Exchange (ETDEWEB)

    Murgatroyd, R A

    1988-12-31

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC). 3 refs.

  10. Reliability of application of inspection procedures

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1988-01-01

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC)

  11. Ethical principles and theories.

    Science.gov (United States)

    Schultz, R C

    1993-01-01

    Ethical theory about what is right and good in human conduct lies behind the issues practitioners face and the codes they turn to for guidance; it also provides guidance for actions, practices, and policies. Principles of obligation, such as egoism, utilitarianism, and deontology, offer general answers to the question, "Which acts/practices are morally right?" A re-emerging alternative to using such principles to assess individual conduct is to center normative theory on personal virtues. For structuring society's institutions, principles of social justice offer alternative answers to the question, "How should social benefits and burdens be distributed?" But human concerns about right and good call for more than just theoretical responses. Some critics (eg, the postmodernists and the feminists) charge that normative ethical theorizing is a misguided enterprise. However, that charge should be taken as a caution and not as a refutation of normative ethical theorizing.

  12. Principles of musical acoustics

    CERN Document Server

    Hartmann, William M

    2013-01-01

    Principles of Musical Acoustics focuses on the basic principles in the science and technology of music. Musical examples and specific musical instruments demonstrate the principles. The book begins with a study of vibrations and waves, in that order. These topics constitute the basic physical properties of sound, one of two pillars supporting the science of musical acoustics. The second pillar is the human element, the physiological and psychological aspects of acoustical science. The perceptual topics include loudness, pitch, tone color, and localization of sound. With these two pillars in place, it is possible to go in a variety of directions. The book treats in turn, the topics of room acoustics, audio both analog and digital, broadcasting, and speech. It ends with chapters on the traditional musical instruments, organized by family. The mathematical level of this book assumes that the reader is familiar with elementary algebra. Trigonometric functions, logarithms and powers also appear in the book, but co...

  13. Reliability allocation in nuclear power plants

    International Nuclear Information System (INIS)

    Bari, R.A.; Cho, N.Z.; Papazoglou, I.A.

    1985-01-01

    The technical feasibility of allocating reliability and risk to reactor systems, subsystems, components, operations, and structures is investigated. A methodology is discussed which identifies top level risk indices as objective functions and plant-specific performance variables as decision variables. These are related by a risk model which includes cost as a top level risk index. A multiobjective optimization procedure is used to find non-inferior solutions in terms of the objective functions and the decision variables. The approach is illustrated for a boiling water reactor plant. The use of the methodology for both operating reactors and for advanced designs is briefly discussed. 16 refs., 1 fig

  14. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    Mills, S.E.

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  15. Reliability assessment for safety critical systems by statistical random testing

    Energy Technology Data Exchange (ETDEWEB)

    Mills, S E [Carleton Univ., Ottawa, ON (Canada). Statistical Consulting Centre

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs.

  16. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  17. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  18. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  19. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  20. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  1. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  2. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  3. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  4. Mechanical engineering principles

    CERN Document Server

    Bird, John

    2014-01-01

    A student-friendly introduction to core engineering topicsThis book introduces mechanical principles and technology through examples and applications, enabling students to develop a sound understanding of both engineering principles and their use in practice. These theoretical concepts are supported by 400 fully worked problems, 700 further problems with answers, and 300 multiple-choice questions, all of which add up to give the reader a firm grounding on each topic.The new edition is up to date with the latest BTEC National specifications and can also be used on undergraduate courses in mecha

  5. Itch Management: General Principles.

    Science.gov (United States)

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. © 2016 S. Karger AG, Basel.

  6. Principles of Optics

    Science.gov (United States)

    Born, Max; Wolf, Emil

    1999-10-01

    Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.

  7. Electrical principles 3 checkbook

    CERN Document Server

    Bird, J O

    2013-01-01

    Electrical Principles 3 Checkbook aims to introduce students to the basic electrical principles needed by technicians in electrical engineering, electronics, and telecommunications.The book first tackles circuit theorems, single-phase series A.C. circuits, and single-phase parallel A.C. circuits. Discussions focus on worked problems on parallel A.C. circuits, worked problems on series A.C. circuits, main points concerned with D.C. circuit analysis, worked problems on circuit theorems, and further problems on circuit theorems. The manuscript then examines three-phase systems and D.C. transients

  8. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  9. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  10. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  11. Organizing the Methodology Work at Higher School

    Directory of Open Access Journals (Sweden)

    O. A. Plaksina

    2012-01-01

    Full Text Available The paper considers the methodology components of organizing the higher school training. The research and analysis of the existing methodology systems carried out by the authors reveals that their advantages and disadvantages are related to the type of the system creating element of the methodology system organizational structure. The optimal scheme of such system has been developed in the context of Vocational School Reorganization implying the specification and expansion of the set of basic design principles of any control system. Following the suggested organizational approach provides the grounds for teachers’ self development and professional growth. The methodology of the approach allows using the given structure in any higher educational institution, providing the system transition from its simple functioning to the sustainable development mode. 

  12. Risk assessment and reliability for low level radioactive waste disposal

    International Nuclear Information System (INIS)

    Gregory, P.O.; Jones, G.A.

    1986-01-01

    The reliability of critical design features at low-level radioactive waste disposal facilities is a major concern in the licensing of these structures. To date, no systematic methodology has been adopted to evaluate the geotechnical reliability of Uranium Mill Tailings Remedial Action (UMTRA) disposal facilities currently being designed and/or constructed. This paper discusses and critiques the deterministic methods currently used to evaluate UMTRA reliability. Because deterministic methods may not be applicable in some cases because of the unusually long design life of UMTRA facilities, it is proposed that a probabilistic risk assessment-based methodology be used as a secondary method to aid in the evaluating of geotechnical reliability of critical items. Similar methodologies have proven successful in evaluating the reliability of a variety of conventional earth structures. In this paper, an ''acceptable'' level of risk for UMTRA facilities is developed, an evaluation method is presented, and two example applications of the proposed methodology are provided for a generic UMTRA disposal facility. The proposed technique is shown to be a simple method which might be used to aid in reliability evaluations on a selective basis. Finally, other possible applications and the limitations of the proposed methodology are discussed

  13. Effect of communication on the reliability of nuclear power plant control room operations - pre study

    International Nuclear Information System (INIS)

    Kettunen, Jari; Pyy, Pekka

    1999-01-01

    The objective of the study presented in this paper is to investigate communication practices and their impact on human reliability and plant safety in a nuclear power plant environment. The study aims at developing a general systems approach towards the issue. The ultimate goal of the study is to contribute to the development of probabilistic safety assessment methodologies in the area of communications and crew co-operation. This paper outlines the results of the pre-study. The study is based on the use and further development of different modelling techniques and the application of generic systems engineering as well as crew resource management (CRM) principles. The results so far include a concise literature review on communication and crew performance, a presentation of some potential theoretical concepts and approaches for studying communication in relation to organisational reliability, causal failure sequences and human failures mechanisms, and an introduction of a General Communications Model (GCM) that is presented as a promising approach for studying the reliability and adequacy of communication transactions. Finally, some observations and recommendation concerning next phases of the study are made (author) (ml)

  14. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  15. Justifying Design Decisions with Theory-based Design Principles

    OpenAIRE

    Schermann, Michael;Gehlert, Andreas;Pohl, Klaus;Krcmar, Helmut

    2014-01-01

    Although the role of theories in design research is recognized, we show that little attention has been paid on how to use theories when designing new artifacts. We introduce design principles as a new methodological approach to address this problem. Design principles extend the notion of design rationales that document how a design decision emerged. We extend the concept of design rationales by using theoretical hypotheses to support or object to design decisions. At the example of developing...

  16. The Principles of Readability

    Science.gov (United States)

    DuBay, William H.

    2004-01-01

    The principles of readability are in every style manual. Readability formulas are in every writing aid. What is missing is the research and theory on which they stand. This short review of readability research spans 100 years. The first part covers the history of adult literacy studies in the U.S., establishing the stratified nature of the adult…

  17. Principles of electrodynamics

    CERN Document Server

    Schwartz, Melvin

    1972-01-01

    This advanced undergraduate- and graduate-level text by the 1988 Nobel Prize winner establishes the subject's mathematical background, reviews the principles of electrostatics, then introduces Einstein's special theory of relativity and applies it throughout the book in topics ranging from Gauss' theorem and Coulomb's law to electric and magnetic susceptibility.

  18. The Idiom Principle Revisited

    Science.gov (United States)

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  19. The Pauli Exclusion Principle

    Indian Academy of Sciences (India)

    his exclusion principle, the quantum theory was a mess. Moreover, it could ... This is a function of all the coordinates and 'internal variables' such as spin, of all the ... must remain basically the same (ie change by a phase factor at most) if we ...

  20. The traveltime holographic principle

    KAUST Repository

    Huang, Y.; Schuster, Gerard T.

    2014-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the ‘traveltime holographic principle’, by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  1. The Bohr Correspondence Principle

    Indian Academy of Sciences (India)

    IAS Admin

    Deepak Dhar. Keywords. Correspondence principle, hy- drogen atom, Kepler orbit. Deepak Dhar works at the. Tata Institute of Funda- mental Research,. Mumbai. His research interests are mainly in the area of statistical physics. We consider the quantum-mechanical non-relati- vistic hydrogen atom. We show that for bound.

  2. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  3. Principles of Protocol Design

    DEFF Research Database (Denmark)

    Sharp, Robin

    This is a new and updated edition of a book first published in 1994. The book introduces the reader to the principles used in the construction of a large range of modern data communication protocols, as used in distributed computer systems of all kinds. The approach taken is rather a formal one...

  4. The traveltime holographic principle

    KAUST Repository

    Huang, Y.

    2014-11-06

    Fermat\\'s interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat\\'s interferometric principle. We denote this principle as the ‘traveltime holographic principle’, by analogy with the holographic principle in cosmology where information in a volume is encoded on the region\\'s boundary.

  5. Fermat's Principle Revisited.

    Science.gov (United States)

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  6. Principles of economics textbooks

    DEFF Research Database (Denmark)

    Madsen, Poul Thøis

    2012-01-01

    Has the financial crisis already changed US principles of economics textbooks? Rather little has changed in individual textbooks, but taken as a whole ten of the best-selling textbooks suggest rather encompassing changes of core curriculum. A critical analysis of these changes shows how individual...

  7. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  8. Reliability studies in research reactors

    International Nuclear Information System (INIS)

    Albuquerque, Tob Rodrigues de

    2013-01-01

    Fault trees and event trees are widely used in industry to model and to evaluate the reliability of safety systems. Detailed analyzes in nuclear installations require the combination of these two techniques. This study uses the methods of FT (Fault Tree) and ET (Event Tree) to accomplish the PSA (Probabilistic Safety Assessment) in research reactors. According to IAEA (lnternational Atomic Energy Agency), the PSA is divided into Level 1, Level 2 and Level 3. At the Level 1, conceptually, the security systems perform to prevent the occurrence of accidents, At the Level 2, once accidents happened, this Level seeks to minimize consequences, known as stage management of accident, and at Level 3 accident impacts are determined. This study focuses on analyzing the Level 1, and searching through the acquisition of knowledge, the consolidation of methodologies for future reliability studies. The Greek Research Reactor, GRR-1, is a case example. The LOCA (Loss of Coolant Accident) was chosen as the initiating event and from it, using ET, possible accidental sequences were developed, which could lead damage to the core. Moreover, for each of affected systems, probabilities of each event top of FT were developed and evaluated in possible accidental sequences. Also, the estimates of importance measures for basic events are presented in this work. The studies of this research were conducted using a commercial computational tool SAPHIRE. Additionally, achieved results thus were considered satisfactory for the performance or the failure of analyzed systems. (author)

  9. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  10. The reliability of commonly used electrophysiology measures.

    Science.gov (United States)

    Brown, K E; Lohse, K R; Mayer, I M S; Strigaro, G; Desikan, M; Casula, E P; Meunier, S; Popa, T; Lamy, J-C; Odish, O; Leavitt, B R; Durr, A; Roos, R A C; Tabrizi, S J; Rothwell, J C; Boyd, L A; Orth, M

    Electrophysiological measures can help understand brain function both in healthy individuals and in the context of a disease. Given the amount of information that can be extracted from these measures and their frequent use, it is essential to know more about their inherent reliability. To understand the reliability of electrophysiology measures in healthy individuals. We hypothesized that measures of threshold and latency would be the most reliable and least susceptible to methodological differences between study sites. Somatosensory evoked potentials from 112 control participants; long-latency reflexes, transcranial magnetic stimulation with resting and active motor thresholds, motor evoked potential latencies, input/output curves, and short-latency sensory afferent inhibition and facilitation from 84 controls were collected at 3 visits over 24 months at 4 Track-On HD study sites. Reliability was assessed using intra-class correlation coefficients for absolute agreement, and the effects of reliability on statistical power are demonstrated for different sample sizes and study designs. Measures quantifying latencies, thresholds, and evoked responses at high stimulator intensities had the highest reliability, and required the smallest sample sizes to adequately power a study. Very few between-site differences were detected. Reliability and susceptibility to between-site differences should be evaluated for electrophysiological measures before including them in study designs. Levels of reliability vary substantially across electrophysiological measures, though there are few between-site differences. To address this, reliability should be used in conjunction with theoretical calculations to inform sample size and ensure studies are adequately powered to detect true change in measures of interest. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  12. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  13. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  14. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  15. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  16. Extremum principles for irreversible processes

    International Nuclear Information System (INIS)

    Hillert, M.; Agren, J.

    2006-01-01

    Hamilton's extremum principle is a powerful mathematical tool in classical mechanics. Onsager's extremum principle may play a similar role in irreversible thermodynamics and may also become a valuable tool. His principle may formally be regarded as a principle of maximum rate of entropy production but does not have a clear physical interpretation. Prigogine's principle of minimum rate of entropy production has a physical interpretation when it applies, but is not strictly valid except for a very special case

  17. Similarity principles for equipment qualification by experience

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1988-07-01

    A methodology is developed for seismic qualification of nuclear plant equipment by applying similarity principles to existing experience data. Experience data are available from previous qualifications by analysis or testing, or from actual earthquake events. Similarity principles are defined in terms of excitation, equipment physical characteristics, and equipment response. Physical similarity is further defined in terms of a critical transfer function for response at a location on a primary structure, whose response can be assumed directly related to ultimate fragility of the item under elevated levels of excitation. Procedures are developed for combining experience data into composite specifications for qualification of equipment that can be shown to be physically similar to the reference equipment. Other procedures are developed for extending qualifications beyond the original specifications under certain conditions. Some examples for application of the procedures and verification of them are given for certain cases that can be approximated by a two degree of freedom simple primary/secondary system. Other examples are based on use of actual test data available from previous qualifications. Relationships of the developments with other previously-published methods are discussed. The developments are intended to elaborate on the rather broad revised guidelines developed by the IEEE 344 Standards Committee for equipment qualification in new nuclear plants. However, the results also contribute to filling a gap that exists between the IEEE 344 methodology and that previously developed by the Seismic Qualification Utilities Group. The relationship of the results to safety margin methodology is also discussed. (author)

  18. Reliability in perceptual analysis of voice quality.

    Science.gov (United States)

    Bele, Irene Velsvik

    2005-12-01

    This study focuses on speaking voice quality in male teachers (n = 35) and male actors (n = 36), who represent untrained and trained voice users, because we wanted to investigate normal and supranormal voices. In this study, both substantial and methodologic aspects were considered. It includes a method for perceptual voice evaluation, and a basic issue was rater reliability. A listening group of 10 listeners, 7 experienced speech-language therapists, and 3 speech-language therapist students evaluated the voices by 15 vocal characteristics using VA scales. Two sets of voice signals were investigated: text reading (2 loudness levels) and sustained vowel (3 levels). The results indicated a high interrater reliability for most perceptual characteristics. Connected speech was evaluated more reliably, especially at the normal level, but both types of voice signals were evaluated reliably, although the reliability for connected speech was somewhat higher than for vowels. Experienced listeners tended to be more consistent in their ratings than did the student raters. Some vocal characteristics achieved acceptable reliability even with a smaller panel of listeners. The perceptual characteristics grouped in 4 factors reflected perceptual dimensions.

  19. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  20. The reliability of the Glasgow Coma Scale: a systematic review.

    Science.gov (United States)

    Reith, Florence C M; Van den Brande, Ruben; Synnot, Anneliese; Gruen, Russell; Maas, Andrew I R

    2016-01-01

    The Glasgow Coma Scale (GCS) provides a structured method for assessment of the level of consciousness. Its derived sum score is applied in research and adopted in intensive care unit scoring systems. Controversy exists on the reliability of the GCS. The aim of this systematic review was to summarize evidence on the reliability of the GCS. A literature search was undertaken in MEDLINE, EMBASE and CINAHL. Observational studies that assessed the reliability of the GCS, expressed by a statistical measure, were included. Methodological quality was evaluated with the consensus-based standards for the selection of health measurement instruments checklist and its influence on results considered. Reliability estimates were synthesized narratively. We identified 52 relevant studies that showed significant heterogeneity in the type of reliability estimates used, patients studied, setting and characteristics of observers. Methodological quality was good (n = 7), fair (n = 18) or poor (n = 27). In good quality studies, kappa values were ≥0.6 in 85%, and all intraclass correlation coefficients indicated excellent reliability. Poor quality studies showed lower reliability estimates. Reliability for the GCS components was higher than for the sum score. Factors that may influence reliability include education and training, the level of consciousness and type of stimuli used. Only 13% of studies were of good quality and inconsistency in reported reliability estimates was found. Although the reliability was adequate in good quality studies, further improvement is desirable. From a methodological perspective, the quality of reliability studies needs to be improved. From a clinical perspective, a renewed focus on training/education and standardization of assessment is required.

  1. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  2. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  3. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  4. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  5. Application of PRINCE2 Project Management Methodology

    Directory of Open Access Journals (Sweden)

    Vaníčková Radka

    2017-09-01

    Full Text Available The methodology describes the principle of setting a project in PRINCE2 project management. The main aim of the paper is to implement PRINCE2 methodology to be used in an enterprise in the service industry. A partial aim is to choose a supplier of the project among new travel guides. The result of the project activity is a sight-seeing tour/service more attractive for customers in the tourism industry and a possible choice of new job opportunities. The added value of the article is the description of applying the principles, processes and topics of PRINCE2 project management so that they might be used in the field.

  6. Reliability Assessment of 2400 MWth Gas-Cooled Fast Reactor Natural Circulation Decay Heat Removal in Pressurized Situations

    Directory of Open Access Journals (Sweden)

    C. Bassi

    2008-01-01

    Full Text Available As the 2400 MWth gas-cooled fast reactor concept makes use of passive safety features in combination with active safety systems, the question of natural circulation decay heat removal (NCDHR reliability and performance assessment into the ongoing probabilistic safety assessment in support to the reactor design, named “probabilistic engineering assessment” (PEA, constitutes a challenge. Within the 5th Framework Program for Research and Development (FPRD of the European Community, a methodology has been developed to evaluate the reliability of passive systems characterized by a moving fluid and whose operation is based on physical principles, such as the natural circulation. This reliability method for passive systems (RMPSs is based on uncertainties propagation into thermal-hydraulic (T-H calculations. The aim of this exercise is finally to determine the performance reliability of the DHR system operating in a “passive” mode, taking into account the uncertainties of parameters retained for thermal-hydraulical calculations performed with the CATHARE 2 code. According to the PEA preliminary results, exhibiting the weight of pressurized scenarios (i.e., with intact primary circuit boundary for the core damage frequency (CDF, the RMPS exercise is first focusing on the NCDHR performance at these T-H conditions.

  7. Reliability evaluation of a natural circulation system

    International Nuclear Information System (INIS)

    Jafari, Jalil; D'Auria, Francesco; Kazeminejad, Hossein; Davilu, Hadi

    2003-01-01

    This paper discusses a reliability study performed with reference to a passive thermohydraulic natural circulation (NC) system, named TTL-1. A methodology based on probabilistic techniques has been applied with the main purpose to optimize the system design. The obtained results have been adopted to estimate the thermal-hydraulic reliability (TH-R) of the same system. A total of 29 relevant parameters (including nominal values and plausible ranges of variations) affecting the design and the NC performance of the TTL-1 loop are identified and a probability of occurrence is assigned for each value based on expert judgment. Following procedures established for the uncertainty evaluation of thermal-hydraulic system codes results, 137 system configurations have been selected and each configuration has been analyzed via the Relap5 best-estimate code. The reference system configuration and the failure criteria derived from the 'mission' of the passive system are adopted for the evaluation of the system TH-R. Four different definitions of a less-than-unity 'reliability-values' (where unity represents the maximum achievable reliability) are proposed for the performance of the selected passive system. This is normally considered fully reliable, i.e. reliability-value equal one, in typical Probabilistic Safety Assessment (PSA) applications in nuclear reactor safety. The two 'point' TH-R values for the considered NC system were found equal to 0.70 and 0.85, i.e. values comparable with the reliability of a pump installed in an 'equivalent' forced circulation (active) system having the same 'mission'. The design optimization study was completed by a regression analysis addressing the output of the 137 calculations: heat losses, undetected leakage, loop length, riser diameter, and equivalent diameter of the test section have been found as the most important parameters bringing to the optimal system design and affecting the TH-R. As added values for this work, the comparison has

  8. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  9. Prime implicants in dynamic reliability analysis

    International Nuclear Information System (INIS)

    Tyrväinen, Tero

    2016-01-01

    This paper develops an improved definition of a prime implicant for the needs of dynamic reliability analysis. Reliability analyses often aim to identify minimal cut sets or prime implicants, which are minimal conditions that cause an undesired top event, such as a system's failure. Dynamic reliability analysis methods take the time-dependent behaviour of a system into account. This means that the state of a component can change in the analysed time frame and prime implicants can include the failure of a component at different time points. There can also be dynamic constraints on a component's behaviour. For example, a component can be non-repairable in the given time frame. If a non-repairable component needs to be failed at a certain time point to cause the top event, we consider that the condition that it is failed at the latest possible time point is minimal, and the condition in which it fails earlier non-minimal. The traditional definition of a prime implicant does not account for this type of time-related minimality. In this paper, a new definition is introduced and illustrated using a dynamic flowgraph methodology model. - Highlights: • A new definition of a prime implicant is developed for dynamic reliability analysis. • The new definition takes time-related minimality into account. • The new definition is needed in dynamic flowgraph methodology. • Results can be represented by a smaller number of prime implicants.

  10. Empirical evaluation and justification of methodologies in psychological science.

    Science.gov (United States)

    Proctor, R W; Capaldi, E J

    2001-11-01

    The purpose of this article is to describe a relatively new movement in the history and philosophy of science, naturalism, a form of pragmatism emphasizing that methodological principles are empirical statements. Thus, methodological principles must be evaluated and justified on the same basis as other empirical statements. On this view, methodological statements may be less secure than the specific scientific theories to which they give rise. The authors examined the feasibility of a naturalistic approach to methodology using logical and historical analysis and by contrasting theories that predict new facts versus theories that explain already known facts. They provide examples of how differences over methodological issues in psychology and in science generally may be resolved using a naturalistic, or empirical, approach.

  11. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  12. Design principles for riboswitch function.

    Directory of Open Access Journals (Sweden)

    Chase L Beisel

    2009-04-01

    Full Text Available Scientific and technological advances that enable the tuning of integrated regulatory components to match network and system requirements are critical to reliably control the function of biological systems. RNA provides a promising building block for the construction of tunable regulatory components based on its rich regulatory capacity and our current understanding of the sequence-function relationship. One prominent example of RNA-based regulatory components is riboswitches, genetic elements that mediate ligand control of gene expression through diverse regulatory mechanisms. While characterization of natural and synthetic riboswitches has revealed that riboswitch function can be modulated through sequence alteration, no quantitative frameworks exist to investigate or guide riboswitch tuning. Here, we combined mathematical modeling and experimental approaches to investigate the relationship between riboswitch function and performance. Model results demonstrated that the competition between reversible and irreversible rate constants dictates performance for different regulatory mechanisms. We also found that practical system restrictions, such as an upper limit on ligand concentration, can significantly alter the requirements for riboswitch performance, necessitating alternative tuning strategies. Previous experimental data for natural and synthetic riboswitches as well as experiments conducted in this work support model predictions. From our results, we developed a set of general design principles for synthetic riboswitches. Our results also provide a foundation from which to investigate how natural riboswitches are tuned to meet systems-level regulatory demands.

  13. Cell tracking. Principles and applications

    International Nuclear Information System (INIS)

    Grimm, Jan; Kircher, Moritz F.; Weissleder, Ralph

    2007-01-01

    Cell based therapies such as stem cell therapies or adoptive immunotherapies are currently being explored as a potential treatment for a variety of diseases such as Parkinson's disease, diabetes or cancer. However, quantitative and qualitative evaluation of adoptively transferred cells is indispensable for monitoring the efficiency of the treatment. Current approaches mostly analyze transferred cells from peripheral blood, which cannot assess whether transferred cells actuallyhome to and stay in the targeted tissue. Using cell-labeling methods such as direct labeling or transfection with a marker gene in conjunction with various imaging modalities (MRI, optical or nuclear imaging), labeled cells can be followed in vivo in real-time, and their accumulation as well as function in vivo can be monitored and quantified accurately. This method is usually referred to as ''cell tracking'' or ''cell trafficking'' and is also being applied in basic biological sciences, exemplified in the evaluation of genes contributing to metastasis. This review focuses on principles of this promising methodology and explains various approaches by highlighting recent examples. (orig.) [de

  14. Principles of geodynamics

    CERN Document Server

    Scheidegger, Adrian E

    1982-01-01

    Geodynamics is commonly thought to be one of the subjects which provide the basis for understanding the origin of the visible surface features of the Earth: the latter are usually assumed as having been built up by geodynamic forces originating inside the Earth ("endogenetic" processes) and then as having been degrad­ ed by geomorphological agents originating in the atmosphere and ocean ("exogenetic" agents). The modem view holds that the sequence of events is not as neat as it was once thought to be, and that, in effect, both geodynamic and geomorphological processes act simultaneously ("Principle of Antagonism"); however, the division of theoretical geology into the principles of geodynamics and those of theoretical geomorphology seems to be useful for didactic purposes. It has therefore been maintained in the present writer's works. This present treatise on geodynamics is the first part of the author's treatment of theoretical geology, the treatise on Theoretical Geomorphology (also published by the Sprin...

  15. Principles of systems science

    CERN Document Server

    Mobus, George E

    2015-01-01

    This pioneering text provides a comprehensive introduction to systems structure, function, and modeling as applied in all fields of science and engineering. Systems understanding is increasingly recognized as a key to a more holistic education and greater problem solving skills, and is also reflected in the trend toward interdisciplinary approaches to research on complex phenomena. The subject of systems science, as a basis for understanding the components and drivers of phenomena at all scales, should be viewed with the same importance as a traditional liberal arts education. Principles of Systems Science contains many graphs, illustrations, side bars, examples, and problems to enhance understanding. From basic principles of organization, complexity, abstract representations, and behavior (dynamics) to deeper aspects such as the relations between information, knowledge, computation, and system control, to higher order aspects such as auto-organization, emergence and evolution, the book provides an integrated...

  16. Common principles and multiculturalism.

    Science.gov (United States)

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea.

  17. Principles of Mobile Communication

    CERN Document Server

    Stüber, Gordon L

    2012-01-01

    This mathematically rigorous overview of physical layer wireless communications is now in a third, fully revised and updated edition. Along with coverage of basic principles sufficient for novice students, the volume includes plenty of finer details that will satisfy the requirements of graduate students aiming to research the topic in depth. It also has a role as a handy reference for wireless engineers. The content stresses core principles that are applicable to a broad range of wireless standards. Beginning with a survey of the field that introduces an array of issues relevant to wireless communications and which traces the historical development of today’s accepted wireless standards, the book moves on to cover all the relevant discrete subjects, from radio propagation to error probability performance and cellular radio resource management. A valuable appendix provides a succinct and focused tutorial on probability and random processes, concepts widely used throughout the book. This new edition, revised...

  18. Principles of mathematical modeling

    CERN Document Server

    Dym, Clive

    2004-01-01

    Science and engineering students depend heavily on concepts of mathematical modeling. In an age where almost everything is done on a computer, author Clive Dym believes that students need to understand and "own" the underlying mathematics that computers are doing on their behalf. His goal for Principles of Mathematical Modeling, Second Edition, is to engage the student reader in developing a foundational understanding of the subject that will serve them well into their careers. The first half of the book begins with a clearly defined set of modeling principles, and then introduces a set of foundational tools including dimensional analysis, scaling techniques, and approximation and validation techniques. The second half demonstrates the latest applications for these tools to a broad variety of subjects, including exponential growth and decay in fields ranging from biology to economics, traffic flow, free and forced vibration of mechanical and other systems, and optimization problems in biology, structures, an...

  19. Principles of Stellar Interferometry

    CERN Document Server

    Glindemann, Andreas

    2011-01-01

    Over the last decade, stellar interferometry has developed from a specialist tool to a mainstream observing technique, attracting scientists whose research benefits from milliarcsecond angular resolution. Stellar interferometry has become part of the astronomer’s toolbox, complementing single-telescope observations by providing unique capabilities that will advance astronomical research. This carefully written book is intended to provide a solid understanding of the principles of stellar interferometry to students starting an astronomical research project in this field or to develop instruments and to astronomers using interferometry but who are not interferometrists per se. Illustrated by excellent drawings and calculated graphs the imaging process in stellar interferometers is explained starting from first principles on light propagation and diffraction wave propagation through turbulence is described in detail using Kolmogorov statistics the impact of turbulence on the imaging process is discussed both f...

  20. Principles of Fourier analysis

    CERN Document Server

    Howell, Kenneth B

    2001-01-01

    Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...