WorldWideScience

Sample records for reliability prediction methodologies

  1. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    International Nuclear Information System (INIS)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J.

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction

  2. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction.

  3. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    International Nuclear Information System (INIS)

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  4. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  5. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  6. Study on the methodology for predicting and preventing errors to improve reliability of maintenance task in nuclear power plant

    International Nuclear Information System (INIS)

    Hanafusa, Hidemitsu; Iwaki, Toshio; Embrey, D.

    2000-01-01

    The objective of this study was to develop and effective methodology for predicting and preventing errors in nuclear power plant maintenance tasks. A method was established by which chief maintenance personnel can predict and reduce errors when reviewing the maintenance procedures and while referring to maintenance supporting systems and methods in other industries including aviation and chemical plant industries. The method involves the following seven steps: 1. Identification of maintenance tasks. 2. Specification of important tasks affecting safety. 3. Assessment of human errors occurring during important tasks. 4. Identification of Performance Degrading Factors. 5. Dividing important tasks into sub-tasks. 6. Extraction of errors using Predictive Human Error Analysis (PHEA). 7. Development of strategies for reducing errors and for recovering from errors. By way of a trial, this method was applied to the pump maintenance procedure in nuclear power plants. This method is believed to be capable of identifying the expected errors in important tasks and supporting the development of error reduction measures. By applying this method, the number of accidents resulting form human errors during maintenance can be reduced. Moreover, the maintenance support base using computers was developed. (author)

  7. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  8. Reliability assessment of passive containment isolation system using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Srivastava, A.; Prasad, Hari; Anthony, A.; Gaikwad, A.J.; Bhatia, S.; Sinha, R.K.

    2008-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) has been employed for evaluation of the reliability of passive systems. The methodology has been applied to the passive containment isolation system (PCIS) of the Indian advanced heavy water reactor (AHWR). In the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to carryout the desired function. The methodology first determines the operational characteristics of the system and the failure conditions by assigning a predetermined failure criterion. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the PCIS performance. APSRA proposes to compare the code predictions with the test data to generate the uncertainties on the failure parameter prediction, which is later considered in the code for accurate prediction of failure surface of the system. Once the failure surface of the system is predicted, the cause of failure is examined through root diagnosis, which occurs mainly due to failure of mechanical components. The failure probability of these components is evaluated through a classical PSA treatment using the generic data. The reliability of the PCIS is evaluated from the probability of availability of the components for the success of the passive containment isolation system

  9. Methodology for allocating reliability and risk

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1986-05-01

    This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs

  10. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  11. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  12. CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology

    Science.gov (United States)

    Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy

    2006-01-01

    This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.

  13. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  14. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  15. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  16. Methodology for reliability, economic and environmental assessment of wave energy

    International Nuclear Information System (INIS)

    Thorpe, T.W.; Muirhead, S.

    1994-01-01

    As part of the Preliminary Actions in Wave Energy R and D for DG XII's Joule programme, methodologies were developed to facilitate assessment of the reliability, economics and environmental impact of wave energy. This paper outlines these methodologies, their limitations and areas requiring further R and D. (author)

  17. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  18. Reliability tasks from prediction to field use

    International Nuclear Information System (INIS)

    Guyot, Christian.

    1975-01-01

    This tutorial paper is part of a series intended to sensitive on reliability prolems. Reliability probabilistic concept, is an important parameter of availability. Reliability prediction is an estimation process for evaluating design progress. It is only by the application of a reliability program that reliability objectives can be attained through the different stages of work: conception, fabrication, field use. The user is mainly interested in operational reliability. Indication are given on the support and the treatment of data in the case of electronic equipment at C.E.A. Reliability engineering requires a special state of mind which must be formed and developed in a company in the same way as it may be done for example for safety [fr

  19. Transmission embedded cost allocation methodology with consideration of system reliability

    International Nuclear Information System (INIS)

    Hur, D.; Park, J.-K.; Yoo, C.-I.; Kim, B.H.

    2004-01-01

    In a vertically integrated utility industry, the cost of reliability, as a separate service, has not received much rigorous analysis. However, as a cornerstone of restructuring the industry, the transmission service pricing must change to be consistent with, and supportive of, competitive wholesale electricity markets. This paper focuses on the equitable allocation of transmission network embedded costs including the transmission reliability cost based on the contributions of each generator to branch flows under normal conditions as well as the line outage impact factor under a variety of load levels. A numerical example on a six-bus system is given to illustrate the applications of the proposed methodology. (author)

  20. Verification, validation, and reliability of predictions

    International Nuclear Information System (INIS)

    Pigford, T.H.; Chambre, P.L.

    1987-04-01

    The objective of predicting long-term performance should be to make reliable determinations of whether the prediction falls within the criteria for acceptable performance. Establishing reliable predictions of long-term performance of a waste repository requires emphasis on valid theories to predict performance. The validation process must establish the validity of the theory, the parameters used in applying the theory, the arithmetic of calculations, and the interpretation of results; but validation of such performance predictions is not possible unless there are clear criteria for acceptable performance. Validation programs should emphasize identification of the substantive issues of prediction that need to be resolved. Examples relevant to waste package performance are predicting the life of waste containers and the time distribution of container failures, establishing the criteria for defining container failure, validating theories for time-dependent waste dissolution that depend on details of the repository environment, and determining the extent of congruent dissolution of radionuclides in the UO 2 matrix of spent fuel. Prediction and validation should go hand in hand and should be done and reviewed frequently, as essential tools for the programs to design and develop repositories. 29 refs

  1. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  2. Evaluating the reliability of predictions made using environmental transfer models

    International Nuclear Information System (INIS)

    1989-01-01

    The development and application of mathematical models for predicting the consequences of releases of radionuclides into the environment from normal operations in the nuclear fuel cycle and in hypothetical accident conditions has increased dramatically in the last two decades. This Safety Practice publication has been prepared to provide guidance on the available methods for evaluating the reliability of environmental transfer model predictions. It provides a practical introduction of the subject and a particular emphasis has been given to worked examples in the text. It is intended to supplement existing IAEA publications on environmental assessment methodology. 60 refs, 17 figs, 12 tabs

  3. Reliability and lifetime predictions of SLC klystrons

    International Nuclear Information System (INIS)

    Allen, M.A.; Callin, R.S.; Fowkes, W.R.; Lee, T.G.; Vlieks, A.E.

    1989-01-01

    The energy upgrade of SLAC, with the first of the new 67 MW SLAC Linear Collider (SLC) klystrons, began over four years ago. Today there are over 200 of these klystrons in operation. As a result, there is a wealth of klystron performance and failure information that enables reasonable predictions to be made on life expectancy and reliability. Data from initial tests, follow-up tests and daily operation monitoring on the accelerator is stored for analysis. Presented here are life expectancy predictions with particular emphasis on cathode life. Also, based on this data, the authors will discuss some of the principal modes of failure. 3 refs., 2 figs., 1 tab

  4. Reliability and lifetime predictions of SLC klystrons

    International Nuclear Information System (INIS)

    Allen, M.A.; Callin, R.S.; Fowkes, W.R.; Lee, T.G.; Vlieks, A.E.

    1989-03-01

    The energy upgrade of SLAC, with the first of the new 67 MW SLAC Linear Collider (SLC) klystrons, began over four years ago. Today there are over 200 of these klystrons in operation. As a result, there is a wealth klystron performance and failure information that enables reasonable predictions to be made on life expectancy and reliability. Data from initial tests, follow-up tests and daily operation monitoring on the accelerator is stores for analysis. Presented here are life expectancy predictions with particular emphasis on cathode life. Also, based on this data, we will discuss some of the principal modes of failure. 3 refs., 2 figs

  5. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  6. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    A number of software reliability models have been developed to estimate and to predict software reliability. However, there are no established standard models to quantify software reliability. Most models estimate the quality of software in reliability figures such as remaining faults, failure rate, or mean time to next failure at the testing phase, and they consider them ultimate indicators of software reliability. Experience shows that there is a large gap between predicted reliability during development and reliability measured during operation, which means that predicted reliability, or so-called test reliability, is not operational reliability. Customers prefer operational reliability to test reliability. In this study, we propose a method that predicts operational reliability rather than test reliability by introducing the testing environment factor that quantifies the changes in environments

  7. A reliability assessment methodology for the VHTR passive safety system

    International Nuclear Information System (INIS)

    Lee, Hyungsuk; Jae, Moosung

    2014-01-01

    The passive safety system of a VHTR (Very High Temperature Reactor), which has recently attracted worldwide attention, is currently being considered for the design of safety improvements for the next generation of nuclear power plants in Korea. The functionality of the passive system does not rely on an external source of an electrical support system, but on the intelligent use of natural phenomena. Its function involves an ultimate heat sink for a passive secondary auxiliary cooling system, especially during a station blackout such as the case of the Fukushima Daiichi reactor accidents. However, it is not easy to quantitatively evaluate the reliability of passive safety for the purpose of risk analysis, considering the existing active system failure since the classical reliability assessment method cannot be applied. Therefore, we present a new methodology to quantify the reliability based on reliability physics models. This evaluation framework is then applied to of the conceptually designed VHTR in Korea. The Response Surface Method (RSM) is also utilized for evaluating the uncertainty of the maximum temperature of nuclear fuel. The proposed method could contribute to evaluating accident sequence frequency and designing new innovative nuclear systems, such as the reactor cavity cooling system (RCCS) in VHTR to be designed and constructed in Korea.

  8. Improved FTA methodology and application to subsea pipeline reliability design.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  9. Seismic reliability assessment methodology for CANDU concrete containment structures

    International Nuclear Information System (INIS)

    Stephens, M.J.; Nessim, M.A.; Hong, H.P.

    1995-05-01

    A study was undertaken to develop a reliability-based methodology for the assessment of existing CANDU concrete containment structures with respect to seismic loading. The focus of the study was on defining appropriate specified values and partial safety factors for earthquake loading and resistance parameters. Key issues addressed in the work were the identification of an approach to select design earthquake spectra that satisfy consistent safety levels, and the use of structure-specific data in the evaluation of structural resistance. (author). 23 refs., 9 tabs., 15 figs

  10. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  11. Final report on reliability and lifetime prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Kenneth T; Wise, Jonathan; Jones, Gary D.; Causa, Al G.; Terrill, Edward R.; Borowczak, Marc

    2012-12-01

    This document highlights the important results obtained from the subtask of the Goodyear CRADA devoted to better understanding reliability of tires and to developing better lifetime prediction methods. The overall objective was to establish the chemical and physical basis for the degradation of tires using standard as well as unique models and experimental techniques. Of particular interest was the potential application of our unique modulus profiling apparatus for assessing tire properties and for following tire degradation. During the course of this complex investigation, extensive relevant information was generated, including experimental results, data analyses and development of models and instruments. Detailed descriptions of the findings are included in this report.

  12. Reliability assessment of passive isolation condenser system of AHWR using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Prasad, Hari; Anthony, A.; Bhatia, S.K.; Sinha, R.K.

    2009-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) is used for evaluation of reliability of passive isolation condenser system of the Indian Advanced Heavy Water Reactor (AHWR). As per the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to perform the design basis function. The methodology first determines the operational characteristics of the system and the failure conditions based on a predetermined failure criterion. The parameters that could degrade the system performance are identified and considered for analysis. Different modes of failure and their cause are identified. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the isolation condenser system performance. Once the failure surface of the system is predicted, the causes of failure are examined through root diagnosis, which occur mainly due to failure of mechanical components. Reliability of the system is evaluated through a classical PSA treatment based on the failure probability of the components using generic data

  13. Software reliability prediction using SPN | Abbasabadee | Journal of ...

    African Journals Online (AJOL)

    Software reliability prediction using SPN. ... In this research for computation of software reliability, component reliability model based on SPN would be proposed. An isomorphic markov ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  14. Can shoulder dystocia be reliably predicted?

    Science.gov (United States)

    Dodd, Jodie M; Catcheside, Britt; Scheil, Wendy

    2012-06-01

    To evaluate factors reported to increase the risk of shoulder dystocia, and to evaluate their predictive value at a population level. The South Australian Pregnancy Outcome Unit's population database from 2005 to 2010 was accessed to determine the occurrence of shoulder dystocia in addition to reported risk factors, including age, parity, self-reported ethnicity, presence of diabetes and infant birth weight. Odds ratios (and 95% confidence interval) of shoulder dystocia was calculated for each risk factor, which were then incorporated into a logistic regression model. Test characteristics for each variable in predicting shoulder dystocia were calculated. As a proportion of all births, the reported rate of shoulder dystocia increased significantly from 0.95% in 2005 to 1.38% in 2010 (P = 0.0002). Using a logistic regression model, induction of labour and infant birth weight greater than both 4000 and 4500 g were identified as significant independent predictors of shoulder dystocia. The value of risk factors alone and when incorporated into the logistic regression model was poorly predictive of the occurrence of shoulder dystocia. While there are a number of factors associated with an increased risk of shoulder dystocia, none are of sufficient sensitivity or positive predictive value to allow their use clinically to reliably and accurately identify the occurrence of shoulder dystocia. © 2012 The Authors ANZJOG © 2012 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  15. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  16. Program integration of predictive maintenance with reliability centered maintenance

    International Nuclear Information System (INIS)

    Strong, D.K. Jr; Wray, D.M.

    1990-01-01

    This paper addresses improving the safety and reliability of power plants in a cost-effective manner by integrating the recently developed reliability centered maintenance techniques with the traditional predictive maintenance techniques of nuclear power plants. The topics of the paper include a description of reliability centered maintenance (RCM), enhancing RCM with predictive maintenance, predictive maintenance programs, condition monitoring techniques, performance test techniques, the mid-Atlantic Reliability Centered Maintenance Users Group, test guides and the benefits of shared guide development

  17. Thermal sensation prediction by soft computing methodology.

    Science.gov (United States)

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Predicting Cost/Reliability/Maintainability of Advanced General Aviation Avionics Equipment

    Science.gov (United States)

    Davis, M. R.; Kamins, M.; Mooz, W. E.

    1978-01-01

    A methodology is provided for assisting NASA in estimating the cost, reliability, and maintenance (CRM) requirements for general avionics equipment operating in the 1980's. Practical problems of predicting these factors are examined. The usefulness and short comings of different approaches for modeling coast and reliability estimates are discussed together with special problems caused by the lack of historical data on the cost of maintaining general aviation avionics. Suggestions are offered on how NASA might proceed in assessing cost reliability CRM implications in the absence of reliable generalized predictive models.

  19. Future of structural reliability methodology in nuclear power plant technology

    Energy Technology Data Exchange (ETDEWEB)

    Schueeller, G I [Technische Univ. Muenchen (Germany, F.R.); Kafka, P [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany, F.R.)

    1978-10-01

    This paper presents the authors' personal view as to which areas of structural reliability in nuclear power plant design need most urgently to be advanced. Aspects of simulation modeling, design rules, codification and specification of reliability, system analysis, probabilistic structural dynamics, rare events and particularly the interaction of systems and structural reliability are discussed. As an example, some considerations of the interaction effects between the protective systems and the pressure vessel are stated. The paper concludes with recommendation for further research.

  20. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  1. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. The experiences show that the operational reliability is higher than the test reliability User's interest is on the operational reliability rather than on the test reliability, however. With the assumption that the difference in reliability results from the change of environment, testing environment factors comprising the aging factor and the coverage factor are defined in this study to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results are close to the actual data

  2. Reliability assessment of Passive Containment Cooling System of an Advanced Reactor using APSRA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Mukesh, E-mail: mukeshd@barc.gov.in [Reactor Engineering Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Chakravarty, Aranyak [School of Nuclear Studies and Application, Jadavpur University, Kolkata 700032 (India); Nayak, A.K. [Reactor Engineering Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Prasad, Hari; Gopika, V. [Reactor Safety Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2014-10-15

    Highlights: • The paper deals with the reliability assessment of Passive Containment Cooling System of Advanced Heavy Water Reactor. • Assessment of Passive System ReliAbility (APSRA) methodology is used for reliability assessment. • Performance assessment of the PCCS is initially performed during a postulated design basis LOCA. • The parameters affecting the system performance are then identified and considered for further analysis. • The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. - Abstract: Passive Systems are increasingly playing a prominent role in the advanced nuclear reactor systems and are being utilised in normal operations as well as safety systems of the reactors following an accident. The Passive Containment Cooling System (PCCS) is one of the several passive safety features in an Advanced Reactor (AHWR). In this paper, the APSRA methodology has been employed for reliability evaluation of the PCCS of AHWR. Performance assessment of the PCCS is initially performed during a postulated design basis LOCA using the best-estimate code RELAP5/Mod 3.2. The parameters affecting the system performance are then identified and considered for further analysis. Based on some pre-determined failure criterion, the failure surface for the system is predicted using the best-estimate code taking into account the deviations of the identified parameters from their nominal states as well as the model uncertainties inherent to the best estimate code. Root diagnosis is then carried out to determine the various failure causes, which occurs mainly due to malfunctioning of mechanical components. The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. The reliability of the PCCS is then evaluated from the probability of availability of these components.

  3. Reliability assessment of Passive Containment Cooling System of an Advanced Reactor using APSRA methodology

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Chakravarty, Aranyak; Nayak, A.K.; Prasad, Hari; Gopika, V.

    2014-01-01

    Highlights: • The paper deals with the reliability assessment of Passive Containment Cooling System of Advanced Heavy Water Reactor. • Assessment of Passive System ReliAbility (APSRA) methodology is used for reliability assessment. • Performance assessment of the PCCS is initially performed during a postulated design basis LOCA. • The parameters affecting the system performance are then identified and considered for further analysis. • The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. - Abstract: Passive Systems are increasingly playing a prominent role in the advanced nuclear reactor systems and are being utilised in normal operations as well as safety systems of the reactors following an accident. The Passive Containment Cooling System (PCCS) is one of the several passive safety features in an Advanced Reactor (AHWR). In this paper, the APSRA methodology has been employed for reliability evaluation of the PCCS of AHWR. Performance assessment of the PCCS is initially performed during a postulated design basis LOCA using the best-estimate code RELAP5/Mod 3.2. The parameters affecting the system performance are then identified and considered for further analysis. Based on some pre-determined failure criterion, the failure surface for the system is predicted using the best-estimate code taking into account the deviations of the identified parameters from their nominal states as well as the model uncertainties inherent to the best estimate code. Root diagnosis is then carried out to determine the various failure causes, which occurs mainly due to malfunctioning of mechanical components. The failure probabilities of the various components are assessed through a classical PSA treatment using generic data. The reliability of the PCCS is then evaluated from the probability of availability of these components

  4. Evaluation of methodologies for remunerating wind power's reliability in Colombia

    International Nuclear Information System (INIS)

    Botero B, Sergio; Isaza C, Felipe; Valencia, Adriana

    2010-01-01

    Colombia strives to have enough firm capacity available to meet unexpected power shortages and peak demand; this is clear from mechanisms currently in place that provide monetary incentives (in the order of nearly US$ 14/MW h) to power producers that can guarantee electricity provision during scarcity periods. Yet, wind power in Colombia is not able to currently guarantee firm power because an accepted methodology to calculate its potential firm capacity does not exist. In this paper we argue that developing such methodology would provide an incentive to potential investors to enter into this low carbon technology. This paper analyzes three methodologies currently used in energy markets around the world to calculate firm wind energy capacity: PJM, NYISO, and Spain. These methodologies are initially selected due to their ability to accommodate to the Colombian energy regulations. The objective of this work is to determine which of these methodologies makes most sense from an investor's perspective, to ultimately shed light into developing a methodology to be used in Colombia. To this end, the authors developed a methodology consisting on the elaboration of a wind model using the Monte-Carlo simulation, based on known wind behaviour statistics of a region with adequate wind potential in Colombia. The simulation gives back random generation data, representing the resource's inherent variability and simulating the historical data required to evaluate the mentioned methodologies, thus achieving the technology's theoretical generation data. The document concludes that the evaluated methodologies are easy to implement and that these do not require historical data (important for Colombia, where there is almost no historical wind power data). It is also found that the Spanish methodology provides a higher Capacity Value (and therefore a higher return to investors). The financial assessment results show that it is crucial that these types of incentives exist to make viable

  5. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  6. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  7. Reliable predictions of waste performance in a geologic repository

    International Nuclear Information System (INIS)

    Pigford, T.H.; Chambre, P.L.

    1985-08-01

    Establishing reliable estimates of long-term performance of a waste repository requires emphasis upon valid theories to predict performance. Predicting rates that radionuclides are released from waste packages cannot rest upon empirical extrapolations of laboratory leach data. Reliable predictions can be based on simple bounding theoretical models, such as solubility-limited bulk-flow, if the assumed parameters are reliably known or defensibly conservative. Wherever possible, performance analysis should proceed beyond simple bounding calculations to obtain more realistic - and usually more favorable - estimates of expected performance. Desire for greater realism must be balanced against increasing uncertainties in prediction and loss of reliability. Theoretical predictions of release rate based on mass-transfer analysis are bounding and the theory can be verified. Postulated repository analogues to simulate laboratory leach experiments introduce arbitrary and fictitious repository parameters and are shown not to agree with well-established theory. 34 refs., 3 figs., 2 tabs

  8. Reliability prediction system based on the failure rate model for electronic components

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Lee, Hwa Ki

    2008-01-01

    Although many methodologies for predicting the reliability of electronic components have been developed, their reliability might be subjective according to a particular set of circumstances, and therefore it is not easy to quantify their reliability. Among the reliability prediction methods are the statistical analysis based method, the similarity analysis method based on an external failure rate database, and the method based on the physics-of-failure model. In this study, we developed a system by which the reliability of electronic components can be predicted by creating a system for the statistical analysis method of predicting reliability most easily. The failure rate models that were applied are MILHDBK- 217F N2, PRISM, and Telcordia (Bellcore), and these were compared with the general purpose system in order to validate the effectiveness of the developed system. Being able to predict the reliability of electronic components from the stage of design, the system that we have developed is expected to contribute to enhancing the reliability of electronic components

  9. Design methodologies for reliability of SSL LED boards

    NARCIS (Netherlands)

    Jakovenko, J.; Formánek, J.; Perpiñà, X.; Jorda, X.; Vellvehi, M.; Werkhoven, R.J.; Husák, M.; Kunen, J.M.G.; Bancken, P.; Bolt, P.J.; Gasse, A.

    2013-01-01

    This work presents a comparison of various LED board technologies from thermal, mechanical and reliability point of view provided by an accurate 3-D modelling. LED boards are proposed as a possible technology replacement of FR4 LED boards used in 400 lumen retrofit SSL lamps. Presented design

  10. Human reliability: an evaluation of its understanding and prediction

    International Nuclear Information System (INIS)

    Joksimovich, V.

    1987-01-01

    This paper presents a viewpoint on the state-of-the-art in human reliability. The bases for this viewpoint are, by and large, research projects conducted by the NUS for the Electric Power Research Institute (EPRI) primarily with the objective of further enhancing the credibility of PRA methodology. The presentation is divided into the following key sections: Background and Overview, Methodology and Data Base with emphasis on the simulator data base

  11. Use of PRA methodology for enhancing operational safety and reliability

    International Nuclear Information System (INIS)

    Chu, B.; Rumble, E.; Najafi, B.; Putney, B.; Young, J.

    1985-01-01

    This paper describes a broad scope, on-going R and D study, sponsored by the Electric Power Research Institute (EPRI) to utilize key features of the state-of-the-art plant information management and system analysis techniques to develop and demonstrate a practical engineering tool for assisting plant engineering and operational staff to perform their activities more effectively. The study is foreseen to consist of two major activities: to develop a user-friendly, integrated software system; and to demonstrate the applications of this software on-site. This integrated software, Reliability Analysis Program with In-Plant Data (RAPID), will consist of three types of interrelated elements: an Executive Controller which will provide engineering and operations staff users with interface and control of the other two software elements, a Data Base Manager which can acquire, store, select, and transfer data, and Applications Modules which will perform the specific reliability-oriented functions. A broad range of these functions has been envisaged. The immediate emphasis will be focused on four application modules: a Plant Status Module, a Technical Specification Optimization Module, a Reliability Assessment Module, and a Utility Module for acquiring plant data

  12. Conformal prediction for reliable machine learning theory, adaptations and applications

    CERN Document Server

    Balasubramanian, Vineeth; Vovk, Vladimir

    2014-01-01

    The conformal predictions framework is a recent development in machine learning that can associate a reliable measure of confidence with a prediction in any real-world pattern recognition application, including risk-sensitive applications such as medical diagnosis, face recognition, and financial risk prediction. Conformal Predictions for Reliable Machine Learning: Theory, Adaptations and Applications captures the basic theory of the framework, demonstrates how to apply it to real-world problems, and presents several adaptations, including active learning, change detection, and anomaly detecti

  13. Prediction of software operational reliability using testing environment factor

    International Nuclear Information System (INIS)

    Jung, Hoan Sung

    1995-02-01

    Software reliability is especially important to customers these days. The need to quantify software reliability of safety-critical systems has been received very special attention and the reliability is rated as one of software's most important attributes. Since the software is an intellectual product of human activity and since it is logically complex, the failures are inevitable. No standard models have been established to prove the correctness and to estimate the reliability of software systems by analysis and/or testing. For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is on the operational reliability rather than on the test reliability, however. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, testing environment factor comprising the aging factor and the coverage factor are defined in this work to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factor Test reliability can also be estimated with this approach without any model change. The application results are close to the actual data. The approach used in this thesis is expected to be applicable to ultra high reliable software systems that are used in nuclear power plants, airplanes, and other safety-critical applications

  14. SST prediction methodologies and verification considerations for ...

    African Journals Online (AJOL)

    Probabilistic hindcasts produced for two separate category thresholds are verified over a 24-year test period from 1978/79 to 2001/02 by investigating the various AGCM configurations' attributes of discrimination (whether the forecasts are discernibly different given different outcomes) and reliability (whether the confidence ...

  15. Optimization of reliability centered predictive maintenance scheme for inertial navigation system

    International Nuclear Information System (INIS)

    Jiang, Xiuhong; Duan, Fuhai; Tian, Heng; Wei, Xuedong

    2015-01-01

    The goal of this study is to propose a reliability centered predictive maintenance scheme for a complex structure Inertial Navigation System (INS) with several redundant components. GO Methodology is applied to build the INS reliability analysis model—GO chart. Components Remaining Useful Life (RUL) and system reliability are updated dynamically based on the combination of components lifetime distribution function, stress samples, and the system GO chart. Considering the redundant design in INS, maintenance time is based not only on components RUL, but also (and mainly) on the timing of when system reliability fails to meet the set threshold. The definition of components maintenance priority balances three factors: components importance to system, risk degree, and detection difficulty. Maintenance Priority Number (MPN) is introduced, which may provide quantitative maintenance priority results for all components. A maintenance unit time cost model is built based on components MPN, components RUL predictive model and maintenance intervals for the optimization of maintenance scope. The proposed scheme can be applied to serve as the reference for INS maintenance. Finally, three numerical examples prove the proposed predictive maintenance scheme is feasible and effective. - Highlights: • A dynamic PdM with a rolling horizon is proposed for INS with redundant components. • GO Methodology is applied to build the system reliability analysis model. • A concept of MPN is proposed to quantify the maintenance sequence of components. • An optimization model is built to select the optimal group of maintenance components. • The optimization goal is minimizing the cost of maintaining system reliability

  16. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  17. Reliability evaluation methodologies for ensuring container integrity of stored transuranic (TRU) waste

    International Nuclear Information System (INIS)

    Smith, K.L.

    1995-06-01

    This report provides methodologies for providing defensible estimates of expected transuranic waste storage container lifetimes at the Radioactive Waste Management Complex. These methodologies can be used to estimate transuranic waste container reliability (for integrity and degradation) and as an analytical tool to optimize waste container integrity. Container packaging and storage configurations, which directly affect waste container integrity, are also addressed. The methodologies presented provide a means for demonstrating Resource Conservation and Recovery Act waste storage requirements

  18. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  19. Multivariate performance reliability prediction in real-time

    International Nuclear Information System (INIS)

    Lu, S.; Lu, H.; Kolarik, W.J.

    2001-01-01

    This paper presents a technique for predicting system performance reliability in real-time considering multiple failure modes. The technique includes on-line multivariate monitoring and forecasting of selected performance measures and conditional performance reliability estimates. The performance measures across time are treated as a multivariate time series. A state-space approach is used to model the multivariate time series. Recursive forecasting is performed by adopting Kalman filtering. The predicted mean vectors and covariance matrix of performance measures are used for the assessment of system survival/reliability with respect to the conditional performance reliability. The technique and modeling protocol discussed in this paper provide a means to forecast and evaluate the performance of an individual system in a dynamic environment in real-time. The paper also presents an example to demonstrate the technique

  20. Performance reliability prediction for thermal aging based on kalman filtering

    International Nuclear Information System (INIS)

    Ren Shuhong; Wen Zhenhua; Xue Fei; Zhao Wensheng

    2015-01-01

    The performance reliability of the nuclear power plant main pipeline that failed due to thermal aging was studied by the performance degradation theory. Firstly, through the data obtained from the accelerated thermal aging experiments, the degradation process of the impact strength and fracture toughness of austenitic stainless steel material of the main pipeline was analyzed. The time-varying performance degradation model based on the state space method was built, and the performance trends were predicted by using Kalman filtering. Then, the multi-parameter and real-time performance reliability prediction model for the main pipeline thermal aging was developed by considering the correlation between the impact properties and fracture toughness, and by using the stochastic process theory. Thus, the thermal aging performance reliability and reliability life of the main pipeline with multi-parameter were obtained, which provides the scientific basis for the optimization management of the aging maintenance decision making for nuclear power plant main pipelines. (authors)

  1. Study of redundant Models in reliability prediction of HXMT's HES

    International Nuclear Information System (INIS)

    Wang Jinming; Liu Congzhan; Zhang Zhi; Ji Jianfeng

    2010-01-01

    Two redundant equipment structures of HXMT's HES are proposed firstly, the block backup and dual system cold-redundancy. Then prediction of the reliability is made by using parts count method. Research of comparison and analysis is also performed on the two proposals. A conclusion is drawn that a higher reliability and longer service life could be offered by taking a redundant equipment structure of block backup. (authors)

  2. Predicting risk and human reliability: a new approach

    International Nuclear Information System (INIS)

    Duffey, R.; Ha, T.-S.

    2009-01-01

    Learning from experience describes human reliability and skill acquisition, and the resulting theory has been validated by comparison against millions of outcome data from multiple industries and technologies worldwide. The resulting predictions were used to benchmark the classic first generation human reliability methods adopted in probabilistic risk assessments. The learning rate, probabilities and response times are also consistent with the existing psychological models for human learning and error correction. The new approach also implies a finite lower bound probability that is not predicted by empirical statistical distributions that ignore the known and fundamental learning effects. (author)

  3. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  4. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  5. A new methodology for predictive tool wear

    Science.gov (United States)

    Kim, Won-Sik

    turned with various cutting conditions and the results were compared with the proposed analytical wear models. The crater surfaces after machining have been carefully studied to shed light on the physics behind the crater wear. In addition, the abrasive wear mechanism plays a major role in the development of crater wear. Laser shock processing (LSP) has been applied to locally relieve the deleterious tensile residual stresses on the crater surface of a coated tool, thus to improve the hardness of the coating. This thesis shows that LSP has indeed improve wear resistance of CVD coated alumina tool inserts, which has residual stress due to high processing temperature. LSP utilizes a very short laser pulse with high energy density, which induces high-pressure stress wave propagation. The residual stresses are relieved by incident shock waves on the coating surface. Residual stress levels of LSP CVD alumina-coated carbide insert were evaluated by the X-ray diffractometer. Based on these results, LSP parameters such as number of laser pulses and laser energy density can be controlled to reduce residual stress. Crater wear shows that the wear resistance increase with LSP treated tool inserts. Because the hardness data are used to predict the wear, the improvement in hardness and wear resistance shows that the mechanism of crater wear also involves abrasive wear.

  6. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  7. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  8. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  9. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.; El-Bassioni, A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed

  10. System reliability prediction using data from non-identical environments

    International Nuclear Information System (INIS)

    Bergman, B.; Ringi, M.

    1997-01-01

    Since information changes one's mind and probability assessments reflect one's degree of beliefs, a reliability prediction model should enclose all relevant information. Almost always ignored in existing reliability models is the dependence on component life lengths, induced by a common but unknown environment. Furthermore, existing models seldom permit learning from components' performance in similar systems, under the knowledge of non-identical operating environments. In an earlier paper by the present authors the first type of aspects were taken into account and in this paper that model is generalised so that failure data generated from several similar systems in non-identical environments may be used for the prediction of any similar system in its specific environment

  11. Inter comparison of REPAS and APSRA methodologies for passive system reliability analysis

    International Nuclear Information System (INIS)

    Solanki, R.B.; Krishnamurthy, P.R.; Singh, Suneet; Varde, P.V.; Verma, A.K.

    2014-01-01

    The increasing use of passive systems in the innovative nuclear reactors puts demand on the estimation of the reliability assessment of these passive systems. The passive systems operate on the driving forces such as natural circulation, gravity, internal stored energy etc. which are moderately weaker than that of active components. Hence, phenomenological failures (virtual components) are equally important as that of equipment failures (real components) in the evaluation of passive systems reliability. The contribution of the mechanical components to the passive system reliability can be evaluated in a classical way using the available component reliability database and well known methods. On the other hand, different methods are required to evaluate the reliability of processes like thermohydraulics due to lack of adequate failure data. The research is ongoing worldwide on the reliability assessment of the passive systems and their integration into PSA, however consensus is not reached. Two of the most widely used methods are Reliability Evaluation of Passive Systems (REPAS) and Assessment of Passive System Reliability (APSRA). Both these methods characterize the uncertainties involved in the design and process parameters governing the function of the passive system. However, these methods differ in the quantification of passive system reliability. Inter comparison among different available methods provides useful insights into the strength and weakness of different methods. This paper highlights the results of the thermal hydraulic analysis of a typical passive isolation condenser system carried out using RELAP mode 3.2 computer code applying REPAS and APSRA methodologies. The failure surface is established for the passive system under consideration and system reliability has also been evaluated using these methods. Challenges involved in passive system reliabilities are identified, which require further attention in order to overcome the shortcomings of these

  12. Towards more accurate and reliable predictions for nuclear applications

    International Nuclear Information System (INIS)

    Goriely, S.

    2015-01-01

    The need for nuclear data far from the valley of stability, for applications such as nuclear astrophysics or future nuclear facilities, challenges the robustness as well as the predictive power of present nuclear models. Most of the nuclear data evaluation and prediction are still performed on the basis of phenomenological nuclear models. For the last decades, important progress has been achieved in fundamental nuclear physics, making it now feasible to use more reliable, but also more complex microscopic or semi-microscopic models in the evaluation and prediction of nuclear data for practical applications. In the present contribution, the reliability and accuracy of recent nuclear theories are discussed for most of the relevant quantities needed to estimate reaction cross sections and beta-decay rates, namely nuclear masses, nuclear level densities, gamma-ray strength, fission properties and beta-strength functions. It is shown that nowadays, mean-field models can be tuned at the same level of accuracy as the phenomenological models, renormalized on experimental data if needed, and therefore can replace the phenomenogical inputs in the prediction of nuclear data. While fundamental nuclear physicists keep on improving state-of-the-art models, e.g. within the shell model or ab initio models, nuclear applications could make use of their most recent results as quantitative constraints or guides to improve the predictions in energy or mass domain that will remain inaccessible experimentally. (orig.)

  13. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs.

  14. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea

    2014-01-01

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs

  15. Reliability prediction for structures under cyclic loads and recurring inspections

    Directory of Open Access Journals (Sweden)

    Alberto W. S. Mello Jr

    2009-06-01

    Full Text Available This work presents a methodology for determining the reliability of fracture control plans for structures subjected to cyclic loads. It considers the variability of the parameters involved in the problem, such as initial flaw and crack growth curve. The probability of detection (POD curve of the field non-destructive inspection method and the condition/environment are used as important factors for structural confidence. According to classical damage tolerance analysis (DTA, inspection intervals are based on detectable crack size and crack growth rate. However, all variables have uncertainties, which makes the final result totally stochastic. The material properties, flight loads, engineering tools and even the reliability of inspection methods are subject to uncertainties which can affect significantly the final maintenance schedule. The present methodology incorporates all the uncertainties in a simulation process, such as Monte Carlo, and establishes a relationship between the reliability of the overall maintenance program and the proposed inspection interval, forming a “cascade” chart. Due to the scatter, it also defines the confidence level of the “acceptable” risk. As an example, the damage tolerance analysis (DTA results are presented for the upper cockpit longeron splice bolt of the BAF upgraded F-5EM. In this case, two possibilities of inspection intervals were found: one that can be characterized as remote risk, with a probability of failure (integrity nonsuccess of 1 in 10 million, per flight hour; and other as extremely improbable, with a probability of nonsuccess of 1 in 1 billion, per flight hour, according to aviation standards. These two results are compared with the classical military airplane damage tolerance requirements.

  16. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  17. An integrated methodology for the dynamic performance and reliability evaluation of fault-tolerant systems

    International Nuclear Information System (INIS)

    Dominguez-Garcia, Alejandro D.; Kassakian, John G.; Schindall, Joel E.; Zinchuk, Jeffrey J.

    2008-01-01

    We propose an integrated methodology for the reliability and dynamic performance analysis of fault-tolerant systems. This methodology uses a behavioral model of the system dynamics, similar to the ones used by control engineers to design the control system, but also incorporates artifacts to model the failure behavior of each component. These artifacts include component failure modes (and associated failure rates) and how those failure modes affect the dynamic behavior of the component. The methodology bases the system evaluation on the analysis of the dynamics of the different configurations the system can reach after component failures occur. For each of the possible system configurations, a performance evaluation of its dynamic behavior is carried out to check whether its properties, e.g., accuracy, overshoot, or settling time, which are called performance metrics, meet system requirements. Markov chains are used to model the stochastic process associated with the different configurations that a system can adopt when failures occur. This methodology not only enables an integrated framework for evaluating dynamic performance and reliability of fault-tolerant systems, but also enables a method for guiding the system design process, and further optimization. To illustrate the methodology, we present a case-study of a lateral-directional flight control system for a fighter aircraft

  18. New methodology for fast prediction of wheel wear evolution

    Science.gov (United States)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  19. Developing Predictive Maintenance Expertise to Improve Plant Equipment Reliability

    International Nuclear Information System (INIS)

    Wurzbach, Richard N.

    2002-01-01

    On-line equipment condition monitoring is a critical component of the world-class production and safety histories of many successful nuclear plant operators. From addressing availability and operability concerns of nuclear safety-related equipment to increasing profitability through support system reliability and reduced maintenance costs, Predictive Maintenance programs have increasingly become a vital contribution to the maintenance and operation decisions of nuclear facilities. In recent years, significant advancements have been made in the quality and portability of many of the instruments being used, and software improvements have been made as well. However, the single most influential component of the success of these programs is the impact of a trained and experienced team of personnel putting this technology to work. Changes in the nature of the power generation industry brought on by competition, mergers, and acquisitions, has taken the historically stable personnel environment of power generation and created a very dynamic situation. As a result, many facilities have seen a significant turnover in personnel in key positions, including predictive maintenance personnel. It has become the challenge for many nuclear operators to maintain the consistent contribution of quality data and information from predictive maintenance that has become important in the overall equipment decision process. These challenges can be met through the implementation of quality training to predictive maintenance personnel and regular updating and re-certification of key technology holders. The use of data management tools and services aid in the sharing of information across sites within an operating company, and with experts who can contribute value-added data management and analysis. The overall effectiveness of predictive maintenance programs can be improved through the incorporation of newly developed comprehensive technology training courses. These courses address the use of

  20. Reliability and Lifetime Prediction of Remote Phosphor Plates in Solid-State Lighting Applications Using Accelerated Degradation Testing

    NARCIS (Netherlands)

    Yazdan Mehr, M.; van Driel, W.D.; Zhang, G.Q.

    2015-01-01

    A methodology, based on accelerated degradation testing, is developed to predict the lifetime of remote phosphor plates used in solid-state lighting (SSL) applications. Both thermal stress and light intensity are used to accelerate degradation reaction in remote phosphor plates. A reliability model,

  1. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  2. Bulk Fuel Pricing: DOD Needs to Take Additional Actions to Establish a More Reliable Methodology

    Science.gov (United States)

    2015-11-19

    Page 1 GAO-16-78R Bulk Fuel Pricing 441 G St. N.W. Washington, DC 20548 November 19, 2015 The Honorable Ashton Carter The Secretary of...Defense Bulk Fuel Pricing : DOD Needs to Take Additional Actions to Establish a More Reliable Methodology Dear Secretary Carter: Each fiscal...year, the Office of the Under Secretary of Defense (Comptroller), in coordination with the Defense Logistics Agency, sets a standard price per barrel

  3. Remaining life prediction of I and C cables for reliability assessment of NPP systems

    International Nuclear Information System (INIS)

    Santhosh, T.V.; Ghosh, A.K.; Fernandes, B.G.

    2012-01-01

    Highlights: ► A framework for time dependent reliability prediction of I and C cables for use in PSA of NPP has been developed using stress–strength interference theory. ► The proposed methodology has been illustrated with the accelerated thermal aging data on a typical XLPE cable. ► The behavior of insulation resistance when the degradation process is linear or exponential has also been modeled. ► The reliability index or probability of failure obtained from this approach can be used in system reliability evaluation to account for cable aging for PSA of NPP. - Abstract: Instrumentation and control (I and C) cables are one of the most important components in nuclear power plants (NPPs) because they provide power to safety-related equipment and also to transmit signals to and from various controllers to perform safety operations. I and C cables in NPP are subjected to a variety of aging and degradation stressors that can produce immediate degradation or aging-related mechanisms causing the degradation of cable components over time. Although, there exits several life estimation techniques, currently there is no any standard methodology or an approach toward estimating the time dependent reliability of I and C cables that can be directly used in probabilistic safety assessment (PSA) applications. Hence, the objective of this study is to develop an approach to estimate and confirm the continued acceptable margin in cable insulation life over time subjected to aging. This paper presents a framework based on the structural reliability theory to quantify the life time of I and C cable subjecting to thermal aging. Since cross-linked polyethylene (XLPE) cables are extensively being used in Indian NPPs, the remaining life time evaluations have been carried out for a typical XLPE cable. However, the methodology can be extended to other cables such as polyvinyl chloride (PVC), ethylene propylene rubber (EPR), etc.

  4. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  5. Methodology for risk assessment and reliability applied for pipeline engineering design and industrial valves operation

    Energy Technology Data Exchange (ETDEWEB)

    Silveira, Dierci [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Escola de Engenharia Industrial e Metalurgia. Lab. de Sistemas de Producao e Petroleo e Gas], e-mail: dsilveira@metal.eeimvr.uff.br; Batista, Fabiano [CICERO, Rio das Ostras, RJ (Brazil)

    2009-07-01

    Two kinds of situations may be distinguished for estimating the operating reliability when maneuvering industrial valves and the probability of undesired events in pipelines and industrial plants: situations in which the risk is identified in repetitive cycles of operations and situations in which there is a permanent hazard due to project configurations introduced by decisions during the engineering design definition stage. The estimation of reliability based on the influence of design options requires the choice of a numerical index, which may include a composite of human operating parameters based on biomechanics and ergonomics data. We first consider the design conditions under which the plant or pipeline operator reliability concepts can be applied when operating industrial valves, and then describe in details the ergonomics and biomechanics risks that would lend itself to engineering design database development and human reliability modeling and assessment. This engineering design database development and reliability modeling is based on a group of engineering design and biomechanics parameters likely to lead to over-exertion forces and working postures, which are themselves associated with the functioning of a particular plant or pipeline. This approach to construct based on ergonomics and biomechanics for a more common industrial valve positioning in the plant layout is proposed through the development of a methodology to assess physical efforts and operator reach, combining various elementary operations situations. These procedures can be combined with the genetic algorithm modeling and four elements of the man-machine systems: the individual, the task, the machinery and the environment. The proposed methodology should be viewed not as competing to traditional reliability and risk assessment bur rather as complementary, since it provides parameters related to physical efforts values for valves operation and workspace design and usability. (author)

  6. The Reliability and Predictive Validity of the Stalking Risk Profile.

    Science.gov (United States)

    McEwan, Troy E; Shea, Daniel E; Daffern, Michael; MacKenzie, Rachel D; Ogloff, James R P; Mullen, Paul E

    2018-03-01

    This study assessed the reliability and validity of the Stalking Risk Profile (SRP), a structured measure for assessing stalking risks. The SRP was administered at the point of assessment or retrospectively from file review for 241 adult stalkers (91% male) referred to a community-based forensic mental health service. Interrater reliability was high for stalker type, and moderate-to-substantial for risk judgments and domain scores. Evidence for predictive validity and discrimination between stalking recidivists and nonrecidivists for risk judgments depended on follow-up duration. Discrimination was moderate (area under the curve = 0.66-0.68) and positive and negative predictive values good over the full follow-up period ( Mdn = 170.43 weeks). At 6 months, discrimination was better than chance only for judgments related to stalking of new victims (area under the curve = 0.75); however, high-risk stalkers still reoffended against their original victim(s) 2 to 4 times as often as low-risk stalkers. Implications for the clinical utility and refinement of the SRP are discussed.

  7. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  8. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  9. A study on a reliability assessment methodology for the VHTR safety systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok

    2012-02-01

    The passive safety system of a 300MWt VHTR (Very High Temperature Reactor)which has attracted worldwide attention recently is actively considered for designing the improvement in the safety of the next generation nuclear power plant. The passive system functionality does not rely on an external source of the electrical support system,but on an intelligent use of the natural phenomena, such as convection, conduction, radiation, and gravity. It is not easy to evaluate quantitatively the reliability of the passive safety for the risk analysis considering the existing active system failure since the classical reliability assessment method could not be applicable. Therefore a new reliability methodology needs to be developed and applied for evaluating the reliability of the conceptual designed VHTR in this study. The preliminary evaluation and conceptualization are performed using the concept of the load and capacity theory related to the reliability physics model. The method of response surface method (RSM) is also utilized for evaluating the maximum temperature of nuclear fuel in this study. The significant variables and their correlation are considered for utilizing the GAMMA+ code. The proposed method might contribute to designing the new passive system of the VHTR

  10. Application of structural reliability and risk assessment to life prediction and life extension decision making

    International Nuclear Information System (INIS)

    Meyer, T.A.; Balkey, K.R.; Bishop, B.A.

    1987-01-01

    There can be numerous uncertainties involved in performing component life assessments. In addition, sufficient data may be unavailable to make a useful life prediction. Structural Reliability and Risk Assessment (SRRA) is primarily an analytical methodology or tool that quantifies the impact of uncertainties on the structural life of plant components and can address the lack of data in component life prediction. As a prelude to discussing the technical aspects of SRRA, a brief review of general component life prediction methods is first made so as to better develop an understanding of the role of SRRA in such evaluations. SRRA is then presented as it is applied in component life evaluations with example applications being discussed for both nuclear and non-nuclear components

  11. A methodology and success/failure criteria for determining emergency diesel generator reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wyckoff, H. L. [Electric Power Research Institute, Palo Alto, California (United States)

    1986-02-15

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  12. A methodology and success/failure criteria for determining emergency diesel generator reliability

    International Nuclear Information System (INIS)

    Wyckoff, H.L.

    1986-01-01

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  13. Reliability Centered Maintenance (RCM) Methodology and Application to the Shutdown Cooling System for APR-1400 Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Faragalla, Mohamed M.; Emmanuel, Efenji; Alhammadi, Ibrahim; Awwal, Arigi M.; Lee, Yong Kwan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    Shutdown Cooling System (SCS) is a safety-related system that is used in conjunction with the Main Steam and Main or Auxiliary Feedwater Systems to reduce the temperature of the Reactor Coolant System (RCS) in post shutdown periods from the hot shutdown operating temperature to the refueling temperature. In this paper RCM methodology is applied to (SCS). RCM analysis is performed based on evaluation of Failure Modes Effects and Criticality Analysis (FME and CA) on the component, system and plant. The Logic Tree Analysis (LTA) is used to determine the optimum maintenance tasks. The main objectives of RCM is the safety, preserve the System function, the cost-effective maintenance of the plant components and increase the reliability and availability value. The RCM methodology is useful for improving the equipment reliability by strengthening the management of equipment condition, and leads to a significant decrease in the number of periodical maintenance, extended maintenance cycle, longer useful life of equipment, and decrease in overall maintenance cost. It also focuses on the safety of the system by assigning criticality index to the various components and further selecting maintenance activities based on the risk of failure involved. Therefore, it can be said that RCM introduces a maintenance plan designed for maximum safety in an economical manner and making the system more reliable. For the SCP, increasing the number of condition monitoring tasks will improve the availability of the SCP. It is recommended to reduce the number of periodic maintenance activities.

  14. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  15. Life prediction and mechanical reliability of NT551 silicon nitride

    Science.gov (United States)

    Andrews, Mark Jay

    551. For the same reasons, the predicted and actual fatigue performance did not correlate well. The results of this study should not be considered a limitation of the life prediction algorithm but emphasize the requirement that ceramics be homogeneous and strength-limiting flaws uniformly distributed as a perquisite for accurate life prediction and reliability analyses.

  16. Reliability of dipstick assay in predicting urinary tract infection

    Directory of Open Access Journals (Sweden)

    Anith Kumar Mambatta

    2015-01-01

    Full Text Available Aims: Urine dipstick analysis is a quick, cheap and a useful test in predicting Urinary Tract Infection (UTI in hospitalized patients. Our aim is to evaluate the reliability (sensitivity of urine dipstick analysis against urine culture in the diagnosis of UTI. Materials and Methods: Patients admitted to our hospital suspected of having UTI, with positive urine cultures were included in this study from a 2-year period (January 2011 to December 2012. Dipstick urinalysis was done using multistix 10 SG (Siemens and clinitek advantus analyzer. The sensitivity of dipstick nitrites, leukocyte esterase and blood in these culture-positive UTI patients was calculated retrospectively. Results: Urine dipstick analysis of 635 urine culture-positive patients was studied. The sensitivity of nitrite alone and leukocyte esterase alone were 23.31% and 48.5%, respectively. The sensitivity of blood alone in positive urine culture was 63.94%, which was the highest sensitivity for a single screening test. The presence of leukocyte esterase and/or blood increased the sensitivity to 72.28%. The sensitivity was found to be the highest when nitrite, leukocyte and blood were considered together. Conclusions: Nitrite test and leukocyte esterase test when used individually is not reliable to rule out UTI. Hence, symptomatic UTI patients with negative dipstick assay should be subjected to urine culture for a proper management.

  17. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    Science.gov (United States)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  18. Fast mission reliability prediction for Unmanned Aerial Vehicles

    International Nuclear Information System (INIS)

    Andrews, J.D.; Poole, J.; Chen, W.H.

    2013-01-01

    There is currently a significant interest in the use of autonomous vehicles in many industrial sectors. One such example is the ever increasing use of Unmanned Aerial Vehicles (UAVs), particularly in military operations. This enables dangerous missions to be accomplished without risk to a pilot. UAVs also have potential civil applications which would require their certification and the demonstration that they are able to respond safety to any potential circumstances. The aircraft would therefore need to be capable of responding safely to the occurrence of component failures, the emergence of threats such as other aircraft in the neighboring airspace, and changing weather conditions. The likelihood that an aircraft will successfully complete any mission can be predicted using phased mission analysis techniques. The predicted mission unreliability can be updated in response to changing circumstances. In the event that the likelihood of mission failure becomes too high then changes have to be made to the mission plan. If these calculations could be carried out fast enough then the quantification procedure could be used to establish an acceptable response to any new conditions. With a view to using the methodology in the context described above, this paper investigates ways in which phased mission analysis can be improved to reduce the calculation time. The methodology improves the processing capability for a UAV phased mission analysis by taking into account the specific characteristics of the fault tree structures which provide the causes of phase failure for a UAV mission. It also carries out as much of the quantification as possible in advance of the mission plan being formulated

  19. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  20. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  1. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  2. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  3. Functional components for a design strategy: Hot cell shielding in the high reliability safeguards methodology

    Energy Technology Data Exchange (ETDEWEB)

    Borrelli, R.A., E-mail: rborrelli@uidaho.edu

    2016-08-15

    The high reliability safeguards (HRS) methodology has been established for the safeguardability of advanced nuclear energy systems (NESs). HRS is being developed in order to integrate safety, security, and safeguards concerns, while also optimizing these with operational goals for facilities that handle special nuclear material (SNM). Currently, a commercial pyroprocessing facility is used as an example system. One of the goals in the HRS methodology is to apply intrinsic features of the system to a design strategy. This current study investigates the thickness of the hot cell walls that could adequately shield processed materials. This is an important design consideration that carries implications regarding the formation of material balance areas, the location of key measurement points, and material flow in the facility.

  4. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  5. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  6. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  7. Methodology for formulating predictions of stress corrosion cracking life

    International Nuclear Information System (INIS)

    Yamauchi, Kiyoshi; Hattori, Shigeo; Shindo, Takenori; Kuniya, Jiro

    1994-01-01

    This paper presents a methodology for formulating predictions to evaluate the stress corrosion cracking (SCC) potential of each light-water reactor component, where an index is introduced as a life index or F index. The index denotes the SCC time ratio of a given SCC system to be evaluated against a reference SCC system. The life index is expressed by the products of several subdivided life indexes, which correspond to each SCC influencing factor. Each subdivided life index is constructed as a function containing the influencing factor variable, obtained by analyzing experimental SCC life data. The methodology was termed the subdivided factor method. Application of the life index to SCC life data and field data showed that it was effective for evaluating the SCC potential, i.e. the SCC life. Accordingly, the proposed methodology can potentially describe a phenomenon expressed by a function which consists of the variables of several influencing factors whether there are formulae which unite as a physical model or not. ((orig.))

  8. Clearance Prediction Methodology Needs Fundamental Improvement: Trends Common to Rat and Human Hepatocytes/Microsomes and Implications for Experimental Methodology.

    Science.gov (United States)

    Wood, F L; Houston, J B; Hallifax, D

    2017-11-01

    Although prediction of clearance using hepatocytes and liver microsomes has long played a decisive role in drug discovery, it is widely acknowledged that reliably accurate prediction is not yet achievable despite the predominance of hepatically cleared drugs. Physiologically mechanistic methodology tends to underpredict clearance by several fold, and empirical correction of this bias is confounded by imprecision across drugs. Understanding the causes of prediction uncertainty has been slow, possibly reflecting poor resolution of variables associated with donor source and experimental methods, particularly for the human situation. It has been reported that among published human hepatocyte predictions there was a tendency for underprediction to increase with increasing in vivo intrinsic clearance, suggesting an inherent limitation using this particular system. This implied an artifactual rate limitation in vitro, although preparative effects on cell stability and performance were not yet resolved from assay design limitations. Here, to resolve these issues further, we present an up-to-date and comprehensive examination of predictions from published rat as well as human studies (where n = 128 and 101 hepatocytes and n = 71 and 83 microsomes, respectively) to assess system performance more independently. We report a clear trend of increasing underprediction with increasing in vivo intrinsic clearance, which is similar both between species and between in vitro systems. Hence, prior concerns arising specifically from human in vitro systems may be unfounded and the focus of investigation in the future should be to minimize the potential in vitro assay limitations common to whole cells and subcellular fractions. Copyright © 2017 by The American Society for Pharmacology and Experimental Therapeutics.

  9. Reliability prediction of engineering systems with competing failure modes due to component degradation

    International Nuclear Information System (INIS)

    Son, Young Kap

    2011-01-01

    Reliability of an engineering system depends on two reliability metrics: the mechanical reliability, considering component failures, that a functional system topology is maintained and the performance reliability of adequate system performance in each functional configuration. Component degradation explains not only the component aging processes leading to failure in function, but also system performance change over time. Multiple competing failure modes for systems with degrading components in terms of system functionality and system performance are considered in this paper with the assumption that system functionality is not independent of system performance. To reduce errors in system reliability prediction, this paper tries to extend system performance reliability prediction methods in open literature through combining system mechanical reliability from component reliabilities and system performance reliability. The extended reliability prediction method provides a useful way to compare designs as well as to determine effective maintenance policy for efficient reliability growth. Application of the method to an electro-mechanical system, as an illustrative example, is explained in detail, and the prediction results are discussed. Both mechanical reliability and performance reliability are compared to total system reliability in terms of reliability prediction errors

  10. A methodology for the prediction of offshore wind energy resources

    Energy Technology Data Exchange (ETDEWEB)

    Watson, S J; Watson, G M [Rutherford Appleton Lab., Oxfordshire (United Kingdom); Holt, R.J. [Univ. of East Anglia, Climatic Research Unit, Norwich (United Kingdom)] Barthelmie, R.J. [Risoe National Lab., Dept. of Wind Energy and Atmospheric Physics, Roskilde (Denmark); Zuylen, E.J. van [Ecofys Energy and Environment, Utrecht (Netherlands)] Cleijne, J.W. [Kema Sustainable, Arnhem (Netherlands)

    1999-03-01

    There are increasing constraints on the development of wind power on land. Recently, there has been a move to develop wind power offshore, though the amount of measured wind speed data at potential offshore wind farm sites is sparse. We present a novel methodology for the prediction of offshore wind power resources which is being applied to European Union waters. The first stage is to calculate the geostrophic wind from long-term pressure fields over the sea area of interest. Secondly, the geostrophic wind is transformed to the sea level using WA{sup s}P, taking account of near shore topography. Finally, these values are corrected for land/sea climatology (stability) effects using an analytical Coastal discontinuity Model (CDM). These values are further refined using high resolution offshore data at selected sites. The final values are validated against existing offshore datasets. Preliminary results are presented of the geostrophic wind speed validation in European Union waters. (au)

  11. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  12. Field programmable gate array reliability analysis using the dynamic flow graph methodology

    Energy Technology Data Exchange (ETDEWEB)

    McNelles, Phillip; Lu, Lixuan [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario (Canada)

    2016-10-15

    Field programmable gate array (FPGA)-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs), for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM). It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the 'IEEE 1164 standard', registers (D flip-flops), configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  13. Application of REPAS Methodology to Assess the Reliability of Passive Safety Systems

    Directory of Open Access Journals (Sweden)

    Franco Pierro

    2009-01-01

    Full Text Available The paper deals with the presentation of the Reliability Evaluation of Passive Safety System (REPAS methodology developed by University of Pisa. The general objective of the REPAS is to characterize in an analytical way the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems. The REPAS can be used in the design of the passive safety systems to assess their goodness and to optimize their costs. It may also provide numerical values that can be used in more complex safety assessment studies and it can be seen as a support to Probabilistic Safety Analysis studies. With regard to this, some examples in the application of the methodology are reported in the paper. A best-estimate thermal-hydraulic code, RELAP5, has been used to support the analyses and to model the selected systems. Probability distributions have been assigned to the uncertain input parameters through engineering judgment. Monte Carlo method has been used to propagate uncertainties and Wilks' formula has been taken into account to select sample size. Failure criterions are defined in terms of nonfulfillment of the defined design targets.

  14. A methodology based in particle swarm optimization algorithm for preventive maintenance focused in reliability and cost

    International Nuclear Information System (INIS)

    Luz, Andre Ferreira da

    2009-01-01

    In this work, a Particle Swarm Optimization Algorithm (PSO) is developed for preventive maintenance optimization. The proposed methodology, which allows the use flexible intervals between maintenance interventions, instead of considering fixed periods (as usual), allows a better adaptation of scheduling in order to deal with the failure rates of components under aging. Moreover, because of this flexibility, the planning of preventive maintenance becomes a difficult task. Motivated by the fact that the PSO has proved to be very competitive compared to other optimization tools, this work investigates the use of PSO as an alternative tool of optimization. Considering that PSO works in a real and continuous space, it is a challenge to use it for discrete optimization, in which scheduling may comprise variable number of maintenance interventions. The PSO model developed in this work overcome such difficulty. The proposed PSO searches for the best policy for maintaining and considers several aspects, such as: probability of needing repair (corrective maintenance), the cost of such repairs, typical outage times, costs of preventive maintenance, the impact of maintaining the reliability of systems as a whole, and the probability of imperfect maintenance. To evaluate the proposed methodology, we investigate an electro-mechanical system consisting of three pumps and four valves, High Pressure Injection System (HPIS) of a PWR. Results show that PSO is quite efficient in finding the optimum preventive maintenance policies for the HPIS. (author)

  15. Recent Methodologies for Creep Deformation Analysis and Its Life Prediction

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Park, Jae-Young; Iung

    2016-01-01

    To design the high-temperature creeping materials, various creep data are needed for codification, as follows: i) stress vs. creep rupture time for base metals and weldments (average and minimum), ii) stress vs. time to 1% total strain (average), iii) stress vs. time to onset of tertiary creep (minimum), and iv) constitutive eqns. for conducting time- and temperature- dependent stress-strain (average), and v) isochronous stress-strain curves (average). Also, elevated temperature components such as those used in modern power generation plant are designed using allowable stress under creep conditions. The allowable stress is usually estimated on the basis of up to 10"5 h creep rupture strength at the operating temperature. The master curve of the “sinh” function was found to have a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. The proposed multi-C method in the LM parameter revealed better life prediction than a single-C method. These improved methodologies can be utilized to accurately predict the long-term creep life or strength of Gen-IV nuclear materials which are designed for life span of 60 years

  16. Validity and reliability of using photography for measuring knee range of motion: a methodological study

    Directory of Open Access Journals (Sweden)

    Adie Sam

    2011-04-01

    Full Text Available Abstract Background The clinimetric properties of knee goniometry are essential to appreciate in light of its extensive use in the orthopaedic and rehabilitative communities. Intra-observer reliability is thought to be satisfactory, but the validity and inter-rater reliability of knee goniometry often demonstrate unacceptable levels of variation. This study tests the validity and reliability of measuring knee range of motion using goniometry and photographic records. Methods Design: Methodology study assessing the validity and reliability of one method ('Marker Method' which uses a skin marker over the greater trochanter and another method ('Line of Femur Method' which requires estimation of the line of femur. Setting: Radiology and orthopaedic departments of two teaching hospitals. Participants: 31 volunteers (13 arthritic and 18 healthy subjects. Knee range of motion was measured radiographically and photographically using a goniometer. Three assessors were assessed for reliability and validity. Main outcomes: Agreement between methods and within raters was assessed using concordance correlation coefficient (CCCs. Agreement between raters was assessed using intra-class correlation coefficients (ICCs. 95% limits of agreement for the mean difference for all paired comparisons were computed. Results Validity (referenced to radiographs: Each method for all 3 raters yielded very high CCCs for flexion (0.975 to 0.988, and moderate to substantial CCCs for extension angles (0.478 to 0.678. The mean differences and 95% limits of agreement were narrower for flexion than they were for extension. Intra-rater reliability: For flexion and extension, very high CCCs were attained for all 3 raters for both methods with slightly greater CCCs seen for flexion (CCCs varied from 0.981 to 0.998. Inter-rater reliability: For both methods, very high ICCs (min to max: 0.891 to 0.995 were obtained for flexion and extension. Slightly higher coefficients were obtained

  17. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  18. A Reliable Methodology for Determining Seed Viability by Using Hyperspectral Data from Two Sides of Wheat Seeds.

    Science.gov (United States)

    Zhang, Tingting; Wei, Wensong; Zhao, Bin; Wang, Ranran; Li, Mingliu; Yang, Liming; Wang, Jianhua; Sun, Qun

    2018-03-08

    This study investigated the possibility of using visible and near-infrared (VIS/NIR) hyperspectral imaging techniques to discriminate viable and non-viable wheat seeds. Both sides of individual seeds were subjected to hyperspectral imaging (400-1000 nm) to acquire reflectance spectral data. Four spectral datasets, including the ventral groove side, reverse side, mean (the mean of two sides' spectra of every seed), and mixture datasets (two sides' spectra of every seed), were used to construct the models. Classification models, partial least squares discriminant analysis (PLS-DA), and support vector machines (SVM), coupled with some pre-processing methods and successive projections algorithm (SPA), were built for the identification of viable and non-viable seeds. Our results showed that the standard normal variate (SNV)-SPA-PLS-DA model had high classification accuracy for whole seeds (>85.2%) and for viable seeds (>89.5%), and that the prediction set was based on a mixed spectral dataset by only using 16 wavebands. After screening with this model, the final germination of the seed lot could be higher than 89.5%. Here, we develop a reliable methodology for predicting the viability of wheat seeds, showing that the VIS/NIR hyperspectral imaging is an accurate technique for the classification of viable and non-viable wheat seeds in a non-destructive manner.

  19. A critique of reliability prediction techniques for avionics applications

    Directory of Open Access Journals (Sweden)

    Guru Prasad PANDIAN

    2018-01-01

    Full Text Available Avionics (aeronautics and aerospace industries must rely on components and systems of demonstrated high reliability. For this, handbook-based methods have been traditionally used to design for reliability, develop test plans, and define maintenance requirements and sustainment logistics. However, these methods have been criticized as flawed and leading to inaccurate and misleading results. In its recent report on enhancing defense system reliability, the U.S. National Academy of Sciences has recently discredited these methods, judging the Military Handbook (MIL-HDBK-217 and its progeny as invalid and inaccurate. This paper discusses the issues that arise with the use of handbook-based methods in commercial and military avionics applications. Alternative approaches to reliability design (and its demonstration are also discussed, including similarity analysis, testing, physics-of-failure, and data analytics for prognostics and systems health management.

  20. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  1. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  2. Conceptual Software Reliability Prediction Models for Nuclear Power Plant Safety Systems

    International Nuclear Information System (INIS)

    Johnson, G.; Lawrence, D.; Yu, H.

    2000-01-01

    The objective of this project is to develop a method to predict the potential reliability of software to be used in a digital system instrumentation and control system. The reliability prediction is to make use of existing measures of software reliability such as those described in IEEE Std 982 and 982.2. This prediction must be of sufficient accuracy to provide a value for uncertainty that could be used in a nuclear power plant probabilistic risk assessment (PRA). For the purposes of the project, reliability was defined to be the probability that the digital system will successfully perform its intended safety function (for the distribution of conditions under which it is expected to respond) upon demand with no unintended functions that might affect system safety. The ultimate objective is to use the identified measures to develop a method for predicting the potential quantitative reliability of a digital system. The reliability prediction models proposed in this report are conceptual in nature. That is, possible prediction techniques are proposed and trial models are built, but in order to become a useful tool for predicting reliability, the models must be tested, modified according to the results, and validated. Using methods outlined by this project, models could be constructed to develop reliability estimates for elements of software systems. This would require careful review and refinement of the models, development of model parameters from actual experience data or expert elicitation, and careful validation. By combining these reliability estimates (generated from the validated models for the constituent parts) in structural software models, the reliability of the software system could then be predicted. Modeling digital system reliability will also require that methods be developed for combining reliability estimates for hardware and software. System structural models must also be developed in order to predict system reliability based upon the reliability

  3. Reliability of case definitions for public health surveillance assessed by Round-Robin test methodology

    Directory of Open Access Journals (Sweden)

    Claus Hermann

    2006-05-01

    Full Text Available Abstract Background Case definitions have been recognized to be important elements of public health surveillance systems. They are to assure comparability and consistency of surveillance data and have crucial impact on the sensitivity and the positive predictive value of a surveillance system. The reliability of case definitions has rarely been investigated systematically. Methods We conducted a Round-Robin test by asking all 425 local health departments (LHD and the 16 state health departments (SHD in Germany to classify a selection of 68 case examples using case definitions. By multivariate analysis we investigated factors linked to classification agreement with a gold standard, which was defined by an expert panel. Results A total of 7870 classifications were done by 396 LHD (93% and all SHD. Reporting sensitivity was 90.0%, positive predictive value 76.6%. Polio case examples had the lowest reporting precision, salmonellosis case examples the highest (OR = 0.008; CI: 0.005–0.013. Case definitions with a check-list format of clinical criteria resulted in higher reporting precision than case definitions with a narrative description (OR = 3.08; CI: 2.47–3.83. Reporting precision was higher among SHD compared to LHD (OR = 1.52; CI: 1.14–2.02. Conclusion Our findings led to a systematic revision of the German case definitions and build the basis for general recommendations for the creation of case definitions. These include, among others, that testable yes/no criteria in a check-list format is likely to improve reliability, and that software used for data transmission should be designed in strict accordance with the case definitions. The findings of this study are largely applicable to case definitions in many other countries or international networks as they share the same structural and editorial characteristics of the case definitions evaluated in this study before their revision.

  4. Test-Retest Reliability and Predictive Validity of the Implicit Association Test in Children

    Science.gov (United States)

    Rae, James R.; Olson, Kristina R.

    2018-01-01

    The Implicit Association Test (IAT) is increasingly used in developmental research despite minimal evidence of whether children's IAT scores are reliable across time or predictive of behavior. When test-retest reliability and predictive validity have been assessed, the results have been mixed, and because these studies have differed on many…

  5. A common reference population from four European Holstein populations increases reliability of genomic predictions

    DEFF Research Database (Denmark)

    Lund, Mogens Sandø; de Ross, Sander PW; de Vries, Alfred G

    2011-01-01

    Background Size of the reference population and reliability of phenotypes are crucial factors influencing the reliability of genomic predictions. It is therefore useful to combine closely related populations. Increased accuracies of genomic predictions depend on the number of individuals added to...

  6. Embedded mechatronic systems 1 analysis of failures, predictive reliability

    CERN Document Server

    El Hami, Abdelkhalak

    2015-01-01

    In operation, mechatronics embedded systems are stressed by loads of different causes: climate (temperature, humidity), vibration, electrical and electromagnetic. These stresses in components which induce failure mechanisms should be identified and modeled for better control. AUDACE is a collaborative project of the cluster Mov'eo that address issues specific to mechatronic reliability embedded systems. AUDACE means analyzing the causes of failure of components of mechatronic systems onboard. The goal of the project is to optimize the design of mechatronic devices by reliability. The projec

  7. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  8. Engineering Design Handbook: Development Guide for Reliability. Part Three. Reliability Prediction

    Science.gov (United States)

    1976-01-01

    to t is pa(t)=l-qa(t) (10-6) This is the reliability of being closed, defined for this interval. 2 The probability that a contact viH be open...Monte Carlo simulation. Few people can know all about all available programs. Special- ists can assist in selecting a few from the avail- able many

  9. Reliability of blood pressure measurement and cardiovascular risk prediction

    NARCIS (Netherlands)

    van der Hoeven, N.V.

    2016-01-01

    High blood pressure is one of the leading risk factors for cardiovascular disease, but difficult to reliably assess because there are many factors which can influence blood pressure including stress, exercise or illness. The first part of this thesis focuses on possible ways to improve the

  10. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  11. An analysis of the human reliability on Three Mile Island II accident considering THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    Fonseca, Renato Alves da; Alvim, Antonio Carlos Marques

    2005-01-01

    The research on the Analysis of the Human Reliability becomes more important every day, as well as the study of the human factors and the contributions of the same ones to the incidents and accidents, mainly in complex plants or of high technology. The analysis here developed it uses the methodologies THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis), as well as, the tables and the cases presented in THERP Handbook and to develop a qualitative and quantitative study of an occurred nuclear accident. The chosen accident was it of Three Mile Island (TMI). The accident analysis has revealed a series of incorrect actions that resulted in the permanent loss of the reactor and shutdown of Unit 2. This study also aims at enhancing the understanding of the THERP and ATHEANA methods and at practical applications. In addition, it is possible to understand the influence of plant operational status on human failures and the influence of human failures on equipment of a system, in this case, a nuclear power plant. (author)

  12. Methodology for Designing Models Predicting Success of Infertility Treatment

    OpenAIRE

    Alireza Zarinara; Mohammad Mahdi Akhondi; Hojjat Zeraati; Koorsh Kamali; Kazem Mohammad

    2016-01-01

    Abstract Background: The prediction models for infertility treatment success have presented since 25 years ago. There are scientific principles for designing and applying the prediction models that is also used to predict the success rate of infertility treatment. The purpose of this study is to provide basic principles for designing the model to predic infertility treatment success. Materials and Methods: In this paper, the principles for developing predictive models are explained and...

  13. Improving the reliability of fishery predictions under climate change

    DEFF Research Database (Denmark)

    Brander, Keith

    2015-01-01

    The increasing number of publications assessing impacts of climate change on marine ecosystems and fisheries attests to rising scientific and public interest. A selection of recent papers, dealing more with biological than social and economic aspects, is reviewed here, with particular attention...... to the reliability of projections of climate impacts on future fishery yields. The 2014 Intergovernmental Panel on Climate Change (IPCC) report expresses high confidence in projections that mid- and high-latitude fish catch potential will increase by 2050 and medium confidence that low-latitude catch potential...... understanding of climate impacts, such as how to improve coupled models from physics to fish and how to strengthen confidence in analysis of time series...

  14. Natural circulation in water cooled nuclear power plants: Phenomena, models, and methodology for system reliability assessments

    International Nuclear Information System (INIS)

    2005-11-01

    In recent years it has been recognized that the application of passive safety systems (i.e. those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. Further, the IAEA Conference on The Safety of Nuclear Power: Strategy for the Future which was convened in 1991 noted that for new plants 'the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate'. Considering the weak driving forces of passive systems based on natural circulation, careful design and analysis methods must be employed to assure that the systems perform their intended functions. To support the development of advanced water cooled reactor designs with passive systems, investigations of natural circulation are an ongoing activity in several IAEA Member States. Some new designs also utilize natural circulation as a means to remove core power during normal operation. In response to the motivating factors discussed above, and to foster international collaboration on the enabling technology of passive systems that utilize natural circulation, an IAEA Coordinated Research Project (CRP) on Natural Circulation Phenomena, Modelling and Reliability of Passive Systems that Utilize Natural Circulation was started in early 2004. Building on the shared expertise within the CRP, this publication presents extensive information on natural circulation phenomena, models, predictive tools and experiments that currently support design and analyses of natural circulation systems and highlights areas where additional research is needed. Therefore, this publication serves both to provide a description of the present state of knowledge on natural circulation in water cooled nuclear power plants and to guide the planning and conduct of the CRP in

  15. MERMOS: an EDF project to update the PHRA methodology (Probabilistic Human Reliability Assessment)

    International Nuclear Information System (INIS)

    Le Bot, Pierre; Desmares, E.; Bieder, C.; Cara, F.; Bonnet, J.L.

    1998-01-01

    To account for successive evolution of nuclear power plants emergency operation, EDF had several times to review PHRA methodologies. It was particularly the case when event-based procedures were left behind to the benefit of state-based procedures. A more recent updating was necessary to get pieces of information on the new unit type N4 safety. The extent of changes in operation for this unit type (especially the computerization of both the control room and the procedures) required to deeply rethink existing PHRA methods. It also seemed necessary to - more explicitly than in the past - base the design of methods on concepts evolved in human sciences. These are the main ambitions of the project named MERMOS that started in 1996. The design effort for a new PHRA method is carried out by a multidisciplinary team involving reliability engineers, psychologists and ergonomists. An independent expert is in charge of project review. The method, considered as the analysis tool dedicated to PHRA analysts, is one of the two outcomes of the project. The other one is the formalization of the design approach for the method, aimed at a good appropriation of the method by the analysts. EDF's specificity in the field of PHRA and more generally PSA is that the method is not used by the designers but by analysts. Keeping track of the approach is also meant to guarantee its transposition to other EDF unit types such as 900 or 1300 MW PWR. The PHRA method is based upon a model of emergency operation called 'SAD model'. The formalization effort of the design approach lead to clarify and justify it. The model describes and explains both functioning and dys-functioning of emergency operation in PSA scenarios. It combines a systemic approach and what is called distributed cognition in cognitive sciences. Collective aspects are considered as an important feature in explaining phenomena under study in operation dys-functioning. The PHRA method is to be operational early next year (1998

  16. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    Science.gov (United States)

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  17. Reliability Prediction Approaches For Domestic Intelligent Electric Energy Meter Based on IEC62380

    Science.gov (United States)

    Li, Ning; Tong, Guanghua; Yang, Jincheng; Sun, Guodong; Han, Dongjun; Wang, Guixian

    2018-01-01

    The reliability of intelligent electric energy meter is a crucial issue considering its large calve application and safety of national intelligent grid. This paper developed a procedure of reliability prediction for domestic intelligent electric energy meter according to IEC62380, especially to identify the determination of model parameters combining domestic working conditions. A case study was provided to show the effectiveness and validation.

  18. A Novel Risk Scoring System Reliably Predicts Readmission Following Pancreatectomy

    Science.gov (United States)

    Valero, Vicente; Grimm, Joshua C.; Kilic, Arman; Lewis, Russell L.; Tosoian, Jeffrey J.; He, Jin; Griffin, James; Cameron, John L.; Weiss, Matthew J.; Vollmer, Charles M.; Wolfgang, Christopher L.

    2015-01-01

    Background Postoperative readmissions have been proposed by Medicare as a quality metric and may impact provider reimbursement. Since readmission following pancreatectomy is common, we sought to identify factors associated with readmission in order to establish a predictive risk scoring system (RSS). Study Design A retrospective analysis of 2,360 pancreatectomies performed at nine, high-volume pancreatic centers between 2005 and 2011 was performed. Forty-five factors strongly associated with readmission were identified. To derive and validate a RSS, the population was randomly divided into two cohorts in a 4:1 fashion. A multivariable logistic regression model was constructed and scores were assigned based on the relative odds ratio of each independent predictor. A composite Readmission After Pancreatectomy (RAP) score was generated and then stratified to create risk groups. Results Overall, 464 (19.7%) patients were readmitted within 90-days. Eight pre- and postoperative factors, including prior myocardial infarction (OR 2.03), ASA Class ≥ 3 (OR 1.34), dementia (OR 6.22), hemorrhage (OR 1.81), delayed gastric emptying (OR 1.78), surgical site infection (OR 3.31), sepsis (OR 3.10) and short length of stay (OR 1.51), were independently predictive of readmission. The 32-point RAP score generated from the derivation cohort was highly predictive of readmission in the validation cohort (AUC 0.72). The low (0-3), intermediate (4-7) and high risk (>7) groups correlated to 11.7%, 17.5% and 45.4% observed readmission rates, respectively (preadmission following pancreatectomy. Identification of patients with increased risk of readmission using the RAP score will allow efficient resource allocation aimed to attenuate readmission rates. It also has potential to serve as a new metric for comparative research and quality assessment. PMID:25797757

  19. Supporting change processes in design: Complexity, prediction and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Eckert, Claudia M. [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: cme26@cam.ac.uk; Keller, Rene [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: rk313@cam.ac.uk; Earl, Chris [Open University, Department of Design and Innovation, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)]. E-mail: C.F.Earl@open.ac.uk; Clarkson, P. John [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: pjc10@cam.ac.uk

    2006-12-15

    Change to existing products is fundamental to design processes. New products are often designed through change or modification to existing products. Specific parts or subsystems are changed to similar ones whilst others are directly reused. Design by modification applies particularly to safety critical products where the reuse of existing working parts and subsystems can reduce cost and risk. However change is rarely a matter of just reusing or modifying parts. Changing one part can propagate through the entire design leading to costly rework or jeopardising the integrity of the whole product. This paper characterises product change based on studies in the aerospace and automotive industry and introduces tools to aid designers in understanding the potential effects of change. Two ways of supporting designers are described: probabilistic prediction of the effects of change and visualisation of change propagation through product connectivities. Change propagation has uncertainties which are amplified by the choices designers make in practice as they implement change. Change prediction and visualisation is discussed with reference to complexity in three areas of product development: the structural backcloth of connectivities in the existing product (and its processes), the descriptions of the product used in design and the actions taken to carry out changes.

  20. Summary of the preparation of methodology for digital system reliability analysis for PSA purposes

    International Nuclear Information System (INIS)

    Hustak, S.; Babic, P.

    2001-12-01

    The report is structured as follows: Specific features of and requirements for the digital part of NPP Instrumentation and Control (I and C) systems (Computer-controlled digital technologies and systems of the NPP I and C system; Specific types of digital technology failures and preventive provisions; Reliability requirements for the digital parts of I and C systems; Safety requirements for the digital parts of I and C systems; Defence-in-depth). Qualitative analyses of NPP I and C system reliability and safety (Introductory system analysis; Qualitative requirements for and proof of NPP I and C system reliability and safety). Quantitative reliability analyses of the digital parts of I and C systems (Selection of a suitable quantitative measure of digital system reliability; Selected qualitative and quantitative findings regarding digital system reliability; Use of relations among the occurrences of the various types of failure). Mathematical section in support of the calculation of the various types of indices (Boolean reliability models, Markovian reliability models). Example of digital system analysis (Description of a selected protective function and the relevant digital part of the I and C system; Functional chain examined, its components and fault tree). (P.A.)

  1. Application case study of AP1000 automatic depressurization system (ADS) for reliability evaluation by GO-FLOW methodology

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Muhammad, E-mail: hashimsajid@yahoo.com; Hidekazu, Yoshikawa, E-mail: yosikawa@kib.biglobe.ne.jp; Takeshi, Matsuoka, E-mail: mats@cc.utsunomiya-u.ac.jp; Ming, Yang, E-mail: myang.heu@gmail.com

    2014-10-15

    Highlights: • Discussion on reasons why AP1000 equipped with ADS system comparatively to PWR. • Clarification of full and partial depressurization of reactor coolant system by ADS system. • Application case study of four stages ADS system for reliability evaluation in LBLOCA. • GO-FLOW tool is capable to evaluate dynamic reliability of passive safety systems. • Calculated ADS reliability result significantly increased dynamic reliability of PXS. - Abstract: AP1000 nuclear power plant (NPP) utilized passive means for the safety systems to ensure its safety in events of transient or severe accidents. One of the unique safety systems of AP1000 to be compared with conventional PWR is the “four stages Automatic Depressurization System (ADS)”, and ADS system originally works as an active safety system. In the present study, authors first discussed the reasons of why four stages ADS system is added in AP1000 plant to be compared with conventional PWR in the aspect of reliability. And then explained the full and partial depressurization of RCS system by four stages ADS in events of transient and loss of coolant accidents (LOCAs). Lastly, the application case study of four stages ADS system of AP1000 has been conducted in the aspect of reliability evaluation of ADS system under postulated conditions of full RCS depressurization during large break loss of a coolant accident (LBLOCA) in one of the RCS cold legs. In this case study, the reliability evaluation is made by GO-FLOW methodology to determinate the influence of ADS system in dynamic reliability of passive core cooling system (PXS) of AP1000, i.e. what will happen if ADS system fails or successfully actuate. The GO-FLOW is success-oriented reliability analysis tool and is capable to evaluating the systems reliability/unavailability alternatively to Fault Tree Analysis (FTA) and Event Tree Analysis (ETA) tools. Under these specific conditions of LBLOCA, the GO-FLOW calculated reliability results indicated

  2. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  3. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  4. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  5. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology

    Science.gov (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  6. Reliable prediction of adsorption isotherms via genetic algorithm molecular simulation.

    Science.gov (United States)

    LoftiKatooli, L; Shahsavand, A

    2017-01-01

    Conventional molecular simulation techniques such as grand canonical Monte Carlo (GCMC) strictly rely on purely random search inside the simulation box for predicting the adsorption isotherms. This blind search is usually extremely time demanding for providing a faithful approximation of the real isotherm and in some cases may lead to non-optimal solutions. A novel approach is presented in this article which does not use any of the classical steps of the standard GCMC method, such as displacement, insertation, and removal. The new approach is based on the well-known genetic algorithm to find the optimal configuration for adsorption of any adsorbate on a structured adsorbent under prevailing pressure and temperature. The proposed approach considers the molecular simulation problem as a global optimization challenge. A detailed flow chart of our so-called genetic algorithm molecular simulation (GAMS) method is presented, which is entirely different from traditions molecular simulation approaches. Three real case studies (for adsorption of CO 2 and H 2 over various zeolites) are borrowed from literature to clearly illustrate the superior performances of the proposed method over the standard GCMC technique. For the present method, the average absolute values of percentage errors are around 11% (RHO-H 2 ), 5% (CHA-CO 2 ), and 16% (BEA-CO 2 ), while they were about 70%, 15%, and 40% for the standard GCMC technique, respectively.

  7. In-plant reliability data base for nuclear power plant components: data collection and methodology report

    International Nuclear Information System (INIS)

    Drago, J.P.; Borkowski, R.J.; Pike, D.H.; Goldberg, F.F.

    1982-07-01

    The development of a component reliability data for use in nuclear power plant probabilistic risk assessments and reliabiilty studies is presented in this report. The sources of the data are the in-plant maintenance work request records from a sample of nuclear power plants. This data base is called the In-Plant Reliability Data (IPRD) system. Features of the IPRD system are compared with other data sources such as the Licensee Event Report system, the Nuclear Plant Reliability Data system, and IEEE Standard 500. Generic descriptions of nuclear power plant systems formulated for IPRD are given

  8. Methodological issues concerning the application of reliable laser particle sizing in soils

    Science.gov (United States)

    de Mascellis, R.; Impagliazzo, A.; Basile, A.; Minieri, L.; Orefice, N.; Terribile, F.

    2009-04-01

    During the past decade, the evolution of technologies has enabled laser diffraction (LD) to become a much widespread means of particle size distribution (PSD), replacing sedimentation and sieve analysis in many scientific fields mainly due to its advantages of versatility, fast measurement and high reproducibility. Despite such developments of the last decade, the soil scientist community has been quite reluctant to replace the good old sedimentation techniques (ST); possibly because of (i) the large complexity of the soil matrix inducing different types of artefacts (aggregates, deflocculating dynamics, etc.), (ii) the difficulties in relating LD results with results obtained through sedimentation techniques and (iii) the limited size range of most LD equipments. More recently LD granulometry is slowly gaining appreciation in soil science also because of some innovations including an enlarged size dynamic range (0,01-2000 m) and the ability to implement more powerful algorithms (e.g. Mie theory). Furthermore, LD PSD can be successfully used in the application of physically based pedo-transfer functions (i.e., Arya and Paris model) for investigations of soil hydraulic properties, due to the direct determination of PSD in terms of volume percentage rather than in terms of mass percentage, thus eliminating the need to adopt the rough approximation of a single value for soil particle density in the prediction process. Most of the recent LD work performed in soil science deals with the comparison with sedimentation techniques and show the general overestimation of the silt fraction following a general underestimation of the clay fraction; these well known results must be related with the different physical principles behind the two techniques. Despite these efforts, it is indeed surprising that little if any work is devoted to more basic methodological issues related to the high sensitivity of LD to the quantity and the quality of the soil samples. Our work aims to

  9. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  10. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  11. Advances in ranking and selection, multiple comparisons, and reliability methodology and applications

    CERN Document Server

    Balakrishnan, N; Nagaraja, HN

    2007-01-01

    S. Panchapakesan has made significant contributions to ranking and selection and has published in many other areas of statistics, including order statistics, reliability theory, stochastic inequalities, and inference. Written in his honor, the twenty invited articles in this volume reflect recent advances in these areas and form a tribute to Panchapakesan's influence and impact on these areas. Thematically organized, the chapters cover a broad range of topics from: Inference; Ranking and Selection; Multiple Comparisons and Tests; Agreement Assessment; Reliability; and Biostatistics. Featuring

  12. Seismic reliability assessment methodology for CANDU concrete containment structures-phase 11

    International Nuclear Information System (INIS)

    Hong, H.P.

    1996-07-01

    This study was undertaken to verify a set of load factors for reliability-based seismic evaluation of CANDU containment structures in Eastern Canada. Here, the new, site-specific, results of probabilistic seismic hazard assessment (response spectral velocity) were applied. It was found that the previously recommended load factors are relatively insensitive to the new seismic hazard information, and are adequate for a reliability-based seismic evaluation process. (author). 4 refs., 5 tabs., 9 figs

  13. Forest cover change prediction using hybrid methodology of ...

    Indian Academy of Sciences (India)

    to assess the present and future land use/land cover scenario of Gangtok, the subHimalayan capital of ... data is minimal. Finally, a combination of Markov modelling and SAVI was used to predict the probable land-use scenario in Gangtok in 2020 AD, which indicted that more ... to develop resource allocation strategies.

  14. A reliability-based preventive maintenance methodology for the projection spot welding machine

    Directory of Open Access Journals (Sweden)

    Fayzimatov Ulugbek

    2018-06-01

    Full Text Available An effective operations of a projection spot welding (PSW machine is closely related to the effec-tiveness of the maintenance. Timely maintenance can prevent failures and improve reliability and maintainability of the machine. Therefore, establishing the maintenance frequency for the welding machine is one of the most important tasks for plant engineers. In this regard, reliability analysis of the welding machine can be used to establish preventive maintenance intervals (PMI and to identify the critical parts of the system. In this reliability and maintainability study, analysis of the PSW machine was carried out. The failure and repair data for analysis were obtained from automobile manufacturing company located in Uzbekistan. The machine was divided into three main sub-systems: electrical, pneumatic and hydraulic. Different distributions functions for all sub-systems was tested and their parameters tabulated. Based on estimated parameters of the analyzed distribu-tions, PMI for the PSW machines sub-systems at different reliability levels was calculated. Finally, preventive measures for enhancing the reliability of the PSW machine sub-systems are suggested.

  15. Methodology for time-dependent reliability analysis of accident sequences and complex reactor systems

    International Nuclear Information System (INIS)

    Paula, H.M.

    1984-01-01

    The work presented here is of direct use in probabilistic risk assessment (PRA) and is of value to utilities as well as the Nuclear Regulatory Commission (NRC). Specifically, this report presents a methodology and a computer program to calculate the expected number of occurrences for each accident sequence in an event tree. The methodology evaluates the time-dependent (instantaneous) and the average behavior of the accident sequence. The methodology accounts for standby safety system and component failures that occur (a) before they are demanded, (b) upon demand, and (c) during the mission (system operation). With respect to failures that occur during the mission, this methodology is unique in the sense that it models components that can be repaired during the mission. The expected number of system failures during the mission provides an upper bound for the probability of a system failure to run - the mission unreliability. The basic event modeling includes components that are continuously monitored, periodically tested, and those that are not tested or are otherwise nonrepairable. The computer program ASA allows practical applications of the method developed. This work represents a required extension of the presently available methodology and allows a more realistic PRA of nuclear power plants

  16. A methodology for noise prediction of turbofan engines.

    OpenAIRE

    Gustavo Di Fiore dos Santos

    2006-01-01

    A computional model is developed for prediction of noise emission from na existing or new turbofan engine. This model allows the simulation of noise generation from high bypass ratio turbofan engines, appropriate for use with computational programs for gas turbine performance developed at ITA. Analytical and empirical methods are used for spectrum shape, spectrum level, overall noise and free-field directivity noise. The most significant noise sources in turbofan engines are modeled: fan, com...

  17. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  18. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    Science.gov (United States)

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  19. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  20. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  1. Methodology for predicting cooling water effects on fish

    International Nuclear Information System (INIS)

    Cakiroglu, C.; Yurteri, C.

    1998-01-01

    The mathematical model presented here predicts the long-term effects of once-through cooling water systems on local fish populations. The fish life cycle model simulates different life stages of fish by using appropriate expressions representing growth and mortality rates. The heart of the developed modeling approach is the prediction of plant-caused reduction in total fish population by estimating recruitment to adult population with and without entrainment of ichthyoplankton and impingement of small fish. The model was applied to a local fish species, gilthead (Aparus aurata), for the case of a proposed power plant in the Aegean region of Turkey. The simulations indicate that entrainment and impingement may lead to a population reduction of about 2% to 8% in the long run. In many cases, an impact of this size can be considered rather unimportant. In the case of sensitive and ecologically values species facing extinction, however, necessary precautions should be taken to minimize or totally avoid such an impact

  2. A generic Approach for Reliability Predictions considering non-uniformly Deterioration Behaviour

    International Nuclear Information System (INIS)

    Krause, Jakob; Kabitzsch, Klaus

    2012-01-01

    Predictive maintenance offers the possibility to prognosticate the remaining time until a maintenance action of a machine has to be scheduled. Unfortunately, current predictive maintenance solutions are only suitable for very specific use cases like reliability predictions based on vibration monitoring. Furthermore, they do not consider the fact that machines may deteriorate non-uniformly, depending on external influences (e.g., the work piece material in a milling machine or the changing fruit acid concentration in a bottling plant). In this paper two concepts for a generic predictive maintenance solution which also considers non-uniformly aging behaviour are introduced. The first concept is based on system models representing the health state of a technical system. As these models are usually statically (viz. without a timely dimension) their coefficients are determined periodically and the resulting time series is used as aging indicator. The second concept focuses on external influences (contexts) which change the behaviour of the previous mentioned aging indicators in order to increase the accuracy of reliability predictions. Therefore, context-depended time series models are determined and used to predict machine reliability. Both concepts were evaluated on data of an air ventilation system. Thereby, it could be shown that they are suitable to determine aging indicators in a generic way and to incorporate external influences in the reliability prediction. Through this, the quality of reliability predictions can be significantly increased. In reality this leads to a more accurate scheduling of maintenance actions. Furthermore, the generic character of the solutions makes the concepts suitable for a wide range of aging processes.

  3. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  4. A Reliable and Valid Survey to Predict a Patient’s Gagging Intensity

    Directory of Open Access Journals (Sweden)

    Casey M. Hearing

    2014-07-01

    Full Text Available Objectives: The aim of this study was to devise a reliable and valid survey to predict the intensity of someone’s gag reflex. Material and Methods: A 10-question Predictive Gagging Survey was created, refined, and tested on 59 undergraduate participants. The questions focused on risk factors and experiences that would indicate the presence and strength of someone’s gag reflex. Reliability was assessed by administering the survey to a group of 17 participants twice, with 3 weeks separating the two administrations. Finally, the survey was given to 25 dental patients. In these cases, patients completed an informed consent form, filled out the survey, and then had a maxillary impression taken while their gagging response was quantified from 1 to 5 on the Fiske and Dickinson Gagging Intensity Index. Results: There was a moderate positive correlation between the Predictive Gagging Survey and Fiske and Dickinson’s Gagging Severity Index, r = +0.64, demonstrating the survey’s validity. Furthermore, the test-retest reliability was r = +0.96, demonstrating the survey’s reliability. Conclusions: The Predictive Gagging Survey is a 10-question survey about gag-related experiences and behaviours. We established that it is a reliable and valid method to assess the strength of someone’s gag reflex.

  5. ABOUT POSSIBILITY OF USAGE METHODOLOGICAL APPROACHES TO BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Ruslan Valentinovich Druzin

    2013-12-01

    Full Text Available Analysis of the most common foreign methods showed that they were designed to analyze enterprises in the sustainable economic development with low-shadowing of the economy. The most appropriate retrospective analysis results were obtained using Springate model, Lis ratio and Beaver ratio. Domestic methods analysis allows us to conclude that they make it difficult to account criterion of insolvency using a number of factors. Ukrainian researchers, as well as foreigners, use indexes for bankruptcy prediction that are based on convolution values of different insolvency signs. However, we believe that usage of a single indicator as a result doesn’t allow us to make an insolvency diagnosis. The reason is high probability of an erroneous calculation because of the unreliability of the data used. Also, one of domestic methods problems is their orientation to the official statistics that increases the error due to significant domestic shadowing economy.

  6. Rainfall prediction methodology with binary multilayer perceptron neural networks

    Science.gov (United States)

    Esteves, João Trevizoli; de Souza Rolim, Glauco; Ferraudo, Antonio Sergio

    2018-05-01

    Precipitation, in short periods of time, is a phenomenon associated with high levels of uncertainty and variability. Given its nature, traditional forecasting techniques are expensive and computationally demanding. This paper presents a soft computing technique to forecast the occurrence of rainfall in short ranges of time by artificial neural networks (ANNs) in accumulated periods from 3 to 7 days for each climatic season, mitigating the necessity of predicting its amount. With this premise it is intended to reduce the variance, rise the bias of data and lower the responsibility of the model acting as a filter for quantitative models by removing subsequent occurrences of zeros values of rainfall which leads to bias the and reduces its performance. The model were developed with time series from ten agriculturally relevant regions in Brazil, these places are the ones with the longest available weather time series and and more deficient in accurate climate predictions, it was available 60 years of daily mean air temperature and accumulated precipitation which were used to estimate the potential evapotranspiration and water balance; these were the variables used as inputs for the ANNs models. The mean accuracy of the model for all the accumulated periods were 78% on summer, 71% on winter 62% on spring and 56% on autumn, it was identified that the effect of continentality, the effect of altitude and the volume of normal precipitation, have an direct impact on the accuracy of the ANNs. The models have peak performance in well defined seasons, but looses its accuracy in transitional seasons and places under influence of macro-climatic and mesoclimatic effects, which indicates that this technique can be used to indicate the eminence of rainfall with some limitations.

  7. An evaluation of the reliability and usefulness of external-initiator PRA [probabilistic risk analysis] methodologies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally ''mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab

  8. An evaluation of the reliability and usefulness of external-initiator PRA (probabilistic risk analysis) methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Budnitz, R.J.; Lambert, H.E. (Future Resources Associates, Inc., Berkeley, CA (USA))

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab.

  9. A reliability as an independent variable (RAIV) methodology for optimizing test planning for liquid rocket engines

    Science.gov (United States)

    Strunz, Richard; Herrmann, Jeffrey W.

    2011-12-01

    The hot fire test strategy for liquid rocket engines has always been a concern of space industry and agency alike because no recognized standard exists. Previous hot fire test plans focused on the verification of performance requirements but did not explicitly include reliability as a dimensioning variable. The stakeholders are, however, concerned about a hot fire test strategy that balances reliability, schedule, and affordability. A multiple criteria test planning model is presented that provides a framework to optimize the hot fire test strategy with respect to stakeholder concerns. The Staged Combustion Rocket Engine Demonstrator, a program of the European Space Agency, is used as example to provide the quantitative answer to the claim that a reduced thrust scale demonstrator is cost beneficial for a subsequent flight engine development. Scalability aspects of major subsystems are considered in the prior information definition inside the Bayesian framework. The model is also applied to assess the impact of an increase of the demonstrated reliability level on schedule and affordability.

  10. Study on seismic reliability for foundation grounds and surrounding slopes of nuclear power plants. Proposal of evaluation methodology and integration of seismic reliability evaluation system

    International Nuclear Information System (INIS)

    Ohtori, Yasuki; Kanatani, Mamoru

    2006-01-01

    This paper proposes an evaluation methodology of annual probability of failure for soil structures subjected to earthquakes and integrates the analysis system for seismic reliability of soil structures. The method is based on margin analysis, that evaluates the ground motion level at which structure is damaged. First, ground motion index that is strongly correlated with damage or response of the specific structure, is selected. The ultimate strength in terms of selected ground motion index is then evaluated. Next, variation of soil properties is taken into account for the evaluation of seismic stability of structures. The variation of the safety factor (SF) is evaluated and then the variation is converted into the variation of the specific ground motion index. Finally, the fragility curve is developed and then the annual probability of failure is evaluated combined with seismic hazard curve. The system facilitates the assessment of seismic reliability. A generator of random numbers, dynamic analysis program and stability analysis program are incorporated into one package. Once we define a structural model, distribution of the soil properties, input ground motions and so forth, list of safety factors for each sliding line is obtained. Monte Carlo Simulation (MCS), Latin Hypercube Sampling (LHS), point estimation method (PEM) and first order second moment (FOSM) implemented in this system are also introduced. As numerical examples, a ground foundation and a surrounding slope are assessed using the proposed method and the integrated system. (author)

  11. Reliable prediction and determination of Norwegian lamb carcass composition and value

    International Nuclear Information System (INIS)

    Kongsro, Jørgen

    2008-01-01

    The main objective of this work was to study prediction and determination of Norwegian lamb carcass composition with different techniques spanning from subjective appraisal to computer-intensive methods. There is an increasing demand, both from farmers and processors of meats, for a more objective and reliable system for prediction of muscle (lean meat), fat, bone and value of a lamb carcass. When introducing new technologies for determination of lamb carcass composition, the reference method used for calibration must be precise and reliable. The precision and reliability of the current dissection reference for lamb carcass classification and grading has never been quantified. A poor reference method will not benefit even the most optimal system for prediction and determination of lamb carcasses. To help achieve reliable systems, the uncertainty or errors in the reference method and measuring systems needs to be quantified. Using proper calibration methods for the measuring systems, the uncertainty and modeling power can be determined for lamb carcasses. The results of the work presented in this thesis show that the current classification system using subjective appraisal (EUROP) is reliable; however the accuracy with respect to carcass composition, especially for lean meat or muscle and carcass value, is poor. The reference method used for determining lamb carcass composition with respect to lamb carcass classification and grading is precise and reliable for carcass composition. For the composition and yield of sub-primal cuts, the reliability varied, and was especially poor for the breast cut. Further attention is needed for jointing and cutting of sub-primals to achieve even higher precision and reliability of the reference method. As an alternative to butcher or manual dissection, Computer Tomography (CT) showed promising results with respect to prediction of lamb carcass composition. This method is nicknamed “virtual dissection”. By utilizing the

  12. Reliability of computerized cephalometric outcome predictions of mandibular set-back surgery

    Directory of Open Access Journals (Sweden)

    Stefanović Neda

    2011-01-01

    Full Text Available Introduction. A successful treatment outcome in dentofacial deformity patients commonly requires combined orthodontic-surgical therapy. This enables us to overcome functional, aesthetic and psychological problems. Since most patients state aesthetics as the primary motive for seeking therapy, cephalometric predictions of treatment outcome have become the essential part of treatment planning, especially in combined orthodontic-surgical cases. Objective. The aim of this study was to evaluate the validity and reliability of computerized orthognathic surgery outcome predictions generated using the Nemotec Dental Studio NX 2005 software. Methods. The sample of the study consisted of 31 patients diagnosed with mandibular prognathism who were surgically treated at the Hospital for Maxillofacial Surgery in Belgrade. Investigation was done on lateral cephalograms made before and after surgical treatment. Cephalograms were digitized and analyzed using computer software. According to measurements made on superimposed pre- and postsurgical cephalograms, the patients were retreated within the software and the predictions were assessed by measuring seven angular and three linear parameters. Prediction measurements were then compared with the actual outcome. Results. Results showed statistically significant changes between posttreatment and predicted values for parameters referring to lower lip and mentolabial sulcus position. Conclusion. Computerized cephalometric predictions for hard-tissue structures in the sagittal and vertical planes, as well as the VTO parameters, generated using the Nemotec Dental Studio NX 2005 software are reliable, while lower lip and mentolabial sulcus position predictions are not reliable enough.

  13. Test-retest reliability and predictive validity of the Implicit Association Test in children.

    Science.gov (United States)

    Rae, James R; Olson, Kristina R

    2018-02-01

    The Implicit Association Test (IAT) is increasingly used in developmental research despite minimal evidence of whether children's IAT scores are reliable across time or predictive of behavior. When test-retest reliability and predictive validity have been assessed, the results have been mixed, and because these studies have differed on many factors simultaneously (lag-time between testing administrations, domain, etc.), it is difficult to discern what factors may explain variability in existing test-retest reliability and predictive validity estimates. Across five studies (total N = 519; ages 6- to 11-years-old), we manipulated two factors that have varied in previous developmental research-lag-time and domain. An internal meta-analysis of these studies revealed that, across three different methods of analyzing the data, mean test-retest (rs of .48, .38, and .34) and predictive validity (rs of .46, .20, and .10) effect sizes were significantly greater than zero. While lag-time did not moderate the magnitude of test-retest coefficients, whether we observed domain differences in test-retest reliability and predictive validity estimates was contingent on other factors, such as how we scored the IAT or whether we included estimates from a unique sample (i.e., a sample containing gender typical and gender diverse children). Recommendations are made for developmental researchers that utilize the IAT in their research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Prediction of health risks from accidents: A comprehensive assessment methodology

    International Nuclear Information System (INIS)

    MacFarlane, D.R.; Yuan, Y.C.

    1992-01-01

    We have developed two computer programs to predict radiation risks to individuals and/or the collective population from exposures to accidental releases of radioactive materials. When used together, these two codes provide a consistent, comprehensive tool to estimate not only the risks to specific individuals but also the distribution of risks in the exposed population and the total number of individuals within a specific level of risk. Prompt and latent fatalities are estimated for the exposed population, and from these, the risk to an average individual can be derived. Uncertainty in weather conditions is considered by estimating both the ''median'' and the ''maximum'' population doses based on the frequency distribution of wind speeds and stabilities for a given site. The importance of including all dispersible particles (particles smaller than about 100 μm) for dose and health risk analyses from nonfiltered releases for receptor locations within about 10 km from a release has been investigated. The dose contribution of the large particles (> 10 μm) has been shown to be substantially greater than those from the small particles for the dose receptors in various release and exposure conditions. These conditions include, particularly, elevated releases, strong wind weather, and exposure pathways associated with ground-deposited material over extended periods of time

  15. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  16. Natural Circulation in Water Cooled Nuclear Power Plants Phenomena, models, and methodology for system reliability assessments

    Energy Technology Data Exchange (ETDEWEB)

    Jose Reyes

    2005-02-14

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''.

  17. Natural Circulation in Water Cooled Nuclear Power Plants Phenomena, models, and methodology for system reliability assessments

    International Nuclear Information System (INIS)

    Jose Reyes

    2005-01-01

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''

  18. Development of equipment reliability process using predictive technologies at Hamaoka Nuclear Power Station

    International Nuclear Information System (INIS)

    Taniguchi, Yuji; Sakuragi, Futoshi; Hamada, Seiichi

    2014-01-01

    Development of equipment reliability(ER) process, specifically for predictive maintenance (PdM) technologies integrated condition based maintenance (CBM) process, at Hamaoka Nuclear Power Station is introduced in this paper. Integration of predictive maintenance technologies such as vibration, oil analysis and thermo monitoring is more than important to establish strong maintenance strategies and to direct a specific technical development. In addition, a practical example of CBM is also presented to support the advantage of the idea. (author)

  19. Reliability analysis and prediction of mixed mode load using Markov Chain Model

    International Nuclear Information System (INIS)

    Nikabdullah, N.; Singh, S. S. K.; Alebrahim, R.; Azizi, M. A.; K, Elwaleed A.; Noorani, M. S. M.

    2014-01-01

    The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading

  20. A Methodology and Toolkit for Deploying Reliable Security Policies in Critical Infrastructures

    Directory of Open Access Journals (Sweden)

    Faouzi Jaïdi

    2018-01-01

    Full Text Available Substantial advances in Information and Communication Technologies (ICT bring out novel concepts, solutions, trends, and challenges to integrate intelligent and autonomous systems in critical infrastructures. A new generation of ICT environments (such as smart cities, Internet of Things, edge-fog-social-cloud computing, and big data analytics is emerging; it has different applications to critical domains (such as transportation, communication, finance, commerce, and healthcare and different interconnections via multiple layers of public and private networks, forming a grid of critical cyberphysical infrastructures. Protecting sensitive and private data and services in critical infrastructures is, at the same time, a main objective and a great challenge for deploying secure systems. It essentially requires setting up trusted security policies. Unfortunately, security solutions should remain compliant and regularly updated to follow and track the evolution of security threats. To address this issue, we propose an advanced methodology for deploying and monitoring the compliance of trusted access control policies. Our proposal extends the traditional life cycle of access control policies with pertinent activities. It integrates formal and semiformal techniques allowing the specification, the verification, the implementation, the reverse-engineering, the validation, the risk assessment, and the optimization of access control policies. To automate and facilitate the practice of our methodology, we introduce our system SVIRVRO that allows managing the extended life cycle of access control policies. We refer to an illustrative example to highlight the relevance of our contributions.

  1. Life Cycle Assessment for desalination: a review on methodology feasibility and reliability.

    Science.gov (United States)

    Zhou, Jin; Chang, Victor W-C; Fane, Anthony G

    2014-09-15

    As concerns of natural resource depletion and environmental degradation caused by desalination increase, research studies of the environmental sustainability of desalination are growing in importance. Life Cycle Assessment (LCA) is an ISO standardized method and is widely applied to evaluate the environmental performance of desalination. This study reviews more than 30 desalination LCA studies since 2000s and identifies two major issues in need of improvement. The first is feasibility, covering three elements that support the implementation of the LCA to desalination, including accounting methods, supporting databases, and life cycle impact assessment approaches. The second is reliability, addressing three essential aspects that drive uncertainty in results, including the incompleteness of the system boundary, the unrepresentativeness of the database, and the omission of uncertainty analysis. This work can serve as a preliminary LCA reference for desalination specialists, but will also strengthen LCA as an effective method to evaluate the environment footprint of desalination alternatives. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  3. A note on the application of probabilistic structural reliability methodology to nuclear power plants

    International Nuclear Information System (INIS)

    Maurer, H.A.

    1978-01-01

    The interest shown in the general prospects of primary energy in European countries prompted description of the actual European situation. Explanation of the needs for installation of nuclear power plants in most contries of the European Communities are given. Activities of the Commission of the European Communities to initiate a progressive harmonization of already existing European criteria, codes and complementary requirements in order to improve the structural reliability of components and systems of nuclear power plants are summarized. Finally, the applicability of a probabilistic safety analysis to facilitate decision-making as to safety by defining acceptable target and limit values, coupled with a subjective estimate as it is applied in the safety analyses performed in most European countries, is demonstrated. (Auth.)

  4. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  5. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    : The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability...... comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0.79 and 0.74 are obtained using our and the compared method, respectively. This tendency is true for any selected subset....

  6. Reliability: How much is it worth? Beyond its estimation or prediction, the (net) present value of reliability

    International Nuclear Information System (INIS)

    Saleh, J.H.; Marais, K.

    2006-01-01

    In this article, we link an engineering concept, reliability, to a financial and managerial concept, net present value, by exploring the impact of a system's reliability on its revenue generation capability. The framework here developed for non-repairable systems quantitatively captures the value of reliability from a financial standpoint. We show that traditional present value calculations of engineering systems do not account for system reliability, thus over-estimate a system's worth and can therefore lead to flawed investment decisions. It is therefore important to involve reliability engineers upfront before investment decisions are made in technical systems. In addition, the analyses here developed help designers identify the optimal level of reliability that maximizes a system's net present value-the financial value reliability provides to the system minus the cost to achieve this level of reliability. Although we recognize that there are numerous considerations driving the specification of an engineering system's reliability, we contend that the financial analysis of reliability here developed should be made available to decision-makers to support in part, or at least be factored into, the system reliability specification

  7. Reliability Prediction Of System And Component Of Process System Of RSG-GAS Reactor

    International Nuclear Information System (INIS)

    Sitorus Pane, Jupiter

    2001-01-01

    The older the reactor the higher the probability of the system and components suffer from loss of function or degradation. This phenomenon occurred because of wear, corrosion, and fatigue. Study on component reliability was generally performed deterministically and statistically. This paper would describe an analysis of using statistical method, i.e. regression Cox, in order to predict the reliability of the components and their environmental influence's factors. The result showed that the dynamics, non safety related, and mechanic components have higher risk of failure, whereas static, safety related, and electric have lower risk of failures. The relative risk value for variable of components dynamics, quality, dummy 1 and dummy 2 are of 1.54, 1.59, 1.50, and 0.83 compare to other components type with each variable. Component with the higher risk have lower reliability than lower one

  8. Reliability of self-reported childhood physical abuse by adults and factors predictive of inconsistent reporting.

    Science.gov (United States)

    McKinney, Christy M; Harris, T Robert; Caetano, Raul

    2009-01-01

    Little is known about the reliability of self-reported child physical abuse (CPA) or CPA reporting practices. We estimated reliability and prevalence of self-reported CPA and identified factors predictive of inconsistent CPA reporting among 2,256 participants using surveys administered in 1995 and 2000. Reliability of CPA was fair to moderate (kappa = 0.41). Using a positive report from either survey, the prevalence of moderate (61.8%) and severe (12.0%) CPA was higher than at either survey alone. Compared to consistent reporters of having experienced CPA, inconsistent reporters were less likely to be > or = 30 years old (vs. 18-29) or Black (vs. White) and more likely to have report one type (vs. > or = 2) of CPA. These findings may assist researchers conducting and interpreting studies of CPA.

  9. Reliability and Validity of the Load-Velocity Relationship to Predict the 1RM Back Squat.

    Science.gov (United States)

    Banyard, Harry G; Nosaka, Kazunori; Haff, G Gregory

    2017-07-01

    Banyard, HG, Nosaka, K, and Haff, GG. Reliability and validity of the load-velocity relationship to predict the 1RM back squat. J Strength Cond Res 31(7): 1897-1904, 2017-This study investigated the reliability and validity of the load-velocity relationship to predict the free-weight back squat one repetition maximum (1RM). Seventeen strength-trained males performed three 1RM assessments on 3 separate days. All repetitions were performed to full depth with maximal concentric effort. Predicted 1RMs were calculated by entering the mean concentric velocity of the 1RM (V1RM) into an individualized linear regression equation, which was derived from the load-velocity relationship of 3 (20, 40, 60% of 1RM), 4 (20, 40, 60, 80% of 1RM), or 5 (20, 40, 60, 80, 90% of 1RM) incremental warm-up sets. The actual 1RM (140.3 ± 27.2 kg) was very stable between 3 trials (ICC = 0.99; SEM = 2.9 kg; CV = 2.1%; ES = 0.11). Predicted 1RM from 5 warm-up sets up to and including 90% of 1RM was the most reliable (ICC = 0.92; SEM = 8.6 kg; CV = 5.7%; ES = -0.02) and valid (r = 0.93; SEE = 10.6 kg; CV = 7.4%; ES = 0.71) of the predicted 1RM methods. However, all predicted 1RMs were significantly different (p ≤ 0.05; ES = 0.71-1.04) from the actual 1RM. Individual variation for the actual 1RM was small between trials ranging from -5.6 to 4.8% compared with the most accurate predictive method up to 90% of 1RM, which was more variable (-5.5 to 27.8%). Importantly, the V1RM (0.24 ± 0.06 m·s) was unreliable between trials (ICC = 0.42; SEM = 0.05 m·s; CV = 22.5%; ES = 0.14). The load-velocity relationship for the full depth free-weight back squat showed moderate reliability and validity but could not accurately predict 1RM, which was stable between trials. Thus, the load-velocity relationship 1RM prediction method used in this study cannot accurately modify sessional training loads because of large V1RM variability.

  10. Prediction Methodology for Proton Single Event Burnout: Application to a STRIPFET Device

    CERN Document Server

    Siconolfi, Sara; Oser, Pascal; Spiezia, Giovanni; Hubert, Guillaume; David, Jean-Pierre

    2015-01-01

    This paper presents a single event burnout (SEB) sensitivity characterization for power MOSFETs, independent from tests, through a prediction model issued from TCAD analysis and the knowledge of device topology. The methodology is applied to a STRIPFET device and compared to proton data obtained at PSI, showing a good agreement in the order of magnitude of proton SEB cross section, and thus validating the prediction model as an alternative device characterization with respect to SEB.

  11. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  12. European Network of Excellence on NPP residual lifetime prediction methodologies (NULIFE)

    International Nuclear Information System (INIS)

    Badea, M.; Vidican, D.

    2006-01-01

    Within Europe massive investments in nuclear power have been made to meet present and future energy needs. The majority of nuclear reactors have been operating for longer than 20 years and their continuing safe operation depends crucially on effective lifetime management. Furthermore, to extend the economic return on investment and environmental benefits, it is necessary to ensure in advance the safe operation of nuclear reactors for 60 years, a period which is typically 20 years in excess of nominal design life. This depends on a clear understanding of, and predictive capability for, how safety margins may be maintained as components degrade under operational conditions. Ageing mechanisms, environment effects and complex loadings increase the likelihood of damage to safety relevant systems, structures and components. The ability to claim increased benefits from reduced conservatism via improved assessments is therefore of great value. Harmonisation and qualification are essential for industrial exploitation of approaches developed for life prediction methodology. Several European organisations and networks have been at the forefront of the development of advanced methodologies in this area. However, these efforts have largely been made at national level and their overall impact and benefit (in comparison to the situation in the USA) has been reduced by fragmentation. There is a need to restructure the networking approach in order to create a single organisational entity capable of working at European level to produce and exploit R and D in support of the safe and competitive operation of nuclear power plants. It is also critical to ensure the competitiveness of European plant life management (PLIM) services at international level, in particular with the USA and Asian countries. To the above challenges the European Network on European research in residual lifetime prediction methodologies (NULIFE) will: - Create a Europe-wide body in order to achieve scientific and

  13. The application of cognitive models to the evaluation and prediction of human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.; Reason, J.T.

    1986-01-01

    The first section of the paper provides a brief overview of a number of important principles relevant to human reliability modeling that have emerged from cognitive models, and presents a synthesis of these approaches in the form of a Generic Error Modeling System (GEMS). The next section illustrates the application of GEMS to some well known nuclear power plant (NPP) incidents in which human error was a major contributor. The way in which design recommendations can emerge from analyses of this type is illustrated. The third section describes the use of cognitive models in the classification of human errors for prediction and data collection purposes. The final section addresses the predictive modeling of human error as part of human reliability assessment in Probabilistic Risk Assessment

  14. General Inattentiveness Is a Long-Term Reliable Trait Independently Predictive of Psychological Health

    DEFF Research Database (Denmark)

    Jensen, Christian Gaden; Niclasen, Janni; Vangkilde, Signe

    2016-01-01

    The Mindful Attention Awareness Scale (MAAS) measures perceived degree of inattentiveness in different contexts and is often used as a reversed indicator of mindfulness. MAAS is hypothesized to reflect a psychological trait or disposition when used outside attentional training contexts, but the l......The Mindful Attention Awareness Scale (MAAS) measures perceived degree of inattentiveness in different contexts and is often used as a reversed indicator of mindfulness. MAAS is hypothesized to reflect a psychological trait or disposition when used outside attentional training contexts......, but the long-term test-retest reliability of MAAS scores is virtually untested. It is unknown whether MAAS predicts psychological health after controlling for standardized socioeconomic status classifications. First, MAAS translated to Danish was validated psychometrically within a randomly invited healthy...... adult community sample (N = 490). Factor analysis confirmed that MAAS scores quantified a unifactorial construct of excellent composite reliability and consistent convergent validity. Structural equation modeling revealed that MAAS scores contributed independently to predicting psychological distress...

  15. Reliability of didactic grades to predict practical skills in an undergraduate dental college in Saudi Arabia.

    Science.gov (United States)

    Zawawi, Khalid H; Afify, Ahmed R; Yousef, Mohammed K; Othman, Hisham I; Al-Dharrab, Ayman A

    2015-01-01

    This longitudinal study was aimed to investigate the association between didactic grades and practical skills for dental students and whether didactic grades can reliability predict the dental students' practical performance. Didactic and practical grades for graduates from the Faculty of Dentistry, King Abdulaziz University, between the years 2009 and 2011 were collected. Four courses were selected: Dental Anatomy, Operative Dentistry, Prosthodontics, and Orthodontics. Pearson product-moment correlation analyses between didactic and practical scores were conducted. There was only a significant correlation between didactic and practical scores for the Dental Anatomy course (Pdidactic scores (Pdidactic and practical scores for all subjects. Based on the findings of this study, the relationship between didactic grades and practical performance is course specific. Didactic grades do not reliably predict the students' practical skills. Measuring practical performances should be independent from didactic grading.

  16. A dynamic particle filter-support vector regression method for reliability prediction

    International Nuclear Information System (INIS)

    Wei, Zhao; Tao, Tao; ZhuoShu, Ding; Zio, Enrico

    2013-01-01

    Support vector regression (SVR) has been applied to time series prediction and some works have demonstrated the feasibility of its use to forecast system reliability. For accuracy of reliability forecasting, the selection of SVR's parameters is important. The existing research works on SVR's parameters selection divide the example dataset into training and test subsets, and tune the parameters on the training data. However, these fixed parameters can lead to poor prediction capabilities if the data of the test subset differ significantly from those of training. Differently, the novel method proposed in this paper uses particle filtering to estimate the SVR model parameters according to the whole measurement sequence up to the last observation instance. By treating the SVR training model as the observation equation of a particle filter, our method allows updating the SVR model parameters dynamically when a new observation comes. Because of the adaptability of the parameters to dynamic data pattern, the new PF–SVR method has superior prediction performance over that of standard SVR. Four application results show that PF–SVR is more robust than SVR to the decrease of the number of training data and the change of initial SVR parameter values. Also, even if there are trends in the test data different from those in the training data, the method can capture the changes, correct the SVR parameters and obtain good predictions. -- Highlights: •A dynamic PF–SVR method is proposed to predict the system reliability. •The method can adjust the SVR parameters according to the change of data. •The method is robust to the size of training data and initial parameter values. •Some cases based on both artificial and real data are studied. •PF–SVR shows superior prediction performance over standard SVR

  17. Evaluation of validity and reliability of a methodology for measuring human postural attitude and its relation to temporomandibular joint disorders

    Science.gov (United States)

    Fernández, Ramón Fuentes; Carter, Pablo; Muñoz, Sergio; Silva, Héctor; Venegas, Gonzalo Hernán Oporto; Cantin, Mario; Ottone, Nicolás Ernesto

    2016-01-01

    INTRODUCTION Temporomandibular joint disorders (TMJDs) are caused by several factors such as anatomical, neuromuscular and psychological alterations. A relationship has been established between TMJDs and postural alterations, a type of anatomical alteration. An anterior position of the head requires hyperactivity of the posterior neck region and shoulder muscles to prevent the head from falling forward. This compensatory muscular function may cause fatigue, discomfort and trigger point activation. To our knowledge, a method for assessing human postural attitude in more than one plane has not been reported. Thus, the aim of this study was to design a methodology to measure the external human postural attitude in frontal and sagittal planes, with proper validity and reliability analyses. METHODS The variable postures of 78 subjects (36 men, 42 women; age 18–24 years) were evaluated. The postural attitudes of the subjects were measured in the frontal and sagittal planes, using an acromiopelvimeter, grid panel and Fox plane. RESULTS The method we designed for measuring postural attitudes had adequate reliability and validity, both qualitatively and quantitatively, based on Cohen’s Kappa coefficient (> 0.87) and Pearson’s correlation coefficient (r = 0.824, > 80%). CONCLUSION This method exhibits adequate metrical properties and can therefore be used in further research on the association of human body posture with skeletal types and TMJDs. PMID:26768173

  18. Use of curium neutron flux from head-end pyroprocessing subsystems for the High Reliability Safeguards methodology

    Energy Technology Data Exchange (ETDEWEB)

    Borrelli, R.A., E-mail: r.angelo.borrelli@gmail.com

    2014-10-01

    The deployment of nuclear energy systems (NESs) is expanding around the world. Nations are investing in NESs as a means to establish energy independence, grow national economies, and address climate change. Transitioning to the advanced nuclear fuel cycle can meet growing energy demands and ensure resource sustainability. However, nuclear facilities in all phases of the advanced fuel cycle must be ‘safeguardable,’ where safety, safeguards, and security are integrated into a practical design strategy. To this end, the High Reliability Safeguards (HRS) approach is a continually developing safeguardability methodology that applies intrinsic design features and employs a risk-informed approach for systems assessment that is safeguards-motivated. Currently, a commercial pyroprocessing facility is used as the example system. This paper presents a modeling study that investigates the neutron flux associated with processed materials. The intent of these studies is to determine if the neutron flux will affect facility design, and subsequently, safeguardability. The results presented in this paper are for the head-end subsystems in a pyroprocessing facility. The collective results from these studies will then be used to further develop the HRS methodology.

  19. Predicting Dissertation Methodology Choice among Doctoral Candidates at a Faith-Based University

    Science.gov (United States)

    Lunde, Rebecca

    2017-01-01

    Limited research has investigated dissertation methodology choice and the factors that contribute to this choice. Quantitative research is based in mathematics and scientific positivism, and qualitative research is based in constructivism. These underlying philosophical differences posit the question if certain factors predict dissertation…

  20. Discrete Address Beacon System (DABS) Software System Reliability Modeling and Prediction.

    Science.gov (United States)

    1981-06-01

    Service ( ATARS ) module because of its interim status. Reliability prediction models for software modules were derived and then verified by matching...System (A’iCR3BS) and thus can be introduced gradually and economically without ma jor olper- ational or procedural change. Since DABS uses monopulse...lineanaly- sis tools or are ured during maintenance or pre-initialization were not modeled because they are not part of the mission software. The ATARS

  1. Reliability, Validity, and Predictive Utility of the 25-Item Criminogenic Cognitions Scale (CCS)

    OpenAIRE

    Tangney, June Price; Stuewig, Jeffrey; Furukawa, Emi; Kopelovich, Sarah; Meyer, Patrick; Cosby, Brandon

    2012-01-01

    Theory, research, and clinical reports suggest that moral cognitions play a role in initiating and sustaining criminal behavior. The 25 item Criminogenic Cognitions Scale (CCS) was designed to tap 5 dimensions: Notions of entitlement; Failure to Accept Responsibility; Short-Term Orientation; Insensitivity to Impact of Crime; and Negative Attitudes Toward Authority. Results from 552 jail inmates support the reliability, validity, and predictive utility of the measure. The CCS was linked to cri...

  2. A summary of methods of predicting reliability life of nuclear equipment with small samples

    International Nuclear Information System (INIS)

    Liao Weixian

    2000-03-01

    Some of nuclear equipment are manufactured in small batch, e.g., 1-3 sets. Their service life may be very difficult to determine experimentally in view of economy and technology. The method combining theoretical analysis with material tests to predict the life of equipment is put forward, based on that equipment consists of parts or elements which are made of different materials. The whole life of an equipment part consists of the crack forming life (i.e., the fatigue life or the damage accumulation life) and the crack extension life. Methods of predicting machine life has systematically summarized with the emphasis on those which use theoretical analysis to substitute large scale prototype experiments. Meanwhile, methods and steps of predicting reliability life have been described by taking into consideration of randomness of various variables and parameters in engineering. Finally, the latest advance and trends of machine life prediction are discussed

  3. An application of characteristic function in order to predict reliability and lifetime of aeronautical hardware

    Energy Technology Data Exchange (ETDEWEB)

    Żurek, Józef; Kaleta, Ryszard; Zieja, Mariusz [Air Force Institute of Technology ul. Księcia Bolesława 6 01-494 Warsaw (Poland)

    2016-06-08

    The forecasting of reliability and life of aeronautical hardware requires recognition of many and various destructive processes that deteriorate the health/maintenance status thereof. The aging of technical components of aircraft as an armament system proves of outstanding significance to reliability and safety of the whole system. The aging process is usually induced by many and various factors, just to mention mechanical, biological, climatic, or chemical ones. The aging is an irreversible process and considerably affects (i.e. reduces) reliability and lifetime of aeronautical equipment. Application of the characteristic function of the aging process is suggested to predict reliability and lifetime of aeronautical hardware. An increment in values of diagnostic parameters is introduced to formulate then, using the characteristic function and after some rearrangements, the partial differential equation. An analytical dependence for the characteristic function of the aging process is a solution to this equation. With the inverse transformation applied, the density function of the aging of aeronautical hardware is found. Having found the density function, one can determine the aeronautical equipment’s reliability and lifetime. The in-service collected or the life tests delivered data are used to attain this goal. Coefficients in this relationship are found using the likelihood function.

  4. An application of characteristic function in order to predict reliability and lifetime of aeronautical hardware

    International Nuclear Information System (INIS)

    Żurek, Józef; Kaleta, Ryszard; Zieja, Mariusz

    2016-01-01

    The forecasting of reliability and life of aeronautical hardware requires recognition of many and various destructive processes that deteriorate the health/maintenance status thereof. The aging of technical components of aircraft as an armament system proves of outstanding significance to reliability and safety of the whole system. The aging process is usually induced by many and various factors, just to mention mechanical, biological, climatic, or chemical ones. The aging is an irreversible process and considerably affects (i.e. reduces) reliability and lifetime of aeronautical equipment. Application of the characteristic function of the aging process is suggested to predict reliability and lifetime of aeronautical hardware. An increment in values of diagnostic parameters is introduced to formulate then, using the characteristic function and after some rearrangements, the partial differential equation. An analytical dependence for the characteristic function of the aging process is a solution to this equation. With the inverse transformation applied, the density function of the aging of aeronautical hardware is found. Having found the density function, one can determine the aeronautical equipment’s reliability and lifetime. The in-service collected or the life tests delivered data are used to attain this goal. Coefficients in this relationship are found using the likelihood function.

  5. Failure and reliability prediction by support vector machines regression of time series data

    International Nuclear Information System (INIS)

    Chagas Moura, Marcio das; Zio, Enrico; Lins, Isis Didier; Droguett, Enrique

    2011-01-01

    Support Vector Machines (SVMs) are kernel-based learning methods, which have been successfully adopted for regression problems. However, their use in reliability applications has not been widely explored. In this paper, a comparative analysis is presented in order to evaluate the SVM effectiveness in forecasting time-to-failure and reliability of engineered components based on time series data. The performance on literature case studies of SVM regression is measured against other advanced learning methods such as the Radial Basis Function, the traditional MultiLayer Perceptron model, Box-Jenkins autoregressive-integrated-moving average and the Infinite Impulse Response Locally Recurrent Neural Networks. The comparison shows that in the analyzed cases, SVM outperforms or is comparable to other techniques. - Highlights: → Realistic modeling of reliability demands complex mathematical formulations. → SVM is proper when the relation input/output is unknown or very costly to be obtained. → Results indicate the potential of SVM for reliability time series prediction. → Reliability estimates support the establishment of adequate maintenance strategies.

  6. Life prediction methodology for ceramic components of advanced vehicular heat engines: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Khandelwal, P.K.; Provenzano, N.J.; Schneider, W.E. [Allison Engine Co., Indianapolis, IN (United States)

    1996-02-01

    One of the major challenges involved in the use of ceramic materials is ensuring adequate strength and durability. This activity has developed methodology which can be used during the design phase to predict the structural behavior of ceramic components. The effort involved the characterization of injection molded and hot isostatic pressed (HIPed) PY-6 silicon nitride, the development of nondestructive evaluation (NDE) technology, and the development of analytical life prediction methodology. Four failure modes are addressed: fast fracture, slow crack growth, creep, and oxidation. The techniques deal with failures initiating at the surface as well as internal to the component. The life prediction methodology for fast fracture and slow crack growth have been verified using a variety of confirmatory tests. The verification tests were conducted at room and elevated temperatures up to a maximum of 1371 {degrees}C. The tests involved (1) flat circular disks subjected to bending stresses and (2) high speed rotating spin disks. Reasonable correlation was achieved for a variety of test conditions and failure mechanisms. The predictions associated with surface failures proved to be optimistic, requiring re-evaluation of the components` initial fast fracture strengths. Correlation was achieved for the spin disks which failed in fast fracture from internal flaws. Time dependent elevated temperature slow crack growth spin disk failures were also successfully predicted.

  7. Using personality item characteristics to predict single-item reliability, retest reliability, and self-other agreement

    NARCIS (Netherlands)

    de Vries, Reinout Everhard; Realo, Anu; Allik, Jüri

    2016-01-01

    The use of reliability estimates is increasingly scrutinized as scholars become more aware that test–retest stability and self–other agreement provide a better approximation of the theoretical and practical usefulness of an instrument than its internal reliability. In this study, we investigate item

  8. Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)

    Science.gov (United States)

    DeMott, Diana L.; Bigler, Mark A.

    2017-01-01

    NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. To determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators, and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the

  9. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  10. Comparing Structural Identification Methodologies for Fatigue Life Prediction of a Highway Bridge

    Directory of Open Access Journals (Sweden)

    Sai G. S. Pai

    2018-01-01

    Full Text Available Accurate measurement-data interpretation leads to increased understanding of structural behavior and enhanced asset-management decision making. In this paper, four data-interpretation methodologies, residual minimization, traditional Bayesian model updating, modified Bayesian model updating (with an L∞-norm-based Gaussian likelihood function, and error-domain model falsification (EDMF, a method that rejects models that have unlikely differences between predictions and measurements, are compared. In the modified Bayesian model updating methodology, a correction is used in the likelihood function to account for the effect of a finite number of measurements on posterior probability–density functions. The application of these data-interpretation methodologies for condition assessment and fatigue life prediction is illustrated on a highway steel–concrete composite bridge having four spans with a total length of 219 m. A detailed 3D finite-element plate and beam model of the bridge and weigh-in-motion data are used to obtain the time–stress response at a fatigue critical location along the bridge span. The time–stress response, presented as a histogram, is compared to measured strain responses either to update prior knowledge of model parameters using residual minimization and Bayesian methodologies or to obtain candidate model instances using the EDMF methodology. It is concluded that the EDMF and modified Bayesian model updating methodologies provide robust prediction of fatigue life compared with residual minimization and traditional Bayesian model updating in the presence of correlated non-Gaussian uncertainty. EDMF has additional advantages due to ease of understanding and applicability for practicing engineers, thus enabling incremental asset-management decision making over long service lives. Finally, parallel implementations of EDMF using grid sampling have lower computations times than implementations using adaptive sampling.

  11. Dynamic reliability assessment and prediction for repairable systems with interval-censored data

    International Nuclear Information System (INIS)

    Peng, Yizhen; Wang, Yu; Zi, YanYang; Tsui, Kwok-Leung; Zhang, Chuhua

    2017-01-01

    The ‘Test, Analyze and Fix’ process is widely applied to improve the reliability of a repairable system. In this process, dynamic reliability assessment for the system has been paid a great deal of attention. Due to instrument malfunctions, staff omissions and imperfect inspection strategies, field reliability data are often subject to interval censoring, making dynamic reliability assessment become a difficult task. Most traditional methods assume this kind of data as multiple normal distributed variables or the missing mechanism as missing at random, which may cause a large bias in parameter estimation. This paper proposes a novel method to evaluate and predict the dynamic reliability of a repairable system subject to interval-censored problem. First, a multiple imputation strategy based on the assumption that the reliability growth trend follows a nonhomogeneous Poisson process is developed to derive the distributions of missing data. Second, a new order statistic model that can transfer the dependent variables into independent variables is developed to simplify the imputation procedure. The unknown parameters of the model are iteratively inferred by the Monte Carlo expectation maximization (MCEM) algorithm. Finally, to verify the effectiveness of the proposed method, a simulation and a real case study for gas pipeline compressor system are implemented. - Highlights: • A new multiple imputation strategy was developed to derive the PDF of missing data. • A new order statistic model was developed to simplify the imputation procedure. • The parameters of the order statistic model were iteratively inferred by MCEM. • A real cases study was conducted to verify the effectiveness of the proposed method.

  12. A neural network based methodology to predict site-specific spectral acceleration values

    Science.gov (United States)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  13. Methodologies for predicting the part-load performance of aero-derivative gas turbines

    DEFF Research Database (Denmark)

    Haglind, Fredrik; Elmegaard, Brian

    2009-01-01

    Prediction of the part-load performance of gas turbines is advantageous in various applications. Sometimes reasonable part-load performance is sufficient, while in other cases complete agreement with the performance of an existing machine is desirable. This paper is aimed at providing some guidance...... on methodologies for predicting part-load performance of aero-derivative gas turbines. Two different design models – one simple and one more complex – are created. Subsequently, for each of these models, the part-load performance is predicted using component maps and turbine constants, respectively. Comparisons...... with manufacturer data are made. With respect to the design models, the simple model, featuring a compressor, combustor and turbines, results in equally good performance prediction in terms of thermal efficiency and exhaust temperature as does a more complex model. As for part-load predictions, the results suggest...

  14. GenMol trademark supramolecular descriptors predicting reliable sensitivity of energetic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Benazet, Stephane; Jacob, Guy [SNPE Materiaux Energetiques, Vert Le Petit (France); Pepe, Gerard [CINaM UPR-CNRS 3118, Campus de Luminy Case, Marseille (France)

    2009-04-15

    Structure/activity relationship methodology has been applied to the problem of the prediction of the energetic molecule's sensitivity. This parameter knowledge is of great importance to increase the safety of operations in the field of synthesis and manipulation of such compounds. It has been shown that descriptors of the solid state interactions and surface topology issued from GenMol {sup trademark} software calculations greatly enhanced the correlation between measured and predicted sensitivity. As the structural parameters used to establish the descriptors are experimental ones, their physical significance is particularly preserved which allows to give a good prediction for impact or friction sensitivity by the so defined descriptors. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  15. The 6-min push test is reliable and predicts low fitness in spinal cord injury.

    Science.gov (United States)

    Cowan, Rachel E; Callahan, Morgan K; Nash, Mark S

    2012-10-01

    The objective of this study is to assess 6-min push test (6MPT) reliability, determine whether the 6MPT is sensitive to fitness differences, and assess if 6MPT distance predicts fitness level in persons with spinal cord injury (SCI) or disease. Forty individuals with SCI who could self-propel a manual wheelchair completed an incremental arm crank peak oxygen consumption assessment and two 6MPTs across 3 d (37% tetraplegia (TP), 63% paraplegia (PP), 85% men, 70% white, 63% Hispanic, mean age = 34 ± 10 yr, mean duration of injury = 13 ± 10 yr, and mean body mass index = 24 ± 5 kg.m). Intraclass correlation and Bland-Altman plots assessed 6MPT distance (m) reliability. Mann-Whitney U test compared 6MPT distance (m) of high and low fitness groups for TP and PP. The fitness status prediction was developed using N = 30 and validated in N = 10 (validation group (VG)). A nonstatistical prediction approach, below or above a threshold distance (TP = 445 m and PP = 604 m), was validated statistically by binomial logistic regression. Accuracy, sensitivity, and specificity were computed to evaluate the threshold approach. Intraclass correlation coefficients exceeded 0.90 for the whole sample and the TP/PP subsets. High fitness persons propelled farther than low fitness persons for both TP/PP (both P < 0.05). Binomial logistic regression (P < 0.008) predicted the same fitness levels in the VG as the threshold approach. In the VG, overall accuracy was 70%. Eighty-six percent of low fitness persons were correctly identified (sensitivity), and 33% of high fitness persons were correctly identified (specificity). The 6MPT may be a useful tool for SCI clinicians and researchers. 6MPT distance demonstrates excellent reliability and is sensitive to differences in fitness level. 6MPT distances less than a threshold distance may be an effective approach to identify low fitness in person with SCI.

  16. ARA and ARI imperfect repair models: Estimation, goodness-of-fit and reliability prediction

    International Nuclear Information System (INIS)

    Toledo, Maria Luíza Guerra de; Freitas, Marta A.; Colosimo, Enrico A.; Gilardoni, Gustavo L.

    2015-01-01

    An appropriate maintenance policy is essential to reduce expenses and risks related to equipment failures. A fundamental aspect to be considered when specifying such policies is to be able to predict the reliability of the systems under study, based on a well fitted model. In this paper, the classes of models Arithmetic Reduction of Age and Arithmetic Reduction of Intensity are explored. Likelihood functions for such models are derived, and a graphical method is proposed for model selection. A real data set involving failures in trucks used by a Brazilian mining is analyzed considering models with different memories. Parameters, namely, shape and scale for Power Law Process, and the efficiency of repair were estimated for the best fitted model. Estimation of model parameters allowed us to derive reliability estimators to predict the behavior of the failure process. These results are a valuable information for the mining company and can be used to support decision making regarding preventive maintenance policy. - Highlights: • Likelihood functions for imperfect repair models are derived. • A goodness-of-fit technique is proposed as a tool for model selection. • Failures in trucks owned by a Brazilian mining are modeled. • Estimation allowed deriving reliability predictors to forecast the future failure process of the trucks

  17. Genotyping cows for the reference increase reliability of genomic prediction in a small breed

    DEFF Research Database (Denmark)

    Thomasen, Jørn Rind; Sørensen, Anders Christian; Lund, Mogens Sandø

    2013-01-01

    We hypothesized that adding cows to the reference population in a breed with a small number of reference bulls would increase reliabilities of genomic breeding values and genetic gain. We tested this premise by comparing two strategies for maintaining the reference population for genetic gain......, inbreeding and reliabilities of genomic predictions: 1) Adding 60 progeny tested bulls each year (B), and 2) in addition to 60 progeny tested bulls, adding 2,000 genotyped cows per year (C). Two breeding schemes were tested: 1) A turbo scheme (T) with only genotyped young bulls used intensively, and 2...... compared to the H-B, at the same level of ∆F. T-C yielded 15% higher ∆G compared t o T-B. Changing the breeding scheme from H-B to H-C increased ∆G by 5.5%. The lowest ∆F was observed with genotyping of cows. Reliabilities of GEBV in the C schemes showed a steep increase in reliability during the first...

  18. [Reliability and validity of the Braden Scale for predicting pressure sore risk].

    Science.gov (United States)

    Boes, C

    2000-12-01

    For more accurate and objective pressure sore risk assessment various risk assessment tools were developed mainly in the USA and Great Britain. The Braden Scale for Predicting Pressure Sore Risk is one such example. By means of a literature analysis of German and English texts referring to the Braden Scale the scientific control criteria reliability and validity will be traced and consequences for application of the scale in Germany will be demonstrated. Analysis of 4 reliability studies shows an exclusive focus on interrater reliability. Further, even though examination of 19 validity studies occurs in many different settings, such examination is limited to the criteria sensitivity and specificity (accuracy). The range of sensitivity and specificity level is 35-100%. The recommended cut off points rank in the field of 10 to 19 points. The studies prove to be not comparable with each other. Furthermore, distortions in these studies can be found which affect accuracy of the scale. The results of the here presented analysis show an insufficient proof for reliability and validity in the American studies. In Germany, the Braden scale has not yet been tested under scientific criteria. Such testing is needed before using the scale in different German settings. During the course of such testing, construction and study procedures of the American studies can be used as a basis as can the problems be identified in the analysis presented below.

  19. Impact of Relationships between Test and Reference Animals and between Reference Animals on Reliability of Genomic Prediction

    DEFF Research Database (Denmark)

    Wu, Xiaoping; Lund, Mogens Sandø; Sun, Dongxiao

    This study investigated reliability of genomic prediction in various scenarios with regard to relationship between test and reference animals and between animals within the reference population. Different reference populations were generated from EuroGenomics data and 1288 Nordic Holstein bulls...... as a common test population. A GBLUP model and a Bayesian mixture model were applied to predict Genomic breeding values for bulls in the test data. Result showed that a closer relationship between test and reference animals led to a higher reliability, while a closer relationship between reference animal...... resulted in a lower reliability. Therefore, the design of reference population is important for improving the reliability of genomic prediction. With regard to model, the Bayesian mixture model in general led to slightly a higher reliability of genomic prediction than the GBLUP model...

  20. A human reliability analysis of the Three Mile power plant accident considering the THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    Fonseca, Renato Alves da

    2004-03-01

    The main purpose of this work is the study of human reliability using the THERP (Technique for Human Error Prediction) and ATHEANA methods (A Technique for Human Error Analysis), and some tables and also, from case studies presented on the THERP Handbook to develop a qualitative and quantitative study of nuclear power plant accident. This accident occurred in the TMI (Three Mile Island Unit 2) power plant, PWR type plant, on March 28th, 1979. The accident analysis has revealed a series of incorrect actions, which resulted in the Unit 2 shut down and permanent loss of the reactor. This study also aims at enhancing the understanding of the THERP method and ATHEANA, and of its practical applications. In addition, it is possible to understand the influence of plant operational status on human failures and of these on equipment of a system, in this case, a nuclear power plant. (author)

  1. A simulation methodology of spacer grid residual spring deflection for predictive and interpretative purposes

    International Nuclear Information System (INIS)

    Kim, K. T.; Kim, H. K.; Yoon, K. H.

    1994-01-01

    The in-reactor fuel rod support conditions against the fretting wear-induced damage can be evaluated by spacer grid residual spring deflection. In order to predict the spacer grid residual spring deflection as a function of burnup for various spring designs, a simulation methodology of spacer grid residual spring deflection has been developed and implemented in the GRIDFORCE program. The simulation methodology takes into account cladding creep rate, initial spring deflection, initial spring force, and spring force relaxation rate as the key parameters affecting the residual spring deflection. The simulation methodology developed in this study can be utilized as an effective tool in evaluating the capability of a newly designed spacer grid spring to prevent the fretting wear-induced damage

  2. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    Science.gov (United States)

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction. © 2015 Blackwell Verlag GmbH.

  3. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    Science.gov (United States)

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  4. A new lifetime estimation model for a quicker LED reliability prediction

    Science.gov (United States)

    Hamon, B. H.; Mendizabal, L.; Feuillet, G.; Gasse, A.; Bataillou, B.

    2014-09-01

    LED reliability and lifetime prediction is a key point for Solid State Lighting adoption. For this purpose, one hundred and fifty LEDs have been aged for a reliability analysis. LEDs have been grouped following nine current-temperature stress conditions. Stress driving current was fixed between 350mA and 1A and ambient temperature between 85C and 120°C. Using integrating sphere and I(V) measurements, a cross study of the evolution of electrical and optical characteristics has been done. Results show two main failure mechanisms regarding lumen maintenance. The first one is the typically observed lumen depreciation and the second one is a much more quicker depreciation related to an increase of the leakage and non radiative currents. Models of the typical lumen depreciation and leakage resistance depreciation have been made using electrical and optical measurements during the aging tests. The combination of those models allows a new method toward a quicker LED lifetime prediction. These two models have been used for lifetime predictions for LEDs.

  5. Physics-based process modeling, reliability prediction, and design guidelines for flip-chip devices

    Science.gov (United States)

    Michaelides, Stylianos

    Flip Chip on Board (FCOB) and Chip-Scale Packages (CSPs) are relatively new technologies that are being increasingly used in the electronic packaging industry. Compared to the more widely used face-up wirebonding and TAB technologies, flip-chips and most CSPs provide the shortest possible leads, lower inductance, higher frequency, better noise control, higher density, greater input/output (I/O), smaller device footprint and lower profile. However, due to the short history and due to the introduction of several new electronic materials, designs, and processing conditions, very limited work has been done to understand the role of material, geometry, and processing parameters on the reliability of flip-chip devices. Also, with the ever-increasing complexity of semiconductor packages and with the continued reduction in time to market, it is too costly to wait until the later stages of design and testing to discover that the reliability is not satisfactory. The objective of the research is to develop integrated process-reliability models that will take into consideration the mechanics of assembly processes to be able to determine the reliability of face-down devices under thermal cycling and long-term temperature dwelling. The models incorporate the time and temperature-dependent constitutive behavior of various materials in the assembly to be able to predict failure modes such as die cracking and solder cracking. In addition, the models account for process-induced defects and macro-micro features of the assembly. Creep-fatigue and continuum-damage mechanics models for the solder interconnects and fracture-mechanics models for the die have been used to determine the reliability of the devices. The results predicted by the models have been successfully validated against experimental data. The validated models have been used to develop qualification and test procedures for implantable medical devices. In addition, the research has helped develop innovative face

  6. Life Prediction/Reliability Data of Glass-Ceramic Material Determined for Radome Applications

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.

    2002-01-01

    Brittle materials, ceramics, are candidate materials for a variety of structural applications for a wide range of temperatures. However, the process of slow crack growth, occurring in any loading configuration, limits the service life of structural components. Therefore, it is important to accurately determine the slow crack growth parameters required for component life prediction using an appropriate test methodology. This test methodology also should be useful in determining the influence of component processing and composition variables on the slow crack growth behavior of newly developed or existing materials, thereby allowing the component processing and composition to be tailored and optimized to specific needs. Through the American Society for Testing and Materials (ASTM), the authors recently developed two test methods to determine the life prediction parameters of ceramics. The two test standards, ASTM 1368 for room temperature and ASTM C 1465 for elevated temperatures, were published in the 2001 Annual Book of ASTM Standards, Vol. 15.01. Briefly, the test method employs constant stress-rate (or dynamic fatigue) testing to determine flexural strengths as a function of the applied stress rate. The merit of this test method lies in its simplicity: strengths are measured in a routine manner in flexure at four or more applied stress rates with an appropriate number of test specimens at each applied stress rate. The slow crack growth parameters necessary for life prediction are then determined from a simple relationship between the strength and the applied stress rate. Extensive life prediction testing was conducted at the NASA Glenn Research Center using the developed ASTM C 1368 test method to determine the life prediction parameters of a glass-ceramic material that the Navy will use for radome applications.

  7. Notes on human factors problems in process plant reliability and safety prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.; Taylor, J.R.

    1976-09-01

    The basis for plant operator reliability evaluation is described. Principles for plant design, necessary to permit reliability evaluation, are outlined. Five approaches to the plant operator reliability problem are described. Case stories, illustrating operator reliability problems, are given. (author)

  8. Comparing methodologies for structural identification and fatigue life prediction of a highway bridge

    OpenAIRE

    Pai, Sai Ganesh Sarvotham; Nussbaumer, Alain; Smith, Ian F. C.

    2018-01-01

    Accurate measurement-data interpretation leads to increased understanding of structural behavior and enhanced asset-management decision making. In this paper, four data-interpretation methodologies, residual minimization, traditional Bayesian model updating, modified Bayesian model updating (with an L∞-norm-based Gaussian likelihood function), and error-domain model falsification (EDMF), a method that rejects models that have unlikely differences between predictions and measurements, are comp...

  9. Comparing Structural Identification Methodologies for Fatigue Life Prediction of a Highway Bridge

    OpenAIRE

    Pai, Sai G.S.; Nussbaumer, Alain; Smith, Ian F.C.

    2018-01-01

    Accurate measurement-data interpretation leads to increased understanding of structural behavior and enhanced asset-management decision making. In this paper, four data-interpretation methodologies, residual minimization, traditional Bayesian model updating, modified Bayesian model updating (with an L∞-norm-based Gaussian likelihood function), and error-domain model falsification (EDMF), a method that rejects models that have unlikely differences between predictions and measurements, are comp...

  10. Probabilistic fatigue life prediction methodology for notched components based on simple smooth fatigue tests

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Z. R.; Li, Z. X. [Dept.of Engineering Mechanics, Jiangsu Key Laboratory of Engineering Mechanics, Southeast University, Nanjing (China); Hu, X. T.; Xin, P. P.; Song, Y. D. [State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing University of Aeronautics and Astronautics, Nanjing (China)

    2017-01-15

    The methodology of probabilistic fatigue life prediction for notched components based on smooth specimens is presented. Weakestlink theory incorporating Walker strain model has been utilized in this approach. The effects of stress ratio and stress gradient have been considered. Weibull distribution and median rank estimator are used to describe fatigue statistics. Fatigue tests under different stress ratios were conducted on smooth and notched specimens of titanium alloy TC-1-1. The proposed procedures were checked against the test data of TC-1-1 notched specimens. Prediction results of 50 % survival rate are all within a factor of two scatter band of the test results.

  11. Validated Loads Prediction Models for Offshore Wind Turbines for Enhanced Component Reliability

    DEFF Research Database (Denmark)

    Koukoura, Christina

    To improve the reliability of offshore wind turbines, accurate prediction of their response is required. Therefore, validation of models with site measurements is imperative. In the present thesis a 3.6MW pitch regulated-variable speed offshore wind turbine on a monopole foundation is built...... are used for the modification of the sub-structure/foundation design for possible material savings. First, the background of offshore wind engineering, including wind-wave conditions, support structure, blade loading and wind turbine dynamics are presented. Second, a detailed description of the site...

  12. Methodological Challenges in Examining the Impact of Healthcare Predictive Analytics on Nursing-Sensitive Patient Outcomes.

    Science.gov (United States)

    Jeffery, Alvin D

    2015-06-01

    The expansion of real-time analytic abilities within current electronic health records has led to innovations in predictive modeling and clinical decision support systems. However, the ability of these systems to influence patient outcomes is currently unknown. Even though nurses are the largest profession within the healthcare workforce, little research has been performed to explore the impact of clinical decision support on their decisions and the patient outcomes associated with them. A scoping literature review explored the impact clinical decision support systems containing healthcare predictive analytics have on four nursing-sensitive patient outcomes (pressure ulcers, failure to rescue, falls, and infections). While many articles discussed variable selection and predictive model development/validation, only four articles examined the impact on patient outcomes. The novelty of predictive analytics and the inherent methodological challenges in studying clinical decision support impact are likely responsible for this paucity of literature. Major methodological challenges include (1) multilevel nature of intervention, (2) treatment fidelity, and (3) adequacy of clinicians' subsequent behavior. There is currently insufficient evidence to demonstrate efficacy of healthcare predictive analytics-enhanced clinical decision support systems on nursing-sensitive patient outcomes. Innovative research methods and a greater emphasis on studying this phenomenon are needed.

  13. Deformation and fracture map methodology for predicting cladding behavior during dry storage

    International Nuclear Information System (INIS)

    Chin, B.A.; Khan, M.A.; Tarn, J.C.L.

    1986-09-01

    The licensing of interim dry storage of light-water reactor spent fuel requires assurance that release limits of radioactive materials are not exceeded. The extent to which Zircaloy cladding can be relied upon as a barrier to prevent release of radioactive spent fuel and fission products depends upon its integrity. The internal pressure from helium and fission gases could become a source of hoop stress for creep rupture if pressures and temperatures were sufficiently high. Consequently, it is of interest to predict the condition of spent fuel cladding during interim storage for periods up to 40 years. To develop this prediction, deformation and fracture theories were used to develop maps. Where available, experimental deformation and fracture data were used to test the validity of the maps. Predictive equations were then developed and cumulative damage methodology was used to take credit for the declining temperature of spent fuel during storage. This methodology was then used to predict storage temperatures below which creep rupture would not be expected to occur except in fuel rods with pre-existing flaws. Predictions were also made and compared with results from tests conducted under abnormal conditions

  14. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  15. Safety analysis methodology with assessment of the impact of the prediction errors of relevant parameters

    International Nuclear Information System (INIS)

    Galia, A.V.

    2011-01-01

    The best estimate plus uncertainty approach (BEAU) requires the use of extensive resources and therefore it is usually applied for cases in which the available safety margin obtained with a conservative methodology can be questioned. Outside the BEAU methodology, there is not a clear approach on how to deal with the issue of considering the uncertainties resulting from prediction errors in the safety analyses performed for licensing submissions. However, the regulatory document RD-310 mentions that the analysis method shall account for uncertainties in the analysis data and models. A possible approach is presented, that is simple and reasonable, representing just the author's views, to take into account the impact of prediction errors and other uncertainties when performing safety analysis in line with regulatory requirements. The approach proposes taking into account the prediction error of relevant parameters. Relevant parameters would be those plant parameters that are surveyed and are used to initiate the action of a mitigating system or those that are representative of the most challenging phenomena for the integrity of a fission barrier. Examples of the application of the methodology are presented involving a comparison between the results with the new approach and a best estimate calculation during the blowdown phase for two small breaks in a generic CANDU 6 station. The calculations are performed with the CATHENA computer code. (author)

  16. Nuclear power plant maintenance personnel reliability prediction (NPP/MPRP) effort at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.

    1981-01-01

    Human errors committed during maintenance activities are potentially a major contribution to the overall risk associated with the operation of a nuclear power plant (NPP). An NRC-sponsored program at Oak Ridge National Laboratory is attempting to develop a quantitative predictive technique to evaluate the contribution of maintenance errors to the overall NPP risk. The current work includes a survey of the requirements of potential users to ascertain the need for and content of the proposed quantitative model, plus an initial job/task analysis to determine the scope and applicability of various maintenance tasks. In addition, existing human reliability prediction models are being reviewed and assessed with respect to their applicability to NPP maintenance tasks. This paper discusses the status of the program and summarizes the results to date

  17. Reliability, Validity, and Predictive Utility of the 25-Item Criminogenic Cognitions Scale (CCS).

    Science.gov (United States)

    Tangney, June Price; Stuewig, Jeffrey; Furukawa, Emi; Kopelovich, Sarah; Meyer, Patrick; Cosby, Brandon

    2012-10-01

    Theory, research, and clinical reports suggest that moral cognitions play a role in initiating and sustaining criminal behavior. The 25 item Criminogenic Cognitions Scale (CCS) was designed to tap 5 dimensions: Notions of entitlement; Failure to Accept Responsibility; Short-Term Orientation; Insensitivity to Impact of Crime; and Negative Attitudes Toward Authority. Results from 552 jail inmates support the reliability, validity, and predictive utility of the measure. The CCS was linked to criminal justice system involvement, self-report measures of aggression, impulsivity, and lack of empathy. Additionally, the CCS was associated with violent criminal history, antisocial personality, and clinicians' ratings of risk for future violence and psychopathy (PCL:SV). Furthermore, criminogenic thinking upon incarceration predicted subsequent official reports of inmate misconduct during incarceration. CCS scores varied somewhat by gender and race. Research and applied uses of CCS are discussed.

  18. Predictable and reliable ECG monitoring over IEEE 802.11 WLANs within a hospital.

    Science.gov (United States)

    Park, Juyoung; Kang, Kyungtae

    2014-09-01

    Telecardiology provides mobility for patients who require constant electrocardiogram (ECG) monitoring. However, its safety is dependent on the predictability and robustness of data delivery, which must overcome errors in the wireless channel through which the ECG data are transmitted. We report here a framework that can be used to gauge the applicability of IEEE 802.11 wireless local area network (WLAN) technology to ECG monitoring systems in terms of delay constraints and transmission reliability. For this purpose, a medical-grade WLAN architecture achieved predictable delay through the combination of a medium access control mechanism based on the point coordination function provided by IEEE 802.11 and an error control scheme based on Reed-Solomon coding and block interleaving. The size of the jitter buffer needed was determined by this architecture to avoid service dropout caused by buffer underrun, through analysis of variations in transmission delay. Finally, we assessed this architecture in terms of service latency and reliability by modeling the transmission of uncompressed two-lead electrocardiogram data from the MIT-BIH Arrhythmia Database and highlight the applicability of this wireless technology to telecardiology.

  19. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    Energy Technology Data Exchange (ETDEWEB)

    Emery, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Coffin, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robbins, Brian A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carroll, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Field, Richard V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeremy Yoo, Yung Suk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kacher, Josh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins with a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.

  20. The prediction of reliability and residual life of reactor pressure components

    International Nuclear Information System (INIS)

    Nemec, J.; Antalovsky, S.

    1978-01-01

    The paper deals with the problem of PWR pressure components reliability and residual life evaluation and prediction. A physical model of damage cumulation which serves as a theoretical basis for all considerations presents two major aspects. The first one describes the dependence of the degree of damage in the crack leading-edge in pressure components on the reactor system load-time history, i.e. on the number of transient loads. Both stages, fatigue crack initiation and growth through the wall until the critical length is reached, are investigated. The crack is supposed to initiate at the flaws in a strength weld joint or in the bimetallic weld of the base ferritic steel and the austenitic stainless overlay cladding. The growth rates of developed cracks are analysed in respect to different load-time histories. Important cyclic properties of some steels are derived from the low-cycle fatigue theory. The second aspect is the load-time history-dependent process of precipitation, deformation and radiation aging, characterized entirely by the critical crack-length value mentioned above. The fracture point, defined by the equation ''crack-length=critical value'' and hence the residual life, can be evaluated using this model and verified by in-service inspection. The physical model described is randomized by considering all the parameters of the model as random. Monte Carlo methods are applied and fatigue crack initiation and growth is simulated. This permits evaluation of the reliability and residual life of the component. The distributions of material and load-time history parameters are needed for such simulation. Both the deterministic and computer-simulated probabilistic predictions of reliability and residual life are verified by prior-to-failure sequential testing of data coming from in-service NDT periodical inspections. (author)

  1. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  2. Reliable and accurate point-based prediction of cumulative infiltration using soil readily available characteristics: A comparison between GMDH, ANN, and MLR

    Science.gov (United States)

    Rahmati, Mehdi

    2017-08-01

    Developing accurate and reliable pedo-transfer functions (PTFs) to predict soil non-readily available characteristics is one of the most concerned topic in soil science and selecting more appropriate predictors is a crucial factor in PTFs' development. Group method of data handling (GMDH), which finds an approximate relationship between a set of input and output variables, not only provide an explicit procedure to select the most essential PTF input variables, but also results in more accurate and reliable estimates than other mostly applied methodologies. Therefore, the current research was aimed to apply GMDH in comparison with multivariate linear regression (MLR) and artificial neural network (ANN) to develop several PTFs to predict soil cumulative infiltration point-basely at specific time intervals (0.5-45 min) using soil readily available characteristics (RACs). In this regard, soil infiltration curves as well as several soil RACs including soil primary particles (clay (CC), silt (Si), and sand (Sa)), saturated hydraulic conductivity (Ks), bulk (Db) and particle (Dp) densities, organic carbon (OC), wet-aggregate stability (WAS), electrical conductivity (EC), and soil antecedent (θi) and field saturated (θfs) water contents were measured at 134 different points in Lighvan watershed, northwest of Iran. Then, applying GMDH, MLR, and ANN methodologies, several PTFs have been developed to predict cumulative infiltrations using two sets of selected soil RACs including and excluding Ks. According to the test data, results showed that developed PTFs by GMDH and MLR procedures using all soil RACs including Ks resulted in more accurate (with E values of 0.673-0.963) and reliable (with CV values lower than 11 percent) predictions of cumulative infiltrations at different specific time steps. In contrast, ANN procedure had lower accuracy (with E values of 0.356-0.890) and reliability (with CV values up to 50 percent) compared to GMDH and MLR. The results also revealed

  3. Reliable test for prenatal prediction of fetal RhD type using maternal plasma from RhD negative women

    DEFF Research Database (Denmark)

    Clausen, Frederik Banch; Krog, Grethe Risum; Rieneck, Klaus

    2005-01-01

    The objective of this study was to establish a reliable test for prenatal prediction of fetal RhD type using maternal plasma from RhD negative women. This test is needed for future prenatal Rh prophylaxis.......The objective of this study was to establish a reliable test for prenatal prediction of fetal RhD type using maternal plasma from RhD negative women. This test is needed for future prenatal Rh prophylaxis....

  4. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  5. Predicting human miRNA target genes using a novel evolutionary methodology

    KAUST Repository

    Aigli, Korfiati; Kleftogiannis, Dimitrios A.; Konstantinos, Theofilatos; Spiros, Likothanassis; Athanasios, Tsakalidis; Seferina, Mavroudi

    2012-01-01

    The discovery of miRNAs had great impacts on traditional biology. Typically, miRNAs have the potential to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. The experimental identification of their targets has many drawbacks including cost, time and low specificity and these are the reasons why many computational approaches have been developed so far. However, existing computational approaches do not include any advanced feature selection technique and they are facing problems concerning their classification performance and their interpretability. In the present paper, we propose a novel hybrid methodology which combines genetic algorithms and support vector machines in order to locate the optimal feature subset while achieving high classification performance. The proposed methodology was compared with two of the most promising existing methodologies in the problem of predicting human miRNA targets. Our approach outperforms existing methodologies in terms of classification performances while selecting a much smaller feature subset. © 2012 Springer-Verlag.

  6. Predicting human miRNA target genes using a novel evolutionary methodology

    KAUST Repository

    Aigli, Korfiati

    2012-01-01

    The discovery of miRNAs had great impacts on traditional biology. Typically, miRNAs have the potential to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. The experimental identification of their targets has many drawbacks including cost, time and low specificity and these are the reasons why many computational approaches have been developed so far. However, existing computational approaches do not include any advanced feature selection technique and they are facing problems concerning their classification performance and their interpretability. In the present paper, we propose a novel hybrid methodology which combines genetic algorithms and support vector machines in order to locate the optimal feature subset while achieving high classification performance. The proposed methodology was compared with two of the most promising existing methodologies in the problem of predicting human miRNA targets. Our approach outperforms existing methodologies in terms of classification performances while selecting a much smaller feature subset. © 2012 Springer-Verlag.

  7. Miedema model based methodology to predict amorphous-forming-composition range in binary and ternary systems

    Energy Technology Data Exchange (ETDEWEB)

    Das, N., E-mail: nirupamd@barc.gov.in [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Mittra, J. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Murty, B.S. [Department of Metallurgical and Materials Engineering, IIT Madras, Chennai 600 036 (India); Pabi, S.K. [Department of Metallurgical and Materials Engineering, IIT Kharagpur, Kharagpur 721 302 (India); Kulkarni, U.D.; Dey, G.K. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India)

    2013-02-15

    Highlights: Black-Right-Pointing-Pointer A methodology was proposed to predict amorphous forming compositions (AFCs). Black-Right-Pointing-Pointer Chemical contribution to enthalpy of mixing {proportional_to} enthalpy of amorphous for AFCs. Black-Right-Pointing-Pointer Accuracy in the prediction of AFC-range was noticed in Al-Ni-Ti system. Black-Right-Pointing-Pointer Mechanical alloying (MA) results of Al-Ni-Ti followed the predicted AFC-range. Black-Right-Pointing-Pointer Earlier MA results of Al-Ni-Ti also conformed to the predicted AFC-range. - Abstract: From the earlier works on the prediction of amorphous forming composition range (AFCR) using Miedema based model and also, on mechanical alloying experiments it has been observed that all amorphous forming compositions of a given alloy system falls within a linear band when the chemical contribution to enthalpy of the solid solution ({Delta}H{sup ss}) is plotted against the enthalpy of mixing in the amorphous phase ({Delta}H{sup amor}). On the basis of this observation, a methodology has been proposed in this article to identify the AFCR of a ternary system that is likely to be more precise than what can be obtained using {Delta}H{sup amor} - {Delta}H{sup ss} < 0 criterion. MA experiments on various compositions of Al-Ni-Ti system, producing amorphous, crystalline, and mixture of amorphous plus crystalline phases have been carried out and the phases have been characterized using X-ray diffraction and transmission electron microscopy techniques. Data from the present MA experiments and, also, from the literature have been used to validate the proposed approach. Also, the proximity of compositions, producing a mixture of amorphous and crystalline phases to the boundary of AFCR in the Al-Ni-Ti ternary has been found useful to validate the effectiveness of the prediction.

  8. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Development of core technology for KNGR system design; development of quantitative reliability evaluation methodologies of KNGR digital I and C components

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Choi, Jong Gyun; Kim, Ung Soo; Kim, Jong Hyun; Kim, Man Cheol; Lee, Seung Jun; Lee, Young Je; Ha, Jun Soo [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2002-03-01

    For the digital systems to be applied to the nuclear industry, which has its unique conservertive to safety, reliability assessment of digital systems is a prerequisite. But, because digital systems show different failure modes from compared to existing analog systems, the existing reliability assessment method cannot be applied to digital systems. It means that a new reliability assessment method for digital systems should be developed. The goal of this study is development of reliability assessment method for digital systems on board level and related software tool. To achieve the goal, we have conducted researches on development of a database for hardware components for digital I and C systems, development of a reliability assessment model for the reliability prediction of digital systems on board level, and the applicability to KNGR digital I and C systems. We developed a database for reliability assessment of digital hardware components, a reliability assessment method for digital systems with consideration of software and hardware together, and a software tool for the reliability assessment of digital systems, which is named as RelPredic. We plan to apply the results of this study to the reliability assessment of digital systems in KNGR digital I and C systems. 13 refs., 71 figs., 31 tabs. (Author)

  10. How reliable must advanced nondestructive testing be? A concept for the prediction, validation and raised quality of NDT

    International Nuclear Information System (INIS)

    Nockemann, C.; Tillack, G.R.; Schnitger, D.; Heidt, H.

    1995-01-01

    A concept of the harmonic integration of the following three mainstays of the reliability of ndt is proposed: 1. Theoretical prediction of the reliability as a function of physical parameter by computer modelling of the test problem concerned and the ndt process; maximisation by variation of the parameters. 2. Experimental evaluation of the reliability of ndt processes by the application of statistical methods to test practice. 3. Increasing the reliability by the combination of several ndt methods in a standard DV environment and European interconnection and provision of a distributed databank system. International exchange of experience via telecommunication. (orig.) [de

  11. Development of a methodology for conducting an integrated HRA/PRA --. Task 1, An assessment of human reliability influences during LP&S conditions PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., McLean, VA (United States)

    1993-06-01

    During Low Power and Shutdown (LP&S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant`s systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP&S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP&S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP&S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP&S, (2) identification of potentially important LP&S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP&S conditions for a pressurized water reactor (PWR).

  12. Preeclampsia prediction in type 1 diabetes and diurnal blood pressure methodology

    DEFF Research Database (Denmark)

    Lauszus, Finn

    2016-01-01

    of the papers with the best, validated methodology on BP measurements, which is by no way guaranteed in numerous recent publications. Inherent characteristics of the measurements to be considered are reproducibility, consistency, precision, and trend over scale of measurement. Studies on these issues suggest....... Preeclampsia is associated with urinary albumin excretion rate, reduced night/day ratio, and elevated diurnal blood pressure from first trimester and onwards. However, due to blunting of the diurnal variation, the night/day rhythm provides no good prediction of preeclampsia. Diurnal measurement is a valuable...

  13. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  14. Investigations and technical reviews on the reliability of prediction for migration behavior of radionuclides (H15)

    International Nuclear Information System (INIS)

    Tachikawa, Hirokazu

    2004-02-01

    The research plan of the validation on effects of colloids and organic materials drawn up by the Japan Nuclear Fuel Cycle Development Institute and its' research outcome were reviewed comprehensively by an expert committee established in the Nuclear Safety Research Association. Additionally, experimental investigations for the migration behavior of actinide elements and fission products in engineering barrier and natural barrier medias, and for solution chemistry of them were carried out and discussed by the committee, in order to enhance the reliability of prediction for migration behavior of radionuclides. The subjects investigated by the expert committee are as follows: (1) Research on solubility products of An(III) hydroxide. (2) Diffusion and electromigration behavior of plutonium in buffer material. (3) Analysis of the nuclide solubility in compacted bentonite. (4) Survey of the actual contamination by alpha emitters in steel materials. (author)

  15. Quantitative trait loci markers derived from whole genome sequence data increases the reliability of genomic prediction

    DEFF Research Database (Denmark)

    Brøndum, Rasmus Froberg; Su, Guosheng; Janss, Luc

    2015-01-01

    This study investigated the effect on the reliability of genomic prediction when a small number of significant variants from single marker analysis based on whole genome sequence data were added to the regular 54k single nucleotide polymorphism (SNP) array data. The extra markers were selected...... with the aim of augmenting the custom low-density Illumina BovineLD SNP chip (San Diego, CA) used in the Nordic countries. The single-marker analysis was done breed-wise on all 16 index traits included in the breeding goals for Nordic Holstein, Danish Jersey, and Nordic Red cattle plus the total merit index...... itself. Depending on the trait’s economic weight, 15, 10, or 5 quantitative trait loci (QTL) were selected per trait per breed and 3 to 5 markers were selected to tag each QTL. After removing duplicate markers (same marker selected for more than one trait or breed) and filtering for high pairwise linkage...

  16. Reflow Process Parameters Analysis and Reliability Prediction Considering Multiple Characteristic Values

    Directory of Open Access Journals (Sweden)

    Guo Yu

    2016-01-01

    Full Text Available As a major step surface mount technology, reflow process is the key factor affecting the quality of the final product. The setting parameters and characteristic value of temperature curve shows a nonlinear relationship. So parameter impacts on characteristic values are analyzed and the parameters adjustment process based on orthogonal experiment is proposed in the paper. First, setting parameters are determined and the orthogonal test is designed according to production conditions. Then each characteristic value for temperature profile is calculated. Further, multi-index orthogonal experiment is analyzed for acquiring the setting parameters which impacts the PCBA product quality greater. Finally, reliability prediction is carried out considering the main influencing parameters for providing a theoretical basis of parameters adjustment and product quality evaluation in engineering process.

  17. Prediction method of long-term reliability in improving residual stresses by means of surface finishing

    International Nuclear Information System (INIS)

    Sera, Takehiko; Hirano, Shinro; Chigusa, Naoki; Okano, Shigetaka; Saida, Kazuyoshi; Mochizuki, Masahito; Nishimoto, Kazutoshi

    2012-01-01

    Surface finishing methods, such as Water Jet Peening (WJP), have been applied to welds in some major components of nuclear power plants as a counter measure to Primary Water Stress Corrosion Cracking (PWSCC). In addition, the methods of surface finishing (buffing treatment) is being standardized, and thus the buffing treatment has been also recognized as the well-established method of improving stress. On the other hand, the long-term stability of peening techniques has been confirmed by accelerated test. However, the effectiveness of stress improvement by surface treatment is limited to thin layers and the effect of complicated residual stress distribution in the weld metal beneath the surface is not strictly taken into account for long-term stability. This paper, therefore, describes the accelerated tests, which confirmed that the long-term stability of the layer subjected to buffing treatment was equal to that subjected to WJP. The long-term reliability of very thin stress improved layer was also confirmed through a trial evaluation by thermal elastic-plastic creep analysis, even if the effect of complicated residual stress distribution in the weld metal was excessively taken into account. Considering the above findings, an approach is proposed for constructing the prediction method of the long-term reliability of stress improvement by surface finishing. (author)

  18. A ductile fracture mechanics methodology for predicting pressure vessel and piping failure

    International Nuclear Information System (INIS)

    Landes, J.D.; Zhou, Z.

    1991-01-01

    This paper reports on a ductile fracture methodology based on one used more generally for the prediction of fracture behavior that was applied to the prediction of fracture behavior in pressure vessel and piping components. The model uses the load versus displacement record from a fracture toughness test to develop inputs for predicting the behavior of the structural component. The principle of load separation is used to convert the test record into two pieces of information, calibration functions which describe the structural deformation behavior and fracture toughness which describes the response of a crack-like flaw to the loading. These calibration functions and fracture toughness values which relate to the test specimen are then transformed to those appropriate to the structure. Often in this step computation procedures could be used but are not always necessary. The calibration functions and fracture for the structure are recombined to predict a load versus displacement behavior for the structure. The input for the model was generated from tests of compact specimen geometries; this geometry is often used for fracture toughness testing. The predictions were done for five model structures

  19. Comparison of two stochastic techniques for reliable urban runoff prediction by modeling systematic errors

    DEFF Research Database (Denmark)

    Del Giudice, Dario; Löwe, Roland; Madsen, Henrik

    2015-01-01

    from different fields and have not yet been compared in environmental modeling. To compare the two approaches, we develop a unifying terminology, evaluate them theoretically, and apply them to conceptual rainfall-runoff modeling in the same drainage system. Our results show that both approaches can......In urban rainfall-runoff, commonly applied statistical techniques for uncertainty quantification mostly ignore systematic output errors originating from simplified models and erroneous inputs. Consequently, the resulting predictive uncertainty is often unreliable. Our objective is to present two...... approaches which use stochastic processes to describe systematic deviations and to discuss their advantages and drawbacks for urban drainage modeling. The two methodologies are an external bias description (EBD) and an internal noise description (IND, also known as stochastic gray-box modeling). They emerge...

  20. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  1. Anterior Cruciate Ligament Tear: Reliability of MR Imaging to Predict Stability after Conservative Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Hye Won; Ahn, Jin Hwan; Ahn, Joong Mo; Yoon, Young Cheol; Hong, Hyun Pyo; Yoo, So Young; Kim, Seon Woo [Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2007-06-15

    The aim of this study is to evaluate the reliability of MR imaging to predict the stability of the torn anterior cruciate ligament (ACL) after complete recovery of the ligament's continuity. Twenty patients with 20 knee injuries (13 males and 7 females; age range, 20 54) were enrolled in the study. The inclusion criteria were a positive history of acute trauma, diagnosis of the ACL tear by both the physical examination and the MR imaging at the initial presentation, conservative treatment, complete recovery of the continuity of the ligament on the follow up (FU) MR images and availability of the KT-2000 measurements. Two radiologists, who worked in consensus, graded the MR findings with using a 3-point system for the signal intensity, sharpness, straightness and the thickness of the healed ligament. The insufficiency of ACL was categorized into three groups according to the KT-2000 measurements. The statistic correlations between the grades of the MR findings and the degrees of ACL insufficiency were analyzed using the Cochran-Mantel-Haenszel test (p < 0.05). The p-values for each category of the MR findings according to the different groups of the KT-2000 measurements were 0.9180 for the MR signal intensity, 1.0000 for sharpness, 0.5038 for straightness and 0.2950 for thickness of the ACL. The MR findings were not significantly different between the different KT-2000 groups. MR imaging itself is not a reliable examination to predict stability of the ACL rupture outcome, even when the MR images show an intact appearance of the ACL.

  2. Preparation of methodology for reliability analysis of selected digital segments of the instrumentation and control systems of NPPs. Pt. 1

    International Nuclear Information System (INIS)

    Hustak, S.; Patrik, M.; Babic, P.

    2000-12-01

    The report is structured as follows: (i) Introduction; (ii) Important notions relating to the safety and dependability of software systems for nuclear power plants (selected notions from IAEA Technical Report No. 397; safety aspects of software application; reliability/dependability aspects of digital systems); (iii) Peculiarities of digital systems and ways to a dependable performance of the required function (failures in the system and principles of defence against them; ensuring resistance of digital systems against failures at various hardware and software levels); (iv) The issue of analytical procedures to assess the safety and reliability of safety-related digital systems (safety and reliability assessment at an early stage of the project; general framework of reliability analysis of complex systems; choice of an appropriate quantitative measure of software reliability); (v) Selected qualitative and quantitative information about the reliability of digital systems; the use of relations between the incidence of various types of faults); and (vi) Conclusions and recommendations. (P.A.)

  3. A novel registration-based methodology for prediction of trabecular bone fabric from clinical QCT: A comprehensive analysis.

    Directory of Open Access Journals (Sweden)

    Vimal Chandran

    Full Text Available Osteoporosis leads to hip fractures in aging populations and is diagnosed by modern medical imaging techniques such as quantitative computed tomography (QCT. Hip fracture sites involve trabecular bone, whose strength is determined by volume fraction and orientation, known as fabric. However, bone fabric cannot be reliably assessed in clinical QCT images of proximal femur. Accordingly, we propose a novel registration-based estimation of bone fabric designed to preserve tensor properties of bone fabric and to map bone fabric by a global and local decomposition of the gradient of a non-rigid image registration transformation. Furthermore, no comprehensive analysis on the critical components of this methodology has been previously conducted. Hence, the aim of this work was to identify the best registration-based strategy to assign bone fabric to the QCT image of a patient's proximal femur. The normalized correlation coefficient and curvature-based regularization were used for image-based registration and the Frobenius norm of the stretch tensor of the local gradient was selected to quantify the distance among the proximal femora in the population. Based on this distance, closest, farthest and mean femora with a distinction of sex were chosen as alternative atlases to evaluate their influence on bone fabric prediction. Second, we analyzed different tensor mapping schemes for bone fabric prediction: identity, rotation-only, rotation and stretch tensor. Third, we investigated the use of a population average fabric atlas. A leave one out (LOO evaluation study was performed with a dual QCT and HR-pQCT database of 36 pairs of human femora. The quality of the fabric prediction was assessed with three metrics, the tensor norm (TN error, the degree of anisotropy (DA error and the angular deviation of the principal tensor direction (PTD. The closest femur atlas (CTP with a full rotation (CR for fabric mapping delivered the best results with a TN error of 7

  4. Quantitative dynamic reliability evaluation of AP1000 passive safety systems by using FMEA and GO-FLOW methodology

    International Nuclear Information System (INIS)

    Hashim Muhammad; Yoshikawa, Hidekazu; Matsuoka, Takeshi; Yang Ming

    2014-01-01

    The passive safety systems utilized in advanced pressurized water reactor (PWR) design such as AP1000 should be more reliable than that of active safety systems of conventional PWR by less possible opportunities of hardware failures and human errors (less human intervention). The objectives of present study are to evaluate the dynamic reliability of AP1000 plant in order to check the effectiveness of passive safety systems by comparing the reliability-related issues with that of active safety systems in the event of the big accidents. How should the dynamic reliability of passive safety systems properly evaluated? And then what will be the comparison of reliability results of AP1000 passive safety systems with the active safety systems of conventional PWR. For this purpose, a single loop model of AP1000 passive core cooling system (PXS) and passive containment cooling system (PCCS) are assumed separately for quantitative reliability evaluation. The transient behaviors of these passive safety systems are taken under the large break loss-of-coolant accident in the cold leg. The analysis is made by utilizing the qualitative method failure mode and effect analysis in order to identify the potential failure mode and success-oriented reliability analysis tool called GO-FLOW for quantitative reliability evaluation. The GO-FLOW analysis has been conducted separately for PXS and PCCS systems under the same accident. The analysis results show that reliability of AP1000 passive safety systems (PXS and PCCS) is increased due to redundancies and diversity of passive safety subsystems and components, and four stages automatic depressurization system is the key subsystem for successful actuation of PXS and PCCS system. The reliability results of PCCS system of AP1000 are more reliable than that of the containment spray system of conventional PWR. And also GO-FLOW method can be utilized for reliability evaluation of passive safety systems. (author)

  5. Methodology for predicting oily mixture properties in the mathematical modeling of molecular distillation

    Directory of Open Access Journals (Sweden)

    M. F. Gayol

    2017-06-01

    Full Text Available A methodology for predicting the thermodynamic and transport properties of a multi-component oily mixture, in which the different mixture components are grouped into a small number of pseudo components is shown. This prediction of properties is used in the mathematical modeling of molecular distillation, which consists of a system of differential equations in partial derivatives, according to the principles of the Transport Phenomena and is solved by an implicit finite difference method using a computer code. The mathematical model was validated with experimental data, specifically the molecular distillation of a deodorizer distillate (DD of sunflower oil. The results obtained were satisfactory, with errors less than 10% with respect to the experimental data in a temperature range in which it is possible to apply the proposed method.

  6. Methodology for predicting oily mixture properties in the mathematical modeling of molecular distillation

    International Nuclear Information System (INIS)

    Gayol, M.F.; Pramparo, M.C.; Miró Erdmann, S.M.

    2017-01-01

    A methodology for predicting the thermodynamic and transport properties of a multi-component oily mixture, in which the different mixture components are grouped into a small number of pseudo components is shown. This prediction of properties is used in the mathematical modeling of molecular distillation, which consists of a system of differential equations in partial derivatives, according to the principles of the Transport Phenomena and is solved by an implicit finite difference method using a computer code. The mathematical model was validated with experimental data, specifically the molecular distillation of a deodorizer distillate (DD) of sunflower oil. The results obtained were satisfactory, with errors less than 10% with respect to the experimental data in a temperature range in which it is possible to apply the proposed method. [es

  7. A New Methodology Based on Imbalanced Classification for Predicting Outliers in Electricity Demand Time Series

    Directory of Open Access Journals (Sweden)

    Francisco Javier Duque-Pintor

    2016-09-01

    Full Text Available The occurrence of outliers in real-world phenomena is quite usual. If these anomalous data are not properly treated, unreliable models can be generated. Many approaches in the literature are focused on a posteriori detection of outliers. However, a new methodology to a priori predict the occurrence of such data is proposed here. Thus, the main goal of this work is to predict the occurrence of outliers in time series, by using, for the first time, imbalanced classification techniques. In this sense, the problem of forecasting outlying data has been transformed into a binary classification problem, in which the positive class represents the occurrence of outliers. Given that the number of outliers is much lower than the number of common values, the resultant classification problem is imbalanced. To create training and test sets, robust statistical methods have been used to detect outliers in both sets. Once the outliers have been detected, the instances of the dataset are labeled accordingly. Namely, if any of the samples composing the next instance are detected as an outlier, the label is set to one. As a study case, the methodology has been tested on electricity demand time series in the Spanish electricity market, in which most of the outliers were properly forecast.

  8. A Duration Prediction Using a Material-Based Progress Management Methodology for Construction Operation Plans

    Directory of Open Access Journals (Sweden)

    Yongho Ko

    2017-04-01

    Full Text Available Precise and accurate prediction models for duration and cost enable contractors to improve their decision making for effective resource management in terms of sustainability in construction. Previous studies have been limited to cost-based estimations, but this study focuses on a material-based progress management method. Cost-based estimations typically used in construction, such as the earned value method, rely on comparing the planned budget with the actual cost. However, accurately planning budgets requires analysis of many factors, such as the financial status of the sectors involved. Furthermore, there is a higher possibility of changes in the budget than in the total amount of material used during construction, which is deduced from the quantity take-off from drawings and specifications. Accordingly, this study proposes a material-based progress management methodology, which was developed using different predictive analysis models (regression, neural network, and auto-regressive moving average as well as datasets on material and labor, which can be extracted from daily work reports from contractors. A case study on actual datasets was conducted, and the results show that the proposed methodology can be efficiently used for progress management in construction.

  9. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  10. Assessing the reliability of predictive activity coefficient models for molecules consisting of several functional groups

    Directory of Open Access Journals (Sweden)

    R. P. Gerber

    2013-03-01

    Full Text Available Currently, the most successful predictive models for activity coefficients are those based on functional groups such as UNIFAC. In contrast, these models require a large amount of experimental data for the determination of their parameter matrix. A more recent alternative is the models based on COSMO, for which only a small set of universal parameters must be calibrated. In this work, a recalibrated COSMO-SAC model was compared with the UNIFAC (Do model employing experimental infinite dilution activity coefficient data for 2236 non-hydrogen-bonding binary mixtures at different temperatures. As expected, UNIFAC (Do presented better overall performance, with a mean absolute error of 0.12 ln-units against 0.22 for our COSMO-SAC implementation. However, in cases involving molecules with several functional groups or when functional groups appear in an unusual way, the deviation for UNIFAC was 0.44 as opposed to 0.20 for COSMO-SAC. These results show that COSMO-SAC provides more reliable predictions for multi-functional or more complex molecules, reaffirming its future prospects.

  11. On the reliability of predictions of geomechanical response - project Cosa in perspective

    International Nuclear Information System (INIS)

    Knowles, N.C.; Lowe, M.J.S.; Come, B.

    1990-01-01

    Project COSA (Comparison of computer codes for Salt) was set up by the CEC as international benchmark exercise to compare the reliability of predictions of thermo-mechanical response of HLW repositories in salt. The first phase (COSA I) was conducted between 1984-1986 and attention was directed at code verification issues. The second phase (COSA II), carried out in the period 1986-1988, addressed code validation and other issues. Specifically, a series of experimental heat and pressure tests carried out at the Asse Mine in Wast Germany were modelled and predictions of the thermo-mechanical behaviour were compared. Ten European organisations participated. A key feature of this exercise was that, as far as possible, the calculations were performed blind (i.e. without any knowledge of the observed behaviour) using the best information available a priori, to describe the physical situation to be modelled. Interest centred around the various constitutive models (of material behaviour) for rock-salt and the assumptions about the in situ state of stress. The paper gives an overview of the project, presents some broad conclusions and attempts to assess their significance. 17 refs., 6 figs., 2 tabs

  12. Advanced error-prediction LDPC with temperature compensation for highly reliable SSDs

    Science.gov (United States)

    Tokutomi, Tsukasa; Tanakamaru, Shuhei; Iwasaki, Tomoko Ogura; Takeuchi, Ken

    2015-09-01

    To improve the reliability of NAND Flash memory based solid-state drives (SSDs), error-prediction LDPC (EP-LDPC) has been proposed for multi-level-cell (MLC) NAND Flash memory (Tanakamaru et al., 2012, 2013), which is effective for long retention times. However, EP-LDPC is not as effective for triple-level cell (TLC) NAND Flash memory, because TLC NAND Flash has higher error rates and is more sensitive to program-disturb error. Therefore, advanced error-prediction LDPC (AEP-LDPC) has been proposed for TLC NAND Flash memory (Tokutomi et al., 2014). AEP-LDPC can correct errors more accurately by precisely describing the error phenomena. In this paper, the effects of AEP-LDPC are investigated in a 2×nm TLC NAND Flash memory with temperature characterization. Compared with LDPC-with-BER-only, the SSD's data-retention time is increased by 3.4× and 9.5× at room-temperature (RT) and 85 °C, respectively. Similarly, the acceptable BER is increased by 1.8× and 2.3×, respectively. Moreover, AEP-LDPC can correct errors with pre-determined tables made at higher temperatures to shorten the measurement time before shipping. Furthermore, it is found that one table can cover behavior over a range of temperatures in AEP-LDPC. As a result, the total table size can be reduced to 777 kBytes, which makes this approach more practical.

  13. On which term is the prediction of the behaviour of glass necessary and reliable?

    International Nuclear Information System (INIS)

    Lefevre, J.

    1997-01-01

    The author questions the ethics of the deep underground storage of high-level radioactive wastes. The time periods that are considered for the confinement are so long that it is completely impossible to predict the way of life of people at that time and the level of knowledge they will have reached. There is a total agreement about the ethics principle of not jeopardizing life and environment of future generations but the difficulty is to draw the limits of this protection. In the regulations of most countries 2 periods of time are defined: the first 500 years and 10.000 years. 500 years is the period of high heat releases due to the decay of most fission products and is also a reasonable time during which the confining site (structures and packages) stays accessible. 10.000 years is considered as the period of time during which predictions are reliable, beyond this time uncertainties become too important and more and more numerous. (A.C.)

  14. A generic statistical methodology to predict the maximum pit depth of a localized corrosion process

    International Nuclear Information System (INIS)

    Jarrah, A.; Bigerelle, M.; Guillemot, G.; Najjar, D.; Iost, A.; Nianga, J.-M.

    2011-01-01

    Highlights: → We propose a methodology to predict the maximum pit depth in a corrosion process. → Generalized Lambda Distribution and the Computer Based Bootstrap Method are combined. → GLD fit a large variety of distributions both in their central and tail regions. → Minimum thickness preventing perforation can be estimated with a safety margin. → Considering its applications, this new approach can help to size industrial pieces. - Abstract: This paper outlines a new methodology to predict accurately the maximum pit depth related to a localized corrosion process. It combines two statistical methods: the Generalized Lambda Distribution (GLD), to determine a model of distribution fitting with the experimental frequency distribution of depths, and the Computer Based Bootstrap Method (CBBM), to generate simulated distributions equivalent to the experimental one. In comparison with conventionally established statistical methods that are restricted to the use of inferred distributions constrained by specific mathematical assumptions, the major advantage of the methodology presented in this paper is that both the GLD and the CBBM enable a statistical treatment of the experimental data without making any preconceived choice neither on the unknown theoretical parent underlying distribution of pit depth which characterizes the global corrosion phenomenon nor on the unknown associated theoretical extreme value distribution which characterizes the deepest pits. Considering an experimental distribution of depths of pits produced on an aluminium sample, estimations of maximum pit depth using a GLD model are compared to similar estimations based on usual Gumbel and Generalized Extreme Value (GEV) methods proposed in the corrosion engineering literature. The GLD approach is shown having smaller bias and dispersion in the estimation of the maximum pit depth than the Gumbel approach both for its realization and mean. This leads to comparing the GLD approach to the GEV one

  15. Development of a predictive methodology for identifying high radon exhalation potential areas

    International Nuclear Information System (INIS)

    Ielsch, G.

    2001-01-01

    Radon 222 is a radioactive natural gas originating from the decay of radium 226 which itself originates from the decay of uranium 23 8 naturally present in rocks and soil. Inhalation of radon gas and its decay products is a potential health risk for man. Radon can accumulate in confined environments such as buildings, and is responsible for one third of the total radiological exposure of the general public to radiation. The problem of how to manage this risk then arises. The main difficulty encountered is due to the large variability of exposure to radon across the country. A prediction needs to be made of areas with the highest density of buildings with high radon levels. Exposure to radon varies depending on the degree of confinement of the habitat, the lifestyle of the occupants and particularly emission of radon from the surface of the soil on which the building is built. The purpose of this thesis is to elaborate a methodology for determining areas presenting a high potential for radon exhalation at the surface of the soil. The methodology adopted is based on quantification of radon exhalation at the surface, starting from a precise characterization of the main local geological and pedological parameters that control the radon source and its transport to the ground/atmosphere interface. The methodology proposed is innovative in that it combines a cartographic analysis, parameters integrated into a Geographic Information system, and a simplified model for vertical transport of radon by diffusion through pores in the soil. This methodology has been validated on two typical areas, in different geological contexts, and gives forecasts that generally agree with field observations. This makes it possible to identify areas with a high exhalation potential within a range of a few square kilometers. (author)

  16. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  17. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model of dynamic safety system (DSS) shows that the estimated reliability of the system is quite reasonable and realistic. (author)

  18. Reliability of steam-turbine rotors. Task 1. Lifetime prediction analysis system. Final report

    International Nuclear Information System (INIS)

    Nair, P.K.; Pennick, H.G.; Peters, J.E.; Wells, C.H.

    1982-12-01

    Task 1 of RP 502, Reliability of Steam Turbine Rotors, resulted in the development of a computerized lifetime prediction analysis system (STRAP) for the automatic evaluation of rotor integrity based upon the results of a boresonic examination of near-bore defects. Concurrently an advanced boresonic examination system (TREES), designed to acquire data automatically for lifetime analysis, was developed and delivered to the maintenance shop of a major utility. This system and a semi-automated, state-of-the-art system (BUCS) were evaluated on two retired rotors as part of the Task 2 effort. A modified nonproprietary version of STRAP, called SAFER, is now available for rotor lifetime prediction analysis. STRAP and SAFER share a common fracture analysis postprocessor for rapid evaluation of either conventional boresonic amplitude data or TREES cell data. The final version of this postprocessor contains general stress intensity correlations for elliptical cracks in a radial stress gradient and provision for elastic-plastic instability of the ligament between an imbedded crack and the bore surface. Both linear elastic and ligament rupture models were developed for rapid analysis of linkup within three-dimensional clusters of defects. Bore stress-rupture criteria are included, but a creep-fatigue crack growth data base is not available. Physical and mechanical properties of air-melt 1CrMoV forgings are built into the program; however, only bounding values of fracture toughness versus temperature are available. Owing to the lack of data regarding the probability of flaw detection for the boresonic systems and of quantitative verification of the flaw linkup analysis, automatic evlauation of boresonic results is not recommended, and the lifetime prediction system is currently restricted to conservative, deterministic analysis of specified flaw geometries

  19. Assimilation of satellite data to increase the reliability of the wave predictions in the Black Sea

    Science.gov (United States)

    Rusu, Liliana; Raileanu, Alina

    2015-04-01

    . The results provided by the present work show that the data assimilation approach implemented leads to a significant enhancement of the reliability of the numerical wave predictions. The work is still ongoing, and besides the fact that all the satellites are now considered for assimilation, the scheme is also adapted in order to be able to provide forecast products. Keywords: Black Sea, wave models, SWAN, data assimilation, satellite data. ACKNOWLEDGEMENT: This work was supported by a grant of the Romanian Ministry of National Education, CNCS - UEFISCDI, project number PN-II-ID-PCE-2012-4-0089 (project DAMWAVE).

  20. HitPredict version 4: comprehensive reliability scoring of physical protein?protein interactions from more than 100 species

    OpenAIRE

    L?pez, Yosvany; Nakai, Kenta; Patil, Ashwini

    2015-01-01

    HitPredict is a consolidated resource of experimentally identified, physical protein?protein interactions with confidence scores to indicate their reliability. The study of genes and their inter-relationships using methods such as network and pathway analysis requires high quality protein?protein interaction information. Extracting reliable interactions from most of the existing databases is challenging because they either contain only a subset of the available interactions, or a mixture of p...

  1. A pediatric FOUR score coma scale: interrater reliability and predictive validity.

    Science.gov (United States)

    Czaikowski, Brianna L; Liang, Hong; Stewart, C Todd

    2014-04-01

    The Full Outline of UnResponsiveness (FOUR) Score is a coma scale that consists of four components (eye and motor response, brainstem reflexes, and respiration). It was originally validated among the adult population and recently in a pediatric population. To enhance clinical assessment of pediatric intensive care unit patients, including those intubated and/or sedated, at our children's hospital, we modified the FOUR Score Scale for this population. This modified scale would provide many of the same advantages as the original, such as interrater reliability, simplicity, and elimination of the verbal component that is not compatible with the Glasgow Coma Scale (GCS), creating a more valuable neurological assessment tool for the nursing community. Our goal was to potentially provide greater information than the formally used GCS when assessing critically ill, neurologically impaired patients, including those sedated and/or intubated. Experienced pediatric intensive care unit nurses were trained as "expert raters." Two different nurses assessed each subject using the Pediatric FOUR Score Scale (PFSS), GCS, and Richmond Agitation Sedation Scale at three different time points. Data were compared with the Pediatric Cerebral Performance Category (PCPC) assessed by another nurse. Our hypothesis was that the PFSS and PCPC should highly correlate and the GCS and PCPC should correlate lower. Study results show that the PFSS is excellent for interrater reliability for trained nurse-rater pairs and prediction of poor outcome and in-hospital mortality, under various situations, but there were no statistically significant differences between the PFSS and the GCS. However, the PFSS does have the potential to provide greater neurological assessment in the intubated and/or sedated patient based on the outcomes of our study.

  2. The Biopharmaceutics Classification System: subclasses for in vivo predictive dissolution (IPD) methodology and IVIVC.

    Science.gov (United States)

    Tsume, Yasuhiro; Mudie, Deanna M; Langguth, Peter; Amidon, Greg E; Amidon, Gordon L

    2014-06-16

    The Biopharmaceutics Classification System (BCS) has found widespread utility in drug discovery, product development and drug product regulatory sciences. The classification scheme captures the two most significant factors influencing oral drug absorption; solubility and intestinal permeability and it has proven to be a very useful and a widely accepted starting point for drug product development and drug product regulation. The mechanistic base of the BCS approach has, no doubt, contributed to its wide spread acceptance and utility. Nevertheless, underneath the simplicity of BCS are many detailed complexities, both in vitro and in vivo which must be evaluated and investigated for any given drug and drug product. In this manuscript we propose a simple extension of the BCS classes to include sub-specification of acid (a), base (b) and neutral (c) for classes II and IV. Sub-classification for Classes I and III (high solubility drugs as currently defined) is generally not needed except perhaps in border line solubility cases. It is well known that the , pKa physical property of a drug (API) has a significant impact on the aqueous solubility dissolution of drug from the drug product both in vitro and in vivo for BCS Class II and IV acids and bases, and is the basis, we propose for a sub-classification extension of the original BCS classification. This BCS sub-classification is particularly important for in vivo predictive dissolution methodology development due to the complex and variable in vivo environment in the gastrointestinal tract, with its changing pH, buffer capacity, luminal volume, surfactant luminal conditions, permeability profile along the gastrointestinal tract and variable transit and fasted and fed states. We believe this sub-classification is a step toward developing a more science-based mechanistic in vivo predictive dissolution (IPD) methodology. Such a dissolution methodology can be used by development scientists to assess the likelihood of a

  3. Improving Predictions with Reliable Extrapolation Schemes and Better Understanding of Factorization

    Science.gov (United States)

    More, Sushant N.

    New insights into the inter-nucleon interactions, developments in many-body technology, and the surge in computational capabilities has led to phenomenal progress in low-energy nuclear physics in the past few years. Nonetheless, many calculations still lack a robust uncertainty quantification which is essential for making reliable predictions. In this work we investigate two distinct sources of uncertainty and develop ways to account for them. Harmonic oscillator basis expansions are widely used in ab-initio nuclear structure calculations. Finite computational resources usually require that the basis be truncated before observables are fully converged, necessitating reliable extrapolation schemes. It has been demonstrated recently that errors introduced from basis truncation can be taken into account by focusing on the infrared and ultraviolet cutoffs induced by a truncated basis. We show that a finite oscillator basis effectively imposes a hard-wall boundary condition in coordinate space. We accurately determine the position of the hard-wall as a function of oscillator space parameters, derive infrared extrapolation formulas for the energy and other observables, and discuss the extension of this approach to higher angular momentum and to other localized bases. We exploit the duality of the harmonic oscillator to account for the errors introduced by a finite ultraviolet cutoff. Nucleon knockout reactions have been widely used to study and understand nuclear properties. Such an analysis implicitly assumes that the effects of the probe can be separated from the physics of the target nucleus. This factorization between nuclear structure and reaction components depends on the renormalization scale and scheme, and has not been well understood. But it is potentially critical for interpreting experiments and for extracting process-independent nuclear properties. We use a class of unitary transformations called the similarity renormalization group (SRG) transformations to

  4. Introducing the MINDER research project: Methodologies for Improvement of Non-residential buildings' Daily Energy Efficiency Reliability

    OpenAIRE

    Berker, Thomas; Gansmo, Helen Jøsok; Junghans, Antje

    2014-01-01

    In the Norwegian building sector, we are currently witnessing the transition from a realization gap - the gap between availability of solutions and their implementation - to a reliability gap: the gap between the building's potential performances as it is commissioned to its users and its actual performance in daily use. When new solutions do not live up to their promises, not only the performance of the individual building is at stake. The reliability gap can easily grow into a credibility g...

  5. Predicting the accumulated number of plugged tubes in a steam generator using statistical methodologies

    International Nuclear Information System (INIS)

    Ferng, Y.-M.; Fan, C.N.; Pei, B.S.; Li, H.-N.

    2008-01-01

    A steam generator (SG) plays a significant role not only with respect to the primary-to-secondary heat transfer but also as a fission product barrier to prevent the release of radionuclides. Tube plugging is an efficient way to avoid releasing radionuclides when SG tubes are severely degraded. However, this remedial action may cause the decrease of SG heat transfer capability, especially in transient or accident conditions. It is therefore crucial for the plant staff to understand the trend of plugged tubes for the SG operation and maintenance. Statistical methodologies are proposed in this paper to predict this trend. The accumulated numbers of SG plugged tubes versus the operation time are predicted using the Weibull and log-normal distributions, which correspond well with the plant measured data from a selected pressurized water reactor (PWR). With the help of these predictions, the accumulated number of SG plugged tubes can be reasonably extrapolated to the 40-year operation lifetime (or even longer than 40 years) of a PWR. This information can assist the plant policymakers to determine whether or when a SG must be replaced

  6. Improved Methodology of Weather Window Prediction for Offshore Operations Based on Probabilities of Operation Failure

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    2017-01-01

    The offshore wind industry is building and planning new wind farms further offshore due to increasing demand on sustainable energy production and already occupied prime resource locations closer to shore. Costs of operation and maintenance, transport and installation of offshore wind turbines...... already contribute significantly to the cost of produced electricity and will continue to increase, due to moving further offshore, if the current techniques of predicting offshore wind farm accessibility are to stay the same. The majority of offshore operations are carried out by specialized ships...... that must be hired for the duration of the operation. Therefore, offshore wind farm accessibility and costs of offshore activities are primarily driven by the expected number of operational hours offshore and waiting times for weather windows, suitable for offshore operations. Having more reliable weather...

  7. Power Prediction Model for Turning EN-31 Steel Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    M. Hameedullah

    2010-01-01

    Full Text Available Power consumption in turning EN-31 steel (a material that is most extensively used in automotive industry with tungstencarbide tool under different cutting conditions was experimentally investigated. The experimental runs were planned accordingto 24+8 added centre point factorial design of experiments, replicated thrice. The data collected was statisticallyanalyzed using Analysis of Variance technique and first order and second order power consumption prediction models weredeveloped by using response surface methodology (RSM. It is concluded that second-order model is more accurate than thefirst-order model and fit well with the experimental data. The model can be used in the automotive industries for decidingthe cutting parameters for minimum power consumption and hence maximum productivity

  8. Feasibility Study on the Satellite Rainfall Data for Prediction of Sediment- Related Disaster by the Japanese Prediction Methodology

    Science.gov (United States)

    Shimizu, Y.; Ishizuka, T.; Osanai, N.; Okazumi, T.

    2014-12-01

    In this study, the sediment-related disaster prediction method which based ground gauged rainfall-data, currently practiced in Japan was coupled with satellite rainfall data and applied to domestic large-scale sediment-related disasters. The study confirmed the feasibility of this integrated method. In Asia, large-scale sediment-related disasters which can sweep away an entire settlement occur frequently. Leyte Island suffered from a huge landslide in 2004, and Typhoon Molakot in 2009 caused huge landslides in Taiwan. In the event of these sediment-related disasters, immediate responses by central and local governments are crucial in crisis management. In general, there are not enough rainfall gauge stations in developing countries. Therefore national and local governments have little information to determine the risk level of water induced disasters in their service areas. In the Japanese methodology, a criterion is set by combining two indices: the short-term rainfall index and long-term rainfall index. The short-term rainfall index is defined as the 60-minute total rainfall; the long-term rainfall index as the soil-water index, which is an estimation of the retention status of fallen rainfall in soil. In July 2009, a high-density sediment related disaster, or a debris flow, occurred in Hofu City of Yamaguchi Prefecture, in the western region of Japan. This event was calculated by the Japanese standard methodology, and then analyzed for its feasibility. Hourly satellite based rainfall has underestimates compared with ground based rainfall data. Long-term index correlates with each other. Therefore, this study confirmed that it is possible to deliver information on the risk level of sediment-related disasters such as shallow landslides and debris flows. The prediction method tested in this study is expected to assist for timely emergency responses to rainfall-induced natural disasters in sparsely gauged areas. As the Global Precipitation Measurement (GPM) Plan

  9. A new methodology for predicting flow induced vibration in industrial components

    International Nuclear Information System (INIS)

    Gay, N.

    1997-12-01

    Flow induced vibration damage is a major concern for designers and operators of industrial components. For example, nuclear power plant operators have currently to deal with such flow induced vibration problems, in steam generator tube bundles, control rods or nuclear fuel assemblies. Some methodologies have thus been recently proposed to obtain an accurate description of the flow induced vibration phenomena. These methodologies are based on unsteady semi-analytical models for fluid-dynamic forces, associated with non-dimensional fluid force coefficients generally obtained from experiments. The aim is to determine the forces induced by the flow on the structure, and then to take account of these forces to derive the dynamic behaviour of the component under flow excitation. The approach is based on a general model for fluid-dynamic forces, using several non-dimensional parameters that cannot be reached through computation. These parameters are then determined experimentally on simplified test sections, representative of the component, of the flow and of the fluid-elastic coupling phenomena. Predicting computations of the industrial component can then be performed for various operating configurations, by applying laws of similarity. The major physical mechanisms involved in complex fluid-structure interaction phenomena have been understood and modelled. (author)

  10. Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCabe, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-10-16

    This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computational tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.

  11. Life prediction methodology for thermal-mechanical fatigue and elevated temperature creep design

    Science.gov (United States)

    Annigeri, Ravindra

    Nickel-based superalloys are used for hot section components of gas turbine engines. Life prediction techniques are necessary to assess service damage in superalloy components resulting from thermal-mechanical fatigue (TMF) and elevated temperature creep. A new TMF life model based on continuum damage mechanics has been developed and applied to IN 738 LC substrate material with and without coating. The model also characterizes TMF failure in bulk NiCoCrAlY overlay and NiAl aluminide coatings. The inputs to the TMF life model are mechanical strain range, hold time, peak cycle temperatures and maximum stress measured from the stabilized or mid-life hysteresis loops. A viscoplastic model is used to predict the stress-strain hysteresis loops. A flow rule used in the viscoplastic model characterizes the inelastic strain rate as a function of the applied stress and a set of three internal stress variables known as back stress, drag stress and limit stress. Test results show that the viscoplastic model can reasonably predict time-dependent stress-strain response of the coated material and stress relaxation during hold times. In addition to the TMF life prediction methodology, a model has been developed to characterize the uniaxial and multiaxial creep behavior. An effective stress defined as the applied stress minus the back stress is used to characterize the creep recovery and primary creep behavior. The back stress has terms representing strain hardening, dynamic recovery and thermal recovery. Whenever the back stress is greater than the applied stress, the model predicts a negative creep rate observed during multiple stress and multiple temperature cyclic tests. The model also predicted the rupture time and the remaining life that are important for life assessment. The model has been applied to IN 738 LC, Mar-M247, bulk NiCoCrAlY overlay coating and 316 austenitic stainless steel. The proposed model predicts creep response with a reasonable accuracy for wide range of

  12. Regression methodology in groundwater composition estimation with composition predictions for Romuvaara borehole KR10

    Energy Technology Data Exchange (ETDEWEB)

    Luukkonen, A.; Korkealaakso, J.; Pitkaenen, P. [VTT Communities and Infrastructure, Espoo (Finland)

    1997-11-01

    Teollisuuden Voima Oy selected five investigation areas for preliminary site studies (1987Ae1992). The more detailed site investigation project, launched at the beginning of 1993 and presently supervised by Posiva Oy, is concentrated to three investigation areas. Romuvaara at Kuhmo is one of the present target areas, and the geochemical, structural and hydrological data used in this study are extracted from there. The aim of the study is to develop suitable methods for groundwater composition estimation based on a group of known hydrogeological variables. The input variables used are related to the host type of groundwater, hydrological conditions around the host location, mixing potentials between different types of groundwater, and minerals equilibrated with the groundwater. The output variables are electrical conductivity, Ca, Mg, Mn, Na, K, Fe, Cl, S, HS, SO{sub 4}, alkalinity, {sup 3}H, {sup 14}C, {sup 13}C, Al, Sr, F, Br and I concentrations, and pH of the groundwater. The methodology is to associate the known hydrogeological conditions (i.e. input variables), with the known water compositions (output variables), and to evaluate mathematical relations between these groups. Output estimations are done with two separate procedures: partial least squares regressions on the principal components of input variables, and by training neural networks with input-output pairs. Coefficients of linear equations and trained networks are optional methods for actual predictions. The quality of output predictions are monitored with confidence limit estimations, evaluated from input variable covariances and output variances, and with charge balance calculations. Groundwater compositions in Romuvaara borehole KR10 are predicted at 10 metre intervals with both prediction methods. 46 refs.

  13. Thermal Protection for Mars Sample Return Earth Entry Vehicle: A Grand Challenge for Design Methodology and Reliability Verification

    Science.gov (United States)

    Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.

    2017-01-01

    Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.

  14. Progress in Methodologies for the Assessment of Passive Safety System Reliability in Advanced Reactors. Results from the Coordinated Research Project on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors

    International Nuclear Information System (INIS)

    2014-09-01

    Strong reliance on inherent and passive design features has become a hallmark of many advanced reactor designs, including several evolutionary designs and nearly all advanced small and medium sized reactor (SMR) designs. Advanced nuclear reactor designs incorporate several passive systems in addition to active ones — not only to enhance the operational safety of the reactors but also to eliminate the possibility of serious accidents. Accordingly, the assessment of the reliability of passive safety systems is a crucial issue to be resolved before their extensive use in future nuclear power plants. Several physical parameters affect the performance of a passive safety system, and their values at the time of operation are unknown a priori. The functions of passive systems are based on basic physical laws and thermodynamic principals, and they may not experience the same kind of failures as active systems. Hence, consistent efforts are required to qualify the reliability of passive systems. To support the development of advanced nuclear reactor designs with passive systems, investigations into their reliability using various methodologies are being conducted in several Member States with advanced reactor development programmes. These efforts include reliability methods for passive systems by the French Atomic Energy and Alternative Energies Commission, reliability evaluation of passive safety system by the University of Pisa, Italy, and assessment of passive system reliability by the Bhabha Atomic Research Centre, India. These different approaches seem to demonstrate a consensus on some aspects. However, the developers of the approaches have been unable to agree on the definition of reliability in a passive system. Based on these developments and in order to foster collaboration, the IAEA initiated the Coordinated Research Project (CRP) on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors in 2008. The

  15. HitPredict version 4: comprehensive reliability scoring of physical protein-protein interactions from more than 100 species.

    Science.gov (United States)

    López, Yosvany; Nakai, Kenta; Patil, Ashwini

    2015-01-01

    HitPredict is a consolidated resource of experimentally identified, physical protein-protein interactions with confidence scores to indicate their reliability. The study of genes and their inter-relationships using methods such as network and pathway analysis requires high quality protein-protein interaction information. Extracting reliable interactions from most of the existing databases is challenging because they either contain only a subset of the available interactions, or a mixture of physical, genetic and predicted interactions. Automated integration of interactions is further complicated by varying levels of accuracy of database content and lack of adherence to standard formats. To address these issues, the latest version of HitPredict provides a manually curated dataset of 398 696 physical associations between 70 808 proteins from 105 species. Manual confirmation was used to resolve all issues encountered during data integration. For improved reliability assessment, this version combines a new score derived from the experimental information of the interactions with the original score based on the features of the interacting proteins. The combined interaction score performs better than either of the individual scores in HitPredict as well as the reliability score of another similar database. HitPredict provides a web interface to search proteins and visualize their interactions, and the data can be downloaded for offline analysis. Data usability has been enhanced by mapping protein identifiers across multiple reference databases. Thus, the latest version of HitPredict provides a significantly larger, more reliable and usable dataset of protein-protein interactions from several species for the study of gene groups. Database URL: http://hintdb.hgc.jp/htp. © The Author(s) 2015. Published by Oxford University Press.

  16. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  17. Hybrid intelligent methodology to design translation invariant morphological operators for Brazilian stock market prediction.

    Science.gov (United States)

    Araújo, Ricardo de A

    2010-12-01

    This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Predicting losing and gaining river reaches in lowland New Zealand based on a statistical methodology

    Science.gov (United States)

    Yang, Jing; Zammit, Christian; Dudley, Bruce

    2017-04-01

    The phenomenon of losing and gaining in rivers normally takes place in lowland where often there are various, sometimes conflicting uses for water resources, e.g., agriculture, industry, recreation, and maintenance of ecosystem function. To better support water allocation decisions, it is crucial to understand the location and seasonal dynamics of these losses and gains. We present a statistical methodology to predict losing and gaining river reaches in New Zealand based on 1) information surveys with surface water and groundwater experts from regional government, 2) A collection of river/watershed characteristics, including climate, soil and hydrogeologic information, and 3) the random forests technique. The surveys on losing and gaining reaches were conducted face-to-face at 16 New Zealand regional government authorities, and climate, soil, river geometry, and hydrogeologic data from various sources were collected and compiled to represent river/watershed characteristics. The random forests technique was used to build up the statistical relationship between river reach status (gain and loss) and river/watershed characteristics, and then to predict for river reaches at Strahler order one without prior losing and gaining information. Results show that the model has a classification error of around 10% for "gain" and "loss". The results will assist further research, and water allocation decisions in lowland New Zealand.

  19. Methodology for predicting ultimate pressure capacity of the ACR-1000 containment structure

    International Nuclear Information System (INIS)

    Saudy, A.M.; Awad, A.; Elgohary, M.

    2006-01-01

    The Advanced CANDU Reactor or the ACR-1000 is developed by Atomic Energy of Canada Limited (AECL) to be the next step in the evolution of the CANDU product line. It is based on the proven CANDU technology and incorporates advanced design technologies. The ACR containment structure is an essential element of the overall defense in depth approach to reactor safety, and is a physical barrier against the release of radioactive material to the environment. Therefore, it is important to provide a robust design with an adequate margin of safety. One of the key design requirements of the ACR containment structure is to have an ultimate pressure capacity that is at least twice the design pressure Using standard design codes, the containment structure is expected to behave elastically at least up to 1.5 times the design pressure. Beyond this pressure level, the concrete containment structure with reinforcements and post-tension tendons behaves in a highly non-linear manner and exhibits a complex response when cracks initiate and propagate. To predict the structural non-linear responses, at least two critical features are involved. These are: the structural idealization by the geometry and material property models, and the adopted solution algorithm. Therefore, detailed idealization of the concrete structure is needed in order to accurately predict its ultimate pressure capacity. This paper summarizes the analysis methodology to be carried out to establish the ultimate pressure capacity of the ACR containment structure and to confirm that the structure meets the specified design requirements. (author)

  20. An appraisal of wind speed distribution prediction by soft computing methodologies: A comparative study

    International Nuclear Information System (INIS)

    Petković, Dalibor; Shamshirband, Shahaboddin; Anuar, Nor Badrul; Saboohi, Hadi; Abdul Wahab, Ainuddin Wahid; Protić, Milan; Zalnezhad, Erfan; Mirhashemi, Seyed Mohammad Amin

    2014-01-01

    Highlights: • Probabilistic distribution functions of wind speed. • Two parameter Weibull probability distribution. • To build an effective prediction model of distribution of wind speed. • Support vector regression application as probability function for wind speed. - Abstract: The probabilistic distribution of wind speed is among the more significant wind characteristics in examining wind energy potential and the performance of wind energy conversion systems. When the wind speed probability distribution is known, the wind energy distribution can be easily obtained. Therefore, the probability distribution of wind speed is a very important piece of information required in assessing wind energy potential. For this reason, a large number of studies have been established concerning the use of a variety of probability density functions to describe wind speed frequency distributions. Although the two-parameter Weibull distribution comprises a widely used and accepted method, solving the function is very challenging. In this study, the polynomial and radial basis functions (RBF) are applied as the kernel function of support vector regression (SVR) to estimate two parameters of the Weibull distribution function according to previously established analytical methods. Rather than minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound, so as to achieve generalized performance. According to the experimental results, enhanced predictive accuracy and capability of generalization can be achieved using the SVR approach compared to other soft computing methodologies

  1. Use of USLE/GIS methodology for predicting soil loss in a semiarid agricultural watershed.

    Science.gov (United States)

    Erdogan, Emrah H; Erpul, Günay; Bayramin, Ilhami

    2007-08-01

    The Universal Soil Loss Equation (USLE) is an erosion model to estimate average soil loss that would generally result from splash, sheet, and rill erosion from agricultural plots. Recently, use of USLE has been extended as a useful tool predicting soil losses and planning control practices in agricultural watersheds by the effective integration of the GIS-based procedures to estimate the factor values in a grid cell basis. This study was performed in the Kazan Watershed located in the central Anatolia, Turkey, to predict soil erosion risk by the USLE/GIS methodology for planning conservation measures in the site. Rain erosivity (R), soil erodibility (K), and cover management factor (C) values of the model were calculated from erosivity map, soil map, and land use map of Turkey, respectively. R values were site-specifically corrected using DEM and climatic data. The topographical and hydrological effects on the soil loss were characterized by LS factor evaluated by the flow accumulation tool using DEM and watershed delineation techniques. From resulting soil loss map of the watershed, the magnitude of the soil erosion was estimated in terms of the different soil units and land uses and the most erosion-prone areas where irreversible soil losses occurred were reasonably located in the Kazan watershed. This could be very useful for deciding restoration practices to control the soil erosion of the sites to be severely influenced.

  2. Determination of cadmium relative bioavailability in contaminated soils and its prediction using in vitro methodologies.

    Science.gov (United States)

    Juhasz, Albert L; Weber, John; Naidu, Ravi; Gancarz, Dorota; Rofe, Allan; Todor, Damian; Smith, Euan

    2010-07-01

    In this study, cadmium (Cd) relative bioavailability in contaminated (n = 5) and spiked (n = 2) soils was assessed using an in vivo mouse model following administration of feed containing soil or Cd acetate (reference material) over a 15 day exposure period. Cadmium relative bioavailability varied depending on whether the accumulation of Cd in the kidneys, liver, or kidney plus liver was used for relative bioavailability calculations. When kidney plus liver Cd concentrations were used, Cd relative bioavailability ranged from 10.1 to 92.1%. Cadmium relative bioavailability was higher (14.4-115.2%) when kidney Cd concentrations were used, whereas lower values (7.2-76.5%) were derived when liver Cd concentrations were employed in calculations. Following in vivo studies, four in vitro methodologies (SBRC, IVG, PBET, and DIN), encompassing both gastric and intestinal phases, were assessed for their ability to predict Cd relative bioavailability. Pearson correlations demonstrated a strong linear relationship between Cd relative bioavailability and Cd bioaccessibility (0.62-0.91), however, stronger in vivo-in vitro relationships were observed when Cd relative bioavailability was calculated using kidney plus liver Cd concentrations. Whereas all in vitro assays could predict Cd relative bioavailability with varying degrees of confidence (r(2) = 0.348-0.835), large y intercepts were calculated for a number of in vitro assays which is undesirable for in vivo-in vitro predictive models. However, determination of Cd bioaccessibility using the intestinal phase of the PBET assay resulted in a small y intercept (5.14; slope =1.091) and the best estimate of in vivo Cd relative bioavailability (r(2) = 0.835).

  3. Simple knowledge-based descriptors to predict protein-ligand interactions. Methodology and validation

    Science.gov (United States)

    Nissink, J. Willem M.; Verdonk, Marcel L.; Klebe, Gerhard

    2000-11-01

    A new type of shape descriptor is proposed to describe the spatial orientation for non-covalent interactions. It is built from simple, anisotropic Gaussian contributions that are parameterised by 10 adjustable values. The descriptors have been used to fit propensity distributions derived from scatter data stored in the IsoStar database. This database holds composite pictures of possible interaction geometries between a common central group and various interacting moieties, as extracted from small-molecule crystal structures. These distributions can be related to probabilities for the occurrence of certain interaction geometries among different functional groups. A fitting procedure is described that generates the descriptors in a fully automated way. For this purpose, we apply a similarity index that is tailored to the problem, the Split Hodgkin Index. It accounts for the similarity in regions of either high or low propensity in a separate way. Although dependent on the division into these two subregions, the index is robust and performs better than the regular Hodgkin index. The reliability and coverage of the fitted descriptors was assessed using SuperStar. SuperStar usually operates on the raw IsoStar data to calculate propensity distributions, e.g., for a binding site in a protein. For our purpose we modified the code to have it operate on our descriptors instead. This resulted in a substantial reduction in calculation time (factor of five to eight) compared to the original implementation. A validation procedure was performed on a set of 130 protein-ligand complexes, using four representative interacting probes to map the properties of the various binding sites: ammonium nitrogen, alcohol oxygen, carbonyl oxygen, and methyl carbon. The predicted `hot spots' for the binding of these probes were compared to the actual arrangement of ligand atoms in experimentally determined protein-ligand complexes. Results indicate that the version of SuperStar that applies to

  4. [Reliability of the PROFUND index to predict 4-year mortality in polypathological patients].

    Science.gov (United States)

    Díez-Manglano, Jesús; Del Corral Beamonte, Esther; Ramos Ibáñez, Rosa; Lambán Aranda, María Pilar; Toyas Miazza, Carla; Rodero Roldán, María Del Mar; Ortiz Domingo, Concepción; Munilla López, Eulalia; de Escalante Yangüela, Begoña

    2016-09-16

    To determine the usefullness of the PROFUND index to assess the risk of global death after 4 years in polypathological patients. Multicenter prospective cohort (Internal Medicine and Geriatrics) study. Polypathological patients admitted between March 1st and June 30th 2011 were included. For each patient, data concerning age, sex, living at home or in a nursing residence, polypathology categories, Charlson, Barthel and Lawton-Brody indexes, Pfeiffer questionnaire, socio-familial Gijon scale, delirium, number of drugs, hemoglobin and creatinine values were gathered, and the PROFUND index was calculated. The follow-up lasted 4 years. We included 441 patients, 324 from Internal Medicine and 117 from Geriatrics, with a mean age of 80.9 (8.7) years. Of them, 245 (55.6%) were women. Heart (62.7%), neurological (41.4%) and respiratory (37.3%) diseases were the most frequent. Geriatrics inpatients were older and more dependants and presented greater cognitive deterioration. After 4 years, 335 (76%) patients died. Mortality was associated with age, dyspnoea, Barthel index<60, delirium, advanced neoplasia and≥4 admissions in the last year. The area under the curve of the PROFUND index was 0.748, 95% CI 0.689-0.806, P<.001 in Internal Medicine and 0.517, 95% CI 0.369-0.666, P=.818 in Geriatrics patients, respectively. The PROFUND index is a reliable tool for predicting long-term global mortality in polypathological patients from Internal Medicine but not from Geriatrics departments. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  5. Complex proximal humerus fractures: Hertel's criteria reliability to predict head necrosis.

    Science.gov (United States)

    Campochiaro, G; Rebuzzi, M; Baudi, P; Catani, F

    2015-09-01

    The risk of post-traumatic humeral head avascular necrosis (AVN), regardless of the treatment, has a high reported incidence. In 2004, Hertel et al. stated that the most relevant predictors of ischemia after intracapsular fracture treated with osteosynthesis are the calcar length, medial hinge integrity and some specific fracture types. Based on Hertel's model, the purpose of this study is to evaluate both its reliability and weaknesses in our series of 267 fractures, assessing how the anatomical configuration of fracture, the quality of reduction and its maintenance were predictive of osteonecrosis development, and so to suggest a treatment choice algorithm. A retrospective study, level of evidence IV, was conducted to duly assess the radiographic features of 267 fractures treated from 2004 to 2010 following Hertel's criteria treated with open reduction and internal fixation by angular stability plates and screws. The average age was 65.2 years. The average follow-up was 28.3 ± 17.0 months. The percentage of AVN, the quality and maintenance of reduction obtained during surgery were evaluated. The AVN incidence was 3.7 %. No significant correlation with gender, age and fracture type was found. At the last follow-up X-ray, only 30 % presented all Hertel's good predictors in the AVN group, 4.7 % in the non-AVN group (p AVN group, it was poor in 50 %; while in the non-AVN group, it was poor in 3.4 % (p AVN were symptomatic, and three needed a second surgery. Hertel's criteria are important in the surgical planning, but they are not sufficient: an accurate evaluation of the calcar area fracture in three planes is required. All fractures involving calcar area should be studied with CT.

  6. Artificial neural network and response surface methodology modeling in mass transfer parameters predictions during osmotic dehydration of Carica papaya L.

    Directory of Open Access Journals (Sweden)

    J. Prakash Maran

    2013-09-01

    Full Text Available In this study, a comparative approach was made between artificial neural network (ANN and response surface methodology (RSM to predict the mass transfer parameters of osmotic dehydration of papaya. The effects of process variables such as temperature, osmotic solution concentration and agitation speed on water loss, weight reduction, and solid gain during osmotic dehydration were investigated using a three-level three-factor Box-Behnken experimental design. Same design was utilized to train a feed-forward multilayered perceptron (MLP ANN with back-propagation algorithm. The predictive capabilities of the two methodologies were compared in terms of root mean square error (RMSE, mean absolute error (MAE, standard error of prediction (SEP, model predictive error (MPE, chi square statistic (χ2, and coefficient of determination (R2 based on the validation data set. The results showed that properly trained ANN model is found to be more accurate in prediction as compared to RSM model.

  7. An overview of the reliability prediction related aspects of high power IGBTs in wind power applications

    DEFF Research Database (Denmark)

    Busca, Christian; Teodorescu, Remus; Blaabjerg, Frede

    2011-01-01

    Reliability is becoming more and more important as the size and number of installed Wind Turbines (WTs) increases. Very high reliability is especially important for offshore WTs because the maintenance and repair of such WTs in case of failures can be very expensive. WT manufacturers need...

  8. Reliability Evaluation on Creep Life Prediction of Alloy 617 for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Hong, Sung-Deok; Kim, Yong-Wan; Park, Jae-Young; Kim, Seon-Jin

    2012-01-01

    This paper evaluates the reliability of creep rupture life under service conditions of Alloy 617, which is considered as one of the candidate materials for use in a very high temperature reactor (VHTR) system. A Z-parameter, which represents the deviation of creep rupture data from the master curve, was used for the reliability analysis of the creep rupture data of Alloy 617. A Service-condition Creep Rupture Interference (SCRI) model, which can consider both the scattering of the creep rupture data and the fluctuations of temperature and stress under any service conditions, was also used for evaluating the reliability of creep rupture life. The statistical analysis showed that the scattering of creep rupture data based on Z-parameter was supported by normal distribution. The values of reliability decreased rapidly with increasing amplitudes of temperature and stress fluctuations. The results established that the reliability decreased with an increasing service time.

  9. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study

    OpenAIRE

    Douglas H. Marin dos Santos; Álvaro N. Atallah

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the ...

  10. Reliability Evaluation Methodologies of Fault Tolerant Techniques of Digital I and C Systems in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Bo Gyung; Kang, Hyun Gook; Seong, Poong Hyun; Lee, Seung Jun

    2011-01-01

    Since the reactor protection system was replaced from analog to digital, digital reactor protection system has 4 redundant channels and each channel has several modules. It is necessary for various fault tolerant techniques to improve availability and reliability due to using complex components in DPPS. To use the digital system, it is necessary to improve the reliability and availability of a system through fault-tolerant techniques. Several researches make an effort to effects of fault tolerant techniques. However, the effects of fault tolerant techniques have not been properly considered yet in most fault tree models. Various fault-tolerant techniques, which used in digital system in NPPs, should reflect in fault tree analysis for getting lower system unavailability and more reliable PSA. When fault-tolerant techniques are modeled in fault tree, categorizing the module to detect by each fault tolerant techniques, fault coverage, detection period and the fault recovery should be considered. Further work will concentrate on various aspects for fault tree modeling. We will find other important factors, and found a new theory to construct the fault tree model

  11. The reliability, validity, sensitivity, specificity and predictive values of the Chinese version of the Rowland Universal Dementia Assessment Scale.

    Science.gov (United States)

    Chen, Chia-Wei; Chu, Hsin; Tsai, Chia-Fen; Yang, Hui-Ling; Tsai, Jui-Chen; Chung, Min-Huey; Liao, Yuan-Mei; Chi, Mei-Ju; Chou, Kuei-Ru

    2015-11-01

    The purpose of this study was to translate the Rowland Universal Dementia Assessment Scale into Chinese and to evaluate the psychometric properties (reliability and validity) and the diagnostic properties (sensitivity, specificity and predictive values) of the Chinese version of the Rowland Universal Dementia Assessment Scale. The accurate detection of early dementia requires screening tools with favourable cross-cultural linguistic and appropriate sensitivity, specificity, and predictive values, particularly for Chinese-speaking populations. This was a cross-sectional, descriptive study. Overall, 130 participants suspected to have cognitive impairment were enrolled in the study. A test-retest for determining reliability was scheduled four weeks after the initial test. Content validity was determined by five experts, whereas construct validity was established by using contrasted group technique. The participants' clinical diagnoses were used as the standard in calculating the sensitivity, specificity, positive predictive value and negative predictive value. The study revealed that the Chinese version of the Rowland Universal Dementia Assessment Scale exhibited a test-retest reliability of 0.90, an internal consistency reliability of 0.71, an inter-rater reliability (kappa value) of 0.88 and a content validity index of 0.97. Both the patients and healthy contrast group exhibited significant differences in their cognitive ability. The optimal cut-off points for the Chinese version of the Rowland Universal Dementia Assessment Scale in the test for mild cognitive impairment and dementia were 24 and 22, respectively; moreover, for these two conditions, the sensitivities of the scale were 0.79 and 0.76, the specificities were 0.91 and 0.81, the areas under the curve were 0.85 and 0.78, the positive predictive values were 0.99 and 0.83 and the negative predictive values were 0.96 and 0.91 respectively. The Chinese version of the Rowland Universal Dementia Assessment Scale

  12. Nonparametric predictive inference for reliability of a k-out-of-m:G system with multiple component types

    International Nuclear Information System (INIS)

    Aboalkhair, Ahmad M.; Coolen, Frank P.A.; MacPhee, Iain M.

    2014-01-01

    Nonparametric predictive inference for system reliability has recently been presented, with specific focus on k-out-of-m:G systems. The reliability of systems is quantified by lower and upper probabilities of system functioning, given binary test results on components, taking uncertainty about component functioning and indeterminacy due to limited test information explicitly into account. Thus far, systems considered were series configurations of subsystems, with each subsystem i a k i -out-of-m i :G system which consisted of only one type of components. Key results are briefly summarized in this paper, and as an important generalization new results are presented for a single k-out-of-m:G system consisting of components of multiple types. The important aspects of redundancy and diversity for such systems are discussed. - Highlights: • New results on nonparametric predictive inference for system reliability. • Prediction of system reliability based on test data for components. • New insights on system redundancy optimization and diversity. • Components that appear inferior in tests may be included to enhance redundancy

  13. Support Vector Machine Analysis of Functional Magnetic Resonance Imaging of Interoception Does Not Reliably Predict Individual Outcomes of Cognitive Behavioral Therapy in Panic Disorder with Agoraphobia

    Directory of Open Access Journals (Sweden)

    Benedikt Sundermann

    2017-06-01

    Full Text Available BackgroundThe approach to apply multivariate pattern analyses based on neuro imaging data for outcome prediction holds out the prospect to improve therapeutic decisions in mental disorders. Patients suffering from panic disorder with agoraphobia (PD/AG often exhibit an increased perception of bodily sensations. The purpose of this investigation was to assess whether multivariate classification applied to a functional magnetic resonance imaging (fMRI interoception paradigm can predict individual responses to cognitive behavioral therapy (CBT in PD/AG.MethodsThis analysis is based on pretreatment fMRI data during an interoceptive challenge from a multicenter trial of the German PANIC-NET. Patients with DSM-IV PD/AG were dichotomized as responders (n = 30 or non-responders (n = 29 based on the primary outcome (Hamilton Anxiety Scale Reduction ≥50% after 6 weeks of CBT (2 h/week. fMRI parametric maps were used as features for response classification with linear support vector machines (SVM with or without automated feature selection. Predictive accuracies were assessed using cross validation and permutation testing. The influence of methodological parameters and the predictive ability for specific interoception-related symptom reduction were further evaluated.ResultsSVM did not reach sufficient overall predictive accuracies (38.0–54.2% for anxiety reduction in the primary outcome. In the exploratory analyses, better accuracies (66.7% were achieved for predicting interoception-specific symptom relief as an alternative outcome domain. Subtle information regarding this alternative response criterion but not the primary outcome was revealed by post hoc univariate comparisons.ConclusionIn contrast to reports on other neurofunctional probes, SVM based on an interoception paradigm was not able to reliably predict individual response to CBT. Results speak against the clinical applicability of this technique.

  14. Reliability and Validity of Digital Imagery Methodology for Measuring Starting Portions and Plate Waste from School Salad Bars.

    Science.gov (United States)

    Bean, Melanie K; Raynor, Hollie A; Thornton, Laura M; Sova, Alexandra; Dunne Stewart, Mary; Mazzeo, Suzanne E

    2018-04-12

    Scientifically sound methods for investigating dietary consumption patterns from self-serve salad bars are needed to inform school policies and programs. To examine the reliability and validity of digital imagery for determining starting portions and plate waste of self-serve salad bar vegetables (which have variable starting portions) compared with manual weights. In a laboratory setting, 30 mock salads with 73 vegetables were made, and consumption was simulated. Each component (initial and removed portion) was weighed; photographs of weighed reference portions and pre- and post-consumption mock salads were taken. Seven trained independent raters visually assessed images to estimate starting portions to the nearest ¼ cup and percentage consumed in 20% increments. These values were converted to grams for comparison with weighed values. Intraclass correlations between weighed and digital imagery-assessed portions and plate waste were used to assess interrater reliability and validity. Pearson's correlations between weights and digital imagery assessments were also examined. Paired samples t tests were used to evaluate mean differences (in grams) between digital imagery-assessed portions and measured weights. Interrater reliabilities were excellent for starting portions and plate waste with digital imagery. For accuracy, intraclass correlations were moderate, with lower accuracy for determining starting portions of leafy greens compared with other vegetables. However, accuracy of digital imagery-assessed plate waste was excellent. Digital imagery assessments were not significantly different from measured weights for estimating overall vegetable starting portions or waste; however, digital imagery assessments slightly underestimated starting portions (by 3.5 g) and waste (by 2.1 g) of leafy greens. This investigation provides preliminary support for use of digital imagery in estimating starting portions and plate waste from school salad bars. Results might inform

  15. The validity and reliability of the type 2 diabetes and health promotion scale Turkish version: a methodological study.

    Science.gov (United States)

    Yildiz, Esra; Kavuran, Esin

    2018-03-01

    A healthy promotion is important for maintaining health and preventing complications in patients with type 2 diabetes. The aim of the present study was to examine the psychometrics of a recently developed tool that can be used to screen for a health-promoting lifestyle in patients with type 2 diabetes. Data were collected from outpatients attending diabetes clinics. The Type 2 Diabetes and Health Promotion Scale (T2DHPS) and a demographic questionnaire were administered to 295 participants. Forward-backward translation of the original English version was used to develop a Turkish version. Internal consistency of the scale was assessed by Cronbach's alpha. An explanatory factor analysis and confirmatory factor analysis used validity of the Type 2 Diabetes and Health Promotion Scale - Turkish version. Kaiser-Meyer-Olkin (KMO) and Bartlett's sphericity tests showed that the sample met the criteria required for factor analysis. The reliability coefficient for the total scale was 0.84, and alpha coefficients for the subscales ranged from 0.57 to 0.92. A six-factor solution was obtained that explained 59.3% of the total variance. The ratio of chi-square statistics to degrees of freedom (χ 2 /df) 3.30 (χ 2 = 1157.48/SD = 350); error of root mean square approximation (RMSEA) 0.061; GFI value of 0.91 and comparative fit index (CFI) value was obtained as 0.91. Turkish version of The T2DHPS is a valid and reliable tool that can be used to assess patients' health-promoting lifestyle behaviours. Validity and reliability studies in different cultures and regions are recommended. © 2017 Nordic College of Caring Science.

  16. Artificial neural network methodology: Application to predict magnetic properties of nanocrystalline alloys

    International Nuclear Information System (INIS)

    Hamzaoui, R.; Cherigui, M.; Guessasma, S.; ElKedim, O.; Fenineche, N.

    2009-01-01

    This paper is dedicated to the optimization of magnetic properties of iron based magnetic materials with regard to milling and coating process conditions using artificial neural network methodology. Fe-20 wt.% Ni and Fe-6.5 wt.% Si, alloys were obtained using two high-energy ball milling technologies, namely a planetary ball mill P4 vario ball mill from Fritsch and planetary ball mill from Retch. Further processing of Fe-Si powder allowed the spraying of the feedstock material using high-velocity oxy-fuel (HVOF) process to obtain a relatively dense coating. Input parameters were the disc Ω and vial ω speed rotations for the milling technique, and spray distance and oxygen flow rate in the case of coating process. Two main magnetic parameters are optimized namely the saturation magnetization and the coercivity. Predicted results depict clearly coupled effects of input parameters to vary magnetic parameters. In particular, the increase of saturation magnetization is correlated to the increase of the product Ωω (shock power) and the product of spray parameters. Largest coercivity values are correlated to the increase of the ratio Ω/ω (shock mode process) and the increase of the product of spray parameters.

  17. Methodological advances in predicting flow-induced dynamics of plants using mechanical-engineering theory.

    Science.gov (United States)

    de Langre, Emmanuel

    2012-03-15

    The modeling of fluid-structure interactions, such as flow-induced vibrations, is a well-developed field of mechanical engineering. Many methods exist, and it seems natural to apply them to model the behavior of plants, and potentially other cantilever-like biological structures, under flow. Overcoming this disciplinary divide, and the application of such models to biological systems, will significantly advance our understanding of ecological patterns and processes and improve our predictive capabilities. Nonetheless, several methodological issues must first be addressed, which I describe here using two practical examples that have strong similarities: one from agricultural sciences and the other from nuclear engineering. Very similar issues arise in both: individual and collective behavior, small and large space and time scales, porous modeling, standard and extreme events, trade-off between the surface of exchange and individual or collective risk of damage, variability, hostile environments and, in some aspects, evolution. The conclusion is that, although similar issues do exist, which need to be exploited in some detail, there is a significant gap that requires new developments. It is obvious that living plants grow in and adapt to their environment, which certainly makes plant biomechanics fundamentally distinct from classical mechanical engineering. Moreover, the selection processes in biology and in human engineering are truly different, making the issue of safety different as well. A thorough understanding of these similarities and differences is needed to work efficiently in the application of a mechanistic approach to ecology.

  18. Thermal aging of some decommissioned reactor components and methodology for life prediction

    International Nuclear Information System (INIS)

    Chung, H.M.

    1989-03-01

    Since a realistic aging of cast stainless steel components for end-of-life or life-extension conditions cannot be produced, it is customary to simulate the thermal aging embrittlement by accelerated aging at ∼400 degree C. In this investigation, field components obtained from decommissioned reactors have been examined after service up to 22 yr to provide a benchmark of the laboratory simulation. The primary and secondary aging processes were found to be identical to those of the laboratory-aged specimens, and the kinetic characteristics were also similar. The extent of the aging embrittlement processes and other key factors that are known to influence the embrittlement kinetics have been compared for the decommissioned reactor components and materials aged under accelerated conditions. On the basis of the study, a mechanistic understanding of the causes of the complex behavior in kinetics and activation energy of aging (i.e., the temperature dependence of aging embrittlement between the accelerated and reactor-operating conditions) is presented. A mechanistic correlation developed thereon is compared with a number of available empirical correlations to provide an insight for development of a better methodology of life prediction of the reactor components. 18 refs., 18 figs., 5 tabs

  19. Experience and benefits from using the EPRI MOV Performance Prediction Methodology in nuclear power plants

    International Nuclear Information System (INIS)

    Walker, T.; Damerell, P.S.

    1999-01-01

    The EPRI MOV Performance Prediction Methodology (PPM) is an effective tool for evaluating design basis thrust and torque requirements for MOVs. Use of the PPM has become more widespread in US nuclear power plants as they close out their Generic Letter (GL) 89-10 programs and address MOV periodic verification per GL 96-05. The PPM has also been used at plants outside the US, many of which are implementing programs similar to US plants' GL 89-10 programs. The USNRC Safety Evaluation of the PPM and the USNRC's discussion of the PPM in GL 96-05 make the PPM an attractive alternative to differential pressure (DP) testing, which can be costly and time-consuming. Significant experience and benefits, which are summarized in this paper, have been gained using the PPM. Although use of PPM requires a commitment of resources, the benefits of a solidly justified approach and a reduced need for DP testing provide a substantial safety and economic benefit. (author)

  20. Response surface methodology approach for structural reliability analysis: An outline of typical applications performed at CEC-JRC, Ispra

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1982-01-01

    The paper presents the main results of the work carried out at JRC-Ispra for the study of specific problems posed by the application of the response surface methodology to the exploration of structural and nuclear reactor safety codes. Some relevant studies have been achieved: assessment of structure behaviours in the case of seismic occurrences; determination of the probability of coherent blockage in LWR fuel elements due to LOCA occurrence; analysis of ATWS consequences in PWR reactors by means of an ALMOD code; analysis of the first wall for an experimental fusion reactor by means of the Bersafe code. (orig.)

  1. The occipitofrontal circumference: reliable prediction of the intracranial volume in children with syndromic and complex craniosynostosis.

    Science.gov (United States)

    Rijken, Bianca Francisca Maria; den Ottelander, Bianca Kelly; van Veelen, Marie-Lise Charlotte; Lequin, Maarten Hans; Mathijssen, Irene Margreet Jacqueline

    2015-05-01

    OBJECT Patients with syndromic and complex craniosynostosis are characterized by the premature fusion of one or more cranial sutures. These patients are at risk for developing elevated intracranial pressure (ICP). There are several factors known to contribute to elevated ICP in these patients, including craniocerebral disproportion, hydrocephalus, venous hypertension, and obstructive sleep apnea. However, the causal mechanism is unknown, and patients develop elevated ICP even after skull surgery. In clinical practice, the occipitofrontal circumference (OFC) is used as an indirect measure for intracranial volume (ICV), to evaluate skull growth. However, it remains unknown whether OFC is a reliable predictor of ICV in patients with a severe skull deformity. Therefore, in this study the authors evaluated the relation between ICV and OFC. METHODS Eighty-four CT scans obtained in 69 patients with syndromic and complex craniosynostosis treated at the Erasmus University Medical Center-Sophia Children's Hospital were included. The ICV was calculated based on CT scans by using autosegmentation with an HU threshold CT scans and OFC measurements were matched based on a maximum amount of the time that was allowed between these examinations, which was dependent on age. A Pearson correlation coefficient was calculated to evaluate the correlations between OFC and ICV. The predictive value of OFC, age, and sex on ICV was then further evaluated using a univariate linear mixed model. The significant factors in the univariate analysis were subsequently entered in a multivariate mixed model. RESULTS The correlations found between OFC and ICV were r = 0.908 for the total group (p < 0.001), r = 0.981 for Apert (p < 0.001), r = 0.867 for Crouzon-Pfeiffer (p < 0.001), r = 0.989 for Muenke (p < 0.001), r = 0.858 for Saethre- Chotzen syndrome (p = 0.001), and r = 0.917 for complex craniosynostosis (p < 0.001). Age and OFC were significant predictors of ICV in the univariate linear mixed

  2. A Tutorial on Nonlinear Time-Series Data Mining in Engineering Asset Health and Reliability Prediction: Concepts, Models, and Algorithms

    Directory of Open Access Journals (Sweden)

    Ming Dong

    2010-01-01

    Full Text Available The primary objective of engineering asset management is to optimize assets service delivery potential and to minimize the related risks and costs over their entire life through the development and application of asset health and usage management in which the health and reliability prediction plays an important role. In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset is generally described as monitored nonlinear time-series data and subject to high levels of uncertainty and unpredictability. It has been proved that application of data mining techniques is very useful for extracting relevant features which can be used as parameters for assets diagnosis and prognosis. In this paper, a tutorial on nonlinear time-series data mining in engineering asset health and reliability prediction is given. Besides that an overview on health and reliability prediction techniques for engineering assets is covered, this tutorial will focus on concepts, models, algorithms, and applications of hidden Markov models (HMMs and hidden semi-Markov models (HSMMs in engineering asset health prognosis, which are representatives of recent engineering asset health prediction techniques.

  3. Reliability prediction of large fuel cell stack based on structure stress analysis

    Science.gov (United States)

    Liu, L. F.; Liu, B.; Wu, C. W.

    2017-09-01

    The aim of this paper is to improve the reliability of Proton Electrolyte Membrane Fuel Cell (PEMFC) stack by designing the clamping force and the thickness difference between the membrane electrode assembly (MEA) and the gasket. The stack reliability is directly determined by the component reliability, which is affected by the material property and contact stress. The component contact stress is a random variable because it is usually affected by many uncertain factors in the production and clamping process. We have investigated the influences of parameter variation coefficient on the probability distribution of contact stress using the equivalent stiffness model and the first-order second moment method. The optimal contact stress to make the component stay in the highest level reliability is obtained by the stress-strength interference model. To obtain the optimal contact stress between the contact components, the optimal thickness of the component and the stack clamping force are optimally designed. Finally, a detailed description is given how to design the MEA and gasket dimensions to obtain the highest stack reliability. This work can provide a valuable guidance in the design of stack structure for a high reliability of fuel cell stack.

  4. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Douglas H. Marin dos Santos

    2015-06-01

    Full Text Available The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA, aiming to provide publicly access to a broad range of biomedical information to be made available on the platform ClinicalTrials (available at https://www.clinicaltrials.gov. We accessed ClinicalTrials.gov and evaluated the compliance of researchers and sponsors with the FDAAA. Our sample comprised 243 protocols of clinical trials of biological monoclonal antibodies (mAb adalimumab, bevacizumab, infliximab, rituximab, and trastuzumab. We demonstrate that the new legislation has positively affected transparency patterns in clinical research, through a significant increase in publication and online reporting rates after the enactment of the law. Poorly designed trials, however, remain a challenge to be overcome, due to a high prevalence of methodological flaws. These flaws affect the quality of clinical information available, breaching ethical duties of sponsors and researchers, as well as the human right to health.

  5. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study.

    Science.gov (United States)

    Marin Dos Santos, Douglas H; Atallah, Álvaro N

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the platform ClinicalTrials (available at https://www.clinicaltrials.gov). We accessed ClinicalTrials.gov and evaluated the compliance of researchers and sponsors with the FDAAA. Our sample comprised 243 protocols of clinical trials of biological monoclonal antibodies (mAb) adalimumab, bevacizumab, infliximab, rituximab, and trastuzumab. We demonstrate that the new legislation has positively affected transparency patterns in clinical research, through a significant increase in publication and online reporting rates after the enactment of the law. Poorly designed trials, however, remain a challenge to be overcome, due to a high prevalence of methodological flaws. These flaws affect the quality of clinical information available, breaching ethical duties of sponsors and researchers, as well as the human right to health.

  6. Dataset size and composition impact the reliability of performance benchmarks for peptide-MHC binding predictions

    DEFF Research Database (Denmark)

    Kim, Yohan; Sidney, John; Buus, Søren

    2014-01-01

    Background: It is important to accurately determine the performance of peptide: MHC binding predictions, as this enables users to compare and choose between different prediction methods and provides estimates of the expected error rate. Two common approaches to determine prediction performance...... are cross-validation, in which all available data are iteratively split into training and testing data, and the use of blind sets generated separately from the data used to construct the predictive method. In the present study, we have compared cross-validated prediction performances generated on our last...

  7. An overview of the IAEA Safety Series on procedures for evaluating the reliability of predictions made by environmental transfer models

    International Nuclear Information System (INIS)

    Hoffman, F.W.; Hofer, E.

    1987-10-01

    The International Atomic Energy Agency is preparing a Safety Series publication on practical approaches for evaluating the reliability of the predictions made by environmental radiological assessment models. This publication identifies factors that affect the reliability of these predictions and discusses methods for quantifying uncertainty. Emphasis is placed on understanding the quantity of interest specified by the assessment question and distinguishing between stochastic variability and lack of knowledge about either the true value or the true distribution of values for quantity of interest. Among the many approaches discussed, model testing using independent data sets (model validation) is considered the best method for evaluating the accuracy in model predictions. Analytical and numerical methods for propagating the uncertainties in model parameters are presented and the strengths and weaknesses of model intercomparison exercises are also discussed. It is recognized that subjective judgment is employed throughout the entire modelling process, and quantitative reliability statements must be subjectively obtained when models are applied to different situations from those under which they have been tested. (6 refs.)

  8. Safety and reliability in the 90s: will past experience or prediction meet our needs?

    International Nuclear Information System (INIS)

    Walter, M.H.; Cox, R.F.

    1990-01-01

    Twenty-six papers are presented in the proceedings of the 1990 Safety and Reliability Society Symposium. The papers selected provide current thinking on improved methods for identification, quantification and management of risks based on the safety culture developed across a range of industries during the last decade. In particular organizational and management factors feature in a large number of the papers. Two papers on the safety of all the operating plants at Sellafield's irradiated nuclear fuel handling and reprocessing site and the selection of field component reliability data for use in nuclear safety studies are selected and indexed separately. (author)

  9. Methodology for experimental validation of a CFD model for predicting noise generation in centrifugal compressors

    International Nuclear Information System (INIS)

    Broatch, A.; Galindo, J.; Navarro, R.; García-Tíscar, J.

    2014-01-01

    Highlights: • A DES of a turbocharger compressor working at peak pressure point is performed. • In-duct pressure signals are measured in a steady flow rig with 3-sensor arrays. • Pressure spectra comparison is performed as a validation for the numerical model. • A suitable comparison methodology is developed, relying on pressure decomposition. • Whoosh noise at outlet duct is detected in experimental and numerical spectra. - Abstract: Centrifugal compressors working in the surge side of the map generate a broadband noise in the range of 1–3 kHz, named as whoosh noise. This noise is perceived at strongly downsized engines operating at particular conditions (full load, tip-in and tip-out maneuvers). A 3-dimensional CFD model of a centrifugal compressor is built to analyze fluid phenomena related to whoosh noise. A detached eddy simulation is performed with the compressor operating at the peak pressure point of 160 krpm. A steady flow rig mounted on an anechoic chamber is used to obtain experimental measurements as a means of validation for the numerical model. In-duct pressure signals are obtained in addition to standard averaged global variables. The numerical simulation provides global variables showing excellent agreement with experimental measurements. Pressure spectra comparison is performed to assess noise prediction capability of numerical model. The influence of the type and position of the virtual pressure probes is evaluated. Pressure decomposition is required by the simulations to obtain meaningful spectra. Different techniques for obtaining pressure components are analyzed. At the simulated conditions, a broadband noise in 1–3 kHz frequency band is detected in the experimental measurements. This whoosh noise is also captured by the numerical model

  10. RANS Based Methodology for Predicting the Influence of Leading Edge Erosion on Airfoil Performance

    Energy Technology Data Exchange (ETDEWEB)

    Langel, Christopher M. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; Chow, Raymond C. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; van Dam, C. P. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; Maniaci, David Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Wind Energy Technologies Dept.

    2017-10-01

    The impact of surface roughness on flows over aerodynamically designed surfaces is of interested in a number of different fields. It has long been known the surface roughness will likely accelerate the laminar- turbulent transition process by creating additional disturbances in the boundary layer. However, there are very few tools available to predict the effects surface roughness will have on boundary layer flow. There are numerous implications of the premature appearance of a turbulent boundary layer. Increases in local skin friction, boundary layer thickness, and turbulent mixing can impact global flow properties compounding the effects of surface roughness. With this motivation, an investigation into the effects of surface roughness on boundary layer transition has been conducted. The effort involved both an extensive experimental campaign, and the development of a high fidelity roughness model implemented in a R ANS solver. Vast a mounts of experimental data was generated at the Texas A&M Oran W. Nicks Low Speed Wind Tunnel for the calibration and validation of the roughness model described in this work, as well as future efforts. The present work focuses on the development of the computational model including a description of the calibration process. The primary methodology presented introduces a scalar field variable and associated transport equation that interacts with a correlation based transition model. The additional equation allows for non-local effects of surface roughness to be accounted for downstream of rough wall sections while maintaining a "local" formulation. The scalar field is determined through a boundary condition function that has been calibrated to flat plate cases with sand grain roughness. The model was initially tested on a NACA 0012 airfoil with roughness strips applied to the leading edge. Further calibration of the roughness model was performed using results from the companion experimental study on a NACA 633 -418 airfoil

  11. Crack Growth-Based Predictive Methodology for the Maintenance of the Structural Integrity of Repaired and Nonrepaired Aging Engine Stationary Components

    National Research Council Canada - National Science Library

    Barron, Michael

    1999-01-01

    .... Specifically, the FAA's goal was to develop "Crack Growth-Based Predictive Methodologies for the Maintenance of the Structural Integrity of Repaired and Nonrepaired Aging Engine Stationary Components...

  12. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  13. Theory and methodology in predicting the religious tourism in Buddhist regions of Russia

    Directory of Open Access Journals (Sweden)

    Petr E. Tsarkov

    2015-11-01

    Full Text Available Article reviews the theoretical and methodological aspects of forecasting the touristic migrations. The theoretical approach is designed according to the anthropological theory of cultural exchange, tourism forecasting methodology developed by the author basing on an interdisciplinary approach. The author facilitates an original approach which assesses the touristic potential of Buddhist regions of Russia on the basis of their aesthetic appeal.

  14. Development of a Methodology for Predicting Forest Area for Large-Area Resource Monitoring

    Science.gov (United States)

    William H. Cooke

    2001-01-01

    The U.S. Department of Agriculture, Forest Service, Southcm Research Station, appointed a remote-sensing team to develop an image-processing methodology for mapping forest lands over large geographic areds. The team has presented a repeatable methodology, which is based on regression modeling of Advanced Very High Resolution Radiometer (AVHRR) and Landsat Thematic...

  15. Feasibility, Reliability and Predictive Value Of In-Ambulance Heart Rate Variability Registration

    NARCIS (Netherlands)

    Yperzeele, Laetitia; van Hooff, Robbert-Jan; De Smedt, Ann; Nagels, Guy; Hubloue, Ives; De Keyser, Jacques; Brouns, Raf

    2016-01-01

    Background Heart rate variability (HRV) is a parameter of autonomic nervous system function. A decrease of HRV has been associated with disease severity, risk of complications and prognosis in several conditions. Objective We aim to investigate the feasibility and the reliability of in-ambulance HRV

  16. Reliabilities of genomic prediction using combined reference data of the Nordic Red dairy cattle production

    DEFF Research Database (Denmark)

    Brøndum, Rasmus Froberg; Rius-Vilarrasa, E; Strandén, I

    2011-01-01

    This study investigated the possibility of increasing the reliability of direct genomic values (DGV) by combining reference opulations. The data were from 3,735 bulls from Danish, Swedish, and Finnish Red dairy cattle populations. Single nucleotide polymorphism markers were fitted as random varia...

  17. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  18. Reliability of nine programs of topological predictions and their application to integral membrane channel and carrier proteins.

    Science.gov (United States)

    Reddy, Abhinay; Cho, Jaehoon; Ling, Sam; Reddy, Vamsee; Shlykov, Maksim; Saier, Milton H

    2014-01-01

    We evaluated topological predictions for nine different programs, HMMTOP, TMHMM, SVMTOP, DAS, SOSUI, TOPCONS, PHOBIUS, MEMSAT-SVM (hereinafter referred to as MEMSAT), and SPOCTOPUS. These programs were first evaluated using four large topologically well-defined families of secondary transporters, and the three best programs were further evaluated using topologically more diverse families of channels and carriers. In the initial studies, the order of accuracy was: SPOCTOPUS > MEMSAT > HMMTOP > TOPCONS > PHOBIUS > TMHMM > SVMTOP > DAS > SOSUI. Some families, such as the Sugar Porter Family (2.A.1.1) of the Major Facilitator Superfamily (MFS; TC #2.A.1) and the Amino Acid/Polyamine/Organocation (APC) Family (TC #2.A.3), were correctly predicted with high accuracy while others, such as the Mitochondrial Carrier (MC) (TC #2.A.29) and the K(+) transporter (Trk) families (TC #2.A.38), were predicted with much lower accuracy. For small, topologically homogeneous families, SPOCTOPUS and MEMSAT were generally most reliable, while with large, more diverse superfamilies, HMMTOP often proved to have the greatest prediction accuracy. We next developed a novel program, TM-STATS, that tabulates HMMTOP, SPOCTOPUS or MEMSAT-based topological predictions for any subdivision (class, subclass, superfamily, family, subfamily, or any combination of these) of the Transporter Classification Database (TCDB; www.tcdb.org) and examined the following subclasses: α-type channel proteins (TC subclasses 1.A and 1.E), secreted pore-forming toxins (TC subclass 1.C) and secondary carriers (subclass 2.A). Histograms were generated for each of these subclasses, and the results were analyzed according to subclass, family and protein. The results provide an update of topological predictions for integral membrane transport proteins as well as guides for the development of more reliable topological prediction programs, taking family-specific characteristics into account. © 2014 S. Karger AG, Basel.

  19. Measurement of Cue-Induced Craving in Human Methamphetamine- Dependent Subjects New Methodological Hopes for Reliable Assessment of Treatment Efficacy

    Directory of Open Access Journals (Sweden)

    Zahra Alam Mehrjerdi

    2011-09-01

    Full Text Available Methamphetamine (MA is a highly addictive psychostimulant drug with crucial impacts on individuals on various levels. Exposure to methamphetamine-associated cues in laboratory can elicit measureable craving and autonomic reactivity in most individuals with methamphetamine dependence and the cue reactivity can model how craving would result in continued drug seeking behaviors and relapse in real environments but study on this notion is still limited. In this brief article, the authors review studies on cue-induced craving in human methamphetamine- dependent subjects in a laboratory-based approach. Craving for methamphetamine is elicited by a variety of methods in laboratory such as paraphernalia, verbal and visual cues and imaginary scripts. In this article, we review the studies applying different cues as main methods of craving incubation in laboratory settings. The brief reviewed literature provides strong evidence that craving for methamphetamine in laboratory conditions is significantly evoked by different cues. Cue-induced craving has important treatment and clinical implications for psychotherapists and clinicians when we consider the role of induced craving in evoking intense desire or urge to use methamphetamine after or during a period of successful craving prevention program. Elicited craving for methamphetamine in laboratory conditions is significantly influenced by methamphetamine-associated cues and results in rapid craving response toward methamphetamine use. This notion can be used as a main core for laboratory-based assessment of treatment efficacy for methamphetamine-dependent patients. In addition, the laboratory settings for studying craving can bridge the gap between somehow-non-reliable preclinical animal model studies and budget demanding randomized clinical trials.

  20. Predicting climate-induced range shifts: model differences and model reliability.

    Science.gov (United States)

    Joshua J. Lawler; Denis White; Ronald P. Neilson; Andrew R. Blaustein

    2006-01-01

    Predicted changes in the global climate are likely to cause large shifts in the geographic ranges of many plant and animal species. To date, predictions of future range shifts have relied on a variety of modeling approaches with different levels of model accuracy. Using a common data set, we investigated the potential implications of alternative modeling approaches for...

  1. Climatic Reliability of Electronics: Early Prediction and Control of Contamination and humidity effects

    DEFF Research Database (Denmark)

    Verdingovas, Vadimas

    were to a significant extent guided by the climatic reliability issues the electronic companies are currently facing. The research in this thesis is focused on the synergistic effects of process related contamination, humidity, potential bias, and PCBA design related aspects, while various tests...... assuming parasitic circuit due to water layer formation on the PCBA surface. The chapters 2-5 review the factors influencing the climatic reliability of electronics namely humidity interaction with materials and ionic contamination on the PCBA surface, common types and sources of ionic contamination...... in electronics, the test methods and techniques, and failure mechanisms related to climate and contamination. Chapter 6 summarizes the materials and experimental methods employed in this thesis. The results of various investigations are presented as individual research papers as published or in the draft form...

  2. Physical attraction to reliable, low variability nervous systems: Reaction time variability predicts attractiveness.

    Science.gov (United States)

    Butler, Emily E; Saville, Christopher W N; Ward, Robert; Ramsey, Richard

    2017-01-01

    The human face cues a range of important fitness information, which guides mate selection towards desirable others. Given humans' high investment in the central nervous system (CNS), cues to CNS function should be especially important in social selection. We tested if facial attractiveness preferences are sensitive to the reliability of human nervous system function. Several decades of research suggest an operational measure for CNS reliability is reaction time variability, which is measured by standard deviation of reaction times across trials. Across two experiments, we show that low reaction time variability is associated with facial attractiveness. Moreover, variability in performance made a unique contribution to attractiveness judgements above and beyond both physical health and sex-typicality judgements, which have previously been associated with perceptions of attractiveness. In a third experiment, we empirically estimated the distribution of attractiveness preferences expected by chance and show that the size and direction of our results in Experiments 1 and 2 are statistically unlikely without reference to reaction time variability. We conclude that an operating characteristic of the human nervous system, reliability of information processing, is signalled to others through facial appearance. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity.

    Science.gov (United States)

    Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R

    2004-11-21

    Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).

  4. Predicting Proliferation: High Reliability Forecasting Models of Nuclear Proliferation as a Policy & Analytical Aid

    OpenAIRE

    Center on Contemporary Conflict; Gartzke, Erik

    2015-01-01

    Performer: University of California at San Diego Project Lead: Erik Gartzke Project Cost: $121,000 FY15-16 Objective: Scholars have spent decades studying and explaining nuclear proliferation. This project will develop a model to predict the behavior of states regarding their pursuit and acquisition of nuclear weapons. An accurate prediction model will allow for action against potential suppliers, interdiction of nuclear trade, intelligence collection on covert nuclea...

  5. Predicting protein complexes from weighted protein-protein interaction graphs with a novel unsupervised methodology: Evolutionary enhanced Markov clustering.

    Science.gov (United States)

    Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina

    2015-03-01

    Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new

  6. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  7. Life prediction methodology for ceramic components of advanced heat engines. Phase 1: Volume 1, Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cuccio, J.C.; Brehm, P.; Fang, H.T. [Allied-Signal Aerospace Co., Phoenix, AZ (United States). Garrett Engine Div.] [and others

    1995-03-01

    Emphasis of this program is to develop and demonstrate ceramics life prediction methods, including fast fracture, stress rupture, creep, oxidation, and nondestructive evaluation. Significant advancements were made in these methods and their predictive capabilities successfully demonstrated.

  8. Reliability of Modern Scores to Predict Long-Term Mortality After Isolated Aortic Valve Operations.

    Science.gov (United States)

    Barili, Fabio; Pacini, Davide; D'Ovidio, Mariangela; Ventura, Martina; Alamanni, Francesco; Di Bartolomeo, Roberto; Grossi, Claudio; Davoli, Marina; Fusco, Danilo; Perucci, Carlo; Parolari, Alessandro

    2016-02-01

    Contemporary scores for estimating perioperative death have been proposed to also predict also long-term death. The aim of the study was to evaluate the performance of the updated European System for Cardiac Operative Risk Evaluation II, The Society of Thoracic Surgeons Predicted Risk of Mortality score, and the Age, Creatinine, Left Ventricular Ejection Fraction score for predicting long-term mortality in a contemporary cohort of isolated aortic valve replacement (AVR). We also sought to develop for each score a simple algorithm based on predicted perioperative risk to predict long-term survival. Complete data on 1,444 patients who underwent isolated AVR in a 7-year period were retrieved from three prospective institutional databases and linked with the Italian Tax Register Information System. Data were evaluated with performance analyses and time-to-event semiparametric regression. Survival was 83.0% ± 1.1% at 5 years and 67.8 ± 1.9% at 8 years. Discrimination and calibration of all three scores both worsened for prediction of death at 1 year and 5 years. Nonetheless, a significant relationship was found between long-term survival and quartiles of scores (p System for Cardiac Operative Risk Evaluation II, 1.34 (95% CI, 1.28 to 1.40) for the Society of Thoracic Surgeons score, and 1.08 (95% CI, 1.06 to 1.10) for the Age, Creatinine, Left Ventricular Ejection Fraction score. The predicted risk generated by European System for Cardiac Operative Risk Evaluation II, The Society of Thoracic Surgeons score, and Age, Creatinine, Left Ventricular Ejection Fraction scores cannot also be considered a direct estimate of the long-term risk for death. Nonetheless, the three scores can be used to derive an estimate of long-term risk of death in patients who undergo isolated AVR with the use of a simple algorithm. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  9. A New Approach for Reliability Life Prediction of Rail Vehicle Axle by Considering Vibration Measurement

    Directory of Open Access Journals (Sweden)

    Meral Bayraktar

    2014-01-01

    Full Text Available The effect of vibration on the axle has been considered. Vibration measurements at different speeds have been performed on the axle of a running rail vehicle to figure out displacement, acceleration, time, and frequency response. Based on the experimental works, equivalent stress has been used to find out life of the axles for 90% and 10% reliability. Calculated life values of the rail vehicle axle have been compared with the real life data and it is found that the life of a vehicle axle taking into account the vibration effects is in good agreement with the real life of the axle.

  10. On the accuracy and reliability of predictions by control-system theory.

    Science.gov (United States)

    Bourbon, W T; Copeland, K E; Dyer, V R; Harman, W K; Mosley, B L

    1990-12-01

    In three experiments we used control-system theory (CST) to predict the results of tracking tasks on which people held a handle to keep a cursor even with a target on a computer screen. 10 people completed a total of 104 replications of the task. In each experiment, there were two conditions: in one, only the handle affected the position of the cursor; in the other, a random disturbance also affected the cursor. From a person's performance during Condition 1, we derived constants used in the CST model to predict the results of Condition 2. In two experiments, predictions occurred a few minutes before Condition 2; in one experiment, the delay was 1 yr. During a 1-min. experimental run, the positions of handle and cursor, produced by the person, were each sampled 1800 times, once every 1/30 sec. During a modeling run, the model predicted the positions of the handle and target for each of the 1800 intervals sampled in the experimental run. In 104 replications, the mean correlation between predicted and actual positions of the handle was .996; SD = .002.

  11. Accuracy and Reliability in the Prediction of End-of-Life Performance of Solar Generators

    Science.gov (United States)

    Rapp, Etienne

    2008-09-01

    The end-of-life power analysis of solar arrays is calculated using a combination of arithmetic and root square sums of loss factors. These loss factors are sometimes linked to degradations, sometimes linked to uncertainties. The uncertainties of the degradations are taken into account considering contractual "worst cases". This paper will put the first stones for a move "metrological" evaluation of the probable performance associated with a standard uncertainty. The turn from silicon to triple junction solar cells induces some changes in the degradation parameters of solar arrays: * The triple junction cells are more sensitive to UV darkening than silicon ones. * The cell voltage is higher and the current is lower. Then the cell strings are shorter, and there are more strings in parallel. This induces some changes in the reliability analyses and risk management. * The failure modes and failure rates of these cells have to be compared and discussed. We try to define improved rules to design solar arrays for end of life performance, for a better knowledge of the margins and a better reliability.

  12. Predictive Reliability Assessment of the Automatic Clutch on a Primary Sodium Pump Drive

    International Nuclear Information System (INIS)

    Westwell, P.

    1975-01-01

    This paper examines the reliability of a group of three clutch couplings each mounted between a pony motor and the main drive for the primary sodium pumps. The sodium pumps specification requires that continuously running AC pony motors be fitted to give a guaranteed 10% drive to the pumps in the event of a main supply failure. The drive to the main shaft is via 3 : 1 reduction gearing such that a six pole pony motor running at 300 rpm would drive the main shaft at 100 rpm i.e., 10% of its rated speed. In order that the pony motor drive could be permanently energised during normal operation a free wheeling clutch is fitted between the motor and the reduction gearing. The type of clutch chosen is. the Synchro-Self Shifting (SSS) clutch, shown in Figure 1. This type of clutch has proved itself under fairly onerous operating conditions, but is normally mounted on a horizontal driving shaft whereas in this case because of space limitations, it is necessary to mount it vertically. The reliability target set is that the chance of losing all three independent back-up pony motor drives on loss of main supplies should fall within the 10 -5 - 10 -6 band. Since the electrical supplies and other parts of the pony motor drives have been assessed within this target and some doubts expressed about the clutch it was now necessary to look at this in some detail

  13. Reliable B cell epitope predictions: impacts of method development and improved benchmarking

    DEFF Research Database (Denmark)

    Kringelum, Jens Vindahl; Lundegaard, Claus; Lund, Ole

    2012-01-01

    biomedical applications such as; rational vaccine design, development of disease diagnostics and immunotherapeutics. However, experimental mapping of epitopes is resource intensive making in silico methods an appealing complementary approach. To date, the reported performance of methods for in silico mapping...... evaluation data set improved from 0.712 to 0.727. Our results thus demonstrate that given proper benchmark definitions, B-cell epitope prediction methods achieve highly significant predictive performances suggesting these tools to be a powerful asset in rational epitope discovery. The updated version...

  14. Development of Stronger and More Reliable Cast Austenitic Stainless Steels (H-Series) Based on Scientific Design Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Muralidharan, G.; Sikka, V.K.; Pankiw, R.I.

    2006-04-15

    Mechanical and Corrosion Properties (ORNL/TM-2005/81/R1). The final report on another related project at the University of Tennessee by George Pharr, Easo George, and Michael Santella has been published as Development of Combinatorial Methods for Alloy Design and Optimization (ORNL/TM-2005-133). The goal of the project was to increase the high-temperature strength by 50% and upper use temperature by 86 to 140 F (30 to 60 C) of H-Series of cast austenitic stainless steels. Meeting such a goal is expected to result in energy savings of 38 trillion Btu/year by 2020 and energy cost savings of $185 million/year. The goal of the project was achieved by using the alloy design methods developed at ORNL, based on precise microcharacterization and identification of critical microstructure/properties relationships and combining them with the modern computational science-based tools that calculate phases, phase fractions, and phase compositions based on alloy compositions. The combined approach of microcharacterization of phases and computational phase prediction would permit rapid improvement of the current alloy composition of an alloy and provide the long-term benefit of customizing alloys within grades for specific applications. The project was appropriate for the domestic industry because the current H-Series alloys have reached their limits both in high-temperature-strength properties and in upper use temperature. The desire of Duraloy's industrial customers to improve process efficiency, while reducing cost, requires that the current alloys be taken to the next level of strength and that the upper use temperature limit be increased. This project addressed a specific topic from the subject call: to develop materials for manufacturing processes that will increase high-temperature strength, fatigue resistance, corrosion, and wear resistance. The outcome of the project would benefit manufacturing processes in the chemical, steel, and heat-treating industries.

  15. Prediction of Osteoporosis through Radiographic Assessment of Proximal Femoral Morphology and Texture in Elderly; is it Valid and Reliable

    Directory of Open Access Journals (Sweden)

    Özkan Köse

    2015-08-01

    Full Text Available Objective: The purpose of this study was to determine the best predictive radiographic measurement method to identify the presence of osteoporosis and test the inter-observer and intra-observer reliability and validity of these methods in postmenopausal women. Materials and Methods: Ninety-two elderly female patients who presented with hip pain were included. Hip radiographs were used to determine the values of Singh index (SI, canal-to-calcar ratio (CCR, and cortical thickness index (CTI. All measurements were performed by two independent observers on two separate occasions, at least 4 weeks apart. Bone mineral density (BMD was assessed by DEXA. In the first part of the analysis, reliability of the all measurement methods was tested. In the second part, correlation coefficient (Pearson r was used to determine the relationship between the measurement methods and BMD. Finally ROC curve analysis was performed to determine the sensitivity, specificity, and threshold values for each radiographic measurement method. Results: Intra-observer reliability analysis of SI revealed kappa coefficient of 0.359 for observer A, and 0.224 for observer B. Inter-observer reliability analysis of SI revealed kappa coefficient of 0.070 for observer A and 0.051 for observer B. The intra-observer and inter-observer reliability was good and excellent for CTI and CCR for both observers (ICC: 0.920 and ICC: 0.936. There was no correlation between SI and BMD (p=0.818. On the other hand, there was a significant correlation between CTI and CCR and BMD (p=0.001. All measured indices were significantly different (p<0.05 between osteoporotic and non-osteoporotic patients. CTI value less than 0.3 or CCR value less than 0.47 reflects the presence of osteoporosis with 100% sensitivity and 98% specificity. Conclusion: SI is not reliable and do not correlate with BMD. However, both CTI and CCR showed good and excellent reliability, and each index correlated well with the real BMD

  16. Serum interleukin -8 is not a reliable marker for prediction of vesicoureteral reflux in children with febrile urinary tract infection

    Directory of Open Access Journals (Sweden)

    Abolfazl Mahyar

    2015-12-01

    Full Text Available Objective: In view of the side effects of voiding cystourethrography (VCUG, identification of noninvasive markers predicting the presence of vesicoureteral reflux (VUR is important. This study was conducted to determine the predictive value of serum interleukin-8 (IL-8 in diagnosis of VUR in children with first febrile urinary tract infection (UTI. Materials and Methods: Eighty children with first febrile UTI were divided into two groups, with and without VUR, based on the results of VCUG. The sensitivity, specificity, positive and negative predictive value positive and negative likelihood ratio, and accuracy of IL-8 for prediction of VUR were investigated. Results: Of the 80 children with febrile UTI, 30 (37.5% had VUR. There was no significant difference between the children with and without VUR and also between low and high-grade VUR groups in terms of serum concentration of IL-8 (P>0.05. Based on ROC curve, the sensitivity, specificity, likelihood ratio positive, and accuracy of serum IL-8 was lower than those of erythrocyte sedimentation rate and C-reactive protein. Multivariate logistic regression analysis showed significant positive correlation only between erythrocyte sedimentation rate and VUR. Conclusions: This study showed no significant difference between the children with and without VUR in terms of the serum concentration of IL-8. Therefore, it seems that serum IL-8 is not a reliable marker for prediction of VUR.

  17. Prediction of sand production onset in petroleum reservoirs using a reliable classification approach

    Directory of Open Access Journals (Sweden)

    Farhad Gharagheizi

    2017-06-01

    It is shown that the developed model can accurately predict the sand production in a real field. The results of this study indicates that implementation of LSSVM modeling can effectively help completion designers to make an on time sand control plan with least deterioration of production.

  18. Identification of the best DFT functionals for a reliable prediction of lignin vibrational properties

    DEFF Research Database (Denmark)

    Barsberg, Soren

    2015-01-01

    Lignin is the most abundant aromatic plant polymer on earth. Useful information on its structure and interactions is gained by vibrational spectroscopy and relies on the quality of band assignments. B3LYP predictions were recently shown to support band assignments. Further progress calls...

  19. Are Available Models Reliable for Predicting the FRP Contribution to the Shear Resistance of RC Beams?

    DEFF Research Database (Denmark)

    Sas, G.; Täljsten, Björn; Barros, J.

    2009-01-01

    In this paper the trustworthiness of the existing theory for predicting the fiber-reinforced plastic contribution to the shear resistance of reinforced concrete beams is discussed. The most well-known shear models for external bonded reinforcement are presented, commented on, and compared...

  20. Human reliability in complex systems: an overview

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1976-07-01

    A detailed analysis is presented of the main conceptual background underlying the areas of human reliability and human error. The concept of error is examined and generalized to that of human reliability, and some of the practical and methodological difficulties of reconciling the different standpoints of the human factors specialist and the engineer discussed. Following a survey of general reviews available on human reliability, quantitative techniques for prediction of human reliability are considered. An in-depth critical analysis of the various quantitative methods is then presented, together with the data bank requirements for human reliability prediction. Reliability considerations in process control and nuclear plant, and also areas of design, maintenance, testing and emergency situations are discussed. The effects of stress on human reliability are analysed and methods of minimizing these effects discussed. Finally, a summary is presented and proposals for further research are set out. (author)

  1. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  2. The nuclear power plant maintenance personnel reliability prediction (NPP/MPRP) effort at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.

    1982-01-01

    Human errors committed during maintenance activities are potentially a major contribution to the overall risk associated with the operation of a nuclear power plant (NPP). An NRC-sponsored program at Oak Ridge National Laboratory is attempting to develop a quantitative predictive technique to evaluate the contribution of maintenance errors to the overall NPP risk. The current work includes a survey of the requirements of potential users to ascertain the need for and content of the proposed quantitative model, plus an initial job/task analysis to determine the scope and applicability of various maintenance tasks. In addition, existing human reliability prediction models are being reviewed and assessed with respect to their applicability to NPP maintenance tasks. This paper discusses the status of the program and summarizes the results to date

  3. Prediction of selectivity from morphological conditions: Methodology and a case study on cod (Gadus morhua)

    DEFF Research Database (Denmark)

    Herrmann, Bent; Krag, Ludvig Ahm; Frandsen, Rikke

    2009-01-01

    The FISHSELECT methodology. tools, and software were developed and used to measure the morphological parameters that determine the ability of cod to penetrate different mesh types, sizes, and openings. The shape of one cross-section at the cod's head was found to explain 97.6% of the mesh...

  4. Reliability residual-life prediction method for thermal aging based on performance degradation

    International Nuclear Information System (INIS)

    Ren Shuhong; Xue Fei; Yu Weiwei; Ti Wenxin; Liu Xiaotian

    2013-01-01

    The paper makes the study of the nuclear power plant main pipeline. The residual-life of the main pipeline that failed due to thermal aging has been studied by the use of performance degradation theory and Bayesian updating methods. Firstly, the thermal aging impact property degradation process of the main pipeline austenitic stainless steel has been analyzed by the accelerated thermal aging test data. Then, the thermal aging residual-life prediction model based on the impact property degradation data is built by Bayesian updating methods. Finally, these models are applied in practical situations. It is shown that the proposed methods are feasible and the prediction accuracy meets the needs of the project. Also, it provides a foundation for the scientific management of aging management of the main pipeline. (authors)

  5. The Reliability and Predictive Ability of a Biomarker of Oxidative DNA Damage on Functional Outcomes after Stroke Rehabilitation

    Science.gov (United States)

    Hsieh, Yu-Wei; Lin, Keh-Chung; Korivi, Mallikarjuna; Lee, Tsong-Hai; Wu, Ching-Yi; Wu, Kuen-Yuh

    2014-01-01

    We evaluated the reliability of 8-hydroxy-2′-deoxyguanosine (8-OHdG), and determined its ability to predict functional outcomes in stroke survivors. The rehabilitation effect on 8-OHdG and functional outcomes were also assessed. Sixty-one stroke patients received a 4-week rehabilitation. Urinary 8-OHdG levels were determined by liquid chromatography–tandem mass spectrometry. The test-retest reliability of 8-OHdG was good (interclass correlation coefficient = 0.76). Upper-limb motor function and muscle power determined by the Fugl-Meyer Assessment (FMA) and Medical Research Council (MRC) scales before rehabilitation showed significant negative correlation with 8-OHdG (r = −0.38, r = −0.30; p rehabilitation, we found a fair and significant correlation between 8-OHdG and FMA (r = −0.34) and 8-OHdG and pain (r = 0.26, p rehabilitation. The exploratory study findings conclude that 8-OHdG is a reliable and promising biomarker of oxidative stress and could be a valid predictor of functional outcomes in patients. Monitoring of behavioral indicators along with biomarkers may have crucial benefits in translational stroke research. PMID:24743892

  6. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling.

    Science.gov (United States)

    Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.

  7. The Reliability and Predictive Ability of a Biomarker of Oxidative DNA Damage on Functional Outcomes after Stroke Rehabilitation

    Directory of Open Access Journals (Sweden)

    Yu-Wei Hsieh

    2014-04-01

    Full Text Available We evaluated the reliability of 8-hydroxy-2'-deoxyguanosine (8-OHdG, and determined its ability to predict functional outcomes in stroke survivors. The rehabilitation effect on 8-OHdG and functional outcomes were also assessed. Sixty-one stroke patients received a 4-week rehabilitation. Urinary 8-OHdG levels were determined by liquid chromatography–tandem mass spectrometry. The test-retest reliability of 8-OHdG was good (interclass correlation coefficient = 0.76. Upper-limb motor function and muscle power determined by the Fugl-Meyer Assessment (FMA and Medical Research Council (MRC scales before rehabilitation showed significant negative correlation with 8-OHdG (r = −0.38, r = −0.30; p < 0.05. After rehabilitation, we found a fair and significant correlation between 8-OHdG and FMA (r = −0.34 and 8-OHdG and pain (r = 0.26, p < 0.05. Baseline 8-OHdG was significantly correlated with post-treatment FMA, MRC, and pain scores (r = −0.34, −0.31, and 0.25; p < 0.05, indicating its ability to predict functional outcomes. 8-OHdG levels were significantly decreased, and functional outcomes were improved after rehabilitation. The exploratory study findings conclude that 8-OHdG is a reliable and promising biomarker of oxidative stress and could be a valid predictor of functional outcomes in patients. Monitoring of behavioral indicators along with biomarkers may have crucial benefits in translational stroke research.

  8. CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool.

    Directory of Open Access Journals (Sweden)

    Manuel Stemmer

    Full Text Available Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5' end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites.

  9. Is epicardial adipose tissue, assessed by echocardiography, a reliable method for visceral adipose tissue prediction?

    Science.gov (United States)

    Silaghi, Alina Cristina; Poantă, Laura; Valea, Ana; Pais, Raluca; Silaghi, Horatiu

    2011-03-01

    Epicardial adipose tissue is an ectopic fat storage at the heart surface in direct contact with the coronary arteries. It is considered a metabolically active tissue, being a local source of pro-inflammatory factors that contribute to the pathogenesis of coronary artery disease. The AIM of our study was to establish correlations between echocardiographic assessment of epicardial adipose tissue and anthropometric and ultrasound measurements of the central and peripheral fat depots. The study was conducted on 22 patients with or without coronaropathy. Epicardial adipose tissue was measured using Aloka Prosound α 10 machine with a 3.5-7.5 MHz variable-frequency transducer and subcutaneous and visceral fat with Esaote Megas GPX machine and 3.5-7.5 MHz variable frequency transducer. Epicardial adipose tissue measured by echocardiography is correlated with waist circumference (p < 0.05), visceral adipose tissue thickness measured by ultrasonography (US) and is not correlated with body mass index (p = 0.315), hip and thigh circumference or subcutaneous fat thickness measured by US. Our study confirms that US assessment of epicardial fat correlates with anthropometric and US measurements of the central fat, representing an indirect but reliable marker of the visceral fat.

  10. Modified Inverse First Order Reliability Method (I-FORM) for Predicting Extreme Sea States.

    Energy Technology Data Exchange (ETDEWEB)

    Eckert-Gallup, Aubrey Celia; Sallaberry, Cedric Jean-Marie; Dallman, Ann Renee; Neary, Vincent Sinclair

    2014-09-01

    Environmental contours describing extreme sea states are generated as the input for numerical or physical model simulation s as a part of the stand ard current practice for designing marine structure s to survive extreme sea states. Such environmental contours are characterized by combinations of significant wave height ( ) and energy period ( ) values calculated for a given recurrence interval using a set of data based on hindcast simulations or buoy observations over a sufficient period of record. The use of the inverse first - order reliability method (IFORM) i s standard design practice for generating environmental contours. In this paper, the traditional appli cation of the IFORM to generating environmental contours representing extreme sea states is described in detail and its merits and drawbacks are assessed. The application of additional methods for analyzing sea state data including the use of principal component analysis (PCA) to create an uncorrelated representation of the data under consideration is proposed. A reexamination of the components of the IFORM application to the problem at hand including the use of new distribution fitting techniques are shown to contribute to the development of more accurate a nd reasonable representations of extreme sea states for use in survivability analysis for marine struc tures. Keywords: In verse FORM, Principal Component Analysis , Environmental Contours, Extreme Sea State Characteri zation, Wave Energy Converters

  11. Microgravity spheroids as a reliable, long-term tool for predictive toxicology

    DEFF Research Database (Denmark)

    Fey, S. J.; Wrzesinski, Krzysztof

    2013-01-01

    those seen in vivo. Studies with 5 common drugs (acetaminophen, amiodarone, metformin, phenformin, and valproic acid) have shown that they are more predictive of lethally-toxic plasma levels in vivo than published studies using primary human hepatocytes. Shotgun proteomics has revealed that the gain...... this time they are metabolically stable for at least 24 days more; grow slowly (a doubling time of >20 days); produce physiological levels of urea, cholesterol and ATP; exhibit stable gene expression (for selected liver relevant genes); and can post translationally modify proteins in a manner which mirrors...

  12. Linear Interaction Energy Based Prediction of Cytochrome P450 1A2 Binding Affinities with Reliability Estimation.

    Directory of Open Access Journals (Sweden)

    Luigi Capoferri

    Full Text Available Prediction of human Cytochrome P450 (CYP binding affinities of small ligands, i.e., substrates and inhibitors, represents an important task for predicting drug-drug interactions. A quantitative assessment of the ligand binding affinity towards different CYPs can provide an estimate of inhibitory activity or an indication of isoforms prone to interact with the substrate of inhibitors. However, the accuracy of global quantitative models for CYP substrate binding or inhibition based on traditional molecular descriptors can be limited, because of the lack of information on the structure and flexibility of the catalytic site of CYPs. Here we describe the application of a method that combines protein-ligand docking, Molecular Dynamics (MD simulations and Linear Interaction Energy (LIE theory, to allow for quantitative CYP affinity prediction. Using this combined approach, a LIE model for human CYP 1A2 was developed and evaluated, based on a structurally diverse dataset for which the estimated experimental uncertainty was 3.3 kJ mol-1. For the computed CYP 1A2 binding affinities, the model showed a root mean square error (RMSE of 4.1 kJ mol-1 and a standard error in prediction (SDEP in cross-validation of 4.3 kJ mol-1. A novel approach that includes information on both structural ligand description and protein-ligand interaction was developed for estimating the reliability of predictions, and was able to identify compounds from an external test set with a SDEP for the predicted affinities of 4.6 kJ mol-1 (corresponding to 0.8 pKi units.

  13. Methodology for maintenance analysis based on hydroelectric power stations reliability; Metodologia para realizar analisis de mantenimiento basado en confiabilidad en centrales hidroelectricas

    Energy Technology Data Exchange (ETDEWEB)

    Rea Soto, Rogelio; Calixto Rodriguez, Roberto; Sandoval Valenzuela, Salvador; Velasco Flores, Rocio; Garcia Lizarraga, Maria del Carmen [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2012-07-01

    A methodology to carry out Reliability Centered Maintenance (RCM) studies for hydroelectric power plants is presented. The methodology is an implantation/extension of the guidelines proposed by the Engineering Society for Advanced Mobility Land, Sea and Space in the SAE-JA1012 standard. With the purpose of answering the first five questions, that are set out in that standard, the use of standard ISO14224 is strongly recommended. This approach standardizes failure mechanisms and homogenizes RCM studies with the process of collecting failure and maintenance data. The use of risk matrixes to rank the importance of each failure based on a risk criteria is also proposed. [Spanish] Se presenta una metodologia para realizar estudios de mantenimiento Basado en Confiabilidad (RCM) aplicados a la industria hidroelectrica. La metodologia es una implantacion/ extension realizada por los autores de este trabajo, de los lineamientos propuestos por la Engineering Society for Advanced Mobility Land, Sea and Space en el estandar SAE-JA1012. Para contestar las primeras cinco preguntas del estandar se propone tomar como base los modos y mecanismos de fallas de componentes documentados en la guia para recopilar datos de falla en el estandar ISO-14224. Este enfoque permite estandarizar la descripcion de mecanismos de fallas de los equipos, tanto en el estudio RCM como en el proceso de recopilacion de datos de falla y de mantenimiento, lo que permite retroalimentar el ciclo de mejora continua de los procesos RCM. Tambien se propone el uso de matrices de riesgo para jerarquizar la importancia de los mecanismos de falla con base en el nivel de riesgo.

  14. Performance of synchrotron x-ray monochromators under heat load: How reliable are the predictions?

    International Nuclear Information System (INIS)

    Freund, A.K.; Hoszowska, J.; Migliore, J.-S.; Mocella, V.; Zhang, L.; Ferrero, C.

    2000-01-01

    With the ongoing development of insertion devices with smaller gaps the heat load generated by modern synchrotron sources increases continuously. To predict the overall performance of experiments on beam lines it is of crucial importance to be able to predict the efficiency of x-ray optics and in particular that of crystal monochromators. We report on a detailed comparison between theory and experiment for a water-cooled silicon crystal exposed to bending magnet radiation of up to 237 W total power and 1.3 W/mm2 power density. The thermal deformation has been calculated by the code ANSYS and its output has been injected into a finite difference code based on the Takagi-Taupin diffraction theory for distorted crystals. Several slit settings, filters and reflection orders were used to vary the geometrical conditions and the x-ray penetration depth in the crystal. In general, good agreement has been observed between the calculated and the observed values for the rocking curve width

  15. Sparse Power-Law Network Model for Reliable Statistical Predictions Based on Sampled Data

    Directory of Open Access Journals (Sweden)

    Alexander P. Kartun-Giles

    2018-04-01

    Full Text Available A projective network model is a model that enables predictions to be made based on a subsample of the network data, with the predictions remaining unchanged if a larger sample is taken into consideration. An exchangeable model is a model that does not depend on the order in which nodes are sampled. Despite a large variety of non-equilibrium (growing and equilibrium (static sparse complex network models that are widely used in network science, how to reconcile sparseness (constant average degree with the desired statistical properties of projectivity and exchangeability is currently an outstanding scientific problem. Here we propose a network process with hidden variables which is projective and can generate sparse power-law networks. Despite the model not being exchangeable, it can be closely related to exchangeable uncorrelated networks as indicated by its information theory characterization and its network entropy. The use of the proposed network process as a null model is here tested on real data, indicating that the model offers a promising avenue for statistical network modelling.

  16. Accurate and Reliable Prediction of the Binding Affinities of Macrocycles to Their Protein Targets.

    Science.gov (United States)

    Yu, Haoyu S; Deng, Yuqing; Wu, Yujie; Sindhikara, Dan; Rask, Amy R; Kimura, Takayuki; Abel, Robert; Wang, Lingle

    2017-12-12

    Macrocycles have been emerging as a very important drug class in the past few decades largely due to their expanded chemical diversity benefiting from advances in synthetic methods. Macrocyclization has been recognized as an effective way to restrict the conformational space of acyclic small molecule inhibitors with the hope of improving potency, selectivity, and metabolic stability. Because of their relatively larger size as compared to typical small molecule drugs and the complexity of the structures, efficient sampling of the accessible macrocycle conformational space and accurate prediction of their binding affinities to their target protein receptors poses a great challenge of central importance in computational macrocycle drug design. In this article, we present a novel method for relative binding free energy calculations between macrocycles with different ring sizes and between the macrocycles and their corresponding acyclic counterparts. We have applied the method to seven pharmaceutically interesting data sets taken from recent drug discovery projects including 33 macrocyclic ligands covering a diverse chemical space. The predicted binding free energies are in good agreement with experimental data with an overall root-mean-square error (RMSE) of 0.94 kcal/mol. This is to our knowledge the first time where the free energy of the macrocyclization of linear molecules has been directly calculated with rigorous physics-based free energy calculation methods, and we anticipate the outstanding accuracy demonstrated here across a broad range of target classes may have significant implications for macrocycle drug discovery.

  17. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  18. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    International Nuclear Information System (INIS)

    Wimmer, Thomas; Srimathveeravalli, Govindarajan; Gutta, Narendra; Ezell, Paula C.; Monette, Sebastien; Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C.; Coleman, Jonathan A.; Solomon, Stephen B.

    2015-01-01

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery

  19. Computational Efficient Upscaling Methodology for Predicting Thermal Conductivity of Nuclear Waste forms

    International Nuclear Information System (INIS)

    Li, Dongsheng; Sun, Xin; Khaleel, Mohammad A.

    2011-01-01

    This study evaluated different upscaling methods to predict thermal conductivity in loaded nuclear waste form, a heterogeneous material system. The efficiency and accuracy of these methods were compared. Thermal conductivity in loaded nuclear waste form is an important property specific to scientific researchers, in waste form Integrated performance and safety code (IPSC). The effective thermal conductivity obtained from microstructure information and local thermal conductivity of different components is critical in predicting the life and performance of waste form during storage. How the heat generated during storage is directly related to thermal conductivity, which in turn determining the mechanical deformation behavior, corrosion resistance and aging performance. Several methods, including the Taylor model, Sachs model, self-consistent model, and statistical upscaling models were developed and implemented. Due to the absence of experimental data, prediction results from finite element method (FEM) were used as reference to determine the accuracy of different upscaling models. Micrographs from different loading of nuclear waste were used in the prediction of thermal conductivity. Prediction results demonstrated that in term of efficiency, boundary models (Taylor and Sachs model) are better than self consistent model, statistical upscaling method and FEM. Balancing the computation resource and accuracy, statistical upscaling is a computational efficient method in predicting effective thermal conductivity for nuclear waste form.

  20. Understanding and predicting metallic whisker growth and its effects on reliability : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Michael, Joseph Richard; Grant, Richard P.; Rodriguez, Mark Andrew; Pillars, Jamin; Susan, Donald Francis; McKenzie, Bonnie Beth; Yelton, William Graham

    2012-01-01

    Tin (Sn) whiskers are conductive Sn filaments that grow from Sn-plated surfaces, such as surface finishes on electronic packages. The phenomenon of Sn whiskering has become a concern in recent years due to requirements for lead (Pb)-free soldering and surface finishes in commercial electronics. Pure Sn finishes are more prone to whisker growth than their Sn-Pb counterparts and high profile failures due to whisker formation (causing short circuits) in space applications have been documented. At Sandia, Sn whiskers are of interest due to increased use of Pb-free commercial off-the-shelf (COTS) parts and possible future requirements for Pb-free solders and surface finishes in high-reliability microelectronics. Lead-free solders and surface finishes are currently being used or considered for several Sandia applications. Despite the long history of Sn whisker research and the recently renewed interest in this topic, a comprehensive understanding of whisker growth remains elusive. This report describes recent research on characterization of Sn whiskers with the aim of understanding the underlying whisker growth mechanism(s). The report is divided into four sections and an Appendix. In Section 1, the Sn plating process is summarized. Specifically, the Sn plating parameters that were successful in producing samples with whiskers will be reviewed. In Section 2, the scanning electron microscopy (SEM) of Sn whiskers and time-lapse SEM studies of whisker growth will be discussed. This discussion includes the characterization of straight as well as kinked whiskers. In Section 3, a detailed discussion is given of SEM/EBSD (electron backscatter diffraction) techniques developed to determine the crystallography of Sn whiskers. In Section 4, these SEM/EBSD methods are employed to determine the crystallography of Sn whiskers, with a statistically significant number of whiskers analyzed. This is the largest study of Sn whisker crystallography ever reported. This section includes a

  1. Evaluation of a Propolis Water Extract Using a Reliable RP-HPLC Methodology and In Vitro and In Vivo Efficacy and Safety Characterisation

    Science.gov (United States)

    Rocha, Bruno Alves; Bueno, Paula Carolina Pires; Vaz, Mirela Mara de Oliveira Lima Leite; Nascimento, Andresa Piacezzi; Ferreira, Nathália Ursoli; Moreno, Gabriela de Padua; Rodrigues, Marina Rezende; Costa-Machado, Ana Rita de Mello; Barizon, Edna Aparecida; Campos, Jacqueline Costa Lima; de Oliveira, Pollyanna Francielli; Acésio, Nathália de Oliveira; Martins, Sabrina de Paula Lima; Tavares, Denise Crispim; Berretta, Andresa Aparecida

    2013-01-01

    Since the beginning of propolis research, several groups have studied its antibacterial, antifungal, and antiviral properties. However, most of these studies have only employed propolis ethanolic extract (PEE) leading to little knowledge about the biological activities of propolis water extract (PWE). Based on this, in a previous study, we demonstrated the anti-inflammatory and immunomodulatory activities of PWE. In order to better understand the equilibrium between effectiveness and toxicity, which is essential for a new medicine, the characteristics of PWE were analyzed. We developed and validated an RP-HPLC method to chemically characterize PWE and PEE and evaluated the in vitro antioxidant/antimicrobial activity for both extracts and the safety of PWE via determining genotoxic potential using in vitro and in vivo mammalian micronucleus assays. We have concluded that the proposed analytical methodology was reliable, and both extracts showed similar chemical composition. The extracts presented antioxidant and antimicrobial effects, while PWE demonstrated higher antioxidant activity and more efficacious for the most of the microorganisms tested than PEE. Finally, PWE was shown to be safe using micronucleus assays. PMID:23710228

  2. Evaluation of a Propolis Water Extract Using a Reliable RP-HPLC Methodology and In Vitro and In Vivo Efficacy and Safety Characterisation

    Directory of Open Access Journals (Sweden)

    Bruno Alves Rocha

    2013-01-01

    Full Text Available Since the beginning of propolis research, several groups have studied its antibacterial, antifungal, and antiviral properties. However, most of these studies have only employed propolis ethanolic extract (PEE leading to little knowledge about the biological activities of propolis water extract (PWE. Based on this, in a previous study, we demonstrated the anti-inflammatory and immunomodulatory activities of PWE. In order to better understand the equilibrium between effectiveness and toxicity, which is essential for a new medicine, the characteristics of PWE were analyzed. We developed and validated an RP-HPLC method to chemically characterize PWE and PEE and evaluated the in vitro antioxidant/antimicrobial activity for both extracts and the safety of PWE via determining genotoxic potential using in vitro and in vivo mammalian micronucleus assays. We have concluded that the proposed analytical methodology was reliable, and both extracts showed similar chemical composition. The extracts presented antioxidant and antimicrobial effects, while PWE demonstrated higher antioxidant activity and more efficacious for the most of the microorganisms tested than PEE. Finally, PWE was shown to be safe using micronucleus assays.

  3. Advanced diagnostics and predictive maintenance to improve availability and reliability of ENEL plants

    Energy Technology Data Exchange (ETDEWEB)

    Cenci, V.; Ghironi, M.; Guidi, L.; Lauro, M.; Pestonesi, D. [ENEL (Italy). Generation and Energy Management Division

    2007-07-01

    This paper reviews the ENEL Generation and Energy Management strategy for diagnostics and predictive maintenance of power plants and provides a comprehensive description of effective applications and systems. Exploiting the most advanced information and communication technologies makes it possible to capture weak and hidden signals and powerful processing can be used to discover forewarning symptoms and identify anomalies both in the process and, above all, inside the devices. The following systems and applications are presented together with results and impact on plant profitability: expert system for the diagnostics of plant main machinery; advanced diagnostics of 'intelligent' fieldbus devices such as on/off valve motor-driven actuators, control-valve positioners and pneumatic actuators, transmitters; control loop and control valve diagnostics in order to investigate valve friction with an estimation of the residual time to failure; multisensorial diagnostics for coal transport and storage systems aimed at preventing firing and structural damages; and wireless sensor networks for the diagnostics of medium and small size components. 4 refs., 14 figs., 1 tab.

  4. Positive Skin Test or Specific IgE to Penicillin Does Not Reliably Predict Penicillin Allergy

    DEFF Research Database (Denmark)

    Tannert, Line Kring; Mørtz, Charlotte G; Skov, Per Stahl

    2017-01-01

    INTRODUCTION: According to guidelines, patients are diagnosed with penicillin allergy if skin test (ST) result or specific IgE (s-IgE) to penicillin is positive. However, the true sensitivity and specificity of these tests are presently not known. OBJECTIVE: To investigate the clinical relevance...... of a positive ST result and positive s-IgE and to study the reproducibility of ST and s-IgE. METHODS: A sample of convenience of 25 patients with positive penicillin ST results, antipenicillin s-IgE results, or both was challenged with their culprit penicillin. Further 19 patients were not challenged......-IgE measured (T0), and then skin tested and had s-IgE measured 4 weeks later (T1). RESULTS: Only 9 (36%) of 25 were challenge positive. There was an increased probability of being penicillin allergic if both ST result and s-IgE were positive at T0. Positive ST result or positive s-IgE alone did not predict...

  5. Inter-Investigator Reliability of Anthropometric Prediction of 1RM Bench Press in College Football Players.

    Science.gov (United States)

    Schumacher, Richard M; Arabas, Jana L; Mayhew, Jerry L; Brechue, William F

    2016-01-01

    The purpose of this study was to determine the effect of inter-investigator differences in anthropometric assessments on the prediction of one-repetition maximum (1RM) bench press in college football players. Division-II players (n = 34, age = 20.4 ± 1.2 y, 182.3 ± 6.6 cm, 99.1 ± 18.4 kg) were measured for selected anthropometric variables and 1RM bench press at the conclusion of a heavy resistance training program. Triceps, subscapular, and abdominal skinfolds were measured in triplicate by three investigators and used to estimate %fat. Arm circumference was measured around a flexed biceps muscle and was corrected for triceps skinfold to estimate muscle cross-sectional area (CSA). Chest circumference was measured at mid-expiration. Significant differences among the testers were evident in six of the nine anthropometric variables, with the least experienced tester being significantly different from the other testers on seven variables, although average differences among investigators ranged from 1-2% for circumferences to 4-9% for skinfolds. The two more experienced testers were significantly different on only one variable. Overall agreement among testers was high (ICC>0.895) for each variable, with low coefficients of variation (CVbench press using a non-performance anthropometric equation. Minimal experience in anthropometry may not impede strength and conditioning specialists from accurately estimating 1RM bench press.

  6. Semen molecular and cellular features: these parameters can reliably predict subsequent ART outcome in a goat model

    Directory of Open Access Journals (Sweden)

    Mereu Paolo

    2009-11-01

    Full Text Available Abstract Currently, the assessment of sperm function in a raw or processed semen sample is not able to reliably predict sperm ability to withstand freezing and thawing procedures and in vivo fertility and/or assisted reproductive biotechnologies (ART outcome. The aim of the present study was to investigate which parameters among a battery of analyses could predict subsequent spermatozoa in vitro fertilization ability and hence blastocyst output in a goat model. Ejaculates were obtained by artificial vagina from 3 adult goats (Capra hircus aged 2 years (A, B and C. In order to assess the predictive value of viability, computer assisted sperm analyzer (CASA motility parameters and ATP intracellular concentration before and after thawing and of DNA integrity after thawing on subsequent embryo output after an in vitro fertility test, a logistic regression analysis was used. Individual differences in semen parameters were evident for semen viability after thawing and DNA integrity. Results of IVF test showed that spermatozoa collected from A and B lead to higher cleavage rates (0

  7. Reliability of Degree-Day Models to Predict the Development Time of Plutella xylostella (L.) under Field Conditions.

    Science.gov (United States)

    Marchioro, C A; Krechemer, F S; de Moraes, C P; Foerster, L A

    2015-12-01

    The diamondback moth, Plutella xylostella (L.), is a cosmopolitan pest of brassicaceous crops occurring in regions with highly distinct climate conditions. Several studies have investigated the relationship between temperature and P. xylostella development rate, providing degree-day models for populations from different geographical regions. However, there are no data available to date to demonstrate the suitability of such models to make reliable projections on the development time for this species in field conditions. In the present study, 19 models available in the literature were tested regarding their ability to accurately predict the development time of two cohorts of P. xylostella under field conditions. Only 11 out of the 19 models tested accurately predicted the development time for the first cohort of P. xylostella, but only seven for the second cohort. Five models correctly predicted the development time for both cohorts evaluated. Our data demonstrate that the accuracy of the models available for P. xylostella varies widely and therefore should be used with caution for pest management purposes.

  8. Assessing the reliability, predictive and construct validity of historical, clinical and risk management-20 (HCR-20) in Mexican psychiatric inpatients.

    Science.gov (United States)

    Sada, Andrea; Robles-García, Rebeca; Martínez-López, Nicolás; Hernández-Ramírez, Rafael; Tovilla-Zarate, Carlos-Alfonso; López-Munguía, Fernando; Suárez-Alvarez, Enrique; Ayala, Xochitl; Fresán, Ana

    2016-08-01

    Assessing dangerousness to gauge the likelihood of future violent behaviour has become an integral part of clinical mental health practice in forensic and non-forensic psychiatric settings, one of the most effective instruments for this being the Historical, Clinical and Risk Management-20 (HCR-20). To examine the HCR-20 factor structure in Mexican psychiatric inpatients and to obtain its predictive validity and reliability for use in this population. In total, 225 patients diagnosed with psychotic, affective or personality disorders were included. The HCR-20 was applied at hospital admission and violent behaviours were assessed during psychiatric hospitalization using the Overt Aggression Scale (OAS). Construct validity, predictive validity and internal consistency were determined. Violent behaviour remains more severe in patients classified in the high-risk group during hospitalization. Fifteen items displayed adequate communalities in the original designated domains of the HCR-20 and internal consistency of the instruments was high. The HCR-20 is a suitable instrument for predicting violence risk in Mexican psychiatric inpatients.

  9. A human hemi-cornea model for eye irritation testing: quality control of production, reliability and predictive capacity.

    Science.gov (United States)

    Engelke, M; Zorn-Kruppa, M; Gabel, D; Reisinger, K; Rusche, B; Mewes, K R

    2013-02-01

    We have developed a 3-dimensional human hemi-cornea which comprises an immortalized epithelial cell line and keratocytes embedded in a collagen stroma. In the present study, we have used MTT reduction of the whole tissue to clarify whether the production of this complex 3-D-model is transferable into other laboratories and whether these tissues can be constructed reproducibly. Our results demonstrate the reproducible production of the hemi-cornea model according to standard operation procedures using 15 independent batches of reconstructed hemi-cornea models in two independent laboratories each. Furthermore, the hemi-cornea tissues have been treated with 20 chemicals of different eye-irritating potential under blind conditions to assess the performance and limitations of our test system comparing three different prediction models. The most suitable prediction model revealed an overall in vitro-in vivo concordance of 80% and 70% in the participating laboratories, respectively, and an inter-laboratory concordance of 80%. Sensitivity of the test was 77% and specificity was between 57% and 86% to discriminate classified from non-classified chemicals. We conclude that additional physiologically relevant endpoints in both epithelium and stroma have to be developed for the reliable prediction of all GHS classes of eye irritation in one stand alone test system. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Reliability of CKD-EPI predictive equation in estimating chronic kidney disease prevalence in the Croatian endemic nephropathy area.

    Science.gov (United States)

    Fuček, Mirjana; Dika, Živka; Karanović, Sandra; Vuković Brinar, Ivana; Premužić, Vedran; Kos, Jelena; Cvitković, Ante; Mišić, Maja; Samardžić, Josip; Rogić, Dunja; Jelaković, Bojan

    2018-02-15

    Chronic kidney disease (CKD) is a significant public health problem and it is not possible to precisely predict its progression to terminal renal failure. According to current guidelines, CKD stages are classified based on the estimated glomerular filtration rate (eGFR) and albuminuria. Aims of this study were to determine the reliability of predictive equation in estimation of CKD prevalence in Croatian areas with endemic nephropathy (EN), compare the results with non-endemic areas, and to determine if the prevalence of CKD stages 3-5 was increased in subjects with EN. A total of 1573 inhabitants of the Croatian Posavina rural area from 6 endemic and 3 non-endemic villages were enrolled. Participants were classified according to the modified criteria of the World Health Organization for EN. Estimated GFR was calculated using Chronic Kidney Disease Epidemiology Collaboration equation (CKD-EPI). The results showed a very high CKD prevalence in the Croatian rural area (19%). CKD prevalence was significantly higher in EN then in non EN villages with the lowest eGFR value in diseased subgroup. eGFR correlated significantly with the diagnosis of EN. Kidney function assessment using CKD-EPI predictive equation proved to be a good marker in differentiating the study subgroups, remained as one of the diagnostic criteria for EN.

  11. Systematic methodology and property prediction of fatty systems for process design/analysis in the oil and fat industry

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Ceriani, Roberta; Gani, Rafiqul

    2010-01-01

    in the vegetable oil were defined. Basic and critical properties were then computed by means of appropriate property prediction software. Temperature dependant properties were modeled using and extending available correlations. The process model was developed through the PRO II commercial simulator and validated......A systematic model based methodology has been developed and its application highlighted through the solvent recovery section of a soybean oil extraction process, with emphasis on the effect of design variables on the process performance. First, the most representative compounds present...

  12. Methodology for Analyzing and Predicting the Runoff and Sediment into a Reservoir

    Directory of Open Access Journals (Sweden)

    Chun-Feng Hao

    2017-06-01

    Full Text Available With the rapid economic growth in China, a large number of hydropower projects have been planned and constructed. The sediment deposition of the reservoirs is one of the most important disputes during the construction and operation, because there are many heavy sediment-laden rivers. The analysis and prediction of the runoff and sediment into a reservoir is of great significance for reservoir operation. With knowledge of the incoming runoff and sediment characteristics, the regulator can adjust the reservoir discharge to guarantee the water supply, and flush more sediment at appropriate times. In this study, the long-term characteristics of runoff and sediment, including trend, jump point, and change cycle, are analyzed using various statistical approaches, such as accumulated anomaly analysis, the Fisher ordered clustering method, and Maximum Entropy Spectral Analysis (MESA. Based on the characteristics, a prediction model is established using the Auto-Regressive Moving Average (ARIMA method. The whole analysis and prediction system is applied to The Three Gorges Project (TGP, one of the biggest hydropower-complex projects in the world. Taking hydrologic series from 1955 to 2010 as the research objectives, the results show that both the runoff and the sediment are decreasing, and the reduction rate of sediment is much higher. Runoff and sediment into the TGP display cyclic variations over time, with a cycle of about a decade, but catastrophe points for runoff and sediment appear in 1991 and 2001, respectively. Prediction models are thus built based on monthly average hydrologic series from 2003 to 2010. ARIMA (1, 1, 1 × (1, 1, 112 and ARIMA (0, 1, 1 × (0, 1, 112 are selected for the runoff and sediment predictions, respectively, and the parameters of the models are also calibrated. The analysis of autocorrelation coefficients and partial autocorrelation coefficients of the residuals indicates that the models built in this study are feasible

  13. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    Science.gov (United States)

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  14. Positive Skin Test or Specific IgE to Penicillin Does Not Reliably Predict Penicillin Allergy.

    Science.gov (United States)

    Tannert, Line Kring; Mortz, Charlotte Gotthard; Skov, Per Stahl; Bindslev-Jensen, Carsten

    According to guidelines, patients are diagnosed with penicillin allergy if skin test (ST) result or specific IgE (s-IgE) to penicillin is positive. However, the true sensitivity and specificity of these tests are presently not known. To investigate the clinical relevance of a positive ST result and positive s-IgE and to study the reproducibility of ST and s-IgE. A sample of convenience of 25 patients with positive penicillin ST results, antipenicillin s-IgE results, or both was challenged with their culprit penicillin. Further 19 patients were not challenged, but deemed allergic on the basis of a recent anaphylactic reaction or delayed reactions to skin testing. Another sample of convenience of 18 patients, 17 overlapping with the 25 challenged, with initial skin testing and s-IgE (median, 25; range, 3-121), months earlier (T -1 ), was repeat skin tested and had s-IgE measured (T 0 ), and then skin tested and had s-IgE measured 4 weeks later (T 1 ). Only 9 (36%) of 25 were challenge positive. There was an increased probability of being penicillin allergic if both ST result and s-IgE were positive at T 0 . Positive ST result or positive s-IgE alone did not predict penicillin allergy. Among the 18 patients repeatedly tested, 46.2% (12 of 25) of positive ST results at T -1 were reproducibly positive at T 0 . For s-IgE, 54.2% (14 of 24) positive measurements were still positive at T 0 and 7 converted to positive at T 1 . The best predictor for a clinically significant (IgE-mediated) penicillin allergy is a combination of a positive case history with simultaneous positive ST result and s-IgE or a positive challenge result. Copyright © 2017 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  15. Prediction of the phase equilibria of methane hydrates using the direct phase coexistence methodology

    Energy Technology Data Exchange (ETDEWEB)

    Michalis, Vasileios K.; Costandy, Joseph; Economou, Ioannis G., E-mail: ioannis.economou@qatar.tamu.edu [Chemical Engineering Program, Texas A and M University at Qatar, P.O. Box 23847, Doha (Qatar); Tsimpanogiannis, Ioannis N.; Stubos, Athanassios K. [Environmental Research Laboratory, National Center for Scientific Research NCSR “Demokritos,” Aghia Paraskevi, Attiki GR-15310 (Greece)

    2015-01-28

    The direct phase coexistence method is used for the determination of the three-phase coexistence line of sI methane hydrates. Molecular dynamics (MD) simulations are carried out in the isothermal–isobaric ensemble in order to determine the coexistence temperature (T{sub 3}) at four different pressures, namely, 40, 100, 400, and 600 bar. Methane bubble formation that results in supersaturation of water with methane is generally avoided. The observed stochasticity of the hydrate growth and dissociation processes, which can be misleading in the determination of T{sub 3}, is treated with long simulations in the range of 1000–4000 ns and a relatively large number of independent runs. Statistical averaging of 25 runs per pressure results in T{sub 3} predictions that are found to deviate systematically by approximately 3.5 K from the experimental values. This is in good agreement with the deviation of 3.15 K between the prediction of TIP4P/Ice water force field used and the experimental melting temperature of ice Ih. The current results offer the most consistent and accurate predictions from MD simulation for the determination of T{sub 3} of methane hydrates. Methane solubility values are also calculated at the predicted equilibrium conditions and are found in good agreement with continuum-scale models.

  16. An EMD-ANN based prediction methodology for DR driven smart household load demand

    NARCIS (Netherlands)

    Tascikaraoglu, A.; Paterakis, N.G.; Catalaõ, J.P.S.; Erdinç, O.; Bakirtzis, A.G.

    2015-01-01

    This study proposes a model for the prediction of smart household load demand influenced by a dynamic pricing demand response (DR) program. Price-based DR programs have a considerable impact on household demand pattern due to the expected choice of customers or their home energy management systems

  17. Genome-based prediction of common diseases: Methodological considerations for future research

    NARCIS (Netherlands)

    A.C.J.W. Janssens (Cécile); P. Tikka-Kleemola (Päivi)

    2009-01-01

    textabstractThe translation of emerging genomic knowledge into public health and clinical care is one of the major challenges for the coming decades. At the moment, genome-based prediction of common diseases, such as type 2 diabetes, coronary heart disease and cancer, is still not informative. Our

  18. ESTABLISHING EMPIRICAL RELATION TO PREDICT TEMPERATURE DIFFERENCE OF VORTEX TUBE USING RESPONSE SURFACE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    PRABAKARAN J.

    2012-12-01

    Full Text Available Vortex tube is a device that produces cold and hot air simultaneously from the source of compressed air. In this work an attempt has been made to investigate the effect of three controllable input variables namely diameter of the orifices, diameter of the nozzles and inlet pressure over the temperature difference in the cold side as output using Response Surface Methodology (RSM. Experiments are conducted using central composite design with three factors at three levels. The influence of vital parameters and interaction among these are investigated using analysis of variance (ANOVA. The proposed mathematical model in this study has proven to fit and in line with experimental values with a 95% confidence interval. It is found that the inlet pressure and diameter of nozzle are significant factors that affect the performance of vortex tube.

  19. A comparative study of finite element methodologies for the prediction of torsional response of bladed rotors

    International Nuclear Information System (INIS)

    Scheepers, R.; Heyns, P. S.

    2016-01-01

    The prevention of torsional vibration-induced fatigue damage to turbo-generators requires determining natural frequencies by either field testing or mathematical modelling. Torsional excitation methods, measurement techniques and mathematical modelling are active fields of research. However, these aspects are mostly considered in isolation and often without experimental verification. The objective of this work is to compare one dimensional (1D), full three dimensional (3D) and 3D cyclic symmetric (3DCS) Finite element (FE) methodologies for torsional vibration response. Results are compared to experimental results for a small-scale test rotor. It is concluded that 3D approaches are feasible given the current computing technology and require less simplification with potentially increased accuracy. Accuracy of 1D models may be reduced due to simplifications but faster solution times are obtained. For high levels of accuracy model updating using field test results is recommended

  20. Prediction of ultrasonic pulse velocity for enhanced peat bricks using adaptive neuro-fuzzy methodology.

    Science.gov (United States)

    Motamedi, Shervin; Roy, Chandrabhushan; Shamshirband, Shahaboddin; Hashim, Roslan; Petković, Dalibor; Song, Ki-Il

    2015-08-01

    Ultrasonic pulse velocity is affected by defects in material structure. This study applied soft computing techniques to predict the ultrasonic pulse velocity for various peats and cement content mixtures for several curing periods. First, this investigation constructed a process to simulate the ultrasonic pulse velocity with adaptive neuro-fuzzy inference system. Then, an ANFIS network with neurons was developed. The input and output layers consisted of four and one neurons, respectively. The four inputs were cement, peat, sand content (%) and curing period (days). The simulation results showed efficient performance of the proposed system. The ANFIS and experimental results were compared through the coefficient of determination and root-mean-square error. In conclusion, use of ANFIS network enhances prediction and generation of strength. The simulation results confirmed the effectiveness of the suggested strategies. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  2. An analytical methodology to predict the coating characteristics of plasma-sprayed ceramic powders

    International Nuclear Information System (INIS)

    Varacalle, D.J. Jr.

    1990-01-01

    Experimental and analytical studies have been conducted at the Idaho National Engineering Laboratory (INEL) to investigate gas, particle, and coating dynamics in the plasma spray process. Nine experiments were conducted using a Taguchi statistical parametric approach. The thermal plasma produced by the commercial plasma spray torch and the related plasma/particle interaction were then numerically modeled from the cathode tip to varied standoff distances in the free plume for the nine experiments, which ranged in power from 28 to 43 kW. The flow and temperature fields in the plasma were solved using the governing conservation equations with suitable boundary conditions. This information was then used as boundary conditions to solve the plasma/particle interaction problem for the nine experiments. The particle dynamics (10- to 75-μm particles) for a yttria-stabilized zirconia powder were then simulated by computer. Particle morphology is discussed with respect to the changes in the process parameters. The predicted temperature and velocity of the zirconia particles were then used as initial conditions to a coating dynamics code. The code predicts the thickness and porosity of the zirconia coatings for the specific process parameters. The predicted coating characteristics exhibit reasonable correlation with the actual characteristics obtained from the Taguchi experimental studies. 12 refs., 7 figs., 6 tabs

  3. Forecasting method for global radiation time series without training phase: Comparison with other well-known prediction methodologies

    International Nuclear Information System (INIS)

    Voyant, Cyril; Motte, Fabrice; Fouilloy, Alexis; Notton, Gilles; Paoli, Christophe; Nivet, Marie-Laure

    2017-01-01

    Integration of unpredictable renewable energy sources into electrical networks intensifies the complexity of the grid management due to their intermittent and unforeseeable nature. Because of the strong increase of solar power generation the prediction of solar yields becomes more and more important. Electrical operators need an estimation of the future production. For nowcasting and short term forecasting, the usual technics based on machine learning need large historical data sets of good quality during the training phase of predictors. However data are not always available and induce an advanced maintenance of meteorological stations, making the method inapplicable for poor instrumented or isolated sites. In this work, we propose intuitive methodologies based on the Kalman filter use (also known as linear quadratic estimation), able to predict a global radiation time series without the need of historical data. The accuracy of these methods is compared to other classical data driven methods, for different horizons of prediction and time steps. The proposed approach shows interesting capabilities allowing to improve quasi-systematically the prediction. For one to 10 h horizons Kalman model performances are competitive in comparison to more sophisticated models such as ANN which require both consistent historical data sets and computational resources. - Highlights: • Solar radiation forecasting with time series formalism. • Trainless approach compared to machine learning methods. • Very simple method dedicated to solar irradiation forecasting with high accuracy.

  4. Methodologic Guide for Evaluating Clinical Performance and Effect of Artificial Intelligence Technology for Medical Diagnosis and Prediction.

    Science.gov (United States)

    Park, Seong Ho; Han, Kyunghwa

    2018-03-01

    The use of artificial intelligence in medicine is currently an issue of great interest, especially with regard to the diagnostic or predictive analysis of medical images. Adoption of an artificial intelligence tool in clinical practice requires careful confirmation of its clinical utility. Herein, the authors explain key methodology points involved in a clinical evaluation of artificial intelligence technology for use in medicine, especially high-dimensional or overparameterized diagnostic or predictive models in which artificial deep neural networks are used, mainly from the standpoints of clinical epidemiology and biostatistics. First, statistical methods for assessing the discrimination and calibration performances of a diagnostic or predictive model are summarized. Next, the effects of disease manifestation spectrum and disease prevalence on the performance results are explained, followed by a discussion of the difference between evaluating the performance with use of internal and external datasets, the importance of using an adequate external dataset obtained from a well-defined clinical cohort to avoid overestimating the clinical performance as a result of overfitting in high-dimensional or overparameterized classification model and spectrum bias, and the essentials for achieving a more robust clinical evaluation. Finally, the authors review the role of clinical trials and observational outcome studies for ultimate clinical verification of diagnostic or predictive artificial intelligence tools through patient outcomes, beyond performance metrics, and how to design such studies. © RSNA, 2018.

  5. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  6. Prediction of failure enthalpy and reliability of irradiated fuel rod under reactivity-initiated accidents by means of statistical approach

    International Nuclear Information System (INIS)

    Nam, Cheol; Choi, Byeong Kwon; Jeong, Yong Hwan; Jung, Youn Ho

    2001-01-01

    During the last decade, the failure behavior of high-burnup fuel rods under RIA has been an extensive concern since observations of fuel rod failures at low enthalpy. Of great importance is placed on failure prediction of fuel rod in the point of licensing criteria and safety in extending burnup achievement. To address the issue, a statistics-based methodology is introduced to predict failure probability of irradiated fuel rods. Based on RIA simulation results in literature, a failure enthalpy correlation for irradiated fuel rod is constructed as a function of oxide thickness, fuel burnup, and pulse width. From the failure enthalpy correlation, a single damage parameter, equivalent enthalpy, is defined to reflect the effects of the three primary factors as well as peak fuel enthalpy. Moreover, the failure distribution function with equivalent enthalpy is derived, applying a two-parameter Weibull statistical model. Using these equations, the sensitivity analysis is carried out to estimate the effects of burnup, corrosion, peak fuel enthalpy, pulse width and cladding materials used

  7. Do in-training evaluation reports deserve their bad reputations? A study of the reliability and predictive ability of ITER scores and narrative comments.

    Science.gov (United States)

    Ginsburg, Shiphra; Eva, Kevin; Regehr, Glenn

    2013-10-01

    Although scores on in-training evaluation reports (ITERs) are often criticized for poor reliability and validity, ITER comments may yield valuable information. The authors assessed across-rotation reliability of ITER scores in one internal medicine program, ability of ITER scores and comments to predict postgraduate year three (PGY3) performance, and reliability and incremental predictive validity of attendings' analysis of written comments. Numeric and narrative data from the first two years of ITERs for one cohort of residents at the University of Toronto Faculty of Medicine (2009-2011) were assessed for reliability and predictive validity of third-year performance. Twenty-four faculty attendings rank-ordered comments (without scores) such that each resident was ranked by three faculty. Mean ITER scores and comment rankings were submitted to regression analyses; dependent variables were PGY3 ITER scores and program directors' rankings. Reliabilities of ITER scores across nine rotations for 63 residents were 0.53 for both postgraduate year one (PGY1) and postgraduate year two (PGY2). Interrater reliabilities across three attendings' rankings were 0.83 for PGY1 and 0.79 for PGY2. There were strong correlations between ITER scores and comments within each year (0.72 and 0.70). Regressions revealed that PGY1 and PGY2 ITER scores collectively explained 25% of variance in PGY3 scores and 46% of variance in PGY3 rankings. Comment rankings did not improve predictions. ITER scores across multiple rotations showed decent reliability and predictive validity. Comment ranks did not add to the predictive ability, but correlation analyses suggest that trainee performance can be measured through these comments.

  8. A core-monitoring based methodology for predictions of graphite weight loss in AGR moderator bricks

    Energy Technology Data Exchange (ETDEWEB)

    McNally, K., E-mail: kevin.mcnally@hsl.gsi.gov.uk [Health and Safety Laboratory, Harpur Hill, Buxton, Derbyshire SK17 9JN (United Kingdom); Warren, N. [Health and Safety Laboratory, Harpur Hill, Buxton, Derbyshire SK17 9JN (United Kingdom); Fahad, M.; Hall, G.; Marsden, B.J. [Nuclear Graphite Research Group, School of MACE, University of Manchester, Manchester M13 9PL (United Kingdom)

    2017-04-01

    Highlights: • A statistically-based methodology for estimating graphite density is presented. • Graphite shrinkage is accounted for using a finite element model. • Differences in weight loss forecasts were found when compared to the existing model. - Abstract: Physically based models, resolved using the finite element (FE) method are often used to model changes in dimensions and the associated stress fields of graphite moderator bricks within a reactor. These models require inputs that describe the loading conditions (temperature, fluence and weight loss ‘field variables’), and coded relationships describing the behaviour of graphite under these conditions. The weight loss field variables are calculated using a reactor chemistry/physics code FEAT DIFFUSE. In this work the authors consider an alternative data source of weight loss: that from a longitudinal dataset of density measurements made on small samples trepanned from operating reactors during statutory outages. A nonlinear mixed-effect model is presented for modelling the age and depth-related trends in density. A correction that accounts for irradiation-induced dimensional changes (axial and radial shrinkage) is subsequently applied. The authors compare weight loss forecasts made using FEAT DIFFUSE with those based on an alternative statistical model for a layer four moderator brick for the Hinkley Point B, Reactor 3. The authors compare the two approaches for the weight loss distribution through the brick with a particular focus on the interstitial keyway, and for the average (over the volume of the brick) weight loss.

  9. Methodology for predicting market transformation due to implementation of energy efficiency standards and labels

    International Nuclear Information System (INIS)

    Mahlia, T.M.I.

    2004-01-01

    There are many papers that have been published on energy efficiency standards and labels. However, a very limited number of articles on the subject have discussed the transformation of appliance energy efficiency in the market after the programs are implemented. This paper is an attempt to investigate the market transformation due to implementation of minimum energy efficiency standards and energy labels. Even though the paper only investigates room air conditioners as a case study, the method is also applicable for predicting market transformation for other household electrical appliances

  10. Methodology to predict the initiation of multiple transverse fractures from horizontal wellbores

    Energy Technology Data Exchange (ETDEWEB)

    Crosby, D. G.; Yang, Z.; Rahman, S. S. [Univ. of New South Wales (Australia)

    2001-10-01

    The criterion based on Drucker and Prager which is designed to predict the pressure required to initiate secondary multiple transverse fractures in close proximity to primary fractures is discussed. Results based on this criterion compare favorably with those measured during a series of laboratory-scale hydraulic fracture interaction tests. It is concluded that the multiple fracture criterion and laboratory results demonstrate that transversely fractured horizontal wellbores have a limited capacity to resist the initiation of multiple fractures from adjacent perforations, or intersecting induced and natural fractures. 23 refs., 1 tab., 9 figs.

  11. Prediction methodologies for target scene generation in the aerothermal targets analysis program (ATAP)

    Science.gov (United States)

    Hudson, Douglas J.; Torres, Manuel; Dougherty, Catherine; Rajendran, Natesan; Thompson, Rhoe A.

    2003-09-01

    The Air Force Research Laboratory (AFRL) Aerothermal Targets Analysis Program (ATAP) is a user-friendly, engineering-level computational tool that features integrated aerodynamics, six-degree-of-freedom (6-DoF) trajectory/motion, convective and radiative heat transfer, and thermal/material response to provide an optimal blend of accuracy and speed for design and analysis applications. ATAP is sponsored by the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility at Eglin AFB, where it is used with the CHAMP (Composite Hardbody and Missile Plume) technique for rapid infrared (IR) signature and imagery predictions. ATAP capabilities include an integrated 1-D conduction model for up to 5 in-depth material layers (with options for gaps/voids with radiative heat transfer), fin modeling, several surface ablation modeling options, a materials library with over 250 materials, options for user-defined materials, selectable/definable atmosphere and earth models, multiple trajectory options, and an array of aerodynamic prediction methods. All major code modeling features have been validated with ground-test data from wind tunnels, shock tubes, and ballistics ranges, and flight-test data for both U.S. and foreign strategic and theater systems. Numerous applications include the design and analysis of interceptors, booster and shroud configurations, window environments, tactical missiles, and reentry vehicles.

  12. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  13. Predictive methodology to address PWSCC of Alloy 600 locations in PWRS

    International Nuclear Information System (INIS)

    Rao, G.V.

    1992-01-01

    Contributing factors to primary water stress corrosion cracking (PWSCC) are susceptible microstructure, temperature, and residual and applied stresses. In order to predict PWSCC of Inconel 600 components in PWR type reactors, a number of steps were taken. All Inconel 600 components were located, fabrication history, weld procedures and material properties were identified. Service temperatures and approximate stresses were determined. Precise service stress evaluations of Inconel 600 locations by Finite Element and other analytical evaluations were made. Using data analysis, relative PWSCC susceptibility evaluations of Inconel 600 locations were made on the basis of the Westinghouse RSI model. Finally, a prioritized inspection plan for Inconel 600 locations was developed and recommendations provided. 11 figs., 2 tabs

  14. Predicting activities of daily living for cancer patients using an ontology-guided machine learning methodology.

    Science.gov (United States)

    Min, Hua; Mobahi, Hedyeh; Irvin, Katherine; Avramovic, Sanja; Wojtusiak, Janusz

    2017-09-16

    Bio-ontologies are becoming increasingly important in knowledge representation and in the machine learning (ML) fields. This paper presents a ML approach that incorporates bio-ontologies and its application to the SEER-MHOS dataset to discover patterns of patient characteristics that impact the ability to perform activities of daily living (ADLs). Bio-ontologies are used to provide computable knowledge for ML methods to "understand" biomedical data. This retrospective study included 723 cancer patients from the SEER-MHOS dataset. Two ML methods were applied to create predictive models for ADL disabilities for the first year after a patient's cancer diagnosis. The first method is a standard rule learning algorithm; the second is that same algorithm additionally equipped with methods for reasoning with ontologies. The models showed that a patient's race, ethnicity, smoking preference, treatment plan and tumor characteristics including histology, staging, cancer site, and morphology were predictors for ADL performance levels one year after cancer diagnosis. The ontology-guided ML method was more accurate at predicting ADL performance levels (P ontologies. This study demonstrated that bio-ontologies can be harnessed to provide medical knowledge for ML algorithms. The presented method demonstrates that encoding specific types of hierarchical relationships to guide rule learning is possible, and can be extended to other types of semantic relationships present in biomedical ontologies. The ontology-guided ML method achieved better performance than the method without ontologies. The presented method can also be used to promote the effectiveness and efficiency of ML in healthcare, in which use of background knowledge and consistency with existing clinical expertise is critical.

  15. Laser Reliability Prediction

    Science.gov (United States)

    1975-08-01

    qovernment aqencies on request from RADC ( nCTM ) Griffiss AFP NY 13441. 71. Cochran, Albert L., "Check Test of Rangefinder, AN/GVS-3 (SM23E2)," Army...Qualified military and oovemnent aoencles on request from RAOC ( ncTM ) Griff Iss AFB NY 13441. 222. This reference will be made available to Qualified

  16. Predictive Models for Different Roughness Parameters During Machining Process of Peek Composites Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Mata-Cabrera Francisco

    2013-10-01

    Full Text Available Polyetheretherketone (PEEK composite belongs to a group of high performance thermoplastic polymers and is widely used in structural components. To improve the mechanical and tribological properties, short fibers are added as reinforcement to the material. Due to its functional properties and potential applications, it’s impor- tant to investigate the machinability of non-reinforced PEEK (PEEK, PEEK rein- forced with 30% of carbon fibers (PEEK CF30, and reinforced PEEK with 30% glass fibers (PEEK GF30 to determine the optimal conditions for the manufacture of the parts. The present study establishes the relationship between the cutting con- ditions (cutting speed and feed rate and the roughness (Ra , Rt , Rq , Rp , by develop- ing second order mathematical models. The experiments were planned as per full factorial design of experiments and an analysis of variance has been performed to check the adequacy of the models. These state the adequacy of the derived models to obtain predictions for roughness parameters within ranges of parameters that have been investigated during the experiments. The experimental results show that the most influence of the cutting parameters is the feed rate, furthermore, proved that glass fiber reinforcements produce a worse machinability.

  17. A Compact Methodology to Understand, Evaluate, and Predict the Performance of Automatic Target Recognition

    Science.gov (United States)

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Chen, Yiping; Zhuang, Zhaowen; Cheng, Yongqiang; Deng, Bin; Wang, Liandong; Zeng, Yonghu; Gao, Lei

    2014-01-01

    This paper offers a compacted mechanism to carry out the performance evaluation work for an automatic target recognition (ATR) system: (a) a standard description of the ATR system's output is suggested, a quantity to indicate the operating condition is presented based on the principle of feature extraction in pattern recognition, and a series of indexes to assess the output in different aspects are developed with the application of statistics; (b) performance of the ATR system is interpreted by a quality factor based on knowledge of engineering mathematics; (c) through a novel utility called “context-probability” estimation proposed based on probability, performance prediction for an ATR system is realized. The simulation result shows that the performance of an ATR system can be accounted for and forecasted by the above-mentioned measures. Compared to existing technologies, the novel method can offer more objective performance conclusions for an ATR system. These conclusions may be helpful in knowing the practical capability of the tested ATR system. At the same time, the generalization performance of the proposed method is good. PMID:24967605

  18. Re-establishing the pecking order: Niche models reliably predict suitable habitats for the reintroduction of red-billed oxpeckers.

    Science.gov (United States)

    Kalle, Riddhika; Combrink, Leigh; Ramesh, Tharmalingam; Downs, Colleen T

    2017-03-01

    Distributions of avian mutualists are affected by changes in biotic interactions and environmental conditions driven directly/indirectly by human actions. The range contraction of red-billed oxpeckers ( Buphagus erythrorhynchus ) in South Africa is partly a result of the widespread use of acaracides (i.e., mainly cattle dips), toxic to both ticks and oxpeckers. We predicted the habitat suitability of red-billed oxpeckers in South Africa using ensemble models to assist the ongoing reintroduction efforts and to identify new reintroduction sites for population recovery. The distribution of red-billed oxpeckers was influenced by moderate to high tree cover, woodland habitats, and starling density (a proxy for cavity-nesting birds) with regard to nest-site characteristics. Consumable resources (host and tick density), bioclimate, surface water body density, and proximity to protected areas were other influential predictors. Our models estimated 42,576.88-98,506.98 km 2 of highly suitable habitat (0.5-1) covering the majority of Limpopo, Mpumalanga, North West, a substantial portion of northern KwaZulu-Natal (KZN) and the Gauteng Province. Niche models reliably predicted suitable habitat in 40%-61% of the reintroduction sites where breeding is currently successful. Ensemble, boosted regression trees and generalized additive models predicted few suitable areas in the Eastern Cape and south of KZN that are part of the historic range. A few southern areas in the Northern Cape, outside the historic range, also had suitable sites predicted. Our models are a promising decision support tool for guiding reintroduction programs at macroscales. Apart from active reintroductions, conservation programs should encourage farmers and/or landowners to use oxpecker-compatible agrochemicals and set up adequate nest boxes to facilitate the population recovery of the red-billed oxpecker, particularly in human-modified landscapes. To ensure long-term conservation success, we suggest that

  19. A General Design Methodology for Synchronous Early-Completion-Prediction Adders in Nano-CMOS DSP Architectures

    Directory of Open Access Journals (Sweden)

    Mauro Olivieri

    2013-01-01

    Full Text Available Synchronous early-completion-prediction adders (ECPAs are used for high clock rate and high-precision DSP datapaths, as they allow a dominant amount of single-cycle operations even if the worst-case carry propagation delay is longer than the clock period. Previous works have also demonstrated ECPA advantages for average leakage reduction and NBTI effects reduction in nanoscale CMOS technologies. This paper illustrates a general systematic methodology to design ECPA units, targeting nanoscale CMOS technologies, which is not available in the current literature yet. The method is fully compatible with standard VLSI macrocell design tools and standard adder structures and includes automatic definition of critical test patterns for postlayout verification. A design example is included, reporting speed and power data superior to previous works.

  20. Remote Sensing-based Methodologies for Snow Model Adjustments in Operational Streamflow Prediction

    Science.gov (United States)

    Bender, S.; Miller, W. P.; Bernard, B.; Stokes, M.; Oaida, C. M.; Painter, T. H.

    2015-12-01

    Water management agencies rely on hydrologic forecasts issued by operational agencies such as NOAA's Colorado Basin River Forecast Center (CBRFC). The CBRFC has partnered with the Jet Propulsion Laboratory (JPL) under funding from NASA to incorporate research-oriented, remotely-sensed snow data into CBRFC operations and to improve the accuracy of CBRFC forecasts. The partnership has yielded valuable analysis of snow surface albedo as represented in JPL's MODIS Dust Radiative Forcing in Snow (MODDRFS) data, across the CBRFC's area of responsibility. When dust layers within a snowpack emerge, reducing the snow surface albedo, the snowmelt rate may accelerate. The CBRFC operational snow model (SNOW17) is a temperature-index model that lacks explicit representation of snowpack surface albedo. CBRFC forecasters monitor MODDRFS data for emerging dust layers and may manually adjust SNOW17 melt rates. A technique was needed for efficient and objective incorporation of the MODDRFS data into SNOW17. Initial development focused in Colorado, where dust-on-snow events frequently occur. CBRFC forecasters used retrospective JPL-CBRFC analysis and developed a quantitative relationship between MODDRFS data and mean areal temperature (MAT) data. The relationship was used to generate adjusted, MODDRFS-informed input for SNOW17. Impacts of the MODDRFS-SNOW17 MAT adjustment method on snowmelt-driven streamflow prediction varied spatially and with characteristics of the dust deposition events. The largest improvements occurred in southwestern Colorado, in years with intense dust deposition events. Application of the method in other regions of Colorado and in "low dust" years resulted in minimal impact. The MODDRFS-SNOW17 MAT technique will be implemented in CBRFC operations in late 2015, prior to spring 2016 runoff. Collaborative investigation of remote sensing-based adjustment methods for the CBRFC operational hydrologic forecasting environment will continue over the next several years.

  1. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  2. The reliability of test results from simple test samples in predicting the fatigue performance of automotive components

    International Nuclear Information System (INIS)

    Fourlaris, G.; Ellwood, R.; Jones, T.B.

    2007-01-01

    The use of high strength steels (HSS) in automotive components is steadily increasing as automotive designers use modern steel grades to improve structural performance, reduce vehicle weight and enhance crash performance. Weight reduction can be achieved by substituting mild steel with a thinner gauge HSS, however, it must be ensured that no deterioration in performance including fatigue capability occurs. In this study, tests have been carried out to determine the effects that gauge and material strength have on the fatigue performance of a fusion welded automotive suspension arm. Current finite element (FE) modelling and fatigue prediction techniques have been evaluated to determine their reliability when used for thin strip steels. Results have shown the fatigue performance of welded components to be independent of the strength of the parent material for the steel grades studied, with material thickness and joining process the key features determining the fatigue performance. The correlation between the fatigue performance of simple welded samples under uniaxial, constant amplitude loading and complex components under biaxial in service road load data, has been shown to be unreliable. This study also indicates that with the application of modern technologies, such as tailor-welded blanks (TWB), significant weight savings can be achieved. This is demonstrated by a 19% weight reduction with no detrimental effect on the fatigue performance

  3. Reliability prediction for the vehicles equipped with advanced driver assistance systems (ADAS and passive safety systems (PSS

    Directory of Open Access Journals (Sweden)

    Balbir S. Dhillon

    2012-10-01

    Full Text Available The human error has been reported as a major root cause in road accidents in today’s world. The human as a driver in road vehicles composed of human, mechanical and electrical components is constantly exposed to changing surroundings (e.g., road conditions, environmentwhich deteriorate the driver’s capacities leading to a potential accident. The auto industries and transportation authorities have realized that similar to other complex and safety sensitive transportation systems, the road vehicles need to rely on both advanced technologies (i.e., Advanced Driver Assistance Systems (ADAS and Passive Safety Systems (PSS (e.g.,, seatbelts, airbags in order to mitigate the risk of accidents and casualties. In this study, the advantages and disadvantages of ADAS as active safety systems as well as passive safety systems in road vehicles have been discussed. Also, this study proposes models that analyze the interactions between human as a driver and ADAS Warning and Crash Avoidance Systems and PSS in the design of vehicles. Thereafter, the mathematical models have been developed to make reliability prediction at any given time on the road transportation for vehicles equipped with ADAS and PSS. Finally, the implications of this study in the improvement of vehicle designs and prevention of casualties are discussed.

  4. Parts and Components Reliability Assessment: A Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  5. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  6. Increasing of prediction reliability of calcium carbonate scale formation in heat exchanger of secondary coolant circuits of thermal and nuclear power plants

    International Nuclear Information System (INIS)

    Tret'yakov, O.V.; Kritskij, V.G.; Styazhkin, P.S.

    1991-01-01

    Calcium carbonate scale formation in the secondary circuit heat exchanger of thermal and nuclear power plants is investigated. A model of calcium-carbonate scale formation providing quite reliable prediction of process running and the possibility of its control affecting the parameters of hydrochemical regime (HCR) is developed. The results can be used when designing the automatic-control system of HCR

  7. Korean Brain Aging Study for the Early Diagnosis and Prediction of Alzheimer's Disease: Methodology and Baseline Sample Characteristics.

    Science.gov (United States)

    Byun, Min Soo; Yi, Dahyun; Lee, Jun Ho; Choe, Young Min; Sohn, Bo Kyung; Lee, Jun-Young; Choi, Hyo Jung; Baek, Hyewon; Kim, Yu Kyeong; Lee, Yun-Sang; Sohn, Chul-Ho; Mook-Jung, Inhee; Choi, Murim; Lee, Yu Jin; Lee, Dong Woo; Ryu, Seung-Ho; Kim, Shin Gyeom; Kim, Jee Wook; Woo, Jong Inn; Lee, Dong Young

    2017-11-01

    The Korean Brain Aging Study for the Early Diagnosis and Prediction of Alzheimer's disease (KBASE) aimed to recruit 650 individuals, aged from 20 to 90 years, to search for new biomarkers of Alzheimer's disease (AD) and to investigate how multi-faceted lifetime experiences and bodily changes contribute to the brain changes or brain pathologies related to the AD process. All participants received comprehensive clinical and neuropsychological evaluations, multi-modal brain imaging, including magnetic resonance imaging, magnetic resonance angiography, [ 11 C]Pittsburgh compound B-positron emission tomography (PET), and [ 18 F]fluorodeoxyglucose-PET, blood and genetic marker analyses at baseline, and a subset of participants underwent actigraph monitoring and completed a sleep diary. Participants are to be followed annually with clinical and neuropsychological assessments, and biannually with the full KBASE assessment, including neuroimaging and laboratory tests. As of March 2017, in total, 758 individuals had volunteered for this study. Among them, in total, 591 participants-291 cognitively normal (CN) old-aged individuals, 74 CN young- and middle-aged individuals, 139 individuals with mild cognitive impairment (MCI), and 87 individuals with AD dementia (ADD)-were enrolled at baseline, after excluding 162 individuals. A subset of participants (n=275) underwent actigraph monitoring. The KBASE cohort is a prospective, longitudinal cohort study that recruited participants with a wide age range and a wide distribution of cognitive status (CN, MCI, and ADD) and it has several strengths in its design and methodologies. Details of the recruitment, study methodology, and baseline sample characteristics are described in this paper.

  8. How to quantify exposure to traumatic stress? Reliability and predictive validity of measures for cumulative trauma exposure in a post-conflict population.

    Science.gov (United States)

    Wilker, Sarah; Pfeiffer, Anett; Kolassa, Stephan; Koslowski, Daniela; Elbert, Thomas; Kolassa, Iris-Tatjana

    2015-01-01

    While studies with survivors of single traumatic experiences highlight individual response variation following trauma, research from conflict regions shows that almost everyone develops posttraumatic stress disorder (PTSD) if trauma exposure reaches extreme levels. Therefore, evaluating the effects of cumulative trauma exposure is of utmost importance in studies investigating risk factors for PTSD. Yet, little research has been devoted to evaluate how this important environmental risk factor can be best quantified. We investigated the retest reliability and predictive validity of different trauma measures in a sample of 227 Ugandan rebel war survivors. Trauma exposure was modeled as the number of traumatic event types experienced or as a score considering traumatic event frequencies. In addition, we investigated whether age at trauma exposure can be reliably measured and improves PTSD risk prediction. All trauma measures showed good reliability. While prediction of lifetime PTSD was most accurate from the number of different traumatic event types experienced, inclusion of event frequencies slightly improved the prediction of current PTSD. As assessing the number of traumatic events experienced is the least stressful and time-consuming assessment and leads to the best prediction of lifetime PTSD, we recommend this measure for research on PTSD etiology.

  9. How to quantify exposure to traumatic stress? Reliability and predictive validity of measures for cumulative trauma exposure in a post-conflict population

    Directory of Open Access Journals (Sweden)

    Sarah Wilker

    2015-11-01

    Full Text Available Background: While studies with survivors of single traumatic experiences highlight individual response variation following trauma, research from conflict regions shows that almost everyone develops posttraumatic stress disorder (PTSD if trauma exposure reaches extreme levels. Therefore, evaluating the effects of cumulative trauma exposure is of utmost importance in studies investigating risk factors for PTSD. Yet, little research has been devoted to evaluate how this important environmental risk factor can be best quantified. Methods: We investigated the retest reliability and predictive validity of different trauma measures in a sample of 227 Ugandan rebel war survivors. Trauma exposure was modeled as the number of traumatic event types experienced or as a score considering traumatic event frequencies. In addition, we investigated whether age at trauma exposure can be reliably measured and improves PTSD risk prediction. Results: All trauma measures showed good reliability. While prediction of lifetime PTSD was most accurate from the number of different traumatic event types experienced, inclusion of event frequencies slightly improved the prediction of current PTSD. Conclusions: As assessing the number of traumatic events experienced is the least stressful and time-consuming assessment and leads to the best prediction of lifetime PTSD, we recommend this measure for research on PTSD etiology.

  10. STRAPS v1.0: evaluating a methodology for predicting electron impact ionisation mass spectra for the aerosol mass spectrometer

    Directory of Open Access Journals (Sweden)

    D. O. Topping

    2017-06-01

    Full Text Available Our ability to model the chemical and thermodynamic processes that lead to secondary organic aerosol (SOA formation is thought to be hampered by the complexity of the system. While there are fundamental models now available that can simulate the tens of thousands of reactions thought to take place, validation against experiments is highly challenging. Techniques capable of identifying individual molecules such as chromatography are generally only capable of quantifying a subset of the material present, making it unsuitable for a carbon budget analysis. Integrative analytical methods such as the Aerosol Mass Spectrometer (AMS are capable of quantifying all mass, but because of their inability to isolate individual molecules, comparisons have been limited to simple data products such as total organic mass and the O : C ratio. More detailed comparisons could be made if more of the mass spectral information could be used, but because a discrete inversion of AMS data is not possible, this activity requires a system of predicting mass spectra based on molecular composition. In this proof-of-concept study, the ability to train supervised methods to predict electron impact ionisation (EI mass spectra for the AMS is evaluated. Supervised Training Regression for the Arbitrary Prediction of Spectra (STRAPS is not built from first principles. A methodology is constructed whereby the presence of specific mass-to-charge ratio (m∕z channels is fitted as a function of molecular structure before the relative peak height for each channel is similarly fitted using a range of regression methods. The widely used AMS mass spectral database is used as a basis for this, using unit mass resolution spectra of laboratory standards. Key to the fitting process is choice of structural information, or molecular fingerprint. Our approach relies on using supervised methods to automatically optimise the relationship between spectral characteristics and these molecular

  11. STRAPS v1.0: evaluating a methodology for predicting electron impact ionisation mass spectra for the aerosol mass spectrometer

    Science.gov (United States)

    Topping, David O.; Allan, James; Rami Alfarra, M.; Aumont, Bernard

    2017-06-01

    Our ability to model the chemical and thermodynamic processes that lead to secondary organic aerosol (SOA) formation is thought to be hampered by the complexity of the system. While there are fundamental models now available that can simulate the tens of thousands of reactions thought to take place, validation against experiments is highly challenging. Techniques capable of identifying individual molecules such as chromatography are generally only capable of quantifying a subset of the material present, making it unsuitable for a carbon budget analysis. Integrative analytical methods such as the Aerosol Mass Spectrometer (AMS) are capable of quantifying all mass, but because of their inability to isolate individual molecules, comparisons have been limited to simple data products such as total organic mass and the O : C ratio. More detailed comparisons could be made if more of the mass spectral information could be used, but because a discrete inversion of AMS data is not possible, this activity requires a system of predicting mass spectra based on molecular composition. In this proof-of-concept study, the ability to train supervised methods to predict electron impact ionisation (EI) mass spectra for the AMS is evaluated. Supervised Training Regression for the Arbitrary Prediction of Spectra (STRAPS) is not built from first principles. A methodology is constructed whereby the presence of specific mass-to-charge ratio (m/z) channels is fitted as a function of molecular structure before the relative peak height for each channel is similarly fitted using a range of regression methods. The widely used AMS mass spectral database is used as a basis for this, using unit mass resolution spectra of laboratory standards. Key to the fitting process is choice of structural information, or molecular fingerprint. Our approach relies on using supervised methods to automatically optimise the relationship between spectral characteristics and these molecular fingerprints. Therefore

  12. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    Directory of Open Access Journals (Sweden)

    Thangam Chinnadurai

    2016-12-01

    Full Text Available This study focuses on investigating the effects of process parameters, namely, Peak current (Ip, Pulse on time (Ton, Pulse off time (Toff, Water pressure (Wp, Wire feed rate (Wf, Wire tension (Wt, Servo voltage (Sv and Servo feed setting (Sfs, on the Material Removal Rate (MRR and Surface Roughness (SR for Wire electrical discharge machining (Wire-EDM of nickel using Taguchi method. Response Surface Methodology (RSM is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used.

  13. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    International Nuclear Information System (INIS)

    Chinnadurai, T.; Vendan, S.A.

    2016-01-01

    This study focuses on investigating the effects of process parameters, namely, Peak current (Ip), Pulse on time (Ton), Pulse off time (Toff), Water pressure (Wp), Wire feed rate (Wf), Wire tension (Wt), Servo voltage (Sv) and Servo feed setting (Sfs), on the Material Removal Rate (MRR) and Surface Roughness (SR) for Wire electrical discharge machining (Wire-EDM) of nickel using Taguchi method. Response Surface Methodology (RSM) is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA) method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used. (Author)

  14. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Chinnadurai, T.; Vendan, S.A.

    2016-07-01

    This study focuses on investigating the effects of process parameters, namely, Peak current (Ip), Pulse on time (Ton), Pulse off time (Toff), Water pressure (Wp), Wire feed rate (Wf), Wire tension (Wt), Servo voltage (Sv) and Servo feed setting (Sfs), on the Material Removal Rate (MRR) and Surface Roughness (SR) for Wire electrical discharge machining (Wire-EDM) of nickel using Taguchi method. Response Surface Methodology (RSM) is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA) method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used. (Author)

  15. Mechanical behavior and wear prediction of stir cast Al–TiB2 composites using response surface methodology

    International Nuclear Information System (INIS)

    Suresh, S.; Shenbaga Vinayaga Moorthi, N.; Vettivel, S.C.; Selvakumar, N.

    2014-01-01

    Graphical abstract: - Highlights: • Various experiments were conducted on Al6061–TiB 2 composite. • XRD and EDS studies confirm the crystalline size and elements present. • SEM, EDS and OM observations were used to study the characteristics. • Curve fitting and RSM design methods are effectively used to develop the model. - Abstract: Al6061 was reinforced with various percentages of TiB 2 particles by using high energy stir casting method. The characterization was performed through X-ray Diffraction, Energy Dispersive Spectrum and Scanning Electron Microscope. The mechanical behaviors such as hardness, tensile strength and tribological behavior were investigated. Wear experiments were conducted by using a pin-on-disc wear tester at varying load. The curve fitting technique was used to develop the respective polynomial and power law equations. The wear mechanism of the specimen was studied through SEM. Response Surface Methodology was used to minimize the number of experimental conditions and develop the mathematical models between the key process parameters namely weight percentage of TiB 2 , load and sliding distance. Analysis of Variance technique was applied to check the validity of the developed model. The mathematical model developed for the specific wear rate was predicted at 99.5% confidence level and some useful conclusions were made

  16. The reliability of a segmentation methodology for assessing intramuscular adipose tissue and other soft-tissue compartments of lower leg MRI images.

    Science.gov (United States)

    Karampatos, Sarah; Papaioannou, Alexandra; Beattie, Karen A; Maly, Monica R; Chan, Adrian; Adachi, Jonathan D; Pritchard, Janet M

    2016-04-01

    Determine the reliability of a magnetic resonance (MR) image segmentation protocol for quantifying intramuscular adipose tissue (IntraMAT), subcutaneous adipose tissue, total muscle and intermuscular adipose tissue (InterMAT) of the lower leg. Ten axial lower leg MRI slices were obtained from 21 postmenopausal women using a 1 Tesla peripheral MRI system. Images were analyzed using sliceOmatic™ software. The average cross-sectional areas of the tissues were computed for the ten slices. Intra-rater and inter-rater reliability were determined and expressed as the standard error of measurement (SEM) (absolute reliability) and intraclass coefficient (ICC) (relative reliability). Intra-rater and inter-rater reliability for IntraMAT were 0.991 (95% confidence interval [CI] 0.978-0.996, p soft tissue compartments, the ICCs were all >0.90 (p soft-tissue compartments of the lower leg. A standard operating procedure manual is provided to assist users, and SEM values can be used to estimate sample size and determine confidence in repeated measurements in future research.

  17. Using image analysis as a tool for assessment of prognostic and predictive biomarkers for breast cancer: How reliable is it?

    Directory of Open Access Journals (Sweden)

    Mark C Lloyd

    2010-01-01

    Full Text Available Background : Estrogen receptor (ER, progesterone receptor (PR and human epidermal growth factor receptor-2 (HER2 are important and well-established prognostic and predictive biomarkers for breast cancers and routinely tested on patient′s tumor samples by immunohistochemical (IHC study. The accuracy of these test results has substantial impact on patient management. A critical factor that contributes to the result is the interpretation (scoring of IHC. This study investigates how computerized image analysis can play a role in a reliable scoring, and identifies potential pitfalls with common methods. Materials and Methods : Whole slide images of 33 invasive ductal carcinoma (IDC (10 ER and 23 HER2 were scored by pathologist under the light microscope and confirmed by another pathologist. The HER2 results were additionally confirmed by fluorescence in situ hybridization (FISH. The scoring criteria were adherent to the guidelines recommended by the American Society of Clinical Oncology/College of American Pathologists. Whole slide stains were then scored by commercially available image analysis algorithms from Definiens (Munich, Germany and Aperio Technologies (Vista, CA, USA. Each algorithm was modified specifically for each marker and tissue. The results were compared with the semi-quantitative manual scoring, which was considered the gold standard in this study. Results : For HER2 positive group, each algorithm scored 23/23 cases within the range established by the pathologist. For ER, both algorithms scored 10/10 cases within range. The performance of each algorithm varies somewhat from the percentage of staining as compared to the pathologist′s reading. Conclusions : Commercially available computerized image analysis can be useful in the evaluation of ER and HER2 IHC results. In order to achieve accurate results either manual pathologist region selection is necessary, or an automated region selection tool must be employed. Specificity can

  18. General inattentiveness is a long-term reliable trait independently predictive of psychological health: Danish validation studies of the Mindful Attention Awareness Scale.

    Science.gov (United States)

    Jensen, Christian Gaden; Niclasen, Janni; Vangkilde, Signe Allerup; Petersen, Anders; Hasselbalch, Steen Gregers

    2016-05-01

    The Mindful Attention Awareness Scale (MAAS) measures perceived degree of inattentiveness in different contexts and is often used as a reversed indicator of mindfulness. MAAS is hypothesized to reflect a psychological trait or disposition when used outside attentional training contexts, but the long-term test-retest reliability of MAAS scores is virtually untested. It is unknown whether MAAS predicts psychological health after controlling for standardized socioeconomic status classifications. First, MAAS translated to Danish was validated psychometrically within a randomly invited healthy adult community sample (N = 490). Factor analysis confirmed that MAAS scores quantified a unifactorial construct of excellent composite reliability and consistent convergent validity. Structural equation modeling revealed that MAAS scores contributed independently to predicting psychological distress and mental health, after controlling for age, gender, income, socioeconomic occupational class, stressful life events, and social desirability (β = 0.32-.42, ps health. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. Fatigue methodology for life predictions for the wheel-rail contact area in large offshore turret bearings

    Directory of Open Access Journals (Sweden)

    T. Lassen

    2016-10-01

    Full Text Available The present report presents a fatigue life prediction method for large roller bearings applied in the turret turn table for large loading buoy units. The contact points between wheel and rail in these bearings are subjected to a multi-axial fluctuating stress situation and both surface wear and fatigue cracking may occur. A methodology based on the Dang Van fatigue criterion is adopted. The criterion is based on an equivalent stress defined as a combination of the fluctuation of the shear stress from its mean value at a critical plane and the associated hydrostatic stress at the given time. The present work is supporting the theoretical model by extensive laboratory testing. Both full scale testing of wheel on rail and small scale testing for characterizing the steel material are carried out. An experimental program was carried out with the high strength stainless steel S165M. The Dang Van stress concept is applied in combination with the Random Fatigue Limit Method (RFLM for life data analyses. This approach gives the opportunity to include both finite lives and the run-outs in a rational manner without any presumption of the existence of a fatigue limit in advance of the data. This gives a non-linear S-N curve for a log-log scale in the very high cycle regime close to the fatigue limit. It is demonstrated how the scatter in fatigue limit decreases when the Dang Van stress concept is applied and that the fatigue limit is occurring beyond 107 cycles

  20. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  1. Methodology of comprehensive evaluation of the effectiveness and reliability of production lines of preparation of sea water for the cultivation of aquatic organisms

    Directory of Open Access Journals (Sweden)

    S. D. Ugryumova

    2016-01-01

    Full Text Available The factors affecting the efficiency and reliability of technical systems. Set stages of development and modernization of production lines that correspond to specific stages of evaluating the effectiveness and reliability. Considered several methods of definition of indicators of indicators of efficiency and reliability of the equipment in technological lines of fisheries sector: forecasting methods, structural methods, physical methods, logical-probability method (method by I.A. Ryabinin and topological method. Advantages and disadvantages, allowing you to work out the most suitable method, process lines preparation of sea water for the cultivation of aquatic organisms, connected in series. Modernized technological line of preparation of sea water for the cultivation of aquatic organisms differing from the typical line of seawater in hatcheries (Far East, as the presence of a large number of instrumentation: sensors, salinity and temperature; motomeru that continuously monitor turbidity in the range of 50÷100 EMF (30÷60 mg/1 by kaolin; signaling the flow sensors volume level of the filtrate and the backfill layer; analyzers of chemical composition of sea water; analyzers of suspended mechanical impurities; signaling sensors of acidity and oxygen content and replacement filters coarse, fine cleaning and auxiliary equipment. A program of comprehensive evaluation of the effectiveness and reliability of production lines, revealed that conducted the modernization of production line preparation of sea water for the cultivation of aquatic organisms has improved its efficiency by an average of 1.71% to reduce the amount of manual labor by 15.1%; control the process; provide the most rapid, efficient purification of sea water; reduce the cost of replacement filter media.

  2. Brain GABA Detection in vivo with the J-editing 1H MRS Technique: A Comprehensive Methodological Evaluation of Sensitivity Enhancement, Macromolecule Contamination and Test-Retest Reliability

    Science.gov (United States)

    Shungu, Dikoma C.; Mao, Xiangling; Gonzales, Robyn; Soones, Tacara N.; Dyke, Jonathan P.; van der Veen, Jan Willem; Kegeles, Lawrence S.

    2016-01-01

    Abnormalities in brain γ-aminobutyric acid (GABA) have been implicated in various neuropsychiatric and neurological disorders. However, in vivo GABA detection by proton magnetic resonance spectroscopy (1H MRS) presents significant challenges arising from low brain concentration, overlap by much stronger resonances, and contamination by mobile macromolecule (MM) signals. This study addresses these impediments to reliable brain GABA detection with the J-editing difference technique on a 3T MR system in healthy human subjects by (a) assessing the sensitivity gains attainable with an 8-channel phased-array head coil, (b) determining the magnitude and anatomic variation of the contamination of GABA by MM, and (c) estimating the test-retest reliability of measuring GABA with this method. Sensitivity gains and test-retest reliability were examined in the dorsolateral prefrontal cortex (DLPFC), while MM levels were compared across three cortical regions: the DLPFC, the medial prefrontal cortex (MPFC) and the occipital cortex (OCC). A 3-fold higher GABA detection sensitivity was attained with the 8-channel head coil compared to the standard single-channel head coil in DLPFC. Despite significant anatomic variation in GABA+MM and MM across the three brain regions (p GABA+MM was relatively stable across the three voxels, ranging from 41% to 49%, a non-significant regional variation (p = 0.58). The test-retest reliability of GABA measurement, expressed either as ratios to voxel tissue water (W) or total creatine, was found to be very high for both the single-channel coil and the 8-channel phased-array coil. For the 8-channel coil, for example, Pearson’s correlation coefficient of test vs. retest for GABA/W was 0.98 (R2 = 0.96, p = 0.0007), the percent coefficient of variation (CV) was 1.25%, and the intraclass correlation coefficient (ICC) was 0.98. Similar reliability was also found for the co-edited resonance of combined glutamate and glutamine (Glx) for both coils. PMID

  3. Influence of model specifications on the reliabilities of genomic prediction in a Swedish-Finnish red breed cattle population

    DEFF Research Database (Denmark)

    Rius-Vilarrasa, E; Strandberg, E; Fikse, W F

    2012-01-01

    Using a combined multi-breed reference population, this study explored the influence of model specification and the effect of including a polygenic effect on the reliability of genomic breeding values (DGV and GEBV). The combined reference population consisted of 2986 Swedish Red Breed (SRB) and ...

  4. Measuring the Performance of Attention Networks with the Dalhousie Computerized Attention Battery (DalCAB): Methodology and Reliability in Healthy Adults.

    Science.gov (United States)

    Jones, Stephanie A H; Butler, Beverly C; Kintzel, Franziska; Johnson, Anne; Klein, Raymond M; Eskes, Gail A

    2016-01-01

    Attention is an important, multifaceted cognitive domain that has been linked to three distinct, yet interacting, networks: alerting, orienting, and executive control. The measurement of attention and deficits of attention within these networks is critical to the assessment of many neurological and psychiatric conditions in both research and clinical settings. The Dalhousie Computerized Attention Battery (DalCAB) was created to assess attentional functions related to the three attention networks using a range of tasks including: simple reaction time, go/no-go, choice reaction time, dual task, flanker, item and location working memory, and visual search. The current study provides preliminary normative data, test-retest reliability (intraclass correlations) and practice effects in DalCAB performance 24-h after baseline for healthy young adults (n = 96, 18-31 years). Performance on the DalCAB tasks demonstrated Good to Very Good test-retest reliability for mean reaction time, while accuracy and difference measures (e.g., switch costs, interference effects, and working memory load effects) were most reliable for tasks that require more extensive cognitive processing (e.g., choice reaction time, flanker, dual task, and conjunction search). Practice effects were common and pronounced at the 24-h interval. In addition, performance related to specific within-task parameters of the DalCAB sub-tests provides preliminary support for future formal assessment of the convergent validity of our interpretation of the DalCAB as a potential clinical and research assessment tool for measuring aspects of attention related to the alerting, orienting, and executive control networks.

  5. Measuring the performance of attention networks with the Dalhousie Computerized Attention Battery (DalCAB: Methodology and reliability in healthy adults

    Directory of Open Access Journals (Sweden)

    Stephanie Anne Holland Jones

    2016-06-01

    Full Text Available Attention is an important, multifaceted cognitive domain that has been linked to three distinct, yet interacting, networks: alerting, orienting, and executive control. The measurement of attention and deficits of attention within these networks is critical to the assessment of many neurological and psychiatric conditions in both research and clinical settings. The Dalhousie Computerized Attention Battery (DalCAB was created to assess attentional functions related to the three attention networks using a range of tasks including: simple reaction time, go/no-go, choice reaction time, dual task, flanker, item and location working memory and visual search. The current study provides preliminary normative data, test-retest reliability (intraclass correlations and practice effects in DalCAB performance 24-hours after baseline for healthy young adults (n = 96, 18-31 years. Performance on the DalCAB tasks demonstrated Good to Excellent test-retest reliability for mean reaction time, while accuracy and difference measures (e.g., switch costs, interference effects and working memory load effects were most reliable for tasks that require more extensive cognitive processing (e.g., choice reaction time, flanker, dual task, and conjunction search. Practice effects were common and pronounced at the 24-hour interval. In addition, performance related to specific within-task parameters of the DalCAB sub-tests provides preliminary support for future formal assessment of the convergent validity of our interpretation of the DalCAB as a potential clinical and research assessment tool for measuring aspects of attention related to the alerting, orienting and executive control networks.Keywords: computerized assessment; attention; orienting; alerting; executive function

  6. Predicting brain age with deep learning from raw imaging data results in a reliable and heritable biomarker

    NARCIS (Netherlands)

    Cole, James H.; Poudel, Rudra P. K.; Tsagkrasoulis, Dimosthenis; Caan, Matthan W. A.; Steves, Claire; Spector, Tim D.; Montana, Giovanni

    2017-01-01

    Machine learning analysis of neuroimaging data can accurately predict chronological age in healthy people. Deviations from healthy brain ageing have been associated with cognitive impairment and disease. Here we sought to further establish the credentials of 'brain-predicted age' as a biomarker of

  7. Safety and reliability of pressure components with special emphasis on the contribution of component and large specimen testing to structural integrity assessment methodology. Vol. 1 and 2

    International Nuclear Information System (INIS)

    1987-01-01

    The 51 papers of the 13. MPA-seminar contribute to structural integrity assessment methodology with special emphasis on the component and large specimen testing. 8 of the papers deal with fracture mechanics, 6 papers with dynamic loading, 13 papers with nondestructive testing, 2 papers with radiation embrittlement, 5 papers with pipe failure, 4 papers with components, 2 papers with thermal shock loading, 5 papers with the high temperature behaviour, 4 papers with the integrity of vessels and 3 papers with the integrity of welded joints. Especially also the fracture behaviour of steel material is verificated. All papers are separately indexed and analysed for the database. (DG) [de

  8. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  9. Reliability Models Applied to a System of Power Converters in Particle Accelerators

    OpenAIRE

    Siemaszko, D; Speiser, M; Pittet, S

    2012-01-01

    Several reliability models are studied when applied to a power system containing a large number of power converters. A methodology is proposed and illustrated in the case study of a novel linear particle accelerator designed for reaching high energies. The proposed methods result in the prediction of both reliability and availability of the considered system for optimisation purposes.

  10. Predictive analysis on the electric energy distribution systems reliability: applying the synerGEE system; Analisis predictivo de la confiabilidad en los sistemas de distribucion de energia electrica: aplicando el sistema synerGEE

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Andrade, Carlos

    2008-12-15

    Electrical distribution systems ought to deliver electric power as economical as possible with an acceptable degree of service quality and continuity. Nevertheless, their faults represent one of the main causes of customer's unavailability. At the moment, a wide range of determinist criteria in the improvement of systems reliability based on past behavior are used, but they do not respond to the stochastic nature of system behavior, and are applied without an adequate balance between reliability and economy. In order to obtain this balance a minimum cost planning methodology that considers the predictive analysis of different investment alternatives in addition to the past behavior of the system is required, which guarantees that the economic resource available and limited will be used to gather the greater possible reliability degree. In this work this problem is approached with the fundamentals and methodologies needed to assess the design effects and operative criteria over the main reliability indexes used by the main utilities around the world, with emphasis on the need to optimize economical resources. The use of the system SynerGEETM, is investigated, probing it as a useful tool for the predictive reliability analysis. Due to the lack of experience that exists in Mexico with this type of analysis, distribution engineers has to become familiar with the concepts of the reliability engineering, their application to establish distribution systems models, and acquiring the ability to use the modern simulation tools, allowing them to evaluate the behavior of these systems with enough analytical rigor. In this sense a serial of well known study cases are presented to help them in this labor. [Spanish] Los sistemas de distribucion de energia electrica deben satisfacer la demanda de energia electrica de la forma mas economica posible, con un grado de calidad y continuidad aceptable. Sin embargo, sus fallas son una de las principales causas de indisponibilidad en

  11. Reliability of application of inspection procedures

    Energy Technology Data Exchange (ETDEWEB)

    Murgatroyd, R A

    1988-12-31

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC). 3 refs.

  12. Reliability of application of inspection procedures

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1988-01-01

    This document deals with the reliability of application of inspection procedures. A method to ensure that the inspection of defects thanks to fracture mechanics is reliable is described. The Systematic Human Error Reduction and Prediction Analysis (SHERPA) methodology is applied to every task performed by the inspector to estimate the possibility of error. It appears that it is essential that inspection procedures should be sufficiently rigorous to avoid substantial errors, and that the selection procedures and the training period for inspectors should be optimised. (TEC)

  13. Note of the methodological flaws in the paper entitled "GSTT1 and GSTM1 polymorphisms predict treatment outcome for breast cancer: a systematic review and meta-analysis".

    Science.gov (United States)

    Qiu, Mali; Wu, Xu; Qu, Xiaobing

    2016-09-01

    With great interest, we read the paper "GSTT1 and GSTM1 polymorphisms predict treatment outcome for breast cancer: a systematic review and meta-analysis" (by Hu XY et al.), which has reached important conclusions that GSTM1 null and GSTT1/GSTM1 double null polymorphisms might be significantly associated with an increased tumor response in breast cancer. The result is encouraging. Nevertheless, several methodological flaws in this meta-analysis are worth noticing.

  14. Limited Sampling Strategy for the Prediction of Area Under the Curve (AUC) of Statins: Reliability of a Single Time Point for AUC Prediction for Pravastatin and Simvastatin.

    Science.gov (United States)

    Srinivas, N R

    2016-02-01

    Statins are widely prescribed medicines and are also available in fixed dose combinations with other drugs to treat several chronic ailments. Given the safety issues associated with statins it may be important to assess feasibility of a single time concentration strategy for prediction of exposure (area under the curve; AUC). The peak concentration (Cmax) was used to establish relationship with AUC separately for pravastatin and simvastatin using published pharmacokinetic data. The regression equations generated for statins were used to predict the AUC values from various literature references. The fold difference of the observed divided by predicted values along with correlation coefficient (r) were used to judge the feasibility of the single time point approach. Both pravastatin and simvastatin showed excellent correlation of Cmax vs. AUC values with r value ≥ 0.9638 (pAUC predictions and >81% of the predicted values were in a narrower range of >0.75-fold but AUC values showed excellent correlation for pravastatin (r=0.9708, n=115; pAUC predictions. On the basis of the present work, it is feasible to develop a single concentration time point strategy that coincides with Cmax occurrence for both pravastatin and simvastatin from a therapeutic drug monitoring perspective. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  16. Determination of Multiphase Flow Meter Reliability and Development of Correction Charts for the Prediction of Oilfield Fluid Flow Rates

    Directory of Open Access Journals (Sweden)

    Samuel S. MOFUNLEWI

    2008-06-01

    Full Text Available The aim of field testing of Multiphase Flow Meter (MPFM is to show whether its accuracy compares favourably with that of the Test Separator in accurately measuring the three production phases (oil, gas and water as well as determining meter reliability in field environment. This study evaluates field test results of the MPFM as compared to reference conventional test separators. Generally, results show that MPFM compares favourably with Test Separator within the specified range of accuracy.At the moment, there is no legislation for meter proving technique for MPFM. However, this study has developed calibration charts that can be used to correct and improve meter accuracy.

  17. Rapid and reliable predictions of the radiological consequences of accidents as an aid to decisions on countermeasures

    International Nuclear Information System (INIS)

    Kelly, G.N.

    1990-01-01

    The rapid and reliable assessment of the potential radiological consequences of an accident at a nuclear installation is an essential input to timely decisions on the effective introduction of countermeasures. There have been considerable improvements over the past decade or so in the methods used for such assessments and, in particular, in the development of computerized systems. The need for such systems is described, together with their current state of development and possible future trends. This topic has featured prominently within the CEC's Radiation Protection Research Programme and is likely to do so far the foreseeable future. The main features of this research, its achievements to date and future directions are described

  18. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  19. Reliable Prediction of Insulin Resistance by a School-Based Fitness Test in Middle-School Children

    Directory of Open Access Journals (Sweden)

    Todd Varness

    2009-01-01

    Full Text Available Objectives. (1 Determine the predictive value of a school-based test of cardiovascular fitness (CVF for insulin resistance (IR; (2 compare a “school-based” prediction of IR to a “laboratory-based” prediction, using various measures of fitness and body composition. Methods. Middle school children (n=82 performed the Progressive Aerobic Cardiovascular Endurance Run (PACER, a school-based CVF test, and underwent evaluation of maximal oxygen consumption treadmill testing (VO2 max, body composition (percent body fat and BMI z score, and IR (derived homeostasis model assessment index [HOMAIR]. Results. PACER showed a strong correlation with VO2 max/kg (rs = 0.83, P<.001 and with HOMAIR (rs = −0.60, P<.001. Multivariate regression analysis revealed that a school-based model (using PACER and BMI z score predicted IR similar to a laboratory-based model (using VO2 max/kg of lean body mass and percent body fat. Conclusions. The PACER is a valid school-based test of CVF, is predictive of IR, and has a similar relationship to IR when compared to complex laboratory-based testing. Simple school-based measures of childhood fitness (PACER and fatness (BMI z score could be used to identify childhood risk for IR and evaluate interventions.

  20. Reliable Prediction of Insulin Resistance by a School-Based Fitness Test in Middle-School Children

    Directory of Open Access Journals (Sweden)

    Allen DavidB

    2009-09-01

    Full Text Available Objectives. (1 Determine the predictive value of a school-based test of cardiovascular fitness (CVF for insulin resistance (IR; (2 compare a "school-based" prediction of IR to a "laboratory-based" prediction, using various measures of fitness and body composition. Methods. Middle school children ( performed the Progressive Aerobic Cardiovascular Endurance Run (PACER, a school-based CVF test, and underwent evaluation of maximal oxygen consumption treadmill testing ( max, body composition (percent body fat and BMI z score, and IR (derived homeostasis model assessment index []. Results. PACER showed a strong correlation with max/kg ( = 0.83, and with ( = , . Multivariate regression analysis revealed that a school-based model (using PACER and BMI z score predicted IR similar to a laboratory-based model (using max/kg of lean body mass and percent body fat. Conclusions. The PACER is a valid school-based test of CVF, is predictive of IR, and has a similar relationship to IR when compared to complex laboratory-based testing. Simple school-based measures of childhood fitness (PACER and fatness (BMI z score could be used to identify childhood risk for IR and evaluate interventions.

  1. Methodology for predicting the life of waste-package materials, and components using multifactor accelerated life tests

    International Nuclear Information System (INIS)

    Thomas, R.E.; Cote, R.W.

    1983-09-01

    Accelerated life tests are essential for estimating the service life of waste-package materials and components. A recommended methodology for generating accelerated life tests is described in this report. The objective of the methodology is to define an accelerated life test program that is scientifically and statistically defensible. The methodology is carried out using a select team of scientists and usually requires 4 to 12 man-months of effort. Specific agendas for the successive meetings of the team are included in the report for use by the team manager. The agendas include assignments for the team scientists and a different set of assignments for the team statistician. The report also includes descriptions of factorial tables, hierarchical trees, and associated mathematical models that are proposed as technical tools to guide the efforts of the design team

  2. Humidity build-up in electronic enclosures exposed to different geographical locations by RC modelling and reliability prediction

    DEFF Research Database (Denmark)

    Conseil-Gudla, H.; Staliulionis, Z.; Mohanty, S.

    2018-01-01

    according to this steady state (25 °C and 60% RH) have been calculated for the different climates, and the protection offered by the enclosures has been estimated under different casing materials and resistor-capacitor (RC) simulation. This method offers a way to predict the average value of failure rate...

  3. Examining the Predictive Validity of GRE Scores on Doctoral Education: Students' Success and Methodology Choices in the Dissertation Process

    Science.gov (United States)

    Rockinson-Szapkiw, Amanda J.; Bray, Oliver R., Jr.; Spaulding, Lucinda S.

    2014-01-01

    This study examines how GRE scores can be used to better understand Education doctoral candidates' methodology choices for the dissertation as well as their persistence behaviors. Candidates' of one online doctoral education program were examined. Results of a MANOVA suggested that there is no difference in GRE scores based on doctoral candidates'…

  4. Prediction of the shape of inline wave force and free surface elevation using First Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Ghadirian, Amin; Bredmose, Henrik; Schløer, Signe

    2017-01-01

    theory, that is, the most likely time history of inline force around a force peak of given value. The results of FORM and NewForce are linearly identical and show only minor deviations at second order. The FORM results are then compared to wave averaged measurements of the same criteria for crest height......In design of substructures for offshore wind turbines, the extreme wave loads which are of interest in Ultimate Limit States are often estimated by choosing extreme events from linear random sea states and replacing them by either stream function wave theory or the NewWave theory of a certain...... design wave height. As these wave theories super from limitations such as symmetry around the crest, other methods to estimate the wave loads are needed. In the present paper, the First Order Reliability Method, FORM, is used systematically to estimate the most likely extreme wave shapes. Two parameters...

  5. Uma metodologia bayesiana para estudos de confiabilidade na fase de projeto: aplicação em um produto eletrônico A bayesian methodology for reliability studies in the design phase: application to an electronic product

    Directory of Open Access Journals (Sweden)

    Ruth Myriam Ramírez Pongo

    1997-12-01

    . This is particularly true when the product technology limits the acceleration factor, as with electronic products, for example. The methodology proposed in this paper combines test results, which are routinely performed during the product development cycle, with additional relevant information that is useful in the assessment of its reliability. In order to illustrate the methodology, it was applied to an electronic equipment, assessing its reliability during the design phase. The computations were performed considering component reliabilities, attribute test data, and also judgement of the product development team.

  6. Development of Reliability Based Life Prediction Methods for Thermal and Environmental Barrier Coatings in Ceramic Matrix Composites

    Science.gov (United States)

    Shah, Ashwin

    2001-01-01

    Literature survey related to the EBC/TBC (environmental barrier coating/thermal barrier coating) fife models, failure mechanisms in EBC/TBC and the initial work plan for the proposed EBC/TBC life prediction methods development was developed as well as the finite element model for the thermal/stress analysis of the GRC-developed EBC system was prepared. Technical report for these activities is given in the subsequent sections.

  7. Improving the reliability of female fertility breeding values using type and milk yield traits that predict energy status in Australian Holstein cattle.

    Science.gov (United States)

    González-Recio, O; Haile-Mariam, M; Pryce, J E

    2016-01-01

    The objectives of this study were (1) to propose changing the selection criteria trait for evaluating fertility in Australia from calving interval to conception rate at d 42 after the beginning of the mating season and (2) to use type traits as early fertility predictors, to increase the reliability of estimated breeding values for fertility. The breeding goal in Australia is conception within 6 wk of the start of the mating season. Currently, the Australian model to predict fertility breeding values (expressed as a linear transformation of calving interval) is a multitrait model that includes calving interval (CVI), lactation length (LL), calving to first service (CFS), first nonreturn rate (FNRR), and conception rate. However, CVI has a lower genetic correlation with the breeding goal (conception within 6 wk of the start of the mating season) than conception rate. Milk yield, type, and fertility data from 164,318 cow sired by 4,766 bulls were used. Principal component analysis and genetic correlation estimates between type and fertility traits were used to select type traits that could subsequently be used in a multitrait analysis. Angularity, foot angle, and pin set were chosen as type traits to include in an index with the traits that are included in the multitrait fertility model: CVI, LL, CFS, FNRR, and conception rate at d 42 (CR42). An index with these 8 traits is expected to achieve an average bull first proof reliability of 0.60 on the breeding objective (conception within 6 wk of the start of the mating season) compared with reliabilities of 0.39 and 0.45 for CR42 only or the current 5-trait Australian model. Subsequently, we used the first eigenvector of a principal component analysis with udder texture, bone quality, angularity, and body condition score to calculate an energy status indicator trait. The inclusion of the energy status indicator trait composite in a multitrait index with CVI, LL, CFS, FNRR, and CR42 achieved a 12-point increase in

  8. In vitro dissolution methodology, mini-Gastrointestinal Simulator (mGIS), predicts better in vivo dissolution of a weak base drug, dasatinib.

    Science.gov (United States)

    Tsume, Yasuhiro; Takeuchi, Susumu; Matsui, Kazuki; Amidon, Gregory E; Amidon, Gordon L

    2015-08-30

    USP apparatus I and II are gold standard methodologies for determining the in vitro dissolution profiles of test drugs. However, it is difficult to use in vitro dissolution results to predict in vivo dissolution, particularly the pH-dependent solubility of weak acid and base drugs, because the USP apparatus contains one vessel with a fixed pH for the test drug, limiting insight into in vivo drug dissolution of weak acid and weak base drugs. This discrepancy underscores the need to develop new in vitro dissolution methodology that better predicts in vivo response to assure the therapeutic efficacy and safety of oral drug products. Thus, the development of the in vivo predictive dissolution (IPD) methodology is necessitated. The major goals of in vitro dissolution are to ensure the performance of oral drug products and the support of drug formulation design, including bioequivalence (BE). Orally administered anticancer drugs, such as dasatinib and erlotinib (tyrosine kinase inhibitors), are used to treat various types of cancer. These drugs are weak bases that exhibit pH-dependent and high solubility in the acidic stomach and low solubility in the small intestine (>pH 6.0). Therefore, these drugs supersaturate and/or precipitate when they move from the stomach to the small intestine. Also of importance, gastric acidity for cancer patients may be altered with aging (reduction of gastric fluid secretion) and/or co-administration of acid-reducing agents. These may result in changes to the dissolution profiles of weak base and the reduction of drug absorption and efficacy. In vitro dissolution methodologies that assess the impact of these physiological changes in the GI condition are expected to better predict in vivo dissolution of oral medications for patients and, hence, better assess efficacy, toxicity and safety concerns. The objective of this present study is to determine the initial conditions for a mini-Gastrointestinal Simulator (mGIS) to assess in vivo

  9. PTCH1 is a reliable marker for predicting imatinib response in chronic myeloid leukemia patients in chronic phase.

    Directory of Open Access Journals (Sweden)

    Juan M Alonso-Dominguez

    Full Text Available Patched homolog 1 gene (PTCH1 expression and the ratio of PTCH1 to Smoothened (SMO expression have been proposed as prognostic markers of the response of chronic myeloid leukemia (CML patients to imatinib. We compared these measurements in a realistic cohort of 101 patients with CML in chronic phase (CP using a simplified qPCR method, and confirmed the prognostic power of each in a competing risk analysis. Gene expression levels were measured in peripheral blood samples at diagnosis. The PTCH1/SMO ratio did not improve PTCH1 prognostic power (area under the receiver operating characteristic curve 0.71 vs. 0.72. In order to reduce the number of genes to be analyzed, PTCH1 was the selected measurement. High and low PTCH1 expression groups had significantly different cumulative incidences of imatinib failure (IF, which was defined as discontinuation of imatinib due to lack of efficacy (5% vs. 25% at 4 years, P = 0.013, probabilities of achieving a major molecular response (81% vs. 53% at first year, P = 0.02, and proportions of early molecular failure (14% vs. 43%, P = 0.015. Every progression to an advanced phase (n = 3 and CML-related death (n = 2 occurred in the low PTCH1 group (P<0.001 for both comparisons. PTCH1 was an independent prognostic factor for the prediction of IF. We also validated previously published thresholds for PTCH1 expression. Therefore, we confirmed that PTCH1 expression can predict the imatinib response in CML patients in CP by applying a more rigorous statistical analysis. Thus, PTCH1 expression is a promising molecular marker for predicting the imatinib response in CML patients in CP.

  10. El análisis de criticidad, una metodología para mejorar la confiabilidad operacional // Criticality analysis , a methodology to improve the operational reliability.

    Directory of Open Access Journals (Sweden)

    R. Huerta Mendoza

    2000-10-01

    establecer prioridades, yfocalizar el esfuerzo que garantice el éxito maximizando la rentabilidad.Palabras claves: confiabilidad, criticidad, seguridad, ambiente, riesgo, disponibilidad, mejoramiento.___________________________________________________________________Abstract:The criticality analysis is a methodology that allows to establish the hierarchy or priorities of processes, systems andequipments, creating a structure that facilitates the taking of effective and correct decisions, addressing the effort and theresources in areas where it is more important or necessary to improve the operational dependability, based on the currentreality.The improvement of the operational dependability of any installation or their systems and components is associated withfour fundamental aspects: human dependability, dependability of the process, dependability of the design and thedependability of the maintenance. Regrettably, difficultly is in our hands limitless resources, so much economic as human,to be able to improve at the same time, these four aspects in all the areas of a company.The approaches to carry out a criticality analysis are associated with: security, surroundings, production, operation costsand maintenance, failure rate and repair time mainly. These approaches are related with a mathematical equation thatgenerates punctuation for each evaluated element.The generated list, result of a team work, allows to even and to homologate approaches to establish priorities, and focalisethe effort that guarantees the success maximizing the profitability..Key words:. PDVSA, dependability, criticality, security, surroundings, risk, readiness, improvement,changes.

  11. Qualification of a full plant nodalization for the prediction of the core exit temperature through a scaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, J., E-mail: jordi.freixa-terradas@upc.edu; Martínez-Quiroga, V., E-mail: victor.martinez.quiroga@upc.edu; Reventós, F., E-mail: francesc.reventos@upc.edu

    2016-11-15

    Highlights: • Core exit temperature is used in PWRs as an indication of core heat up. • Qualification of full scale nuclear reactors by means of a scaling methodology. • Scaling of RELAP5 calculations to full scale power plants. - Abstract: System codes and their necessary power plant nodalizations are an essential step in thermal hydraulic safety analysis. In order to assess the safety of a particular power plant, in addition to the validation and verification of the code, the nodalization of the system needs to be qualified. Since most existing experimental data come from scaled-down facilities, any qualification process must therefore address scale considerations. The Group of Thermal Hydraulic Studies at Technical University of Catalonia has developed a scaling-up methodology (SCUP) for the qualification of full-scale nodalizations through a systematic procedure based on the extrapolation of post-test simulations of Integral Test Facility experiments. In the present work, the SCUP methodology will be employed to qualify the nodalization of the AscóNPP, a Pressurized Water Reactor (PWR), for the reproduction of an important safety phenomenon which is the effectiveness of the Core Exit Temperature (CET) as an Accident Management (AM) indicator. Given the difficulties in placing measurements in the core region, CET measurements are used as a criterion for the initiation of safety operational procedures during accidental conditions in PWR. However, the CET response has some limitation in detecting inadequate core cooling simply because the measurement is not taken in the position where the cladding exposure occurs. In order to apply the SCUP methodology, the OECD/NEA ROSA-2 Test 3, an SBLOCA in the hot leg, has been selected as a starting point. This experiment was conducted at the Large Scale Test Facility (LSTF), a facility operated by the Japanese Atomic Energy Agency (JAEA) and was focused on the assessment of the effectiveness of AM actions triggered by

  12. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool for wind tunnel model using the parameter varying estimation (PVE) technique to...

  13. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  14. Report on an Assessment of the Application of EPP Results from the Strain Limit Evaluation Procedure to the Prediction of Cyclic Life Based on the SMT Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jetter, R. I. [R. I. Jetter Consulting, Pebble Beach, CA (United States); Messner, M. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Sham, T. -L. [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Y. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continued in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.

  15. The Electricity Feed Law levy - how reliably can it be predicted?; Wie verlaesslich laesst sich die EEG-Umlage prognostizieren?

    Energy Technology Data Exchange (ETDEWEB)

    Bause, Rainer; Schulz, Woldemar [Amprion GmbH, Dortmund (Germany); Buehler, Holger [EnBW TNG, Stuttgart (Germany); Hodurek, Claus [50Hertz Transmission GmbH, Berlin (Germany); Kiessling, Axel [TenneT TSO GmbH, Bayreuth (Germany)

    2011-10-15

    By paying their monthly electricity bill German consumers are promoting the transition to tomorrow's resource-efficient electricity supply system, in which the largest part of electricity used is to come from renewable energy resources. Every year Germany's transmission system operators publish what is referred to as the ''EEG-Umlage'' (report on the Electricity Feed Law levy, or EEG levy), which shows how much every German household will be paying for the promotion of renewable energies in the coming year. The determination of the EEG levy involves uncertainties and imponderabilities which have to be taken into account in its calculation. The crucial task is to find a suitable systematic scheme for predicting the renewable energy yield.

  16. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk"? submarine study

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-03-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  17. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk" submarine study

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-06-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  18. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: 'Kursk' submarine study

    Directory of Open Access Journals (Sweden)

    A. Baklanov

    2003-01-01

    Full Text Available There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.. Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1 probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2 forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2 over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  19. Prediction of dosage-based parameters from the puff dispersion of airborne materials in urban environments using the CFD-RANS methodology

    Science.gov (United States)

    Efthimiou, G. C.; Andronopoulos, S.; Bartzis, J. G.

    2018-02-01

    One of the key issues of recent research on the dispersion inside complex urban environments is the ability to predict dosage-based parameters from the puff release of an airborne material from a point source in the atmospheric boundary layer inside the built-up area. The present work addresses the question of whether the computational fluid dynamics (CFD)-Reynolds-averaged Navier-Stokes (RANS) methodology can be used to predict ensemble-average dosage-based parameters that are related with the puff dispersion. RANS simulations with the ADREA-HF code were, therefore, performed, where a single puff was released in each case. The present method is validated against the data sets from two wind-tunnel experiments. In each experiment, more than 200 puffs were released from which ensemble-averaged dosage-based parameters were calculated and compared to the model's predictions. The performance of the model was evaluated using scatter plots and three validation metrics: fractional bias, normalized mean square error, and factor of two. The model presented a better performance for the temporal parameters (i.e., ensemble-average times of puff arrival, peak, leaving, duration, ascent, and descent) than for the ensemble-average dosage and peak concentration. The majority of the obtained values of validation metrics were inside established acceptance limits. Based on the obtained model performance indices, the CFD-RANS methodology as implemented in the code ADREA-HF is able to predict the ensemble-average temporal quantities related to transient emissions of airborne material in urban areas within the range of the model performance acceptance criteria established in the literature. The CFD-RANS methodology as implemented in the code ADREA-HF is also able to predict the ensemble-average dosage, but the dosage results should be treated with some caution; as in one case, the observed ensemble-average dosage was under-estimated slightly more than the acceptance criteria. Ensemble

  20. The mechanical behavior and reliability prediction of the HTR graphite component at various temperature and neutron dose ranges

    International Nuclear Information System (INIS)

    Fang, Xiang; Yu, Suyuan; Wang, Haitao; Li, Chenfeng

    2014-01-01

    Highlights: • The mechanical behavior of graphite component in HTRs under high temperature and neutron irradiation conditions is simulated. • The computational process of mechanical analysis is introduced. • Deformation, stresses and failure probability of the graphite component are obtained and discussed. • Various temperature and neutron dose ranges are selected in order to investigate the effect of in-core conditions on the results. - Abstract: In a pebble-bed high temperature gas-cooled reactor (HTR), nuclear graphite serves as the main structural material of the side reflectors. The reactor core is made up of a large number of graphite bricks. In the normal operation case of the reactor, the maximum temperature of the helium coolant commonly reaches about 750 °C. After around 30 years’ full power operation, the peak value of in-core fast neutron cumulative dose reaches to 1 × 10 22 n cm −2 (EDN). Such high temperature and neutron irradiation strongly impact the behavior of graphite component, causing obvious deformation. The temperature and neutron dose are unevenly distributed inside a graphite brick, resulting in stress concentrations. The deformation and stress concentration can both greatly affect safety and reliability of the graphite component. In addition, most of the graphite properties (such as Young's modulus and coefficient of thermal expansion) change remarkably under high temperature and neutron irradiations. The irradiation-induced creep also plays a very important role during the whole process, and provides a significant impact on the stress accumulation. In order to simulate the behavior of graphite component under various in-core conditions, all of the above factors must be considered carefully. In this paper, the deformation, stress distribution and failure probability of a side graphite component are studied at various temperature points and neutron dose levels. 400 °C, 500 °C, 600 °C and 750 °C are selected as the

  1. An experimental evaluation of powder flow predictions in small-scale process equipment based on Jenike's hopper design methodology

    DEFF Research Database (Denmark)

    Søgaard, Søren Vinter; Olesen, Niels Erik; Hirschberg, Cosima

    2017-01-01

    . The comparison of the observed and predicted critical outlet diameters showed good agreement for the powder with the best flowability when linear extrapolation of the flow function was applied. In contrast, the predicted critical outlet diameter was slightly overestimated compared to the experimentally observed...... diameter for the two more cohesive powders. A likely reason for this overestimation is that the flow function probably has a non-linear convex upward shape for these two powders at very small consolidation stresses. These findings illustrate the relevance of measuring shear and wall shear stresses at very...... small consolidation stresses to improve the flow behavior predictions for small-scale process equipment typically used during production of solid state pharmaceuticals....

  2. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  3. The reliability of finite element analysis results of the low impact test in predicting the energy absorption performance of thin-walled structures

    Energy Technology Data Exchange (ETDEWEB)

    Alipour, R.; Nejadx, Farokhi A.; Izman, S. [Universiti Teknologi Malaysia, Johor Bahru (Malaysia)

    2015-05-15

    The application of dual phase steels (DPS) such as DP600 in the form of thin-walled structure in automotive components is being continuously increased as vehicle designers utilize modern steel grades and low weight structures to improve structural performance, make automotive light and reinforce crash performance. Preventing cost enhancement of broad investigations in this area can be gained by using computers in structural analysis in order to substitute lots of experiments with finite element analysis (FEA). Nevertheless, it necessitates to be certified that selected method including element type and solution methodology is capable of predicting real condition. In this paper, numerical and experimental studies are done to specify the effect of element type selection and solution methodology on the results of finite element analysis in order to investigate the energy absorption behavior of a DP600 thin-walled structure with three different geometries under a low impact loading. The outcomes indicated the combination of implicit method and solid elements is in better agreement with the experiments. In addition, using a combination of shell element types with implicit method reduces the time of simulation remarkably, although the error of results compared to the experiments increased to some extent.

  4. Single-leg lateral, horizontal, and vertical jump assessment: reliability, interrelationships, and ability to predict sprint and change-of-direction performance.

    Science.gov (United States)

    Meylan, Cesar; McMaster, Travis; Cronin, John; Mohammad, Nur Ikhwan; Rogers, Cailyn; Deklerk, Melissa

    2009-07-01

    The purposes of this study were to determine the reliability of unilateral vertical, horizontal, and lateral countermovement jump assessments, the interrelationship between these tests, and their usefulness as predictors of sprint (10 m) and change-of-direction (COD) performance for 80 men and women physical education students. Jump performance was assessed on a contact mat and sprint, and COD performances were assessed using timing lights. With regard to the reliability statistics, the largest coefficient of variation (CV) was observed for the vertical jump (CV = 6.7-7.2%) of both genders, whereas the sprint and COD assessments had smallest variability (CV = 0.8 to 2.8%). All intraclass correlation coefficients (ICC) were greater than 0.85, except for the men's COD assessment with the alternate leg. The shared variance between the single-leg vertical, horizontal, and lateral jumps for men and women was less than 50%, indicating that the jumps are relatively independent of one another and represent different leg strength/power qualities. The ability of the jumps to predict sprint and COD performance was limited (R2 < 43%). It would seem that the ability to change direction with 1 leg is relatively independent of a COD with the other leg, especially in the women (R < 30%) of this study. However, if 1 jump assessment were selected to predict sprint and COD performance in a test battery, the single-leg horizontal countermovement jump would seem the logical choice, given the results of this study. Many of the findings in this study have interesting diagnostic and training implications for the strength and conditioning coach.

  5. Predicting interatrial septum rotation: is the position of the heart or the direction of the coronary sinus reliable?: Implications for interventional electrophysiologists from CT studies.

    Science.gov (United States)

    Sun, Huan; Wang, Yanjing; Zhang, Zhenming; Liu, Lin; Yang, Ping

    2015-04-01

    Determining the location of the interatrial septum (IAS) is crucial for cardiac electrophysiology procedures. Empirical methods of predicting IAS orientation depend on anatomical landmarks, including determining it from the direction of the coronary sinus (CS) and the position of the heart (e.g., vertical or transverse). However, the reliability of these methods for predicting IAS rotation warrants further study. The purpose of this study was to assess the clinical utility of the relationship between IAS orientation, CS direction, and heart position. Data from 115 patients undergoing coronary computed tomography (CT) angiography with no evidence of cardiac structural disease were collected and analyzed. Angulations describing IAS orientation, CS direction, and heart position were measured. The relationships between IAS orientation and each of the other two parameters were subsequently analyzed. The mean angulations for IAS orientation, CS direction, and heart position were 36.8 ± 7.3° (range 19.1-53.6), 37.7 ± 6.6° (range 21.3-50.1), and 37.1 ± 8.3° (range 19.2-61.0), respectively. We found a significant correlation between IAS orientation and CS direction (r = 0.928; P IAS orientation = 2.01 + 1.03 × CS direction (r(2) = 0.86). No correlation was observed between IAS orientation and heart position (P = 0.86). In patients without structural heart disease, CS direction may be a reliable predictor of IAS orientation, and may serve as a helpful reference for clinicians during invasive electrophysiological procedures. Further study is warranted to clarify the relationship between IAS orientation and heart position. © 2015 Wiley Periodicals, Inc.

  6. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  7. Ares I-X Launch Abort System, Crew Module, and Upper Stage Simulator Vibroacoustic Flight Data Evaluation, Comparison to Predictions, and Recommendations for Adjustments to Prediction Methodology and Assumptions

    Science.gov (United States)

    Smith, Andrew; Harrison, Phil

    2010-01-01

    The National Aeronautics and Space Administration (NASA) Constellation Program (CxP) has identified a series of tests to provide insight into the design and development of the Crew Launch Vehicle (CLV) and Crew Exploration Vehicle (CEV). Ares I-X was selected as the first suborbital development flight test to help meet CxP objectives. The Ares I-X flight test vehicle (FTV) is an early operational model of CLV, with specific emphasis on CLV and ground operation characteristics necessary to meet Ares I-X flight test objectives. The in-flight part of the test includes a trajectory to simulate maximum dynamic pressure during flight and perform a stage separation of the Upper Stage Simulator (USS) from the First Stage (FS). The in-flight test also includes recovery of the FS. The random vibration response from the ARES 1-X flight will be reconstructed for a few specific locations that were instrumented with accelerometers. This recorded data will be helpful in validating and refining vibration prediction tools and methodology. Measured vibroacoustic environments associated with lift off and ascent phases of the Ares I-X mission will be compared with pre-flight vibration predictions. The measured flight data was given as time histories which will be converted into power spectral density plots for comparison with the maximum predicted environments. The maximum predicted environments are documented in the Vibroacoustics and Shock Environment Data Book, AI1-SYS-ACOv4.10 Vibration predictions made using statistical energy analysis (SEA) VAOne computer program will also be incorporated in the comparisons. Ascent and lift off measured acoustics will also be compared to predictions to assess whether any discrepancies between the predicted vibration levels and measured vibration levels are attributable to inaccurate acoustic predictions. These comparisons will also be helpful in assessing whether adjustments to prediction methodologies are needed to improve agreement between the

  8. Prediction of radiation levels in residences: A methodological comparison of CART [Classification and Regression Tree Analysis] and conventional regression

    International Nuclear Information System (INIS)

    Janssen, I.; Stebbings, J.H.

    1990-01-01

    In environmental epidemiology, trace and toxic substance concentrations frequently have very highly skewed distributions ranging over one or more orders of magnitude, and prediction by conventional regression is often poor. Classification and Regression Tree Analysis (CART) is an alternative in such contexts. To compare the techniques, two Pennsylvania data sets and three independent variables are used: house radon progeny (RnD) and gamma levels as predicted by construction characteristics in 1330 houses; and ∼200 house radon (Rn) measurements as predicted by topographic parameters. CART may identify structural variables of interest not identified by conventional regression, and vice versa, but in general the regression models are similar. CART has major advantages in dealing with other common characteristics of environmental data sets, such as missing values, continuous variables requiring transformations, and large sets of potential independent variables. CART is most useful in the identification and screening of independent variables, greatly reducing the need for cross-tabulations and nested breakdown analyses. There is no need to discard cases with missing values for the independent variables because surrogate variables are intrinsic to CART. The tree-structured approach is also independent of the scale on which the independent variables are measured, so that transformations are unnecessary. CART identifies important interactions as well as main effects. The major advantages of CART appear to be in exploring data. Once the important variables are identified, conventional regressions seem to lead to results similar but more interpretable by most audiences. 12 refs., 8 figs., 10 tabs

  9. A Post-Harvest Prediction Mass Loss Model for Tomato Fruit Using A Numerical Methodology Centered on Approximation Error Minimization

    Directory of Open Access Journals (Sweden)

    Francisco Javier Bucio

    2017-10-01

    Full Text Available Due to its nutritional and economic value, the tomato is considered one of the main vegetables in terms of production and consumption in the world. For this reason, an important case study is the fruit maturation parametrized by its mass loss in this study. This process develops in the fruit mainly after harvest. Since that parameter affects the economic value of the crop, the scientific community has been progressively approaching the issue. However, there is no a state-of-the-art practical model allowing the prediction of the tomato fruit mass loss yet. This study proposes a prediction model for tomato mass loss in a continuous and definite time-frame using regression methods. The model is based on a combination of adjustment methods such as least squares polynomial regression leading to error estimation, and cross validation techniques. Experimental results from a 50 fruit of tomato sample studied over a 54 days period were compared to results from the model using a second-order polynomial approach found to provide optimal data fit with a resulting efficiency of ~97%. The model also allows the design of precise logistic strategies centered on post-harvest tomato mass loss prediction usable by producers, distributors, and consumers.

  10. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that