WorldWideScience

Sample records for model providing reliable

  1. Bring Your Own Device - Providing Reliable Model of Data Access

    Directory of Open Access Journals (Sweden)

    Stąpór Paweł

    2016-10-01

    Full Text Available The article presents a model of Bring Your Own Device (BYOD as a model network, which provides the user reliable access to network resources. BYOD is a model dynamically developing, which can be applied in many areas. Research network has been launched in order to carry out the test, in which as a service of BYOD model Work Folders service was used. This service allows the user to synchronize files between the device and the server. An access to the network is completed through the wireless communication by the 802.11n standard. Obtained results are shown and analyzed in this article.

  2. Reliability constrained decision model for energy service provider incorporating demand response programs

    International Nuclear Information System (INIS)

    Mahboubi-Moghaddam, Esmaeil; Nayeripour, Majid; Aghaei, Jamshid

    2016-01-01

    Highlights: • The operation of Energy Service Providers (ESPs) in electricity markets is modeled. • Demand response as the cost-effective solution is used for energy service provider. • The market price uncertainty is modeled using the robust optimization technique. • The reliability of the distribution network is embedded into the framework. • The simulation results demonstrate the benefits of robust framework for ESPs. - Abstract: Demand response (DR) programs are becoming a critical concept for the efficiency of current electric power industries. Therefore, its various capabilities and barriers have to be investigated. In this paper, an effective decision model is presented for the strategic behavior of energy service providers (ESPs) to demonstrate how to participate in the day-ahead electricity market and how to allocate demand in the smart distribution network. Since market price affects DR and vice versa, a new two-step sequential framework is proposed, in which unit commitment problem (UC) is solved to forecast the expected locational marginal prices (LMPs), and successively DR program is applied to optimize the total cost of providing energy for the distribution network customers. This total cost includes the cost of purchased power from the market and distributed generation (DG) units, incentive cost paid to the customers, and compensation cost of power interruptions. To obtain compensation cost, the reliability evaluation of the distribution network is embedded into the framework using some innovative constraints. Furthermore, to consider the unexpected behaviors of the other market participants, the LMP prices are modeled as the uncertainty parameters using the robust optimization technique, which is more practical compared to the conventional stochastic approach. The simulation results demonstrate the significant benefits of the presented framework for the strategic performance of ESPs.

  3. A stochastic simulation model for reliable PV system sizing providing for solar radiation fluctuations

    International Nuclear Information System (INIS)

    Kaplani, E.; Kaplanis, S.

    2012-01-01

    Highlights: ► Solar radiation data for European cities follow the Extreme Value or Weibull distribution. ► Simulation model for the sizing of SAPV systems based on energy balance and stochastic analysis. ► Simulation of PV Generator-Loads-Battery Storage System performance for all months. ► Minimum peak power and battery capacity required for reliable SAPV sizing for various European cities. ► Peak power and battery capacity reduced by more than 30% for operation 95% success rate. -- Abstract: The large fluctuations observed in the daily solar radiation profiles affect highly the reliability of the PV system sizing. Increasing the reliability of the PV system requires higher installed peak power (P m ) and larger battery storage capacity (C L ). This leads to increased costs, and makes PV technology less competitive. This research paper presents a new stochastic simulation model for stand-alone PV systems, developed to determine the minimum installed P m and C L for the PV system to be energy independent. The stochastic simulation model developed, makes use of knowledge acquired from an in-depth statistical analysis of the solar radiation data for the site, and simulates the energy delivered, the excess energy burnt, the load profiles and the state of charge of the battery system for the month the sizing is applied, and the PV system performance for the entire year. The simulation model provides the user with values for the autonomy factor d, simulating PV performance in order to determine the minimum P m and C L depending on the requirements of the application, i.e. operation with critical or non-critical loads. The model makes use of NASA’s Surface meteorology and Solar Energy database for the years 1990–2004 for various cities in Europe with a different climate. The results obtained with this new methodology indicate a substantial reduction in installed peak power and battery capacity, both for critical and non-critical operation, when compared to

  4. Travel time reliability modeling.

    Science.gov (United States)

    2011-07-01

    This report includes three papers as follows: : 1. Guo F., Rakha H., and Park S. (2010), "A Multi-state Travel Time Reliability Model," : Transportation Research Record: Journal of the Transportation Research Board, n 2188, : pp. 46-54. : 2. Park S.,...

  5. Reliability and Model Fit

    Science.gov (United States)

    Stanley, Leanne M.; Edwards, Michael C.

    2016-01-01

    The purpose of this article is to highlight the distinction between the reliability of test scores and the fit of psychometric measurement models, reminding readers why it is important to consider both when evaluating whether test scores are valid for a proposed interpretation and/or use. It is often the case that an investigator judges both the…

  6. Supply chain reliability modelling

    Directory of Open Access Journals (Sweden)

    Eugen Zaitsev

    2012-03-01

    Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.

  7. Reliability Modeling of Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik

    components. Thus, models of reliability should be developed and applied in order to quantify the residual life of the components. Damage models based on physics of failure combined with stochastic models describing the uncertain parameters are imperative for development of cost-optimal decision tools...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied....... Further, reliability modeling of load sharing systems is considered and a theoretical model is proposed based on sequential order statistics and structural systems reliability methods. Procedures for reliability estimation are detailed and presented in a collection of research papers....

  8. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  9. PROVIDING RELIABILITY OF HUMAN RESOURCES IN PRODUCTION MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Anna MAZUR

    2014-07-01

    Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.

  10. Reliability Modeling of Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik

    Cost reductions for offshore wind turbines are a substantial requirement in order to make offshore wind energy more competitive compared to other energy supply methods. During the 20 – 25 years of wind turbines useful life, Operation & Maintenance costs are typically estimated to be a quarter...... the actions should be made and the type of actions requires knowledge on the accumulated damage or degradation state of the wind turbine components. For offshore wind turbines, the action times could be extended due to weather restrictions and result in damage or degradation increase of the remaining...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...

  11. Nonspecialist Raters Can Provide Reliable Assessments of Procedural Skills

    DEFF Research Database (Denmark)

    Mahmood, Oria; Dagnæs, Julia; Bube, Sarah

    2018-01-01

    BACKGROUND: Competency-based learning has become a crucial component in medical education. Despite the advantages of competency-based learning, there are still challenges that need to be addressed. Currently, the common perception is that specialist assessment is needed for evaluating procedural ...... study suggests that nonspecialist raters can provide reliable and valid assessments of video recorded cystoscopies. This could make mastery learning and competency-based education more feasible....

  12. Reliability in the Rasch Model

    Czech Academy of Sciences Publication Activity Database

    Martinková, Patrícia; Zvára, K.

    2007-01-01

    Roč. 43, č. 3 (2007), s. 315-326 ISSN 0023-5954 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : Cronbach's alpha * Rasch model * reliability Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.552, year: 2007 http://dml.cz/handle/10338.dmlcz/135776

  13. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  14. Master/slave clock arrangement for providing reliable clock signal

    Science.gov (United States)

    Abbey, Duane L. (Inventor)

    1977-01-01

    The outputs of two like frequency oscillators are combined to form a single reliable clock signal, with one oscillator functioning as a slave under the control of the other to achieve phase coincidence when the master is operative and in a free-running mode when the master is inoperative so that failure of either oscillator produces no effect on the clock signal.

  15. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  16. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  17. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  18. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  19. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  20. Reliability state space model of Power Transformer

    OpenAIRE

    REENA JHARANIYA; M.AHFAZ KHAN

    2011-01-01

    In electrical power network, transformer is one of the most important electrical equipment in power system, which running status is directly concerned with the reliability of power system. Reliability of a power system is considerably influenced by its equipments. Power transformers are one of the most critical and expensive equipments of a power system and their proper functions are vital for the substations and utilities .Therefore, reliability model of power transformer is very important i...

  1. Generation and analysis of large reliability models

    Science.gov (United States)

    Palumbo, Daniel L.; Nicol, David M.

    1990-01-01

    An effort has been underway for several years at NASA's Langley Research Center to extend the capability of Markov modeling techniques for reliability analysis to the designers of highly reliable avionic systems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG), a software tool which uses as input a graphical, object-oriented block diagram of the system, is discussed. RMG uses an automated failure modes-effects analysis algorithm to produce the reliability model from the graphical description. Also considered is the ASSURE software tool, a parallel processing program which uses the ASSIST modeling language and SURE semi-Markov solution technique. An executable failure modes-effects analysis is used by ASSURE. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that large system architectures can now be analyzed.

  2. Virtual private networks can provide reliable IT connections.

    Science.gov (United States)

    Kabachinski, Jeff

    2006-01-01

    A VPN is a private network that uses a public network, such as the Internet, to connect remote sites and users together. Instead of using a dedicated hard-wired connection as in a trusted connection or leased lines, a VPN uses a virtual connection routed through the Internet from the organization's private network to the remote site or employee. Typical VPN services allow for security in terms of data encryption as well as means to authenticate, authorize, and account for all the traffic. VPN services allow the organization to use whatever network operating system they wish as it also encapsulate your data into the protocols needed to transport data across public lines. The intention of this IT World article was to give the reader an introduction to VPNs. Keep in mind that there are no standard models for a VPN. You're likely to come across many vendors presenting the virtues of their VPN applications and devices when you Google "VPN." However the general uses, concepts, and principles outlined here should give you a fighting chance to read through the marketing language in the online ads and "white papers."

  3. Reliability modeling of an engineered barrier system

    International Nuclear Information System (INIS)

    Ananda, M.M.A.; Singh, A.K.; Flueck, J.A.

    1993-01-01

    The Weibull distribution is widely used in reliability literature as a distribution of time to failure, as it allows for both increasing failure rate (IFR) and decreasing failure rate (DFR) models. It has also been used to develop models for an engineered barrier system (EBS), which is known to be one of the key components in a deep geological repository for high level radioactive waste (HLW). The EBS failure time can more realistically be modelled by an IFR distribution, since the failure rate for the EBS is not expected to decrease with time. In this paper, we use an IFR distribution to develop a reliability model for the EBS

  4. Towards a reliable animal model of migraine

    DEFF Research Database (Denmark)

    Olesen, Jes; Jansen-Olesen, Inger

    2012-01-01

    The pharmaceutical industry shows a decreasing interest in the development of drugs for migraine. One of the reasons for this could be the lack of reliable animal models for studying the effect of acute and prophylactic migraine drugs. The infusion of glyceryl trinitrate (GTN) is the best validated...... and most studied human migraine model. Several attempts have been made to transfer this model to animals. The different variants of this model are discussed as well as other recent models....

  5. An interval-valued reliability model with bounded failure rates

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2012-01-01

    The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... rate is known possibly along with other reliability measures, precise or imprecise. The Lagrange method is used to solve the constrained optimization problem to derive new reliability measures of interest. The obtained results call for an exponential-wise approximation of failure probability density...... function if only partial failure information is available. An example is provided. © 2012 Copyright Taylor and Francis Group, LLC....

  6. Achieving Success in Measurement and Reliability Modeling

    OpenAIRE

    Keller, Ted; Munson, John C.; Schneidewind, Norman; Stark, George

    1993-01-01

    Panel Session at the International Symposium on Software Reliability Engineering 1993, Saturday: 6 November 1993, 0830-1000 and 1030-1200 The NASA Space Shuttle on-board software is one of the nation’s most safety-critical software systems. The process which produces this software has been rated at maturity level five. Among the quality assurance methods that are used to ensure the software is free of safetycritical faults is the use of reliability modelling and predi...

  7. Space Vehicle Reliability Modeling in DIORAMA

    Energy Technology Data Exchange (ETDEWEB)

    Tornga, Shawn Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-12

    When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.

  8. Overcoming some limitations of imprecise reliability models

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2011-01-01

    The application of imprecise reliability models is often hindered by the rapid growth in imprecision that occurs when many components constitute a system and by the fact that time to failure is bounded from above. The latter results in the necessity to explicitly introduce an upper bound on time...... to failure which is in reality a rather arbitrary value. The practical meaning of the models of this kind is brought to question. We suggest an approach that overcomes the issue of having to impose an upper bound on time to failure and makes the calculated lower and upper reliability measures more precise...

  9. System reliability time-dependent models

    International Nuclear Information System (INIS)

    Debernardo, H.D.

    1991-06-01

    A probabilistic methodology for safety system technical specification evaluation was developed. The method for Surveillance Test Interval (S.T.I.) evaluation basically means an optimization of S.T.I. of most important system's periodically tested components. For Allowed Outage Time (A.O.T.) calculations, the method uses system reliability time-dependent models (A computer code called FRANTIC III). A new approximation, which was called Independent Minimal Cut Sets (A.C.I.), to compute system unavailability was also developed. This approximation is better than Rare Event Approximation (A.E.R.) and the extra computing cost is neglectible. A.C.I. was joined to FRANTIC III to replace A.E.R. on future applications. The case study evaluations verified that this methodology provides a useful probabilistic assessment of surveillance test intervals and allowed outage times for many plant components. The studied system is a typical configuration of nuclear power plant safety systems (two of three logic). Because of the good results, these procedures will be used by the Argentine nuclear regulatory authorities in evaluation of technical specification of Atucha I and Embalse nuclear power plant safety systems. (Author) [es

  10. Centralized Bayesian reliability modelling with sensor networks

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Sečkárová, Vladimíra

    2013-01-01

    Roč. 19, č. 5 (2013), s. 471-482 ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant - others:GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/dedecius-0392551.pdf

  11. Stochastic models in reliability and maintenance

    CERN Document Server

    2002-01-01

    Our daily lives can be maintained by the high-technology systems. Computer systems are typical examples of such systems. We can enjoy our modern lives by using many computer systems. Much more importantly, we have to maintain such systems without failure, but cannot predict when such systems will fail and how to fix such systems without delay. A stochastic process is a set of outcomes of a random experiment indexed by time, and is one of the key tools needed to analyze the future behavior quantitatively. Reliability and maintainability technologies are of great interest and importance to the maintenance of such systems. Many mathematical models have been and will be proposed to describe reliability and maintainability systems by using the stochastic processes. The theme of this book is "Stochastic Models in Reliability and Main­ tainability. " This book consists of 12 chapters on the theme above from the different viewpoints of stochastic modeling. Chapter 1 is devoted to "Renewal Processes," under which cla...

  12. Reliability modelling and simulation of switched linear system ...

    African Journals Online (AJOL)

    Thus, constructing a subsystem Markov model and matching its parameters with the specified safety factors provides the basis for the entire system analysis. For the system simulation, temporal databases and predictive control algorithm are designed. The simulation results are analyzed to assess the reliability of the system ...

  13. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  14. Cost Calculation Model for Logistics Service Providers

    Directory of Open Access Journals (Sweden)

    Zoltán Bokor

    2012-11-01

    Full Text Available The exact calculation of logistics costs has become a real challenge in logistics and supply chain management. It is essential to gain reliable and accurate costing information to attain efficient resource allocation within the logistics service provider companies. Traditional costing approaches, however, may not be sufficient to reach this aim in case of complex and heterogeneous logistics service structures. So this paper intends to explore the ways of improving the cost calculation regimes of logistics service providers and show how to adopt the multi-level full cost allocation technique in logistics practice. After determining the methodological framework, a sample cost calculation scheme is developed and tested by using estimated input data. Based on the theoretical findings and the experiences of the pilot project it can be concluded that the improved costing model contributes to making logistics costing more accurate and transparent. Moreover, the relations between costs and performances also become more visible, which enhances the effectiveness of logistics planning and controlling significantly

  15. Reliability model of SNS linac (spallation neutron source-ORNL)

    International Nuclear Information System (INIS)

    Pitigoi, A.; Fernandez, P.

    2015-01-01

    A reliability model of SNS LINAC (Spallation Neutron Source at Oak Ridge National Laboratory) has been developed using risk spectrum reliability analysis software and the analysis of the accelerator system's reliability has been performed. The analysis results have been evaluated by comparing them with the SNS operational data. This paper presents the main results and conclusions focusing on the definition of design weaknesses and provides recommendations to improve reliability of the MYRRHA ( linear accelerator. The reliability results show that the most affected SNS LINAC parts/systems are: 1) SCL (superconducting linac), front-end systems: IS, LEBT (low-energy beam transport line), MEBT (medium-energy beam transport line), diagnostics and controls; 2) RF systems (especially the SCL RF system); 3) power supplies and PS controllers. These results are in line with the records in the SNS logbook. The reliability issue that needs to be enforced in the linac design is the redundancy of the systems, subsystems and components most affected by failures. For compensation purposes, there is a need for intelligent fail-over redundancy implementation in controllers. Enough diagnostics has to be implemented to allow reliable functioning of the redundant solutions and to ensure the compensation function

  16. Interrater reliability of early intervention providers scoring the alberta infant motor scale.

    Science.gov (United States)

    Blanchard, Y; Neilan, E; Busanich, J; Garavuso, L; Klimas, D

    2004-01-01

    This study was designed to examine the interrater reliability of early intervention providers scoring of the Alberta Infant Motor Scale (AIMS) and to examine whether training on the AIMS would improve their interrater reliability. Eight early intervention providers were randomly assigned to two groups. Participants in Group 1 scored the AIMS on seven videotapes of infants prior to receiving training and after training on another set of seven videotapes of infants. Participants in Group 2 scored the AIMS on all 14 videotapes of the infants after receiving training. Overall interrater reliability before and after training was high with intraclass correlation coefficients ranging from 0.98 to 0.99. Detailed examination of the results showed that training improved the reliability of the supine subscale in a subgroup of infants between the ages of five and seven months. Training also had an effect on the classification of infants as normal or abnormal in their motor development based on their percentile rankings. The AIMS manual provides sufficient information to attain high interrater reliability without training, but revisions regarding scoring are strongly recommended.

  17. The Impact of Process Capability on Service Reliability for Critical Infrastructure Providers

    Science.gov (United States)

    Houston, Clemith J., Jr.

    2013-01-01

    This study investigated the relationship between organizational processes that have been identified as promoting resiliency and their impact on service reliability within the scope of critical infrastructure providers. The importance of critical infrastructure to the nation is evident from the body of research and is supported by instances where…

  18. Standardized Patients Provide a Reliable Assessment of Athletic Training Students' Clinical Skills

    Science.gov (United States)

    Armstrong, Kirk J.; Jarriel, Amanda J.

    2016-01-01

    Context: Providing students reliable objective feedback regarding their clinical performance is of great value for ongoing clinical skill assessment. Since a standardized patient (SP) is trained to consistently portray the case, students can be assessed and receive immediate feedback within the same clinical encounter; however, no research, to our…

  19. Reliability modeling and analysis of smart power systems

    CERN Document Server

    Karki, Rajesh; Verma, Ajit Kumar

    2014-01-01

    The volume presents the research work in understanding, modeling and quantifying the risks associated with different ways of implementing smart grid technology in power systems in order to plan and operate a modern power system with an acceptable level of reliability. Power systems throughout the world are undergoing significant changes creating new challenges to system planning and operation in order to provide reliable and efficient use of electrical energy. The appropriate use of smart grid technology is an important drive in mitigating these problems and requires considerable research acti

  20. System Statement of Tasks of Calculating and Providing the Reliability of Heating Cogeneration Plants in Power Systems

    Science.gov (United States)

    Biryuk, V. V.; Tsapkova, A. B.; Larin, E. A.; Livshiz, M. Y.; Sheludko, L. P.

    2018-01-01

    A set of mathematical models for calculating the reliability indexes of structurally complex multifunctional combined installations in heat and power supply systems was developed. Reliability of energy supply is considered as required condition for the creation and operation of heat and power supply systems. The optimal value of the power supply system coefficient F is based on an economic assessment of the consumers’ loss caused by the under-supply of electric power and additional system expences for the creation and operation of an emergency capacity reserve. Rationing of RI of the industrial heat supply is based on the use of concept of technological margin of safety of technological processes. The definition of rationed RI values of heat supply of communal consumers is based on the air temperature level iside the heated premises. The complex allows solving a number of practical tasks for providing reliability of heat supply for consumers. A probabilistic model is developed for calculating the reliability indexes of combined multipurpose heat and power plants in heat-and-power supply systems. The complex of models and calculation programs can be used to solve a wide range of specific tasks of optimization of schemes and parameters of combined heat and power plants and systems, as well as determining the efficiency of various redundance methods to ensure specified reliability of power supply.

  1. Inter-Rater Reliability of Provider Interpretations of Irritable Bowel Syndrome Food and Symptom Journals.

    Science.gov (United States)

    Zia, Jasmine; Chung, Chia-Fang; Xu, Kaiyuan; Dong, Yi; Schenk, Jeanette M; Cain, Kevin; Munson, Sean; Heitkemper, Margaret M

    2017-11-04

    There are currently no standardized methods for identifying trigger food(s) from irritable bowel syndrome (IBS) food and symptom journals. The primary aim of this study was to assess the inter-rater reliability of providers' interpretations of IBS journals. A second aim was to describe whether these interpretations varied for each patient. Eight providers reviewed 17 IBS journals and rated how likely key food groups (fermentable oligo-di-monosaccharides and polyols, high-calorie, gluten, caffeine, high-fiber) were to trigger IBS symptoms for each patient. Agreement of trigger food ratings was calculated using Krippendorff's α-reliability estimate. Providers were also asked to write down recommendations they would give to each patient. Estimates of agreement of trigger food likelihood ratings were poor (average α = 0.07). Most providers gave similar trigger food likelihood ratings for over half the food groups. Four providers gave the exact same written recommendation(s) (range 3-7) to over half the patients. Inter-rater reliability of provider interpretations of IBS food and symptom journals was poor. Providers favored certain trigger food likelihood ratings and written recommendations. This supports the need for a more standardized method for interpreting these journals and/or more rigorous techniques to accurately identify personalized IBS food triggers.

  2. Inter-Rater Reliability of Provider Interpretations of Irritable Bowel Syndrome Food and Symptom Journals

    Directory of Open Access Journals (Sweden)

    Jasmine Zia

    2017-11-01

    Full Text Available There are currently no standardized methods for identifying trigger food(s from irritable bowel syndrome (IBS food and symptom journals. The primary aim of this study was to assess the inter-rater reliability of providers’ interpretations of IBS journals. A second aim was to describe whether these interpretations varied for each patient. Eight providers reviewed 17 IBS journals and rated how likely key food groups (fermentable oligo-di-monosaccharides and polyols, high-calorie, gluten, caffeine, high-fiber were to trigger IBS symptoms for each patient. Agreement of trigger food ratings was calculated using Krippendorff’s α-reliability estimate. Providers were also asked to write down recommendations they would give to each patient. Estimates of agreement of trigger food likelihood ratings were poor (average α = 0.07. Most providers gave similar trigger food likelihood ratings for over half the food groups. Four providers gave the exact same written recommendation(s (range 3–7 to over half the patients. Inter-rater reliability of provider interpretations of IBS food and symptom journals was poor. Providers favored certain trigger food likelihood ratings and written recommendations. This supports the need for a more standardized method for interpreting these journals and/or more rigorous techniques to accurately identify personalized IBS food triggers.

  3. Some notes on unobserved parameters (frailties) in reliability modeling

    International Nuclear Information System (INIS)

    Cha, Ji Hwan; Finkelstein, Maxim

    2014-01-01

    Unobserved random quantities (frailties) often appear in various reliability problems especially when dealing with the failure rates of items from heterogeneous populations. As the failure rate is a conditional characteristic, the distributions of these random quantities, similar to Bayesian approaches, are updated in accordance with the corresponding survival information. At some instances, apart from a statistical meaning, frailties can have also useful interpretations describing the underlying lifetime model. We discuss and clarify these issues in reliability context and present and analyze several meaningful examples. We consider the proportional hazards model with a random factor; the stress–strength model, where the unobserved strength of a system can be viewed as frailty; a parallel system with a random number of components and, finally, the first passage time problem for the Wiener process with random parameters. - Highlights: • We discuss and clarify the notion of frailty in reliability context and present and analyze several meaningful examples. • The paper provides a new insight and general perspective on reliability models with unobserved parameters. • The main message of the paper is well illustrated by several meaningful examples and emphasized by detailed discussion

  4. Modeling high-Power Accelerators Reliability-SNS LINAC (SNS-ORNL); MAX LINAC (MYRRHA)

    Energy Technology Data Exchange (ETDEWEB)

    Pitigoi, A. E.; Fernandez Ramos, P.

    2013-07-01

    Improving reliability has recently become a very important objective in the field of particle accelerators. The particle accelerators in operation are constantly undergoing modifications, and improvements are implemented using new technologies, more reliable components or redundant schemes (to obtain more reliability, strength, more power, etc.) A reliability model of SNS (Spallation Neutron Source) LINAC has been developed within MAX project and analysis of the accelerator systems reliability has been performed within the MAX project, using the Risk Spectrum reliability analysis software. The analysis results have been evaluated by comparison with the SNS operational data. Results and conclusions are presented in this paper, oriented to identify design weaknesses and provide recommendations for improving reliability of MYRRHA linear accelerator. The SNS reliability model developed for the MAX preliminary design phase indicates possible avenues for further investigation that could be needed to improve the reliability of the high-power accelerators, in view of the future reliability targets of ADS accelerators.

  5. Evaluating the reliability of predictions made using environmental transfer models

    International Nuclear Information System (INIS)

    1989-01-01

    The development and application of mathematical models for predicting the consequences of releases of radionuclides into the environment from normal operations in the nuclear fuel cycle and in hypothetical accident conditions has increased dramatically in the last two decades. This Safety Practice publication has been prepared to provide guidance on the available methods for evaluating the reliability of environmental transfer model predictions. It provides a practical introduction of the subject and a particular emphasis has been given to worked examples in the text. It is intended to supplement existing IAEA publications on environmental assessment methodology. 60 refs, 17 figs, 12 tabs

  6. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  7. Motion Reliability Modeling and Evaluation for Manipulator Path Planning Task

    OpenAIRE

    Li, Tong; Jia, Qingxuan; Chen, Gang; Sun, Hanxu

    2015-01-01

    Motion reliability as a criterion can reflect the accuracy of manipulator in completing operations. Since path planning task takes a significant role in operations of manipulator, the motion reliability evaluation of path planning task is discussed in the paper. First, a modeling method for motion reliability is proposed by taking factors related to position accuracy of manipulator into account. In the model, multidimensional integral for PDF is carried out to calculate motion reliability. Co...

  8. Human reliability data collection and modelling

    International Nuclear Information System (INIS)

    1991-09-01

    The main purpose of this document is to review and outline the current state-of-the-art of the Human Reliability Assessment (HRA) used for quantitative assessment of nuclear power plants safe and economical operation. Another objective is to consider Human Performance Indicators (HPI) which can alert plant manager and regulator to departures from states of normal and acceptable operation. These two objectives are met in the three sections of this report. The first objective has been divided into two areas, based on the location of the human actions being considered. That is, the modelling and data collection associated with control room actions are addressed first in chapter 1 while actions outside the control room (including maintenance) are addressed in chapter 2. Both chapters 1 and 2 present a brief outline of the current status of HRA for these areas, and major outstanding issues. Chapter 3 discusses HPI. Such performance indicators can signal, at various levels, changes in factors which influence human performance. The final section of this report consists of papers presented by the participants of the Technical Committee Meeting. A separate abstract was prepared for each of these papers. Refs, figs and tabs

  9. Reliability modelling - PETROBRAS 2010 integrated gas supply chain

    Energy Technology Data Exchange (ETDEWEB)

    Faertes, Denise; Heil, Luciana; Saker, Leonardo; Vieira, Flavia; Risi, Francisco; Domingues, Joaquim; Alvarenga, Tobias; Carvalho, Eduardo; Mussel, Patricia

    2010-09-15

    The purpose of this paper is to present the innovative reliability modeling of Petrobras 2010 integrated gas supply chain. The model represents a challenge in terms of complexity and software robustness. It was jointly developed by PETROBRAS Gas and Power Department and Det Norske Veritas. It was carried out with the objective of evaluating security of supply of 2010 gas network design that was conceived to connect Brazilian Northeast and Southeast regions. To provide best in class analysis, state of the art software was used to quantify the availability and the efficiency of the overall network and its individual components.

  10. Providing reliable energy in a time of constraints : a North American concern

    International Nuclear Information System (INIS)

    Egan, T.; Turk, E.

    2008-04-01

    The reliability of the North American electricity grid was discussed. Government initiatives designed to control carbon dioxide (CO 2 ) and other emissions in some regions of Canada may lead to electricity supply constraints in other regions. A lack of investment in transmission infrastructure has resulted in constraints within the North American transmission grid, and the growth of smaller projects is now raising concerns about transmission capacity. Labour supply shortages in the electricity industry are also creating concerns about the long-term security of the electricity market. Measures to address constraints must be considered in the current context of the North American electricity system. The extensive transmission interconnects and integration between the United States and Canada will provide a framework for greater trade and market opportunities between the 2 countries. Coordinated actions and increased integration will enable Canada and the United States to increase the reliability of electricity supply. However, both countries must work cooperatively to increase generation supply using both mature and emerging technologies. The cross-border transmission grid must be enhanced by increasing transmission capacity as well as by implementing new reliability rules, building new infrastructure, and ensuring infrastructure protection. Barriers to cross-border electricity trade must be identified and avoided. Demand-side and energy efficiency measures must also be implemented. It was concluded that both countries must focus on developing strategies for addressing the environmental concerns related to electricity production. 6 figs

  11. Building and integrating reliability models in a Reliability-Centered-Maintenance approach

    International Nuclear Information System (INIS)

    Verite, B.; Villain, B.; Venturini, V.; Hugonnard, S.; Bryla, P.

    1998-03-01

    Electricite de France (EDF) has recently developed its OMF-Structures method, designed to optimize preventive maintenance of passive structures such as pipes and support, based on risk. In particular, reliability performances of components need to be determined; it is a two-step process, consisting of a qualitative sort followed by a quantitative evaluation, involving two types of models. Initially, degradation models are widely used to exclude some components from the field of preventive maintenance. The reliability of the remaining components is then evaluated by means of quantitative reliability models. The results are then included in a risk indicator that is used to directly optimize preventive maintenance tasks. (author)

  12. Model of reliability assessment in ultrasonic nondestructive inspection

    International Nuclear Information System (INIS)

    Park, Ik Keun; Park, Un Su; Kim, Hyun Mook; Park, Yoon Won; Kang, Suk Chull; Choi, Young Hwan; Lee, Jin Ho

    2001-01-01

    Ultrasonic inspection system is consisted of the operator, equipment and procedure. The reliability of results in ultrasonic inspection is affected by its ability. Furthermore, the reliability of nondestructive testing is influenced by the inspection environment, other materials and types of defect. Therefore, it is very difficult to estimate the reliability of NDT due to various factors. In this study, the probability of detection, used logistic probability model and Monte Carlo simulation, estimated the reliability of ultrasonic inspection. The utility of the NDT reliability assessment is verified by the analysis of the data from round robin test applied these models

  13. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    Science.gov (United States)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  14. Reliability physics and engineering time-to-failure modeling

    CERN Document Server

    McPherson, J W

    2013-01-01

    Reliability Physics and Engineering provides critically important information that is needed for designing and building reliable cost-effective products. Key features include:  ·       Materials/Device Degradation ·       Degradation Kinetics ·       Time-To-Failure Modeling ·       Statistical Tools ·       Failure-Rate Modeling ·       Accelerated Testing ·       Ramp-To-Failure Testing ·       Important Failure Mechanisms for Integrated Circuits ·       Important Failure Mechanisms for  Mechanical Components ·       Conversion of Dynamic  Stresses into Static Equivalents ·       Small Design Changes Producing Major Reliability Improvements ·       Screening Methods ·       Heat Generation and Dissipation ·       Sampling Plans and Confidence Intervals This textbook includes numerous example problems with solutions. Also, exercise problems along with the answers are included at the end of each chapter. Relia...

  15. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  16. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  17. An approximation formula for a class of Markov reliability models

    Science.gov (United States)

    White, A. L.

    1984-01-01

    A way of considering a small but often used class of reliability model and approximating algebraically the systems reliability is shown. The models considered are appropriate for redundant reconfigurable digital control systems that operate for a short period of time without maintenance, and for such systems the method gives a formula in terms of component fault rates, system recovery rates, and system operating time.

  18. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  19. Reliability models applicable to space telescope solar array assembly system

    Science.gov (United States)

    Patil, S. A.

    1986-01-01

    A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.

  20. Evaluation of mobile ad hoc network reliability using propagation-based link reliability model

    International Nuclear Information System (INIS)

    Padmavathy, N.; Chaturvedi, Sanjay K.

    2013-01-01

    A wireless mobile ad hoc network (MANET) is a collection of solely independent nodes (that can move randomly around the area of deployment) making the topology highly dynamic; nodes communicate with each other by forming a single hop/multi-hop network and maintain connectivity in decentralized manner. MANET is modelled using geometric random graphs rather than random graphs because the link existence in MANET is a function of the geometric distance between the nodes and the transmission range of the nodes. Among many factors that contribute to the MANET reliability, the reliability of these networks also depends on the robustness of the link between the mobile nodes of the network. Recently, the reliability of such networks has been evaluated for imperfect nodes (transceivers) with binary model of communication links based on the transmission range of the mobile nodes and the distance between them. However, in reality, the probability of successful communication decreases as the signal strength deteriorates due to noise, fading or interference effects even up to the nodes' transmission range. Hence, in this paper, using a propagation-based link reliability model rather than a binary-model with nodes following a known failure distribution to evaluate the network reliability (2TR m , ATR m and AoTR m ) of MANET through Monte Carlo Simulation is proposed. The method is illustrated with an application and some imperative results are also presented

  1. Developing Fast and Reliable Flood Models

    DEFF Research Database (Denmark)

    Thrysøe, Cecilie; Toke, Jens; Borup, Morten

    2016-01-01

    State-of-the-art flood modelling in urban areas are based on distributed physically based models. However, their usage is impeded by high computational demands and numerical instabilities, which make calculations both difficult and time consuming. To address these challenges we develop and test a...... accuracy. The model shows no instability, hence larger time steps can be applied, which reduces the computational time by more than a factor 1400. In conclusion, surrogate models show great potential for usage in urban water modelling.......State-of-the-art flood modelling in urban areas are based on distributed physically based models. However, their usage is impeded by high computational demands and numerical instabilities, which make calculations both difficult and time consuming. To address these challenges we develop and test...

  2. Wireless Channel Modeling Perspectives for Ultra-Reliable Communications

    DEFF Research Database (Denmark)

    Eggers, Patrick Claus F.; Popovski, Petar

    2018-01-01

    Ultra-Reliable Communication (URC) is one of the distinctive features of the upcoming 5G wireless communication. The level of reliability, going down to packet error rates (PER) of $10^{-9}$, should be sufficiently convincing in order to remove cables in an industrial setting or provide remote...

  3. Reliability modeling of Clinch River breeder reactor electrical shutdown systems

    International Nuclear Information System (INIS)

    Schatz, R.A.; Duetsch, K.L.

    1974-01-01

    The initial simulation of the probabilistic properties of the Clinch River Breeder Reactor Plant (CRBRP) electrical shutdown systems is described. A model of the reliability (and availability) of the systems is presented utilizing Success State and continuous-time, discrete state Markov modeling techniques as significant elements of an overall reliability assessment process capable of demonstrating the achievement of program goals. This model is examined for its sensitivity to safe/unsafe failure rates, sybsystem redundant configurations, test and repair intervals, monitoring by reactor operators; and the control exercised over system reliability by design modifications and the selection of system operating characteristics. (U.S.)

  4. Conceptual Software Reliability Prediction Models for Nuclear Power Plant Safety Systems

    International Nuclear Information System (INIS)

    Johnson, G.; Lawrence, D.; Yu, H.

    2000-01-01

    The objective of this project is to develop a method to predict the potential reliability of software to be used in a digital system instrumentation and control system. The reliability prediction is to make use of existing measures of software reliability such as those described in IEEE Std 982 and 982.2. This prediction must be of sufficient accuracy to provide a value for uncertainty that could be used in a nuclear power plant probabilistic risk assessment (PRA). For the purposes of the project, reliability was defined to be the probability that the digital system will successfully perform its intended safety function (for the distribution of conditions under which it is expected to respond) upon demand with no unintended functions that might affect system safety. The ultimate objective is to use the identified measures to develop a method for predicting the potential quantitative reliability of a digital system. The reliability prediction models proposed in this report are conceptual in nature. That is, possible prediction techniques are proposed and trial models are built, but in order to become a useful tool for predicting reliability, the models must be tested, modified according to the results, and validated. Using methods outlined by this project, models could be constructed to develop reliability estimates for elements of software systems. This would require careful review and refinement of the models, development of model parameters from actual experience data or expert elicitation, and careful validation. By combining these reliability estimates (generated from the validated models for the constituent parts) in structural software models, the reliability of the software system could then be predicted. Modeling digital system reliability will also require that methods be developed for combining reliability estimates for hardware and software. System structural models must also be developed in order to predict system reliability based upon the reliability

  5. Models for Battery Reliability and Lifetime

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K.; Wood, E.; Santhanagopalan, S.; Kim, G. H.; Neubauer, J.; Pesaran, A.

    2014-03-01

    Models describing battery degradation physics are needed to more accurately understand how battery usage and next-generation battery designs can be optimized for performance and lifetime. Such lifetime models may also reduce the cost of battery aging experiments and shorten the time required to validate battery lifetime. Models for chemical degradation and mechanical stress are reviewed. Experimental analysis of aging data from a commercial iron-phosphate lithium-ion (Li-ion) cell elucidates the relative importance of several mechanical stress-induced degradation mechanisms.

  6. Motion Reliability Modeling and Evaluation for Manipulator Path Planning Task

    Directory of Open Access Journals (Sweden)

    Tong Li

    2015-01-01

    Full Text Available Motion reliability as a criterion can reflect the accuracy of manipulator in completing operations. Since path planning task takes a significant role in operations of manipulator, the motion reliability evaluation of path planning task is discussed in the paper. First, a modeling method for motion reliability is proposed by taking factors related to position accuracy of manipulator into account. In the model, multidimensional integral for PDF is carried out to calculate motion reliability. Considering the complex of multidimensional integral, the approach of equivalent extreme value is introduced, with which multidimensional integral is converted into one dimensional integral for convenient calculation. Then a method based on the maximum entropy principle is proposed for model calculation. With the method, the PDF can be obtained efficiently at the state of maximum entropy. As a result, the evaluation of motion reliability can be achieved by one dimensional integral for PDF. Simulations on a particular path planning task are carried out, with which the feasibility and effectiveness of the proposed methods are verified. In addition, the modeling method which takes the factors related to position accuracy into account can represent the contributions of these factors to motion reliability. And the model calculation method can achieve motion reliability evaluation with high precision and efficiency.

  7. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    Science.gov (United States)

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  8. Software reliability growth model for safety systems of nuclear reactor

    International Nuclear Information System (INIS)

    Thirugnana Murthy, D.; Murali, N.; Sridevi, T.; Satya Murty, S.A.V.; Velusamy, K.

    2014-01-01

    The demand for complex software systems has increased more rapidly than the ability to design, implement, test, and maintain them, and the reliability of software systems has become a major concern for our, modern society.Software failures have impaired several high visibility programs in space, telecommunications, defense and health industries. Besides the costs involved, it setback the projects. The ways of quantifying it and using it for improvement and control of the software development and maintenance process. This paper discusses need for systematic approaches for measuring and assuring software reliability which is a major share of project development resources. It covers the reliability models with the concern on 'Reliability Growth'. It includes data collection on reliability, statistical estimation and prediction, metrics and attributes of product architecture, design, software development, and the operational environment. Besides its use for operational decisions like deployment, it includes guiding software architecture, development, testing and verification and validation. (author)

  9. A Reliability Based Model for Wind Turbine Selection

    Directory of Open Access Journals (Sweden)

    A.K. Rajeevan

    2013-06-01

    Full Text Available A wind turbine generator output at a specific site depends on many factors, particularly cut- in, rated and cut-out wind speed parameters. Hence power output varies from turbine to turbine. The objective of this paper is to develop a mathematical relationship between reliability and wind power generation. The analytical computation of monthly wind power is obtained from weibull statistical model using cubic mean cube root of wind speed. Reliability calculation is based on failure probability analysis. There are many different types of wind turbinescommercially available in the market. From reliability point of view, to get optimum reliability in power generation, it is desirable to select a wind turbine generator which is best suited for a site. The mathematical relationship developed in this paper can be used for site-matching turbine selection in reliability point of view.

  10. System Reliability Analysis Capability and Surrogate Model Application in RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Huang, Dongli [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gleicher, Frederick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Bei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Adbel-Khalik, Hany S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pascucci, Valerio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-11-01

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  11. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  12. A Novel OBDD-Based Reliability Evaluation Algorithm for Wireless Sensor Networks on the Multicast Model

    Directory of Open Access Journals (Sweden)

    Zongshuai Yan

    2015-01-01

    Full Text Available The two-terminal reliability calculation for wireless sensor networks (WSNs is a #P-hard problem. The reliability calculation of WSNs on the multicast model provides an even worse combinatorial explosion of node states with respect to the calculation of WSNs on the unicast model; many real WSNs require the multicast model to deliver information. This research first provides a formal definition for the WSN on the multicast model. Next, a symbolic OBDD_Multicast algorithm is proposed to evaluate the reliability of WSNs on the multicast model. Furthermore, our research on OBDD_Multicast construction avoids the problem of invalid expansion, which reduces the number of subnetworks by identifying the redundant paths of two adjacent nodes and s-t unconnected paths. Experiments show that the OBDD_Multicast both reduces the complexity of the WSN reliability analysis and has a lower running time than Xing’s OBDD- (ordered binary decision diagram- based algorithm.

  13. Estimation of some stochastic models used in reliability engineering

    International Nuclear Information System (INIS)

    Huovinen, T.

    1989-04-01

    The work aims to study the estimation of some stochastic models used in reliability engineering. In reliability engineering continuous probability distributions have been used as models for the lifetime of technical components. We consider here the following distributions: exponential, 2-mixture exponential, conditional exponential, Weibull, lognormal and gamma. Maximum likelihood method is used to estimate distributions from observed data which may be either complete or censored. We consider models based on homogeneous Poisson processes such as gamma-poisson and lognormal-poisson models for analysis of failure intensity. We study also a beta-binomial model for analysis of failure probability. The estimators of the parameters for three models are estimated by the matching moments method and in the case of gamma-poisson and beta-binomial models also by maximum likelihood method. A great deal of mathematical or statistical problems that arise in reliability engineering can be solved by utilizing point processes. Here we consider the statistical analysis of non-homogeneous Poisson processes to describe the failing phenomena of a set of components with a Weibull intensity function. We use the method of maximum likelihood to estimate the parameters of the Weibull model. A common cause failure can seriously reduce the reliability of a system. We consider a binomial failure rate (BFR) model as an application of the marked point processes for modelling common cause failure in a system. The parameters of the binomial failure rate model are estimated with the maximum likelihood method

  14. Effective stimuli for constructing reliable neuron models.

    Directory of Open Access Journals (Sweden)

    Shaul Druckmann

    2011-08-01

    Full Text Available The rich dynamical nature of neurons poses major conceptual and technical challenges for unraveling their nonlinear membrane properties. Traditionally, various current waveforms have been injected at the soma to probe neuron dynamics, but the rationale for selecting specific stimuli has never been rigorously justified. The present experimental and theoretical study proposes a novel framework, inspired by learning theory, for objectively selecting the stimuli that best unravel the neuron's dynamics. The efficacy of stimuli is assessed in terms of their ability to constrain the parameter space of biophysically detailed conductance-based models that faithfully replicate the neuron's dynamics as attested by their ability to generalize well to the neuron's response to novel experimental stimuli. We used this framework to evaluate a variety of stimuli in different types of cortical neurons, ages and animals. Despite their simplicity, a set of stimuli consisting of step and ramp current pulses outperforms synaptic-like noisy stimuli in revealing the dynamics of these neurons. The general framework that we propose paves a new way for defining, evaluating and standardizing effective electrical probing of neurons and will thus lay the foundation for a much deeper understanding of the electrical nature of these highly sophisticated and non-linear devices and of the neuronal networks that they compose.

  15. Learning reliable manipulation strategies without initial physical models

    Science.gov (United States)

    Christiansen, Alan D.; Mason, Matthew T.; Mitchell, Tom M.

    1990-01-01

    A description is given of a robot, possessing limited sensory and effectory capabilities but no initial model of the effects of its actions on the world, that acquires such a model through exploration, practice, and observation. By acquiring an increasingly correct model of its actions, it generates increasingly successful plans to achieve its goals. In an apparently nondeterministic world, achieving reliability requires the identification of reliable actions and a preference for using such actions. Furthermore, by selecting its training actions carefully, the robot can significantly improve its learning rate.

  16. Interactive Reliability Model for Whisker-toughened Ceramics

    Science.gov (United States)

    Palko, Joseph L.

    1993-01-01

    Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.

  17. A new model for reliability optimization of series-parallel systems with non-homogeneous components

    International Nuclear Information System (INIS)

    Feizabadi, Mohammad; Jahromi, Abdolhamid Eshraghniaye

    2017-01-01

    In discussions related to reliability optimization using redundancy allocation, one of the structures that has attracted the attention of many researchers, is series-parallel structure. In models previously presented for reliability optimization of series-parallel systems, there is a restricting assumption based on which all components of a subsystem must be homogeneous. This constraint limits system designers in selecting components and prevents achieving higher levels of reliability. In this paper, a new model is proposed for reliability optimization of series-parallel systems, which makes possible the use of non-homogeneous components in each subsystem. As a result of this flexibility, the process of supplying system components will be easier. To solve the proposed model, since the redundancy allocation problem (RAP) belongs to the NP-hard class of optimization problems, a genetic algorithm (GA) is developed. The computational results of the designed GA are indicative of high performance of the proposed model in increasing system reliability and decreasing costs. - Highlights: • In this paper, a new model is proposed for reliability optimization of series-parallel systems. • In the previous models, there is a restricting assumption based on which all components of a subsystem must be homogeneous. • The presented model provides a possibility for the subsystems’ components to be non- homogeneous in the required conditions. • The computational results demonstrate the high performance of the proposed model in improving reliability and reducing costs.

  18. Modeling and Forecasting (Un)Reliable Realized Covariances for More Reliable Financial Decisions

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier

    We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases into the c......We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases...... turnover and statistically superior positions compared to existing procedures. Translating these statistical improvements into economic gains, we find that under empirically realistic assumptions a risk-averse investor would be willing to pay up to 170 basis points per year to shift to using the new class...

  19. Using LISREL to Evaluate Measurement Models and Scale Reliability.

    Science.gov (United States)

    Fleishman, John; Benson, Jeri

    1987-01-01

    LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…

  20. Travel Time Reliability for Urban Networks : Modelling and Empirics

    NARCIS (Netherlands)

    Zheng, F.; Liu, Xiaobo; van Zuylen, H.J.; Li, Jie; Lu, Chao

    2017-01-01

    The importance of travel time reliability in traffic management, control, and network design has received a lot of attention in the past decade. In this paper, a network travel time distribution model based on the Johnson curve system is proposed. The model is applied to field travel time data

  1. A Multistate Model of Reliability of Farming Machinery

    Directory of Open Access Journals (Sweden)

    Durczak Karol

    2018-01-01

    Full Text Available The article describes a multistate model of reliability of farming machinery as a deductive stochastic model of the process of changes in the technical conditions observed during operation. These conditions determine the capacity of machinery to fulfil functions, simultaneously keeping safety and maintaining acceptable costs of possible repairs. The theory of semi-Markov processes was used to solve the problem. After detailed analysis of the symptoms of damage to exemplary groups of farming machinery (rotary mowers, rotary harrows and harvesting presses we obligatorily and arbitrarily proposed an optimal four-state reliability model to describe changes in technical conditions. In contrast to the classic reliability theory, which allows only two states of technical usability (either a machine is fit to function or not, we also allowed intermediate states, because not all types of damage affect the functionality of machinery. This approach increases the probability of technical usability of machinery and rationally delays the moment of premature repair.

  2. Reliability Analysis of Wireless Sensor Networks Using Markovian Model

    Directory of Open Access Journals (Sweden)

    Jin Zhu

    2012-01-01

    Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.

  3. Reliability Modeling of Microelectromechanical Systems Using Neural Networks

    Science.gov (United States)

    Perera. J. Sebastian

    2000-01-01

    Microelectromechanical systems (MEMS) are a broad and rapidly expanding field that is currently receiving a great deal of attention because of the potential to significantly improve the ability to sense, analyze, and control a variety of processes, such as heating and ventilation systems, automobiles, medicine, aeronautical flight, military surveillance, weather forecasting, and space exploration. MEMS are very small and are a blend of electrical and mechanical components, with electrical and mechanical systems on one chip. This research establishes reliability estimation and prediction for MEMS devices at the conceptual design phase using neural networks. At the conceptual design phase, before devices are built and tested, traditional methods of quantifying reliability are inadequate because the device is not in existence and cannot be tested to establish the reliability distributions. A novel approach using neural networks is created to predict the overall reliability of a MEMS device based on its components and each component's attributes. The methodology begins with collecting attribute data (fabrication process, physical specifications, operating environment, property characteristics, packaging, etc.) and reliability data for many types of microengines. The data are partitioned into training data (the majority) and validation data (the remainder). A neural network is applied to the training data (both attribute and reliability); the attributes become the system inputs and reliability data (cycles to failure), the system output. After the neural network is trained with sufficient data. the validation data are used to verify the neural networks provided accurate reliability estimates. Now, the reliability of a new proposed MEMS device can be estimated by using the appropriate trained neural networks developed in this work.

  4. Exponentiated Weibull distribution approach based inflection S-shaped software reliability growth model

    Directory of Open Access Journals (Sweden)

    B.B. Sagar

    2016-09-01

    Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.

  5. Reliability of four models for clinical gait analysis.

    Science.gov (United States)

    Kainz, Hans; Graham, David; Edwards, Julie; Walsh, Henry P J; Maine, Sheanna; Boyd, Roslyn N; Lloyd, David G; Modenese, Luca; Carty, Christopher P

    2017-05-01

    Three-dimensional gait analysis (3DGA) has become a common clinical tool for treatment planning in children with cerebral palsy (CP). Many clinical gait laboratories use the conventional gait analysis model (e.g. Plug-in-Gait model), which uses Direct Kinematics (DK) for joint kinematic calculations, whereas, musculoskeletal models, mainly used for research, use Inverse Kinematics (IK). Musculoskeletal IK models have the advantage of enabling additional analyses which might improve the clinical decision-making in children with CP. Before any new model can be used in a clinical setting, its reliability has to be evaluated and compared to a commonly used clinical gait model (e.g. Plug-in-Gait model) which was the purpose of this study. Two testers performed 3DGA in eleven CP and seven typically developing participants on two occasions. Intra- and inter-tester standard deviations (SD) and standard error of measurement (SEM) were used to compare the reliability of two DK models (Plug-in-Gait and a six degrees-of-freedom model solved using Vicon software) and two IK models (two modifications of 'gait2392' solved using OpenSim). All models showed good reliability (mean SEM of 3.0° over all analysed models and joint angles). Variations in joint kinetics were less in typically developed than in CP participants. The modified 'gait2392' model which included all the joint rotations commonly reported in clinical 3DGA, showed reasonable reliable joint kinematic and kinetic estimates, and allows additional musculoskeletal analysis on surgically adjustable parameters, e.g. muscle-tendon lengths, and, therefore, is a suitable model for clinical gait analysis. Copyright © 2017. Published by Elsevier B.V.

  6. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  7. A model for assessing human cognitive reliability in PRA studies

    International Nuclear Information System (INIS)

    Hannaman, G.W.; Spurgin, A.J.; Lukic, Y.

    1985-01-01

    This paper summarizes the status of a research project sponsored by EPRI as part of the Probabilistic Risk Assessment (PRA) technology improvement program and conducted by NUS Corporation to develop a model of Human Cognitive Reliability (HCR). The model was synthesized from features identified in a review of existing models. The model development was based on the hypothesis that the key factors affecting crew response times are separable. The inputs to the model consist of key parameters the values of which can be determined by PRA analysts for each accident situation being assessed. The output is a set of curves which represent the probability of control room crew non-response as a function of time for different conditions affecting their performance. The non-response probability is then a contributor to the overall non-success of operating crews to achieve a functional objective identified in the PRA study. Simulator data and some small scale tests were utilized to illustrate the calibration of interim HCR model coefficients for different types of cognitive processing since the data were sparse. The model can potentially help PRA analysts make human reliability assessments more explicit. The model incorporates concepts from psychological models of human cognitive behavior, information from current collections of human reliability data sources and crew response time data from simulator training exercises

  8. The reliability of the Adelaide in-shoe foot model.

    Science.gov (United States)

    Bishop, Chris; Hillier, Susan; Thewlis, Dominic

    2017-07-01

    Understanding the biomechanics of the foot is essential for many areas of research and clinical practice such as orthotic interventions and footwear development. Despite the widespread attention paid to the biomechanics of the foot during gait, what largely remains unknown is how the foot moves inside the shoe. This study investigated the reliability of the Adelaide In-Shoe Foot Model, which was designed to quantify in-shoe foot kinematics and kinetics during walking. Intra-rater reliability was assessed in 30 participants over five walking trials whilst wearing shoes during two data collection sessions, separated by one week. Sufficient reliability for use was interpreted as a coefficient of multiple correlation and intra-class correlation coefficient of >0.61. Inter-rater reliability was investigated separately in a second sample of 10 adults by two researchers with experience in applying markers for the purpose of motion analysis. The results indicated good consistency in waveform estimation for most kinematic and kinetic data, as well as good inter-and intra-rater reliability. The exception is the peak medial ground reaction force, the minimum abduction angle and the peak abduction/adduction external hindfoot joint moments which resulted in less than acceptable repeatability. Based on our results, the Adelaide in-shoe foot model can be used with confidence for 24 commonly measured biomechanical variables during shod walking. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Construction of a reliable model pyranometer for irradiance ...

    African Journals Online (AJOL)

    There is a problem of availability of sufficient and functional instruments for measuring solar radiation in Nigeria due to the high importation costs and maintenance. In order to alleviate this problem, the design, construction and testing of a reliable model pyranometer (RMP001) was done in Mubi, Adamawa State of Nigeria.

  10. Fuse Modeling for Reliability Study of Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad; Iannuzzo, Francesco; Blaabjerg, Frede

    2017-01-01

    This paper describes a comprehensive modeling approach on reliability of fuses used in power electronic circuits. When fuses are subjected to current pulses, cyclic temperature stress is introduced to the fuse element and will wear out the component. Furthermore, the fuse may be used in a large v...

  11. A Radiographic Healing Classification for Osteochondritis Dissecans of the Knee Provides Good Interobserver Reliability

    Science.gov (United States)

    Ramski, David E.; Ganley, Theodore J.; Carey, James L.

    2017-01-01

    Background: Recent studies have examined radiographic factors associated with healing of osteochondritis dissecans (OCD) lesions of the knee. However, there is still no gold standard in determining the healing status of an OCD lesion. Purpose: We examined temporally associated patterns of healing to (1) evaluate the practicality of a classification system and (2) elucidate any associations between healing pattern and patient age, sex, lesion location, treatment type, and physeal patency. Study Design: Cohort study (diagnosis); Level of evidence, 3. Methods: We retrospectively screened 489 patients from 2006 to 2010 for a total of 41 consecutive knee OCD lesions that met inclusion criteria, including at least 3 consecutive radiographic series (mean patient age, 12.8 years; range, 7.8-17.1 years; mean follow-up, 75.1 weeks). Radiographs were arranged in sequential order for ratings by 2 orthopaedic sports medicine specialists. Healing patterns were rated as boundary resolution, increasing radiodensity of progeny fragment, combined, or not applicable. Repeat ratings were conducted 3 weeks later. Results: Patients were most commonly adolescent males aged 13 to 17 years, with a medial femoral condyle lesion that was treated operatively. Interobserver reliability of the healing classification was good (intraclass correlation coefficient, 0.67; 95% CI, 0.55-0.79). Boundary and radiodensity healing was observed for all ages, sexes, lesion locations, treatment types, and physeal patency states. Conclusion: This study evaluated a valuable radiographic paradigm—boundary resolution, increasing radiodensity of progeny fragment, or combined—for assessment of OCD lesion healing. The proposed system of healing classification demonstrated good inter- and intraobserver reliability. Healing patterns were not significantly associated with any particular age, sex, lesion location, treatment type, or physeal patency status. The development of a classification system for knee OCD may

  12. Modular reliability modeling of the TJNAF personnel safety system

    International Nuclear Information System (INIS)

    Cinnamon, J.; Mahoney, K.

    1997-01-01

    A reliability model for the Thomas Jefferson National Accelerator Facility (formerly CEBAF) personnel safety system has been developed. The model, which was implemented using an Excel spreadsheet, allows simulation of all or parts of the system. Modularity os the model's implementation allows rapid open-quotes what if open-quotes case studies to simulate change in safety system parameters such as redundancy, diversity, and failure rates. Particular emphasis is given to the prediction of failure modes which would result in the failure of both of the redundant safety interlock systems. In addition to the calculation of the predicted reliability of the safety system, the model also calculates availability of the same system. Such calculations allow the user to make tradeoff studies between reliability and availability, and to target resources to improving those parts of the system which would most benefit from redesign or upgrade. The model includes calculated, manufacturer's data, and Jefferson Lab field data. This paper describes the model, methods used, and comparison of calculated to actual data for the Jefferson Lab personnel safety system. Examples are given to illustrate the model's utility and ease of use

  13. Modeling and Analysis of Component Faults and Reliability

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter

    2016-01-01

    This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets that are automati......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...... that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates....

  14. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  15. THE SIMULATION DIAGNOSTIC METHODS AND REGENERATION WAYS OF REINFORCED - CONCRETE CONSTRUCTIONS OF BRIDGES IN PROVIDING THEIR OPERATING RELIABILITY AND LONGEVITY

    Directory of Open Access Journals (Sweden)

    B. V. Savchinskiy

    2010-03-01

    Full Text Available On the basis of analysis of existing diagnostic methods and regeneration ways of reinforced-concrete constructions of bridges the recommendations on introduction of new modern technologies of renewal of reinforced-concrete constructions of bridges in providing their operating reliability and longevity are offered.

  16. Performance and reliability model checking and model construction

    NARCIS (Netherlands)

    Hermanns, H.; Gnesi, Stefania; Schieferdecker, Ina; Rennoch, Axel

    2000-01-01

    Continuous-time Markov chains (CTMCs) are widely used to describe stochastic phenomena in many diverse areas. They are used to estimate performance and reliability characteristics of various nature, for instance to quantify throughputs of manufacturing systems, to locate bottlenecks in communication

  17. Maintenance overtime policies in reliability theory models with random working cycles

    CERN Document Server

    Nakagawa, Toshio

    2015-01-01

    This book introduces a new concept of replacement in maintenance and reliability theory. Replacement overtime, where replacement occurs at the first completion of a working cycle over a planned time, is a new research topic in maintenance theory and also serves to provide a fresh optimization technique in reliability engineering. In comparing replacement overtime with standard and random replacement techniques theoretically and numerically, 'Maintenance Overtime Policies in Reliability Theory' highlights the key benefits to be gained by adopting this new approach and shows how they can be applied to inspection policies, parallel systems and cumulative damage models. Utilizing the latest research in replacement overtime by internationally recognized experts, readers are introduced to new topics and methods, and learn how to practically apply this knowledge to actual reliability models. This book will serve as an essential guide to a new subject of study for graduate students and researchers and also provides a...

  18. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    conservativeness level , the conservative probability of failure obtained from Section 4 must be maintained. The mathematical formulation of conservative model... CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...PDF and a probability of failure are selected from these predicted output PDFs at a user-specified conservativeness level for validation. For

  19. Advanced techniques in reliability model representation and solution

    Science.gov (United States)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  20. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  1. Reliability assessment of competing risks with generalized mixed shock models

    International Nuclear Information System (INIS)

    Rafiee, Koosha; Feng, Qianmei; Coit, David W.

    2017-01-01

    This paper investigates reliability modeling for systems subject to dependent competing risks considering the impact from a new generalized mixed shock model. Two dependent competing risks are soft failure due to a degradation process, and hard failure due to random shocks. The shock process contains fatal shocks that can cause hard failure instantaneously, and nonfatal shocks that impact the system in three different ways: 1) damaging the unit by immediately increasing the degradation level, 2) speeding up the deterioration by accelerating the degradation rate, and 3) weakening the unit strength by reducing the hard failure threshold. While the first impact from nonfatal shocks comes from each individual shock, the other two impacts are realized when the condition for a new generalized mixed shock model is satisfied. Unlike most existing mixed shock models that consider a combination of two shock patterns, our new generalized mixed shock model includes three classic shock patterns. According to the proposed generalized mixed shock model, the degradation rate and the hard failure threshold can simultaneously shift multiple times, whenever the condition for one of these three shock patterns is satisfied. An example using micro-electro-mechanical systems devices illustrates the effectiveness of the proposed approach with sensitivity analysis. - Highlights: • A rich reliability model for systems subject to dependent failures is proposed. • The degradation rate and the hard failure threshold can shift simultaneously. • The shift is triggered by a new generalized mixed shock model. • The shift can occur multiple times under the generalized mixed shock model.

  2. Markerless motion capture can provide reliable 3D gait kinematics in the sagittal and frontal plane

    DEFF Research Database (Denmark)

    Sandau, Martin; Koblauch, Henrik; Moeslund, Thomas B.

    2014-01-01

    specific articulated model was generated with three rotational and three translational degrees of freedom for each limb segment and without any constraints to the range of motion. This approach was tested on 3D gait analysis and compared to a marker based method. The experiment included ten healthy...

  3. Using satellites to provide reliable daily water use estimates at field scales

    Science.gov (United States)

    The ability to accurately map daily crop water use over agricultural landscapes at scales resolving individual farm fields is a significant challenge, but it is increasingly relevant in a future scenario of reduced water availability and increased atmospheric demand. Remote sensing provides a robust...

  4. Testing the reliability of ice-cream cone model

    Science.gov (United States)

    Pan, Zonghao; Shen, Chenglong; Wang, Chuanbing; Liu, Kai; Xue, Xianghui; Wang, Yuming; Wang, Shui

    2015-04-01

    Coronal Mass Ejections (CME)'s properties are important to not only the physical scene itself but space-weather prediction. Several models (such as cone model, GCS model, and so on) have been raised to get rid of the projection effects within the properties observed by spacecraft. According to SOHO/ LASCO observations, we obtain the 'real' 3D parameters of all the FFHCMEs (front-side full halo Coronal Mass Ejections) within the 24th solar cycle till July 2012, by the ice-cream cone model. Considering that the method to obtain 3D parameters from the CME observations by multi-satellite and multi-angle has higher accuracy, we use the GCS model to obtain the real propagation parameters of these CMEs in 3D space and compare the results with which by ice-cream cone model. Then we could discuss the reliability of the ice-cream cone model.

  5. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  6. Creation and Reliability Analysis of Vehicle Dynamic Weighing Model

    Directory of Open Access Journals (Sweden)

    Zhi-Ling XU

    2014-08-01

    Full Text Available In this paper, it is modeled by using ADAMS to portable axle load meter of dynamic weighing system, controlling a single variable simulation weighing process, getting the simulation weighing data under the different speed and weight; simultaneously using portable weighing system with the same parameters to achieve the actual measurement, comparative analysis the simulation results under the same conditions, at 30 km/h or less, the simulation value and the measured value do not differ by more than 5 %, it is not only to verify the reliability of dynamic weighing model, but also to create possible for improving algorithm study efficiency by using dynamic weighing model simulation.

  7. Providing Reliability Services through Demand Response: A Prelimnary Evaluation of the Demand Response Capabilities of Alcoa Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Starke, Michael R [ORNL; Kirby, Brendan J [ORNL; Kueck, John D [ORNL; Todd, Duane [Alcoa; Caulfield, Michael [Alcoa; Helms, Brian [Alcoa

    2009-02-01

    Demand response is the largest underutilized reliability resource in North America. Historic demand response programs have focused on reducing overall electricity consumption (increasing efficiency) and shaving peaks but have not typically been used for immediate reliability response. Many of these programs have been successful but demand response remains a limited resource. The Federal Energy Regulatory Commission (FERC) report, 'Assessment of Demand Response and Advanced Metering' (FERC 2006) found that only five percent of customers are on some form of demand response program. Collectively they represent an estimated 37,000 MW of response potential. These programs reduce overall energy consumption, lower green house gas emissions by allowing fossil fuel generators to operate at increased efficiency and reduce stress on the power system during periods of peak loading. As the country continues to restructure energy markets with sophisticated marginal cost models that attempt to minimize total energy costs, the ability of demand response to create meaningful shifts in the supply and demand equations is critical to creating a sustainable and balanced economic response to energy issues. Restructured energy market prices are set by the cost of the next incremental unit of energy, so that as additional generation is brought into the market, the cost for the entire market increases. The benefit of demand response is that it reduces overall demand and shifts the entire market to a lower pricing level. This can be very effective in mitigating price volatility or scarcity pricing as the power system responds to changing demand schedules, loss of large generators, or loss of transmission. As a global producer of alumina, primary aluminum, and fabricated aluminum products, Alcoa Inc., has the capability to provide demand response services through its manufacturing facilities and uniquely through its aluminum smelting facilities. For a typical aluminum smelter

  8. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  9. DJFS: Providing Highly Reliable and High‐Performance File System with Small‐Sized

    Directory of Open Access Journals (Sweden)

    Junghoon Kim

    2017-11-01

    Full Text Available File systems and applications try to implement their own update protocols to guarantee data consistency, which is one of the most crucial aspects of computing systems. However, we found that the storage devices are substantially under‐utilized when preserving data consistency because they generate massive storage write traffic with many disk cache flush operations and force‐unit‐access (FUA commands. In this paper, we present DJFS (Delta‐Journaling File System that provides both a high level of performance and data consistency for different applications. We made three technical contributions to achieve our goal. First, to remove all storage accesses with disk cache flush operations and FUA commands, DJFS uses small‐sized NVRAM for a file system journal. Second, to reduce the access latency and space requirements of NVRAM, DJFS attempts to journal compress the differences in the modified blocks. Finally, to relieve explicit checkpointing overhead, DJFS aggressively reflects the checkpoint transactions to file system area in the unit of the specified region. Our evaluation on TPC‐C SQLite benchmark shows that, using our novel optimization schemes, DJFS outperforms Ext4 by up to 64.2 times with only 128 MB of NVRAM.

  10. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  11. Reliability Assessment of IGBT Modules Modeled as Systems with Correlated Components

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2013-01-01

    configuration. The estimated system reliability by the proposed method is a conservative estimate. Application of the suggested method could be extended for reliability estimation of systems composing of welding joints, bolts, bearings, etc. The reliability model incorporates the correlation between...

  12. Power Electronic Packaging Design, Assembly Process, Reliability and Modeling

    CERN Document Server

    Liu, Yong

    2012-01-01

    Power Electronic Packaging presents an in-depth overview of power electronic packaging design, assembly,reliability and modeling. Since there is a drastic difference between IC fabrication and power electronic packaging, the book systematically introduces typical power electronic packaging design, assembly, reliability and failure analysis and material selection so readers can clearly understand each task's unique characteristics. Power electronic packaging is one of the fastest growing segments in the power electronic industry, due to the rapid growth of power integrated circuit (IC) fabrication, especially for applications like portable, consumer, home, computing and automotive electronics. This book also covers how advances in both semiconductor content and power advanced package design have helped cause advances in power device capability in recent years. The author extrapolates the most recent trends in the book's areas of focus to highlight where further improvement in materials and techniques can d...

  13. Reliability analysis and prediction of mixed mode load using Markov Chain Model

    International Nuclear Information System (INIS)

    Nikabdullah, N.; Singh, S. S. K.; Alebrahim, R.; Azizi, M. A.; K, Elwaleed A.; Noorani, M. S. M.

    2014-01-01

    The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading

  14. Reliable software systems via chains of object models with provably correct behavior

    International Nuclear Information System (INIS)

    Yakhnis, A.; Yakhnis, V.

    1996-01-01

    This work addresses specification and design of reliable safety-critical systems, such as nuclear reactor control systems. Reliability concerns are addressed in complimentary fashion by different fields. Reliability engineers build software reliability models, etc. Safety engineers focus on prevention of potential harmful effects of systems on environment. Software/hardware correctness engineers focus on production of reliable systems on the basis of mathematical proofs. The authors think that correctness may be a crucial guiding issue in the development of reliable safety-critical systems. However, purely formal approaches are not adequate for the task, because they neglect the connection with the informal customer requirements. They alleviate that as follows. First, on the basis of the requirements, they build a model of the system interactions with the environment, where the system is viewed as a black box. They will provide foundations for automated tools which will (a) demonstrate to the customer that all of the scenarios of system behavior are presented in the model, (b) uncover scenarios not present in the requirements, and (c) uncover inconsistent scenarios. The developers will work with the customer until the black box model will not possess scenarios (b) and (c) above. Second, the authors will build a chain of several increasingly detailed models, where the first model is the black box model and the last model serves to automatically generated proved executable code. The behavior of each model will be proved to conform to the behavior of the previous one. They build each model as a cluster of interactive concurrent objects, thus they allow both top-down and bottom-up development

  15. Stochastic reliability and maintenance modeling essays in honor of Professor Shunji Osaki on his 70th birthday

    CERN Document Server

    Nakagawa, Toshio

    2013-01-01

    In honor of the work of Professor Shunji Osaki, Stochastic Reliability and Maintenance Modeling provides a comprehensive study of the legacy of and ongoing research in stochastic reliability and maintenance modeling. Including associated application areas such as dependable computing, performance evaluation, software engineering, communication engineering, distinguished researchers review and build on the contributions over the last four decades by Professor Shunji Osaki. Fundamental yet significant research results are presented and discussed clearly alongside new ideas and topics on stochastic reliability and maintenance modeling to inspire future research. Across 15 chapters readers gain the knowledge and understanding to apply reliability and maintenance theory to computer and communication systems. Stochastic Reliability and Maintenance Modeling is ideal for graduate students and researchers in reliability engineering, and workers, managers and engineers engaged in computer, maintenance and management wo...

  16. Stochastic process corrosion growth models for pipeline reliability

    International Nuclear Information System (INIS)

    Bazán, Felipe Alexander Vargas; Beck, André Teófilo

    2013-01-01

    Highlights: •Novel non-linear stochastic process corrosion growth model is proposed. •Corrosion rate modeled as random Poisson pulses. •Time to corrosion initiation and inherent time-variability properly represented. •Continuous corrosion growth histories obtained. •Model is shown to precisely fit actual corrosion data at two time points. -- Abstract: Linear random variable corrosion models are extensively employed in reliability analysis of pipelines. However, linear models grossly neglect well-known characteristics of the corrosion process. Herein, a non-linear model is proposed, where corrosion rate is represented as a Poisson square wave process. The resulting model represents inherent time-variability of corrosion growth, produces continuous growth and leads to mean growth at less-than-one power of time. Different corrosion models are adjusted to the same set of actual corrosion data for two inspections. The proposed non-linear random process corrosion growth model leads to the best fit to the data, while better representing problem physics

  17. Recent progress in human reliability models for nuclear power safety

    International Nuclear Information System (INIS)

    Bersini, U.; Devooght, J.; Smidts, C.

    1988-01-01

    The importance of a human factor in the safety of nuclear power plants hardly needs to be stressed after the Three Mile Island and Chernobyl accidents. Following TMI, such progress was made that Chernobyl did not reveal significant faults in the design or operation of Western nuclear power plants. Post-TMI progress concerns: design of control rooms, development of simulators for training operators, use of computer aided diagnostics, a better understanding of procedural safety, the collection of human error data, etc. We shall concentrate here on the specific point of modelling human errors for incorporation in the standard tools of reliability and safety engineering (e.g. fault trees). The Rasmussen report (WASH 1400) has already included human error in the analysis of fault and event trees and since then new models have been developed and tested. Human reliability methods, which first appeared in the early 1980s, model operator behavior during routine tasks and quantify his error probability. Here three of these methods are briefly described: THERP, SLIM and MAPPS. 17 refs

  18. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  19. Reliable low precision simulations in land surface models

    Science.gov (United States)

    Dawson, Andrew; Düben, Peter D.; MacLeod, David A.; Palmer, Tim N.

    2017-12-01

    Weather and climate models must continue to increase in both resolution and complexity in order that forecasts become more accurate and reliable. Moving to lower numerical precision may be an essential tool for coping with the demand for ever increasing model complexity in addition to increasing computing resources. However, there have been some concerns in the weather and climate modelling community over the suitability of lower precision for climate models, particularly for representing processes that change very slowly over long time-scales. These processes are difficult to represent using low precision due to time increments being systematically rounded to zero. Idealised simulations are used to demonstrate that a model of deep soil heat diffusion that fails when run in single precision can be modified to work correctly using low precision, by splitting up the model into a small higher precision part and a low precision part. This strategy retains the computational benefits of reduced precision whilst preserving accuracy. This same technique is also applied to a full complexity land surface model, resulting in rounding errors that are significantly smaller than initial condition and parameter uncertainties. Although lower precision will present some problems for the weather and climate modelling community, many of the problems can likely be overcome using a straightforward and physically motivated application of reduced precision.

  20. On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev V.; Kozine, Igor

    2010-01-01

    Uncertainty of parameters in engineering design has been modeled in different frameworks such as inter-val analysis, fuzzy set and possibility theories, ran-dom set theory and imprecise probability theory. The authors of this paper for many years have been de-veloping new imprecise reliability......? Developing models enabling to answer these two questions has been in the focus of the new research the results of which are described in the paper. In this paper we describe new models for com-puting structural reliability based on measurements of values of stress and strength and taking account of the fact...... that the number of observations may be ra-ther small. The approach to developing the models is based on using the imprecise Bayesian inference models (Walley 1996). These models provide a rich supply of coherent imprecise inferences that are ex-pressed in terms of posterior upper and lower prob...

  1. Reliable critical sized defect rodent model for cleft palate research.

    Science.gov (United States)

    Mostafa, Nesrine Z; Doschak, Michael R; Major, Paul W; Talwar, Reena

    2014-12-01

    Suitable animal models are necessary to test the efficacy of new bone grafting therapies in cleft palate surgery. Rodent models of cleft palate are available but have limitations. This study compared and modified mid-palate cleft (MPC) and alveolar cleft (AC) models to determine the most reliable and reproducible model for bone grafting studies. Published MPC model (9 × 5 × 3 mm(3)) lacked sufficient information for tested rats. Our initial studies utilizing AC model (7 × 4 × 3 mm(3)) in 8 and 16 weeks old Sprague Dawley (SD) rats revealed injury to adjacent structures. After comparing anteroposterior and transverse maxillary dimensions in 16 weeks old SD and Wistar rats, virtual planning was performed to modify MPC and AC defects dimensions, taking the adjacent structures into consideration. Modified MPC (7 × 2.5 × 1 mm(3)) and AC (5 × 2.5 × 1 mm(3)) defects were employed in 16 weeks old Wistar rats and healing was monitored by micro-computed tomography and histology. Maxillary dimensions in SD and Wistar rats were not significantly different. Preoperative virtual planning enhanced postoperative surgical outcomes. Bone healing occurred at defect margin leaving central bone void confirming the critical size nature of the modified MPC and AC defects. Presented modifications for MPC and AC models created clinically relevant and reproducible defects. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  2. Reliability of Bolton analysis evaluation in tridimensional virtual models.

    Science.gov (United States)

    Brandão, Marianna Mendonca; Sobral, Marcio Costal; Vogel, Carlos Jorge

    2015-10-01

    The present study aimed at evaluating the reliability of Bolton analysis in tridimensional virtual models, comparing it with the manual method carried out with dental casts. The present investigation was performed using 56 pairs of dental casts produced from the dental arches of patients in perfect conditions and randomly selected from Universidade Federal da Bahia, School of Dentistry, Orthodontics Postgraduate Program. Manual measurements were obtained with the aid of a digital Cen-Tech 4"(r) caliper (Harpor Freight Tools, Calabasas, CA, USA). Subsequently, samples were digitized on 3Shape(r)R-700T scanner (Copenhagen, Denmark) and digital measures were obtained by Ortho Analyzer software. Data were subject to statistical analysis and results revealed that there were no statistically significant differences between measurements with p-values equal to p = 0.173 and p= 0.239 for total and anterior proportions, respectively. Based on these findings, it is possible to deduce that Bolton analysis performed on tridimensional virtual models is as reliable as measurements obtained from dental casts with satisfactory agreement.

  3. Reliability of Bolton analysis evaluation in tridimensional virtual models

    Science.gov (United States)

    Brandão, Marianna Mendonca; Sobral, Marcio Costal; Vogel, Carlos Jorge

    2015-01-01

    Objective: The present study aimed at evaluating the reliability of Bolton analysis in tridimensional virtual models, comparing it with the manual method carried out with dental casts. Methods: The present investigation was performed using 56 pairs of dental casts produced from the dental arches of patients in perfect conditions and randomly selected from Universidade Federal da Bahia, School of Dentistry, Orthodontics Postgraduate Program. Manual measurements were obtained with the aid of a digital Cen-Tech 4"(r) caliper (Harpor Freight Tools, Calabasas, CA, USA). Subsequently, samples were digitized on 3Shape(r) R-700T scanner (Copenhagen, Denmark) and digital measures were obtained by Ortho Analyzer software. Results: Data were subject to statistical analysis and results revealed that there were no statistically significant differences between measurements with p-values equal to p = 0.173 and p= 0.239 for total and anterior proportions, respectively. Conclusion: Based on these findings, it is possible to deduce that Bolton analysis performed on tridimensional virtual models is as reliable as measurements obtained from dental casts with satisfactory agreement. PMID:26560824

  4. Reliability of Bolton analysis evaluation in tridimensional virtual models

    Directory of Open Access Journals (Sweden)

    Marianna Mendonca Brandão

    2015-10-01

    Full Text Available Objective: The present study aimed at evaluating the reliability of Bolton analysis in tridimensional virtual models, comparing it with the manual method carried out with dental casts.Methods:The present investigation was performed using 56 pairs of dental casts produced from the dental arches of patients in perfect conditions and randomly selected from Universidade Federal da Bahia, School of Dentistry, Orthodontics Postgraduate Program. Manual measurements were obtained with the aid of a digital Cen-Tech 4"(r caliper (Harpor Freight Tools, Calabasas, CA, USA. Subsequently, samples were digitized on 3Shape(rR-700T scanner (Copenhagen, Denmark and digital measures were obtained by Ortho Analyzer software.Results:Data were subject to statistical analysis and results revealed that there were no statistically significant differences between measurements with p-values equal to p = 0.173 and p= 0.239 for total and anterior proportions, respectively.Conclusion:Based on these findings, it is possible to deduce that Bolton analysis performed on tridimensional virtual models is as reliable as measurements obtained from dental casts with satisfactory agreement.

  5. Reliability model for offshore wind farms; Paalidelighedsmodel for havvindmoelleparker

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, P.; Lundtang Paulsen, J.; Lybech Toegersen, M.; Krogh, T. [Risoe National Lab., Roskilde (Denmark); Raben, N.; Donovan, M.H.; Joergensen, L. [SEAS (Denmark); Winther-Jensen, M.

    2002-05-01

    A method for the prediction of the mean availability for an offshore windfarm has been developed. Factors comprised are the reliability of the single turbine, the strategy for preventive maintenance the climate, the number of repair teams, and the type of boats available for transport. The mean availability is defined as the sum of the fractions of time, where each turbine is available for production. The project has been carried out together with SEAS Wind Technique, and their site Roedsand has been chosen as the example of the work. A climate model has been created based on actual site measurements. The prediction of the availability is done with a Monte Carlo-simulation. Software was developed for the preparation of the climate model from weather measurements as well as for the Monte carlo-simulation. Three examples have been simulated, one with guessed parametres, and the other two with parameters more close to the Roedsand case. (au)

  6. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  7. Development of web-based reliability data analysis algorithm model and its application

    International Nuclear Information System (INIS)

    Hwang, Seok-Won; Oh, Ji-Yong; Moosung-Jae

    2010-01-01

    For this study, a database model of plant reliability was developed for the effective acquisition and management of plant-specific data that can be used in various applications of plant programs as well as in Probabilistic Safety Assessment (PSA). Through the development of a web-based reliability data analysis algorithm, this approach systematically gathers specific plant data such as component failure history, maintenance history, and shift diary. First, for the application of the developed algorithm, this study reestablished the raw data types, data deposition procedures and features of the Enterprise Resource Planning (ERP) system process. The component codes and system codes were standardized to make statistical analysis between different types of plants possible. This standardization contributes to the establishment of a flexible database model that allows the customization of reliability data for the various applications depending on component types and systems. In addition, this approach makes it possible for users to perform trend analyses and data comparisons for the significant plant components and systems. The validation of the algorithm is performed through a comparison of the importance measure value (Fussel-Vesely) of the mathematical calculation and that of the algorithm application. The development of a reliability database algorithm is one of the best approaches for providing systemic management of plant-specific reliability data with transparency and continuity. This proposed algorithm reinforces the relationships between raw data and application results so that it can provide a comprehensive database that offers everything from basic plant-related data to final customized data.

  8. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  9. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  10. System reliability assessment with an approximate reasoning model

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Helm, T.M.; Boerigter, S.T.

    1998-12-31

    The projected service life of weapons in the US nuclear stockpile will exceed the original design life of their critical components. Interim metrics are needed to describe weapon states for use in simulation models of the nuclear weapons complex. The authors present an approach to this problem based upon the theory of approximate reasoning (AR) that allows meaningful assessments to be made in an environment where reliability models are incomplete. AR models are designed to emulate the inference process used by subject matter experts. The emulation is based upon a formal logic structure that relates evidence about components. This evidence is translated using natural language expressions into linguistic variables that describe membership in fuzzy sets. The authors introduce a metric that measures the acceptability of a weapon to nuclear deterrence planners. Implication rule bases are used to draw a series of forward chaining inferences about the acceptability of components, subsystems and individual weapons. They describe each component in the AR model in some detail and illustrate its behavior with a small example. The integration of the acceptability metric into a prototype model to simulate the weapons complex is also described.

  11. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  12. Reliability of Coulomb stress changes inferred from correlated uncertainties of finite-fault source models

    KAUST Repository

    Woessner, J.

    2012-07-14

    Static stress transfer is one physical mechanism to explain triggered seismicity. Coseismic stress-change calculations strongly depend on the parameterization of the causative finite-fault source model. These models are uncertain due to uncertainties in input data, model assumptions, and modeling procedures. However, fault model uncertainties have usually been ignored in stress-triggering studies and have not been propagated to assess the reliability of Coulomb failure stress change (ΔCFS) calculations. We show how these uncertainties can be used to provide confidence intervals for co-seismic ΔCFS-values. We demonstrate this for the MW = 5.9 June 2000 Kleifarvatn earthquake in southwest Iceland and systematically map these uncertainties. A set of 2500 candidate source models from the full posterior fault-parameter distribution was used to compute 2500 ΔCFS maps. We assess the reliability of the ΔCFS-values from the coefficient of variation (CV) and deem ΔCFS-values to be reliable where they are at least twice as large as the standard deviation (CV ≤ 0.5). Unreliable ΔCFS-values are found near the causative fault and between lobes of positive and negative stress change, where a small change in fault strike causes ΔCFS-values to change sign. The most reliable ΔCFS-values are found away from the source fault in the middle of positive and negative ΔCFS-lobes, a likely general pattern. Using the reliability criterion, our results support the static stress-triggering hypothesis. Nevertheless, our analysis also suggests that results from previous stress-triggering studies not considering source model uncertainties may have lead to a biased interpretation of the importance of static stress-triggering.

  13. Comprehensive Care For Joint Replacement Model - Provider Data

    Data.gov (United States)

    U.S. Department of Health & Human Services — Comprehensive Care for Joint Replacement Model - provider data. This data set includes provider data for two quality measures tracked during an episode of care:...

  14. Providing all global energy with wind, water, and solar power, Part II: Reliability, system and transmission costs, and policies

    International Nuclear Information System (INIS)

    Delucchi, Mark A.; Jacobson, Mark Z.

    2011-01-01

    This is Part II of two papers evaluating the feasibility of providing all energy for all purposes (electric power, transportation, and heating/cooling), everywhere in the world, from wind, water, and the sun (WWS). In Part I, we described the prominent renewable energy plans that have been proposed and discussed the characteristics of WWS energy systems, the global demand for and availability of WWS energy, quantities and areas required for WWS infrastructure, and supplies of critical materials. Here, we discuss methods of addressing the variability of WWS energy to ensure that power supply reliably matches demand (including interconnecting geographically dispersed resources, using hydroelectricity, using demand-response management, storing electric power on site, over-sizing peak generation capacity and producing hydrogen with the excess, storing electric power in vehicle batteries, and forecasting weather to project energy supplies), the economics of WWS generation and transmission, the economics of WWS use in transportation, and policy measures needed to enhance the viability of a WWS system. We find that the cost of energy in a 100% WWS will be similar to the cost today. We conclude that barriers to a 100% conversion to WWS power worldwide are primarily social and political, not technological or even economic. - Research highlights: → We evaluate the feasibility of global energy supply from wind, water, and solar energy. → WWS energy can be supplied reliably and economically to all energy-use sectors. → The social cost of WWS energy generally is less than the cost of fossil-fuel energy. → Barriers to 100% WWS power worldwide are socio-political, not techno-economic.

  15. Practical applications of age-dependent reliability models and analysis of operational data

    International Nuclear Information System (INIS)

    Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L.

    2005-01-01

    The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems

  16. Practical applications of age-dependent reliability models and analysis of operational data

    Energy Technology Data Exchange (ETDEWEB)

    Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L

    2005-07-01

    The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems.

  17. Customer-Provider Strategic Alignment: A Maturity Model

    Science.gov (United States)

    Luftman, Jerry; Brown, Carol V.; Balaji, S.

    This chapter presents a new model for assessing the maturity of a ­customer-provider relationship from a collaborative service delivery perspective: the Customer-Provider Strategic Alignment Maturity (CPSAM) Model. This model builds on recent research for effectively managing the customer-provider relationship in IT service outsourcing contexts and a validated model for assessing alignment across internal IT service units and their business customers within the same organization. After reviewing relevant literature by service science and information systems researchers, the six overarching components of the maturity model are presented: value measurements, governance, partnership, communications, human resources and skills, and scope and architecture. A key assumption of the model is that all of the components need be addressed to assess and improve customer-provider alignment. Examples of specific metrics for measuring the maturity level of each component over the five levels of maturity are also presented.

  18. A Reliable Mouse Model of Hind limb Gangrene.

    Science.gov (United States)

    Parikh, Punam P; Castilla, Diego; Lassance-Soares, Roberta M; Shao, Hongwei; Regueiro, Manuela; Li, Yan; Vazquez-Padron, Roberto; Webster, Keith A; Liu, Zhao-Jun; Velazquez, Omaida C

    2018-04-01

    Lack of a reliable hind limb gangrene animal model limits preclinical studies of gangrene, a severe form of critical limb ischemia. We develop a novel mouse hind limb gangrene model to facilitate translational studies. BALB/c, FVB, and C57BL/6 mice underwent femoral artery ligation (FAL) with or without administration of N G -nitro-L-arginine methyl ester (L-NAME), an endothelial nitric oxide synthase inhibitor. Gangrene was assessed using standardized ischemia scores ranging from 0 (no gangrene) to 12 (forefoot gangrene). Laser Doppler imaging (LDI) and DiI perfusion quantified hind limb reperfusion postoperatively. BALB/c develops gangrene with FAL-only (n = 11/11, 100% gangrene incidence), showing mean limb ischemia score of 12 on postoperative days (PODs) 7 and 14 with LDI ranging from 0.08 to 0.12 on respective PODs. Most FVB did not develop gangrene with FAL-only (n = 3/9, 33% gangrene incidence) but with FAL and L-NAME (n = 9/9, 100% gangrene incidence). Mean limb ischemia scores for FVB undergoing FAL with L-NAME were significantly higher than for FVB receiving FAL-only. LDI score and capillary density by POD 28 were significantly lower in FVB undergoing FAL with L-NAME. C57BL/6 did not develop gangrene with FAL-only or FAL and L-NAME. Reproducible murine gangrene models may elucidate molecular mechanisms for gangrene development, facilitating therapeutic intervention. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. A Survey of Software Reliability Modeling and Estimation

    Science.gov (United States)

    1983-09-01

    Reliability and Its Exorcism ," Proceedings of the Joint Automatic Control Conference, 1977, Published by the-IE, New York, 1977, pp. 225-231. 28...of Software Reliability and its Exorcism ," Proceedings of the Joint Automatic Control Conference, 1977, Published by the IEEE, New I.’ York, 1977

  20. A risk assessment model for selecting cloud service providers

    OpenAIRE

    Cayirci, Erdal; Garaga, Alexandr; Santana de Oliveira, Anderson; Roudier, Yves

    2016-01-01

    The Cloud Adoption Risk Assessment Model is designed to help cloud customers in assessing the risks that they face by selecting a specific cloud service provider. It evaluates background information obtained from cloud customers and cloud service providers to analyze various risk scenarios. This facilitates decision making an selecting the cloud service provider with the most preferable risk profile based on aggregated risks to security, privacy, and service delivery. Based on this model we ...

  1. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  2. Integrating software reliability concepts into risk and reliability modeling of digital instrumentation and control systems used in nuclear power plants

    International Nuclear Information System (INIS)

    Arndt, S. A.

    2006-01-01

    As software-based digital systems are becoming more and more common in all aspects of industrial process control, including the nuclear power industry, it is vital that the current state of the art in quality, reliability, and safety analysis be advanced to support the quantitative review of these systems. Several research groups throughout the world are working on the development and assessment of software-based digital system reliability methods and their applications in the nuclear power, aerospace, transportation, and defense industries. However, these groups are hampered by the fact that software experts and probabilistic safety assessment experts view reliability engineering very differently. This paper discusses the characteristics of a common vocabulary and modeling framework. (authors)

  3. Using multi-model averaging to improve the reliability of catchment scale nitrogen predictions

    Science.gov (United States)

    Exbrayat, J.-F.; Viney, N. R.; Frede, H.-G.; Breuer, L.

    2013-01-01

    Hydro-biogeochemical models are used to foresee the impact of mitigation measures on water quality. Usually, scenario-based studies rely on single model applications. This is done in spite of the widely acknowledged advantage of ensemble approaches to cope with structural model uncertainty issues. As an attempt to demonstrate the reliability of such multi-model efforts in the hydro-biogeochemical context, this methodological contribution proposes an adaptation of the reliability ensemble averaging (REA) philosophy to nitrogen losses predictions. A total of 4 models are used to predict the total nitrogen (TN) losses from the well-monitored Ellen Brook catchment in Western Australia. Simulations include re-predictions of current conditions and a set of straightforward management changes targeting fertilisation scenarios. Results show that, in spite of good calibration metrics, one of the models provides a very different response to management changes. This behaviour leads the simple average of the ensemble members to also predict reductions in TN export that are not in agreement with the other models. However, considering the convergence of model predictions in the more sophisticated REA approach assigns more weight to previously less well-calibrated models that are more in agreement with each other. This method also avoids having to disqualify any of the ensemble members.

  4. Using multi-model averaging to improve the reliability of catchment scale nitrogen predictions

    Directory of Open Access Journals (Sweden)

    J.-F. Exbrayat

    2013-01-01

    Full Text Available Hydro-biogeochemical models are used to foresee the impact of mitigation measures on water quality. Usually, scenario-based studies rely on single model applications. This is done in spite of the widely acknowledged advantage of ensemble approaches to cope with structural model uncertainty issues. As an attempt to demonstrate the reliability of such multi-model efforts in the hydro-biogeochemical context, this methodological contribution proposes an adaptation of the reliability ensemble averaging (REA philosophy to nitrogen losses predictions. A total of 4 models are used to predict the total nitrogen (TN losses from the well-monitored Ellen Brook catchment in Western Australia. Simulations include re-predictions of current conditions and a set of straightforward management changes targeting fertilisation scenarios. Results show that, in spite of good calibration metrics, one of the models provides a very different response to management changes. This behaviour leads the simple average of the ensemble members to also predict reductions in TN export that are not in agreement with the other models. However, considering the convergence of model predictions in the more sophisticated REA approach assigns more weight to previously less well-calibrated models that are more in agreement with each other. This method also avoids having to disqualify any of the ensemble members.

  5. Modeling Parameters of Reliability of Technological Processes of Hydrocarbon Pipeline Transportation

    Directory of Open Access Journals (Sweden)

    Shalay Viktor

    2016-01-01

    Full Text Available On the basis of methods of system analysis and parametric reliability theory, the mathematical modeling of processes of oil and gas equipment operation in reliability monitoring was conducted according to dispatching data. To check the quality of empiric distribution coordination , an algorithm and mathematical methods of analysis are worked out in the on-line mode in a changing operating conditions. An analysis of physical cause-and-effect relations mechanism between the key factors and changing parameters of technical systems of oil and gas facilities is made, the basic types of technical distribution parameters are defined. Evaluation of the adequacy the analyzed parameters of the type of distribution is provided by using a criterion A.Kolmogorov, as the most universal, accurate and adequate to verify the distribution of continuous processes of complex multiple-technical systems. Methods of calculation are provided for supervising by independent bodies for risk assessment and safety facilities.

  6. An auto-focusing heuristic model to increase the reliability of a scientific mission

    International Nuclear Information System (INIS)

    Gualdesi, Lavinio

    2006-01-01

    Researchers invest a lot of time and effort on the design and development of components used in a scientific mission. To capitalize on this investment and on the operational experience of the researchers, it is useful to adopt a quantitative data base to monitor the history and usage of the components. This work describes a model to monitor the reliability level of components. The model is very flexible and allows users to compose systems using the same components in different configurations as required by each mission. This tool provides availability and reliability figures for the configuration requested, derived from historical data of the components' previous performance. The system is based on preliminary checklists to establish standard operating procedures (SOP) for all components life phases. When an infringement to the SOP occurs, a quantitative ranking is provided in order to quantify the risk associated with this deviation. The final agreement between field data and expected performance of the component makes the model converge onto a heuristic monitoring system. The model automatically focuses on points of failure at the detailed component element level, calculates risks, provides alerts when a demonstrated risk to safety is encountered, and advises when there is a mismatch between component performance and mission requirements. This model also helps the mission to focus resources on critical tasks where they are most needed

  7. An auto-focusing heuristic model to increase the reliability of a scientific mission

    Science.gov (United States)

    Gualdesi, Lavinio

    2006-11-01

    Researchers invest a lot of time and effort on the design and development of components used in a scientific mission. To capitalize on this investment and on the operational experience of the researchers, it is useful to adopt a quantitative data base to monitor the history and usage of the components. This work describes a model to monitor the reliability level of components. The model is very flexible and allows users to compose systems using the same components in different configurations as required by each mission. This tool provides availability and reliability figures for the configuration requested, derived from historical data of the components' previous performance. The system is based on preliminary checklists to establish standard operating procedures (SOP) for all components life phases. When an infringement to the SOP occurs, a quantitative ranking is provided in order to quantify the risk associated with this deviation. The final agreement between field data and expected performance of the component makes the model converge onto a heuristic monitoring system. The model automatically focuses on points of failure at the detailed component element level, calculates risks, provides alerts when a demonstrated risk to safety is encountered, and advises when there is a mismatch between component performance and mission requirements. This model also helps the mission to focus resources on critical tasks where they are most needed.

  8. Modeling, implementation, and validation of arterial travel time reliability : [summary].

    Science.gov (United States)

    2013-11-01

    Travel time reliability (TTR) has been proposed as : a better measure of a facilitys performance than : a statistical measure like peak hour demand. TTR : is based on more information about average traffic : flows and longer time periods, thus inc...

  9. Modeling, implementation, and validation of arterial travel time reliability.

    Science.gov (United States)

    2013-11-01

    Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...

  10. Development of system reliability models for railway bridges.

    Science.gov (United States)

    2012-07-01

    Performance of the railway transportation network depends on the reliability of railway bridges, which can be affected by : various forms of deterioration and extreme environmental conditions. More than half of the railway bridges in US were : built ...

  11. Study of redundant Models in reliability prediction of HXMT's HES

    International Nuclear Information System (INIS)

    Wang Jinming; Liu Congzhan; Zhang Zhi; Ji Jianfeng

    2010-01-01

    Two redundant equipment structures of HXMT's HES are proposed firstly, the block backup and dual system cold-redundancy. Then prediction of the reliability is made by using parts count method. Research of comparison and analysis is also performed on the two proposals. A conclusion is drawn that a higher reliability and longer service life could be offered by taking a redundant equipment structure of block backup. (authors)

  12. Parameter estimation of component reliability models in PSA model of Krsko NPP

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2001-01-01

    In the paper, the uncertainty analysis of component reliability models for independent failures is shown. The present approach for parameter estimation of component reliability models in NPP Krsko is presented. Mathematical approaches for different types of uncertainty analyses are introduced and used in accordance with some predisposed requirements. Results of the uncertainty analyses are shown in an example for time-related components. As the most appropriate uncertainty analysis proved the Bayesian estimation with the numerical estimation of a posterior, which can be approximated with some appropriate probability distribution, in this paper with lognormal distribution.(author)

  13. Modeling the bathtub shape hazard rate function in terms of reliability

    International Nuclear Information System (INIS)

    Wang, K.S.; Hsu, F.S.; Liu, P.P.

    2002-01-01

    In this paper, a general form of bathtub shape hazard rate function is proposed in terms of reliability. The degradation of system reliability comes from different failure mechanisms, in particular those related to (1) random failures, (2) cumulative damage, (3) man-machine interference, and (4) adaptation. The first item is referred to the modeling of unpredictable failures in a Poisson process, i.e. it is shown by a constant. Cumulative damage emphasizes the failures owing to strength deterioration and therefore the possibility of system sustaining the normal operation load decreases with time. It depends on the failure probability, 1-R. This representation denotes the memory characteristics of the second failure cause. Man-machine interference may lead to a positive effect in the failure rate due to learning and correction, or negative from the consequence of human inappropriate habit in system operations, etc. It is suggested that this item is correlated to the reliability, R, as well as the failure probability. Adaptation concerns with continuous adjusting between the mating subsystems. When a new system is set on duty, some hidden defects are explored and disappeared eventually. Therefore, the reliability decays combined with decreasing failure rate, which is expressed as a power of reliability. Each of these phenomena brings about the failures independently and is described by an additive term in the hazard rate function h(R), thus the overall failure behavior governed by a number of parameters is found by fitting the evidence data. The proposed model is meaningful in capturing the physical phenomena occurring during the system lifetime and provides for simpler and more effective parameter fitting than the usually adopted 'bathtub' procedures. Five examples of different type of failure mechanisms are taken in the validation of the proposed model. Satisfactory results are found from the comparisons

  14. Reliability model analysis and primary experimental evaluation of laser triggered pulse trigger

    International Nuclear Information System (INIS)

    Chen Debiao; Yang Xinglin; Li Yuan; Li Jin

    2012-01-01

    High performance pulse trigger can enhance performance and stability of the PPS. It is necessary to evaluate the reliability of the LTGS pulse trigger, so we establish the reliability analysis model of this pulse trigger based on CARMES software, the reliability evaluation is accord with the statistical results. (authors)

  15. Probabilistic reliability modeling for oil exploration & production (E&P) facilities in the tallgrass prairie preserve.

    Science.gov (United States)

    Zambrano, Lyda; Sublette, Kerry; Duncan, Kathleen; Thoma, Greg

    2007-10-01

    The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage.

  16. A new and reliable animal model for optic nerve injury.

    Science.gov (United States)

    Yan, Hua; Li, Fengling; Zhang, Linlin

    2012-10-01

    FPI. Optic nerve injury was demonstrated by F-VEP and MRI, and confirmed histologically. Our model is a simple, reliable, reproducible, and stable tool for use in investigations on the mechanism(s) of and treatment for optic nerve injury.

  17. Reliability-cost models for the power switching devices of wind power converters

    DEFF Research Database (Denmark)

    Ma, Ke; Blaabjerg, Frede

    2012-01-01

    In order to satisfy the growing reliability requirements for the wind power converters with more cost-effective solution, the target of this paper is to establish a new reliability-cost model which can connect the relationship between reliability performances and corresponding semiconductor cost...... temperature mean value Tm and fluctuation amplitude ΔTj of power devices, are presented. With the proposed reliability-cost model, it is possible to enable future reliability-oriented design of the power switching devices for wind power converters, and also an evaluation benchmark for different wind power...

  18. On new cautious structural reliability models in the framework of imprecise probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev; Kozine, Igor

    2010-01-01

    New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability ...

  19. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Balan, I.; Ionescu-Bujor, M.

    2008-01-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  20. Reliability models for a nonrepairable system with heterogeneous components having a phase-type time-to-failure distribution

    International Nuclear Information System (INIS)

    Kim, Heungseob; Kim, Pansoo

    2017-01-01

    This research paper presents practical stochastic models for designing and analyzing the time-dependent reliability of nonrepairable systems. The models are formulated for nonrepairable systems with heterogeneous components having phase-type time-to-failure distributions by a structured continuous time Markov chain (CTMC). The versatility of the phase-type distributions enhances the flexibility and practicality of the systems. By virtue of these benefits, studies in reliability engineering can be more advanced than the previous studies. This study attempts to solve a redundancy allocation problem (RAP) by using these new models. The implications of mixing components, redundancy levels, and redundancy strategies are simultaneously considered to maximize the reliability of a system. An imperfect switching case in a standby redundant system is also considered. Furthermore, the experimental results for a well-known RAP benchmark problem are presented to demonstrate the approximating error of the previous reliability function for a standby redundant system and the usefulness of the current research. - Highlights: • Phase-type time-to-failure distribution is used for components. • Reliability model for nonrepairable system is developed using Markov chain. • System is composed of heterogeneous components. • Model provides the real value of standby system reliability not an approximation. • Redundancy allocation problem is used to show usefulness of this model.

  1. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    Science.gov (United States)

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-18

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  2. Reliability and Maintainability Analysis: A Conceptual Design Model

    Science.gov (United States)

    1972-03-01

    is likely to be probabilistic in nature, a plenitude of possibilities - and problems - remain. Let us consider two of the more popular attempts which...characteristics would affect -. • -- 5 - the available reliability/maintainability alternativa ,. 4. The use of other criterion functions than the one used in the

  3. Phd study of reliability and validity: One step closer to a standardized music therapy assessment model

    DEFF Research Database (Denmark)

    Jacobsen, Stine Lindahl

    The paper will present a phd study concerning reliability and validity of music therapy assessment model “Assessment of Parenting Competences” (APC) in the area of families with emotionally neglected children. This study had a multiple strategy design with a philosophical base of critical realism...... and pragmatism. The fixed design for this study was a between and within groups design in testing the APCs reliability and validity. The two different groups were parents with neglected children and parents with non-neglected children. The flexible design had a multiple case study strategy specifically...... with interplay of turns between parent and child as the case under study comparing clinical and non-clinical groups and looking for differences in patterns of interaction. The flexible design informed the fixed design and led to further valuable statistical analysis. The presenter will provide an overview...

  4. Mathematical modeling and reliability analysis of a 3D Li-ion battery

    Directory of Open Access Journals (Sweden)

    RICHARD HONG PENG LIANG

    2014-02-01

    Full Text Available The three-dimensional (3D Li-ion battery presents an effective solution to issues affecting its two-dimensional counterparts, as it is able to attain high energy capacities for the same areal footprint without sacrificing power density. A 3D battery has key structural features extending in and fully utilizing 3D space, allowing it to achieve greater reliability and longevity. This study applies an electrochemical-thermal coupled model to a checkerboard array of alternating positive and negative electrodes in a 3D architecture with either square or circular electrodes. The mathematical model comprises the transient conservation of charge, species, and energy together with electroneutrality, constitutive relations and relevant initial and boundary conditions. A reliability analysis carried out to simulate malfunctioning of either a positive or negative electrode reveals that although there are deviations in electrochemical and thermal behavior for electrodes adjacent to the malfunctioning electrode as compared to that in a fully-functioning array, there is little effect on electrodes further away, demonstrating the redundancy that a 3D electrode array provides. The results demonstrate that implementation of 3D batteries allow it to reliably and safely deliver power even if a component malfunctions, a strong advantage over conventional 2D batteries.

  5. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  6. Discrete and continuous reliability models for systems with identically distributed correlated components

    International Nuclear Information System (INIS)

    Fiondella, Lance; Xing, Liudong

    2015-01-01

    Many engineers and researchers base their reliability models on the assumption that components of a system fail in a statistically independent manner. This assumption is often violated in practice because environmental and system specific factors contribute to correlated failures, which can lower the reliability of a fault tolerant system. A simple method to quantify the impact of correlation on system reliability is needed to encourage models explicitly incorporating correlated failures. Previous approaches to model correlation are limited to systems consisting of two or three components or assume that the majority of the subsets of component failures are statistically independent. This paper proposes a method to model the reliability of systems with correlated identical components, where components possess the same reliability and also exhibit a common failure correlation parameter. Both discrete and continuous models are proposed. The method is demonstrated through a series of examples, including derivations of analytical expressions for several common structures such as k-out-of-n: good and parallel systems. The continuous models consider the role of correlation on reliability and metrics, including mean time to failure, availability, and mean residual life. These examples illustrate that the method captures the impact of component correlation on system reliability and related metrics. - Highlights: • Reliability of systems with identical but correlated components are studied. • Correlation lowers the reliability and mean time to failure of fault tolerant systems. • Correlation lowers the availability and mean residual life of fault tolerant systems

  7. Levels of Interaction Provided by Online Distance Education Models

    Science.gov (United States)

    Alhih, Mohammed; Ossiannilsson, Ebba; Berigel, Muhammet

    2017-01-01

    Interaction plays a significant role to foster usability and quality in online education. It is one of the quality standard to reveal the evidence of practice in online distance education models. This research study aims to evaluate levels of interaction in the practices of distance education centres. It is aimed to provide online distance…

  8. Modeling of seismic hazards for dynamic reliability analysis

    International Nuclear Information System (INIS)

    Mizutani, M.; Fukushima, S.; Akao, Y.; Katukura, H.

    1993-01-01

    This paper investigates the appropriate indices of seismic hazard curves (SHCs) for seismic reliability analysis. In the most seismic reliability analyses of structures, the seismic hazards are defined in the form of the SHCs of peak ground accelerations (PGAs). Usually PGAs play a significant role in characterizing ground motions. However, PGA is not always a suitable index of seismic motions. When random vibration theory developed in the frequency domain is employed to obtain statistics of responses, it is more convenient for the implementation of dynamic reliability analysis (DRA) to utilize an index which can be determined in the frequency domain. In this paper, we summarize relationships among the indices which characterize ground motions. The relationships between the indices and the magnitude M are arranged as well. In this consideration, duration time plays an important role in relating two distinct class, i.e. energy class and power class. Fourier and energy spectra are involved in the energy class, and power and response spectra and PGAs are involved in the power class. These relationships are also investigated by using ground motion records. Through these investigations, we have shown the efficiency of employing the total energy as an index of SHCs, which can be determined in the time and frequency domains and has less variance than the other indices. In addition, we have proposed the procedure of DRA based on total energy. (author)

  9. Sensitivity of Reliability Estimates in Partially Damaged RC Structures subject to Earthquakes, using Reduced Hysteretic Models

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.; Skjærbæk, P. S.

    The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation.......The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation....

  10. Reliability Analysis of a Composite Blade Structure Using the Model Correction Factor Method

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian

    2010-01-01

    This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...

  11. Cloud-based calculators for fast and reliable access to NOAA's geomagnetic field models

    Science.gov (United States)

    Woods, A.; Nair, M. C.; Boneh, N.; Chulliat, A.

    2017-12-01

    While the Global Positioning System (GPS) provides accurate point locations, it does not provide pointing directions. Therefore, the absolute directional information provided by the Earth's magnetic field is of primary importance for navigation and for the pointing of technical devices such as aircrafts, satellites and lately, mobile phones. The major magnetic sources that affect compass-based navigation are the Earth's core, its magnetized crust and the electric currents in the ionosphere and magnetosphere. NOAA/CIRES Geomagnetism (ngdc.noaa.gov/geomag/) group develops and distributes models that describe all these important sources to aid navigation. Our geomagnetic models are used in variety of platforms including airplanes, ships, submarines and smartphones. While the magnetic field from Earth's core can be described in relatively fewer parameters and is suitable for offline computation, the magnetic sources from Earth's crust, ionosphere and magnetosphere require either significant computational resources or real-time capabilities and are not suitable for offline calculation. This is especially important for small navigational devices or embedded systems, where computational resources are limited. Recognizing the need for a fast and reliable access to our geomagnetic field models, we developed cloud-based application program interfaces (APIs) for NOAA's ionospheric and magnetospheric magnetic field models. In this paper we will describe the need for reliable magnetic calculators, the challenges faced in running geomagnetic field models in the cloud in real-time and the feedback from our user community. We discuss lessons learned harvesting and validating the data which powers our cloud services, as well as our strategies for maintaining near real-time service, including load-balancing, real-time monitoring, and instance cloning. We will also briefly talk about the progress we achieved on NOAA's Big Earth Data Initiative (BEDI) funded project to develop API

  12. Skill and reliability of climate model ensembles at the Last Glacial Maximum and mid-Holocene

    Directory of Open Access Journals (Sweden)

    J. C. Hargreaves

    2013-03-01

    Full Text Available Paleoclimate simulations provide us with an opportunity to critically confront and evaluate the performance of climate models in simulating the response of the climate system to changes in radiative forcing and other boundary conditions. Hargreaves et al. (2011 analysed the reliability of the Paleoclimate Modelling Intercomparison Project, PMIP2 model ensemble with respect to the MARGO sea surface temperature data synthesis (MARGO Project Members, 2009 for the Last Glacial Maximum (LGM, 21 ka BP. Here we extend that work to include a new comprehensive collection of land surface data (Bartlein et al., 2011, and introduce a novel analysis of the predictive skill of the models. We include output from the PMIP3 experiments, from the two models for which suitable data are currently available. We also perform the same analyses for the PMIP2 mid-Holocene (6 ka BP ensembles and available proxy data sets. Our results are predominantly positive for the LGM, suggesting that as well as the global mean change, the models can reproduce the observed pattern of change on the broadest scales, such as the overall land–sea contrast and polar amplification, although the more detailed sub-continental scale patterns of change remains elusive. In contrast, our results for the mid-Holocene are substantially negative, with the models failing to reproduce the observed changes with any degree of skill. One cause of this problem could be that the globally- and annually-averaged forcing anomaly is very weak at the mid-Holocene, and so the results are dominated by the more localised regional patterns in the parts of globe for which data are available. The root cause of the model-data mismatch at these scales is unclear. If the proxy calibration is itself reliable, then representativity error in the data-model comparison, and missing climate feedbacks in the models are other possible sources of error.

  13. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  14. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  15. Reliability Modeling and Optimization Strategy for Manufacturing System Based on RQR Chain

    Directory of Open Access Journals (Sweden)

    Yihai He

    2015-01-01

    Full Text Available Accurate and dynamic reliability modeling for the running manufacturing system is the prerequisite to implement preventive maintenance. However, existing studies could not output the reliability value in real time because their abandonment of the quality inspection data originated in the operation process of manufacturing system. Therefore, this paper presents an approach to model the manufacturing system reliability dynamically based on their operation data of process quality and output data of product reliability. Firstly, on the basis of importance explanation of the quality variations in manufacturing process as the linkage for the manufacturing system reliability and product inherent reliability, the RQR chain which could represent the relationships between them is put forward, and the product qualified probability is proposed to quantify the impacts of quality variation in manufacturing process on the reliability of manufacturing system further. Secondly, the impact of qualified probability on the product inherent reliability is expounded, and the modeling approach of manufacturing system reliability based on the qualified probability is presented. Thirdly, the preventive maintenance optimization strategy for manufacturing system driven by the loss of manufacturing quality variation is proposed. Finally, the validity of the proposed approach is verified by the reliability analysis and optimization example of engine cover manufacturing system.

  16. Time-dependent reliability analysis of nuclear reactor operators using probabilistic network models

    International Nuclear Information System (INIS)

    Oka, Y.; Miyata, K.; Kodaira, H.; Murakami, S.; Kondo, S.; Togo, Y.

    1987-01-01

    Human factors are very important for the reliability of a nuclear power plant. Human behavior has essentially a time-dependent nature. The details of thinking and decision making processes are important for detailed analysis of human reliability. They have, however, not been well considered by the conventional methods of human reliability analysis. The present paper describes the models for the time-dependent and detailed human reliability analysis. Recovery by an operator is taken into account and two-operators models are also presented

  17. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  18. RELY: A reliability modeling system for analysis of sodium-sulfur battery configurations

    Energy Technology Data Exchange (ETDEWEB)

    Hostick, C.J.; Huber, H.D.; Doggett, W.H.; Dirks, J.A.; Dovey, J.F.; Grinde, R.B.; Littlefield, J.S.; Cuta, F.M.

    1987-06-01

    In support of the Office of Energy Storage and Distribution of the US Department of Energy (DOE), Pacific Northwest Laboratory has produced a microcomputer-based software package, called RELY, to assess the impact of sodium-sulfur cell reliability on constant current discharge battery performance. The Fortran-based software operates on IBM microcomputers and IBM-compatibles that have a minimum of 512K of internal memory. The software package has three models that provide the following: (1) a description of the failure distribution parameters used to model cell failure, (2) a Monte Carlo simulation of battery life, and (3) a detailed discharge model for a user-specified battery discharge cycle. 6 refs., 31 figs., 4 tabs.

  19. Stochastic modeling for reliability shocks, burn-in and heterogeneous populations

    CERN Document Server

    Finkelstein, Maxim

    2013-01-01

    Focusing on shocks modeling, burn-in and heterogeneous populations, Stochastic Modeling for Reliability naturally combines these three topics in the unified stochastic framework and presents numerous practical examples that illustrate recent theoretical findings of the authors.  The populations of manufactured items in industry are usually heterogeneous. However, the conventional reliability analysis is performed under the implicit assumption of homogeneity, which can result in distortion of the corresponding reliability indices and various misconceptions. Stochastic Modeling for Reliability fills this gap and presents the basics and further developments of reliability theory for heterogeneous populations. Specifically, the authors consider burn-in as a method of elimination of ‘weak’ items from heterogeneous populations. The real life objects are operating in a changing environment. One of the ways to model an impact of this environment is via the external shocks occurring in accordance with some stocha...

  20. Estimating the Parameters of Software Reliability Growth Models Using the Grey Wolf Optimization Algorithm

    OpenAIRE

    Alaa F. Sheta; Amal Abdel-Raouf

    2016-01-01

    In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM) can be used to predict the number of...

  1. Model of Providing Assistive Technologies in Special Education Schools.

    Science.gov (United States)

    Lersilp, Suchitporn; Putthinoi, Supawadee; Chakpitak, Nopasit

    2015-05-14

    Most students diagnosed with disabilities in Thai special education schools received assistive technologies, but this did not guarantee the greatest benefits. The purpose of this study was to survey the provision, use and needs of assistive technologies, as well as the perspectives of key informants regarding a model of providing them in special education schools. The participants were selected by the purposive sampling method, and they comprised 120 students with visual, physical, hearing or intellectual disabilities from four special education schools in Chiang Mai, Thailand; and 24 key informants such as parents or caregivers, teachers, school principals and school therapists. The instruments consisted of an assistive technology checklist and a semi-structured interview. Results showed that a category of assistive technologies was provided for students with disabilities, with the highest being "services", followed by "media" and then "facilities". Furthermore, mostly students with physical disabilities were provided with assistive technologies, but those with visual disabilities needed it more. Finally, the model of providing assistive technologies was composed of 5 components: Collaboration; Holistic perspective; Independent management of schools; Learning systems and a production manual for users; and Development of an assistive technology center, driven by 3 major sources such as Government and Private organizations, and Schools.

  2. Analysis and Application of Mechanical System Reliability Model Based on Copula Function

    Directory of Open Access Journals (Sweden)

    An Hai

    2016-10-01

    Full Text Available There is complicated correlations in mechanical system. By using the advantages of copula function to solve the related issues, this paper proposes the mechanical system reliability model based on copula function. And makes a detailed research for the serial and parallel mechanical system model and gets their reliability function respectively. Finally, the application research is carried out for serial mechanical system reliability model to prove its validity by example. Using Copula theory to make mechanical system reliability modeling and its expectation, studying the distribution of the random variables (marginal distribution of the mechanical product’ life and associated structure of variables separately, can reduce the difficulty of multivariate probabilistic modeling and analysis to make the modeling and analysis process more clearly.

  3. 3D Modeling of Lower Extremities With Biplanar Radiographs: Reliability of Measures on Subsequent Examinations.

    Science.gov (United States)

    Westberry, David E; Carpenter, Ashley M

    2017-08-02

    Biplanar radiography with 3-dimensional (3D) modeling (EOS) provides a comprehensive assessment of lower limb alignment in an upright weight-bearing position with less radiation than conventional radiography. A study was performed to assess the consistency and reliability of 2 lower extremity 3D biplanar radiograph models created at least 1 year apart in a pediatric population. All patients who had 2 lower extremity radiographic evaluations with EOS performed at visits a minimum of 1 year apart were reviewed. Digital radiographs, of lower extremities in both frontal and sagittal planes, were acquired simultaneously, using the EOS system. The 3D reconstruction of the images was achieved utilizing the SterEOS software. Pelvic position, femoral and tibial anatomy, and the torsional profile were evaluated and compared using t tests. In total, 53 patients with a mean age of 11.7 years (range, 6.1 to 18.9 y) met inclusion criteria. When comparing 3D models between visits, minimal differences were noted in proximal femoral anatomy and pelvic alignment (pelvic incidence, sacral slope, sagittal tilt, neck shaft angle). Expected differences in femoral and tibial length corresponded with normal longitudinal growth between visits. Sagittal plane knee position varied widely between examinations. Femoral and/or tibial rotational osteotomies were performed in 37% of extremities between examinations. After femoral derotational osteotomy, a significant difference in femoral anteversion was appreciated when comparing preoperative and postoperative 3D models. However, this difference was less than the expected difference based on the anatomic correction achieved intraoperatively. No differences were noted in tibial torsion measures after tibial derotational osteotomy. The 3D modeling based on biplanar radiographs provides consistent and reliable measures of pelvic and hip joint anatomy of the lower extremity. Patient positioning may influence the reproducibility of knee alignment

  4. Reliability Based Optimal Design of Vertical Breakwaters Modelled as a Series System Failure

    DEFF Research Database (Denmark)

    Christiani, E.; Burcharth, H. F.; Sørensen, John Dalsgaard

    1996-01-01

    Reliability based design of monolithic vertical breakwaters is considered. Probabilistic models of important failure modes such as sliding and rupture failure in the rubble mound and the subsoil are described. Characterisation of the relevant stochastic parameters are presented, and relevant design...... variables are identified and an optimal system reliability formulation is presented. An illustrative example is given....

  5. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  6. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  7. Powering stochastic reliability models by discrete event simulation

    DEFF Research Database (Denmark)

    Kozine, Igor; Wang, Xiaoyun

    2012-01-01

    it difficult to find a solution to the problem. The power of modern computers and recent developments in discrete-event simulation (DES) software enable to diminish some of the drawbacks of stochastic models. In this paper we describe the insights we have gained based on using both Markov and DES models...

  8. Cognitive modelling: a basic complement of human reliability analysis

    International Nuclear Information System (INIS)

    Bersini, U.; Cacciabue, P.C.; Mancini, G.

    1988-01-01

    In this paper the issues identified in modelling humans and machines are discussed in the perspective of the consideration of human errors managing complex plants during incidental as well as normal conditions. The dichotomy between the use of a cognitive versus a behaviouristic model approach is discussed and the complementarity aspects rather than the differences of the two methods are identified. A cognitive model based on a hierarchical goal-oriented approach and driven by fuzzy logic methodology is presented as the counterpart to the 'classical' THERP methodology for studying human errors. Such a cognitive model is discussed at length and its fundamental components, i.e. the High Level Decision Making and the Low Level Decision Making models, are reviewed. Finally, the inadequacy of the 'classical' THERP methodology to deal with cognitive errors is discussed on the basis of a simple test case. For the same case the cognitive model is then applied showing the flexibility and adequacy of the model to dynamic configuration with time-dependent failures of components and with consequent need for changing of strategy during the transient itself. (author)

  9. A logical model provides insights into T cell receptor signaling.

    Directory of Open Access Journals (Sweden)

    Julio Saez-Rodriguez

    2007-08-01

    Full Text Available Cellular decisions are determined by complex molecular interaction networks. Large-scale signaling networks are currently being reconstructed, but the kinetic parameters and quantitative data that would allow for dynamic modeling are still scarce. Therefore, computational studies based upon the structure of these networks are of great interest. Here, a methodology relying on a logical formalism is applied to the functional analysis of the complex signaling network governing the activation of T cells via the T cell receptor, the CD4/CD8 co-receptors, and the accessory signaling receptor CD28. Our large-scale Boolean model, which comprises 94 nodes and 123 interactions and is based upon well-established qualitative knowledge from primary T cells, reveals important structural features (e.g., feedback loops and network-wide dependencies and recapitulates the global behavior of this network for an array of published data on T cell activation in wild-type and knock-out conditions. More importantly, the model predicted unexpected signaling events after antibody-mediated perturbation of CD28 and after genetic knockout of the kinase Fyn that were subsequently experimentally validated. Finally, we show that the logical model reveals key elements and potential failure modes in network functioning and provides candidates for missing links. In summary, our large-scale logical model for T cell activation proved to be a promising in silico tool, and it inspires immunologists to ask new questions. We think that it holds valuable potential in foreseeing the effects of drugs and network modifications.

  10. Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study

    Energy Technology Data Exchange (ETDEWEB)

    Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Meck, Robert A. [U.S. Nuclear Regulatory Commission

    2008-10-01

    This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intake or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently

  11. National Water Model: Providing the Nation with Actionable Water Intelligence

    Science.gov (United States)

    Aggett, G. R.; Bates, B.

    2017-12-01

    The National Water Model (NWM) provides national, street-level detail of water movement through time and space. Operating hourly, this flood of information offers enormous benefits in the form of water resource management, natural disaster preparedness, and the protection of life and property. The Geo-Intelligence Division at the NOAA National Water Center supplies forecasters and decision-makers with timely, actionable water intelligence through the processing of billions of NWM data points every hour. These datasets include current streamflow estimates, short and medium range streamflow forecasts, and many other ancillary datasets. The sheer amount of NWM data produced yields a dataset too large to allow for direct human comprehension. As such, it is necessary to undergo model data post-processing, filtering, and data ingestion by visualization web apps that make use of cartographic techniques to bring attention to the areas of highest urgency. This poster illustrates NWM output post-processing and cartographic visualization techniques being developed and employed by the Geo-Intelligence Division at the NOAA National Water Center to provide national actionable water intelligence.

  12. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  13. Modeling and reliability analysis of three phase z-source AC-AC converter

    Directory of Open Access Journals (Sweden)

    Prasad Hanuman

    2017-12-01

    Full Text Available This paper presents the small signal modeling using the state space averaging technique and reliability analysis of a three-phase z-source ac-ac converter. By controlling the shoot-through duty ratio, it can operate in buck-boost mode and maintain desired output voltage during voltage sag and surge condition. It has faster dynamic response and higher efficiency as compared to the traditional voltage regulator. Small signal analysis derives different control transfer functions and this leads to design a suitable controller for a closed loop system during supply voltage variation. The closed loop system of the converter with a PID controller eliminates the transients in output voltage and provides steady state regulated output. The proposed model designed in the RT-LAB and executed in a field programming gate array (FPGA-based real-time digital simulator at a fixedtime step of 10 μs and a constant switching frequency of 10 kHz. The simulator was developed using very high speed integrated circuit hardware description language (VHDL, making it versatile and moveable. Hardware-in-the-loop (HIL simulation results are presented to justify the MATLAB simulation results during supply voltage variation of the three phase z-source ac-ac converter. The reliability analysis has been applied to the converter to find out the failure rate of its different components.

  14. Observation Likelihood Model Design and Failure Recovery Scheme Toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2010-12-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  15. Observation Likelihood Model Design and Failure Recovery Scheme toward Reliable Localization of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Chang-bae Moon

    2011-01-01

    Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.

  16. Reliability modeling of digital component in plant protection system with various fault-tolerant techniques

    International Nuclear Information System (INIS)

    Kim, Bo Gyung; Kang, Hyun Gook; Kim, Hee Eun; Lee, Seung Jun; Seong, Poong Hyun

    2013-01-01

    Highlights: • Integrated fault coverage is introduced for reflecting characteristics of fault-tolerant techniques in the reliability model of digital protection system in NPPs. • The integrated fault coverage considers the process of fault-tolerant techniques from detection to fail-safe generation process. • With integrated fault coverage, the unavailability of repairable component of DPS can be estimated. • The new developed reliability model can reveal the effects of fault-tolerant techniques explicitly for risk analysis. • The reliability model makes it possible to confirm changes of unavailability according to variation of diverse factors. - Abstract: With the improvement of digital technologies, digital protection system (DPS) has more multiple sophisticated fault-tolerant techniques (FTTs), in order to increase fault detection and to help the system safely perform the required functions in spite of the possible presence of faults. Fault detection coverage is vital factor of FTT in reliability. However, the fault detection coverage is insufficient to reflect the effects of various FTTs in reliability model. To reflect characteristics of FTTs in the reliability model, integrated fault coverage is introduced. The integrated fault coverage considers the process of FTT from detection to fail-safe generation process. A model has been developed to estimate the unavailability of repairable component of DPS using the integrated fault coverage. The new developed model can quantify unavailability according to a diversity of conditions. Sensitivity studies are performed to ascertain important variables which affect the integrated fault coverage and unavailability

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    Science.gov (United States)

    Liu, Donhang

    2014-01-01

    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  19. Microstructural Modeling of Brittle Materials for Enhanced Performance and Reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Teague, Melissa Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Teague, Melissa Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rodgers, Theron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rodgers, Theron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grutzik, Scott Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grutzik, Scott Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    Brittle failure is often influenced by difficult to measure and variable microstructure-scale stresses. Recent advances in photoluminescence spectroscopy (PLS), including improved confocal laser measurement and rapid spectroscopic data collection have established the potential to map stresses with microscale spatial resolution (%3C2 microns). Advanced PLS was successfully used to investigate both residual and externally applied stresses in polycrystalline alumina at the microstructure scale. The measured average stresses matched those estimated from beam theory to within one standard deviation, validating the technique. Modeling the residual stresses within the microstructure produced general agreement in comparison with the experimentally measured results. Microstructure scale modeling is primed to take advantage of advanced PLS to enable its refinement and validation, eventually enabling microstructure modeling to become a predictive tool for brittle materials.

  20. Governance, Government, and the Search for New Provider Models

    Directory of Open Access Journals (Sweden)

    Richard B. Saltman

    2016-01-01

    Full Text Available A central problem in designing effective models of provider governance in health systems has been to ensure an appropriate balance between the concerns of public sector and/or government decision-makers, on the one hand, and of non-governmental health services actors in civil society and private life, on the other. In tax-funded European health systems up to the 1980s, the state and other public sector decision-makers played a dominant role over health service provision, typically operating hospitals through national or regional governments on a command-and-control basis. In a number of countries, however, this state role has started to change, with governments first stepping out of direct service provision and now de facto pushed to focus more on steering provider organizations rather than on direct public management. In this new approach to provider governance, the state has pulled back into a regulatory role that introduces market-like incentives and management structures, which then apply to both public and private sector providers alike. This article examines some of the main operational complexities in implementing this new governance reality/strategy, specifically from a service provision (as opposed to mostly a financing or even regulatory perspective. After briefly reviewing some of the key theoretical dilemmas, the paper presents two case studies where this new approach was put into practice: primary care in Sweden and hospitals in Spain. The article concludes that good governance today needs to reflect practical operational realities if it is to have the desired effect on health sector reform outcome.

  1. Parametric and semiparametric models with applications to reliability, survival analysis, and quality of life

    CERN Document Server

    Nikulin, M; Mesbah, M; Limnios, N

    2004-01-01

    Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.

  2. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  3. Reliability Evaluation for the Surface to Air Missile Weapon Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Deng Jianjun

    2015-01-01

    Full Text Available The fuzziness and randomness is integrated by using digital characteristics, such as Expected value, Entropy and Hyper entropy. The cloud model adapted to reliability evaluation is put forward based on the concept of the surface to air missile weapon. The cloud scale of the qualitative evaluation is constructed, and the quantitative variable and the qualitative variable in the system reliability evaluation are corresponded. The practical calculation result shows that it is more effective to analyze the reliability of the surface to air missile weapon by this way. The practical calculation result also reflects the model expressed by cloud theory is more consistent with the human thinking style of uncertainty.

  4. Graph-theoretical models to calculate the reliability and availability of power plants

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1977-01-01

    The paper has three main subjects namely: 1) Establishment of reliability models, 2) modification and use of stochastic networks for the mathematical description of multistage and/or non-markoff processes and physical reliability models, and 3) reliability analysis in two examples of application. Some important characteristics of power plants are quantified with the aid of short surveys on the availabilities of coal-fired and nuclear power plant units and with the help of exemplary investigations of time curves of the availability of steam generators, maintenance success, and distributions of operating and outage periods of steam generator components. (orig./HP) [de

  5. Electromagnetic Model Reliably Predicts Radar Scattering Characteristics of Airborne Organisms

    Science.gov (United States)

    Mirkovic, Djordje; Stepanian, Phillip M.; Kelly, Jeffrey F.; Chilson, Phillip B.

    2016-10-01

    The radar scattering characteristics of aerial animals are typically obtained from controlled laboratory measurements of a freshly harvested specimen. These measurements are tedious to perform, difficult to replicate, and typically yield only a small subset of the full azimuthal, elevational, and polarimetric radio scattering data. As an alternative, biological applications of radar often assume that the radar cross sections of flying animals are isotropic, since sophisticated computer models are required to estimate the 3D scattering properties of objects having complex shapes. Using the method of moments implemented in the WIPL-D software package, we show for the first time that such electromagnetic modeling techniques (typically applied to man-made objects) can accurately predict organismal radio scattering characteristics from an anatomical model: here the Brazilian free-tailed bat (Tadarida brasiliensis). The simulated scattering properties of the bat agree with controlled measurements and radar observations made during a field study of bats in flight. This numerical technique can produce the full angular set of quantitative polarimetric scattering characteristics, while eliminating many practical difficulties associated with physical measurements. Such a modeling framework can be applied for bird, bat, and insect species, and will help drive a shift in radar biology from a largely qualitative and phenomenological science toward quantitative estimation of animal densities and taxonomic identification.

  6. Appraisal and Reliability of Variable Engagement Model Prediction ...

    African Journals Online (AJOL)

    The variable engagement model based on the stress - crack opening displacement relationship and, which describes the behaviour of randomly oriented steel fibres composite subjected to uniaxial tension has been evaluated so as to determine the safety indices associated when the fibres are subjected to pullout and with ...

  7. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  8. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  9. Proposition of a multicriteria model to select logistics services providers

    Directory of Open Access Journals (Sweden)

    Miriam Catarina Soares Aharonovitz

    2014-06-01

    Full Text Available This study aims to propose a multicriteria model to select logistics service providers by the development of a decision tree. The methodology consists of a survey, which resulted in a sample of 181 responses. The sample was analyzed using statistic methods, descriptive statistics among them, multivariate analysis, variance analysis, and parametric tests to compare means. Based on these results, it was possible to obtain the decision tree and information to support the multicriteria analysis. The AHP (Analytic Hierarchy Process was applied to determine the data influence and thus ensure better consistency in the analysis. The decision tree categorizes the criteria according to the decision levels (strategic, tactical and operational. Furthermore, it allows to generically evaluate the importance of each criterion in the supplier selection process from the point of view of logistics services contractors.

  10. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    the production trait evaluation of Nordic Red dairy cattle. Genotyped bulls with daughters are used as training animals, and genotyped bulls and producing cows as candidate animals. For simplicity, size of the data is chosen so that the full inverses of the mixed model equation coefficient matrices can......Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was...

  11. Reliability of linear measurements on a virtual bilateral cleft lip and palate model

    NARCIS (Netherlands)

    Oosterkamp, B.C.M.; van der Meer, W.J.; Rutenfrans, M.; Dijkstra, P.U.

    Objective: To assess the reliability and validity of measurements performed on three-dimensional virtual models of neonatal bilateral cleft lip and palate patients, compared with measurements performed on plaster cast models. Materials and Methods: Ten high-quality plaster cast models of bilateral

  12. Reliable modeling of the electronic spectra of realistic uranium complexes

    Science.gov (United States)

    Tecmer, Paweł; Govind, Niranjan; Kowalski, Karol; de Jong, Wibe A.; Visscher, Lucas

    2013-07-01

    We present an EOMCCSD (equation of motion coupled cluster with singles and doubles) study of excited states of the small [UO2]2+ and [UO2]+ model systems as well as the larger UVIO2(saldien) complex. In addition, the triples contribution within the EOMCCSDT and CR-EOMCCSD(T) (completely renormalized EOMCCSD with non-iterative triples) approaches for the [UO2]2+ and [UO2]+ systems as well as the active-space variant of the CR-EOMCCSD(T) method—CR-EOMCCSd(t)—for the UVIO2(saldien) molecule are investigated. The coupled cluster data were employed as benchmark to choose the "best" appropriate exchange-correlation functional for subsequent time-dependent density functional (TD-DFT) studies on the transition energies for closed-shell species. Furthermore, the influence of the saldien ligands on the electronic structure and excitation energies of the [UO2]+ molecule is discussed. The electronic excitations as well as their oscillator dipole strengths modeled with TD-DFT approach using the CAM-B3LYP exchange-correlation functional for the [UVO2(saldien)]- with explicit inclusion of two dimethyl sulfoxide molecules are in good agreement with the experimental data of Takao et al. [Inorg. Chem. 49, 2349 (2010), 10.1021/ic902225f].

  13. Assessing the reliability of predictive activity coefficient models for molecules consisting of several functional groups

    Directory of Open Access Journals (Sweden)

    R. P. Gerber

    2013-03-01

    Full Text Available Currently, the most successful predictive models for activity coefficients are those based on functional groups such as UNIFAC. In contrast, these models require a large amount of experimental data for the determination of their parameter matrix. A more recent alternative is the models based on COSMO, for which only a small set of universal parameters must be calibrated. In this work, a recalibrated COSMO-SAC model was compared with the UNIFAC (Do model employing experimental infinite dilution activity coefficient data for 2236 non-hydrogen-bonding binary mixtures at different temperatures. As expected, UNIFAC (Do presented better overall performance, with a mean absolute error of 0.12 ln-units against 0.22 for our COSMO-SAC implementation. However, in cases involving molecules with several functional groups or when functional groups appear in an unusual way, the deviation for UNIFAC was 0.44 as opposed to 0.20 for COSMO-SAC. These results show that COSMO-SAC provides more reliable predictions for multi-functional or more complex molecules, reaffirming its future prospects.

  14. Modeling Manufacturing Impacts on Aging and Reliability of Polyurethane Foams

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R.; Roberts, Christine Cardinal; Mondy, Lisa Ann; Soehnel, Melissa Marie; Johnson, Kyle; Lorenzo, Henry T.

    2016-10-01

    Polyurethane is a complex multiphase material that evolves from a viscous liquid to a system of percolating bubbles, which are created via a CO2 generating reaction. The continuous phase polymerizes to a solid during the foaming process generating heat. Foams introduced into a mold increase their volume up to tenfold, and the dynamics of the expansion process may lead to voids and will produce gradients in density and degree of polymerization. These inhomogeneities can lead to structural stability issues upon aging. For instance, structural components in weapon systems have been shown to change shape as they age depending on their molding history, which can threaten critical tolerances. The purpose of this project is to develop a Cradle-to-Grave multiphysics model, which allows us to predict the material properties of foam from its birth through aging in the stockpile, where its dimensional stability is important.

  15. Liquefaction of Tangier soils by using physically based reliability analysis modelling

    Directory of Open Access Journals (Sweden)

    Dubujet P.

    2012-07-01

    Full Text Available Approaches that are widely used to characterize propensity of soils to liquefaction are mainly of empirical type. The potential of liquefaction is assessed by using correlation formulas that are based on field tests such as the standard and the cone penetration tests. These correlations depend however on the site where they were derived. In order to adapt them to other sites where seismic case histories are not available, further investigation is required. In this work, a rigorous one-dimensional modelling of the soil dynamics yielding liquefaction phenomenon is considered. Field tests consisting of core sampling and cone penetration testing were performed. They provided the necessary data for numerical simulations performed by using DeepSoil software package. Using reliability analysis, the probability of liquefaction was estimated and the obtained results were used to adapt Juang method to the particular case of sandy soils located in Tangier.

  16. Microgrid Reliability Modeling and Battery Scheduling Using Stochastic Linear Programming

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Goncalo; Stadler, Michael; Siddiqui, Afzal; Marnay, Chris; DeForest, Nicholas; Barbosa-Povoa, Ana; Ferrao, Paulo

    2013-05-23

    This paper describes the introduction of stochastic linear programming into Operations DER-CAM, a tool used to obtain optimal operating schedules for a given microgrid under local economic and environmental conditions. This application follows previous work on optimal scheduling of a lithium-iron-phosphate battery given the output uncertainty of a 1 MW molten carbonate fuel cell. Both are in the Santa Rita Jail microgrid, located in Dublin, California. This fuel cell has proven unreliable, partially justifying the consideration of storage options. Several stochastic DER-CAM runs are executed to compare different scenarios to values obtained by a deterministic approach. Results indicate that using a stochastic approach provides a conservative yet more lucrative battery schedule. Lower expected energy bills result, given fuel cell outages, in potential savings exceeding 6percent.

  17. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori......The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...

  18. Herbarium records are reliable sources of phenological change driven by climate and provide novel insights into species' phenological cueing mechanisms.

    Science.gov (United States)

    Davis, Charles C; Willis, Charles G; Connolly, Bryan; Kelly, Courtland; Ellison, Aaron M

    2015-10-01

    Climate change has resulted in major changes in the phenology of some species but not others. Long-term field observational records provide the best assessment of these changes, but geographic and taxonomic biases limit their utility. Plant specimens in herbaria have been hypothesized to provide a wealth of additional data for studying phenological responses to climatic change. However, no study to our knowledge has comprehensively addressed whether herbarium data are accurate measures of phenological response and thus applicable to addressing such questions. We compared flowering phenology determined from field observations (years 1852-1858, 1875, 1878-1908, 2003-2006, 2011-2013) and herbarium records (1852-2013) of 20 species from New England, United States. Earliest flowering date estimated from herbarium records faithfully reflected field observations of first flowering date and substantially increased the sampling range across climatic conditions. Additionally, although most species demonstrated a response to interannual temperature variation, long-term temporal changes in phenological response were not detectable. Our findings support the use of herbarium records for understanding plant phenological responses to changes in temperature, and also importantly establish a new use of herbarium collections: inferring primary phenological cueing mechanisms of individual species (e.g., temperature, winter chilling, photoperiod). These latter data are lacking from most investigations of phenological change, but are vital for understanding differential responses of individual species to ongoing climate change. © 2015 Botanical Society of America.

  19. Intraclass reliability for assessing how well Taiwan constrained hospital-provided medical services using statistical process control chart techniques

    Directory of Open Access Journals (Sweden)

    Chien Tsair-Wei

    2012-05-01

    Full Text Available Abstract Background Few studies discuss the indicators used to assess the effect on cost containment in healthcare across hospitals in a single-payer national healthcare system with constrained medical resources. We present the intraclass correlation coefficient (ICC to assess how well Taiwan constrained hospital-provided medical services in such a system. Methods A custom Excel-VBA routine to record the distances of standard deviations (SDs from the central line (the mean over the previous 12 months of a control chart was used to construct and scale annual medical expenditures sequentially from 2000 to 2009 for 421 hospitals in Taiwan to generate the ICC. The ICC was then used to evaluate Taiwan’s year-based convergent power to remain unchanged in hospital-provided constrained medical services. A bubble chart of SDs for a specific month was generated to present the effects of using control charts in a national healthcare system. Results ICCs were generated for Taiwan’s year-based convergent power to constrain its medical services from 2000 to 2009. All hospital groups showed a gradually well-controlled supply of services that decreased from 0.772 to 0.415. The bubble chart identified outlier hospitals that required investigation of possible excessive reimbursements in a specific time period. Conclusion We recommend using the ICC to annually assess a nation’s year-based convergent power to constrain medical services across hospitals. Using sequential control charts to regularly monitor hospital reimbursements is required to achieve financial control in a single-payer nationwide healthcare system.

  20. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  1. A modelling approach to find stable and reliable soil organic carbon values for further regionalization.

    Science.gov (United States)

    Bönecke, Eric; Franko, Uwe

    2015-04-01

    Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical

  2. The model case IRS-RWE for the determination of reliability data in practical operation

    International Nuclear Information System (INIS)

    Hoemke, P.; Krause, H.

    1975-11-01

    Reliability und availability analyses are carried out to assess the safety of nuclear power plants. This paper deals in the first part with the requirement of accuracy for the input data of such analyses and in the second part with the prototype data collection of reliability data 'Model case IRS-RWE'. The objectives and the structure of the data collection will be described. The present results show that the estimation of reliability data in power plants is possible and gives reasonable results. (orig.) [de

  3. Statistical Model Selection for Better Prediction and Discovering Science Mechanisms That Affect Reliability

    Directory of Open Access Journals (Sweden)

    Christine M. Anderson-Cook

    2015-08-01

    Full Text Available Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidate inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. Finally, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.

  4. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  5. Study on quantitative reliability analysis by multilevel flow models for nuclear power plants

    International Nuclear Information System (INIS)

    Yang Ming; Zhang Zhijian

    2011-01-01

    Multilevel Flow Models (MFM) is a goal-oriented system modeling method. MFM explicitly describes how a system performs the required functions under stated conditions for a stated period of time. This paper presents a novel system reliability analysis method based on MFM (MRA). The proposed method allows describing the system knowledge at different levels of abstraction which makes the reliability model easy for understanding, establishing, modifying and extending. The success probabilities of all main goals and sub-goals can be available by only one-time quantitative analysis. The proposed method is suitable for the system analysis and scheme comparison for complex industrial systems such as nuclear power plants. (authors)

  6. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    Science.gov (United States)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  7. Simultaneous parameter and tolerance optimization of structures via probability-interval mixed reliability model

    DEFF Research Database (Denmark)

    Luo, Yangjun; Wu, Xiaoxiang; Zhou, Mingdong

    2015-01-01

    on a probability-interval mixed reliability model, the imprecision of design parameters is modeled as interval uncertainties fluctuating within allowable tolerance bounds. The optimization model is defined as to minimize the total manufacturing cost under mixed reliability index constraints, which are further...... transformed into their equivalent formulations by using the performance measure approach. The optimization problem is then solved with the sequential approximate programming. Meanwhile, a numerically stable algorithm based on the trust region method is proposed to efficiently update the target performance...

  8. Stochastic models and reliability parameter estimation applicable to nuclear power plant safety

    International Nuclear Information System (INIS)

    Mitra, S.P.

    1979-01-01

    A set of stochastic models and related estimation schemes for reliability parameters are developed. The models are applicable for evaluating reliability of nuclear power plant systems. Reliability information is extracted from model parameters which are estimated from the type and nature of failure data that is generally available or could be compiled in nuclear power plants. Principally, two aspects of nuclear power plant reliability have been investigated: (1) The statistical treatment of inplant component and system failure data; (2) The analysis and evaluation of common mode failures. The model inputs are failure data which have been classified as either the time type of failure data or the demand type of failure data. Failures of components and systems in nuclear power plant are, in general, rare events.This gives rise to sparse failure data. Estimation schemes for treating sparse data, whenever necessary, have been considered. The following five problems have been studied: 1) Distribution of sparse failure rate component data. 2) Failure rate inference and reliability prediction from time type of failure data. 3) Analyses of demand type of failure data. 4) Common mode failure model applicable to time type of failure data. 5) Estimation of common mode failures from 'near-miss' demand type of failure data

  9. Do Lumped-Parameter Models Provide the Correct Geometrical Damping?

    DEFF Research Database (Denmark)

    Andersen, Lars

    This paper concerns the formulation of lumped-parameter models for rigid footings on homogenous or stratified soil. Such models only contain a few degrees of freedom, which makes them ideal for inclusion in aero-elastic codes for wind turbines and other models applied to fast evaluation of struct......This paper concerns the formulation of lumped-parameter models for rigid footings on homogenous or stratified soil. Such models only contain a few degrees of freedom, which makes them ideal for inclusion in aero-elastic codes for wind turbines and other models applied to fast evaluation...... response during excitation and the geometrical damping related to free vibrations of a hexagonal footing. The optimal order of a lumped-parameter model is determined for each degree of freedom, i.e. horizontal and vertical translation as well as torsion and rocking. In particular, the necessity of coupling...

  10. Indirect test method does not provide reliable data. Rated thermal capacity of storage furnaces; Indirekte Pruefmethode fuehrt nicht zu sicheren Kennwerten. Nennwaermeleistung von Speicherfeuerstaetten

    Energy Technology Data Exchange (ETDEWEB)

    Sprung, J.; Koenig, N. [Fraunhofer-Institut fuer Bauphysik, Stuttgart (Germany)

    2002-11-01

    Decentralised furnaces fuelled with biogenic fuels are important in building construction today as the heat demand of modern buildings has decreased. Efficient projecting and operation requires reliable data on the achievable thermal capacity. The contribution shows that tests according to current specifications will not provide reliable data, especially when furnaces with heat storage options are concerned. [German] Dezentrale, mit biogenen Brennstoffen betriebene Feuerstaetten gewinnen aufgrund des niedrigen Waermebedarfs im modernen Wohnungsbau Bedeutung. Um diese Art von Feuerstaetten effektiv planen und betreiben zu koennen, sind verlaessliche Angaben zur realisierbaren Waermeleistung erforderlich. Im folgenden wird herausgestellt, dass die bisher fuer mit Festbrennstoffen betriebenen Einzelfeuerstaetten angewandte und in den einschlaegigen Normen enthaltene Pruefmethode der indirekten Bestimmung der Waermeleistung insbesondere fuer Feuerstaetten mit waermespeichernden Eigenschaften nicht verwendbar ist. (orig.)

  11. Computer aided reliability, availability, and safety modeling for fault-tolerant computer systems with commentary on the HARP program

    Science.gov (United States)

    Shooman, Martin L.

    1991-01-01

    Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.

  12. An adaptive neuro fuzzy model for estimating the reliability of component-based software systems

    Directory of Open Access Journals (Sweden)

    Kirti Tyagi

    2014-01-01

    Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.

  13. Life cycle reliability assessment of new products—A Bayesian model updating approach

    International Nuclear Information System (INIS)

    Peng, Weiwen; Huang, Hong-Zhong; Li, Yanfeng; Zuo, Ming J.; Xie, Min

    2013-01-01

    The rapidly increasing pace and continuously evolving reliability requirements of new products have made life cycle reliability assessment of new products an imperative yet difficult work. While much work has been done to separately estimate reliability of new products in specific stages, a gap exists in carrying out life cycle reliability assessment throughout all life cycle stages. We present a Bayesian model updating approach (BMUA) for life cycle reliability assessment of new products. Novel features of this approach are the development of Bayesian information toolkits by separately including “reliability improvement factor” and “information fusion factor”, which allow the integration of subjective information in a specific life cycle stage and the transition of integrated information between adjacent life cycle stages. They lead to the unique characteristics of the BMUA in which information generated throughout life cycle stages are integrated coherently. To illustrate the approach, an application to the life cycle reliability assessment of a newly developed Gantry Machining Center is shown

  14. Do Lumped-Parameter Models Provide the Correct Geometrical Damping?

    DEFF Research Database (Denmark)

    Andersen, Lars

    2007-01-01

    This paper concerns the formulation of lumped-parameter models for rigid footings on homogenous or stratified soil with focus on the horizontal sliding and rocking. Such models only contain a few degrees of freedom, which makes them ideal for inclusion in aero-elastic codes for wind turbines......-parameter models with respect to the prediction of the maximum response during excitation and the geometrical damping related to free vibrations of a footing....

  15. Modeling Optimal Scheduling for Pumping System to Minimize Operation Cost and Enhance Operation Reliability

    Directory of Open Access Journals (Sweden)

    Yin Luo

    2012-01-01

    Full Text Available Traditional pump scheduling models neglect the operation reliability which directly relates with the unscheduled maintenance cost and the wear cost during the operation. Just for this, based on the assumption that the vibration directly relates with the operation reliability and the degree of wear, it could express the operation reliability as the normalization of the vibration level. The characteristic of the vibration with the operation point was studied, it could be concluded that idealized flow versus vibration plot should be a distinct bathtub shape. There is a narrow sweet spot (80 to 100 percent BEP to obtain low vibration levels in this shape, and the vibration also follows similar law with the square of the rotation speed without resonance phenomena. Then, the operation reliability could be modeled as the function of the capacity and rotation speed of the pump and add this function to the traditional model to form the new. And contrast with the tradition method, the result shown that the new model could fix the result produced by the traditional, make the pump operate in low vibration, then the operation reliability could increase and the maintenance cost could decrease.

  16. Multinary systems and reliability models from coherence to some kind of non-coherence

    International Nuclear Information System (INIS)

    Mazars, N.

    1986-01-01

    First restricted to models for binary systems, reliability theory is being generalized for multinary systems, of multinary components. After a general viewpoint on reliability models for multinary systems, coherence generalizations are examined. First studied in terms of structure functions, the binary coherent systems can be fully characterized in terms of their minimal path (cut) sets as well as in terms of their life functions. Various fundamental notions such as minimal path (cut) sets and relevance first are introduced in terms of structure functions. The binary decompositions are studied and used for characterizing the broad-sense coherence in terms of sets. The binary-type coherence, the homogenous coherence and the various types of strict-sense coherence are reviewed and fully characterized in various ways. Life functions lead to some model useful for reliability calculations. Methods for determining, in a exact or approximated way, reliability characteristics of multinary coherent systems are studied from both of the fundamental models of reliability, then possible. Futhermore, some kind of non-coherent multinary system is suggested. This analysis may be interesting in the nuclear field

  17. [Reliability study in the measurement of the cusp inclination angle of a chairside digital model].

    Science.gov (United States)

    Xinggang, Liu; Xiaoxian, Chen

    2018-02-01

    This study aims to evaluate the reliability of the software Picpick in the measurement of the cusp inclination angle of a digital model. Twenty-one trimmed models were used as experimental objects. The chairside digital impression was then used for the acquisition of 3D digital models, and the software Picpick was employed for the measurement of the cusp inclination of these models. The measurements were repeated three times, and the results were compared with a gold standard, which was a manually measured experimental model cusp angle. The intraclass correlation coefficient (ICC) was calculated. The paired t test value of the two measurement methods was 0.91. The ICCs between the two measurement methods and three repeated measurements were greater than 0.9. The digital model achieved a smaller coefficient of variation (9.9%). The software Picpick is reliable in measuring the cusp inclination of a digital model.

  18. Model of the reliability analysis of the distributed computer systems with architecture "client-server"

    Science.gov (United States)

    Kovalev, I. V.; Zelenkov, P. V.; Karaseva, M. V.; Tsarev, M. Yu; Tsarev, R. Yu

    2015-01-01

    The paper considers the problem of the analysis of distributed computer systems reliability with client-server architecture. A distributed computer system is a set of hardware and software for implementing the following main functions: processing, storage, transmission and data protection. This paper discusses the distributed computer systems architecture "client-server". The paper presents the scheme of the distributed computer system functioning represented as a graph where vertices are the functional state of the system and arcs are transitions from one state to another depending on the prevailing conditions. In reliability analysis we consider such reliability indicators as the probability of the system transition in the stopping state and accidents, as well as the intensity of these transitions. The proposed model allows us to obtain correlations for the reliability parameters of the distributed computer system without any assumptions about the distribution laws of random variables and the elements number in the system.

  19. Reliability modelling of redundant safety systems without automatic diagnostics incorporating common cause failures and process demand.

    Science.gov (United States)

    Alizadeh, Siamak; Sriramula, Srinivas

    2017-11-01

    Redundant safety systems are commonly used in the process industry to respond to hazardous events. In redundant systems composed of identical units, Common Cause Failures (CCFs) can significantly influence system performance with regards to reliability and safety. However, their impact has been overlooked due to the inherent complexity of modelling common cause induced failures. This article develops a reliability model for a redundant safety system using Markov analysis approach. The proposed model incorporates process demands in conjunction with CCF for the first time and evaluates their impacts on the reliability quantification of safety systems without automatic diagnostics. The reliability of the Markov model is quantified by considering the Probability of Failure on Demand (PFD) as a measure for low demand systems. The safety performance of the model is analysed using Hazardous Event Frequency (HEF) to evaluate the frequency of entering a hazardous state that will lead to an accident if the situation is not controlled. The utilisation of Markov model for a simple case study of a pressure protection system is demonstrated and it is shown that the proposed approach gives a sufficiently accurate result for all demand rates, durations, component failure rates and corresponding repair rates for low demand mode of operation. The Markov model proposed in this paper assumes the absence of automatic diagnostics, along with multiple stage repair strategy for CCFs and restoration of the system from hazardous state to the "as good as new" state. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Bayesian Hierarchical Scale Mixtures of Log-Normal Models for Inference in Reliability with Stochastic Constraint

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2017-06-01

    Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.

  1. Comparative analysis among deterministic and stochastic collision damage models for oil tanker and bulk carrier reliability

    Directory of Open Access Journals (Sweden)

    A. Campanile

    2018-01-01

    Full Text Available The incidence of collision damage models on oil tanker and bulk carrier reliability is investigated considering the IACS deterministic model against GOALDS/IMO database statistics for collision events, substantiating the probabilistic model. Statistical properties of hull girder residual strength are determined by Monte Carlo simulation, based on random generation of damage dimensions and a modified form of incremental-iterative method, to account for neutral axis rotation and equilibrium of horizontal bending moment, due to cross-section asymmetry after collision events. Reliability analysis is performed, to investigate the incidence of collision penetration depth and height statistical properties on hull girder sagging/hogging failure probabilities. Besides, the incidence of corrosion on hull girder residual strength and reliability is also discussed, focussing on gross, hull girder net and local net scantlings, respectively. The ISSC double hull oil tanker and single side bulk carrier, assumed as test cases in the ISSC 2012 report, are taken as reference ships.

  2. Reliability-Based Design of Wind Turbine Foundations – Computational Modelling

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammad Javad

    of fossil fuels causing pollution, environmental degradation, and climate change, and finally mixed messages regarding declining domestic and foreign oil reserves. Therefore, the wind power industry is becoming a key player as the green energy producer in many developed countries. However, consumers demand......, a reduction in foundation cost, and optimizing foundation structural design is the best solution to cost effectiveness. An optimized wind turbine foundation design should provide a suitable target reliability level. Unfortunately, the reliability level is not identified in most current deterministic design...... on the foundation reliability have already been characterized. Given that usually a conservative result has already been obtained through the current deterministic design methodologies; consequently, a reliability-based design can be suggested to quantify the uncertainties related to the design parameters...

  3. PCA as a practical indicator of OPLS-DA model reliability.

    Science.gov (United States)

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  4. A treatment schedule of conventional physical therapy provided to enhance upper limb sensorimotor recovery after stroke: expert criterion validity and intra-rater reliability.

    Science.gov (United States)

    Donaldson, Catherine; Tallis, Raymond C; Pomeroy, Valerie M

    2009-06-01

    Inadequate description of treatment hampers progress in stroke rehabilitation. To develop a valid, reliable, standardised treatment schedule of conventional physical therapy provided for the paretic upper limb after stroke. Eleven neurophysiotherapists participated in the established methodology: semi-structured interviews, focus groups and piloting a draft treatment schedule in clinical practice. Different physiotherapists (n=13) used the treatment schedule to record treatment given to stroke patients with mild, moderate and severe upper limb paresis. Rating of adequacy of the treatment schedule was made using a visual analogue scale (0 to 100mm). Mean (95% confidence interval) visual analogue scores were calculated (expert criterion validity). For intra-rater reliability, each physiotherapist observed a video tape of their treatment and immediately completed a treatment schedule recording form on two separate occasions, 4 to 6 weeks apart. The Kappa statistic was calculated for intra-rater reliability. The treatment schedule consists of a one-page A4 recording form and a user booklet, detailing 50 treatment activities. Expert criterion validity was 79 (95% confidence interval 74 to 84). Intra-rater Kappa was 0.81 (Pwork is needed to investigate generalisability beyond this geographical area.

  5. Diverse Data Sets Can Yield Reliable Information through Mechanistic Modeling: Salicylic Acid Clearance

    Science.gov (United States)

    Raymond, G. M.; Bassingthwaighte, J. B.

    2016-01-01

    This is a practical example of a powerful research strategy: putting together data from studies covering a diversity of conditions can yield a scientifically sound grasp of the phenomenon when the individual observations failed to provide definitive understanding. The rationale is that defining a realistic, quantitative, explanatory hypothesis for the whole set of studies, brings about a “consilience” of the often competing hypotheses considered for individual data sets. An internally consistent conjecture linking multiple data sets simultaneously provides stronger evidence on the characteristics of a system than does analysis of individual data sets limited to narrow ranges of conditions. Our example examines three very different data sets on the clearance of salicylic acid from humans: a high concentration set from aspirin overdoses; a set with medium concentrations from a research study on the influences of the route of administration and of sex on the clearance kinetics, and a set on low dose aspirin for cardiovascular health. Three models were tested: (1) a first order reaction, (2) a Michaelis-Menten (M-M) approach, and (3) an enzyme kinetic model with forward and backward reactions. The reaction rates found from model 1 were distinctly different for the three data sets, having no commonality. The M-M model 2 fitted each of the three data sets but gave a reliable estimates of the Michaelis constant only for the medium level data (Km = 24±5.4 mg/L); analyzing the three data sets together with model 2 gave Km = 18±2.6 mg/L. (Estimating parameters using larger numbers of data points in an optimization increases the degrees of freedom, constraining the range of the estimates). Using the enzyme kinetic model (3) increased the number of free parameters but nevertheless improved the goodness of fit to the combined data sets, giving tighter constraints, and a lower estimated Km = 14.6±2.9 mg/L, demonstrating that fitting diverse data sets with a single model

  6. Diverse Data Sets Can Yield Reliable Information through Mechanistic Modeling: Salicylic Acid Clearance.

    Science.gov (United States)

    Raymond, G M; Bassingthwaighte, J B

    This is a practical example of a powerful research strategy: putting together data from studies covering a diversity of conditions can yield a scientifically sound grasp of the phenomenon when the individual observations failed to provide definitive understanding. The rationale is that defining a realistic, quantitative, explanatory hypothesis for the whole set of studies, brings about a "consilience" of the often competing hypotheses considered for individual data sets. An internally consistent conjecture linking multiple data sets simultaneously provides stronger evidence on the characteristics of a system than does analysis of individual data sets limited to narrow ranges of conditions. Our example examines three very different data sets on the clearance of salicylic acid from humans: a high concentration set from aspirin overdoses; a set with medium concentrations from a research study on the influences of the route of administration and of sex on the clearance kinetics, and a set on low dose aspirin for cardiovascular health. Three models were tested: (1) a first order reaction, (2) a Michaelis-Menten (M-M) approach, and (3) an enzyme kinetic model with forward and backward reactions. The reaction rates found from model 1 were distinctly different for the three data sets, having no commonality. The M-M model 2 fitted each of the three data sets but gave a reliable estimates of the Michaelis constant only for the medium level data (K m = 24±5.4 mg/L); analyzing the three data sets together with model 2 gave K m = 18±2.6 mg/L. (Estimating parameters using larger numbers of data points in an optimization increases the degrees of freedom, constraining the range of the estimates). Using the enzyme kinetic model (3) increased the number of free parameters but nevertheless improved the goodness of fit to the combined data sets, giving tighter constraints, and a lower estimated K m = 14.6±2.9 mg/L, demonstrating that fitting diverse data sets with a single model

  7. Model organoids provide new research opportunities for ductal pancreatic cancer

    NARCIS (Netherlands)

    Boj, Sylvia F|info:eu-repo/dai/nl/304074799; Hwang, Chang-Il; Baker, Lindsey A; Engle, Dannielle D; Tuveson, David A; Clevers, Hans|info:eu-repo/dai/nl/07164282X

    We recently established organoid models from normal and neoplastic murine and human pancreas tissues. These organoids exhibit ductal- and disease stage-specific characteristics and, after orthotopic transplantation, recapitulate the full spectrum of tumor progression. Pancreatic organoid technology

  8. Statistical and RBF NN models : providing forecasts and risk assessment

    OpenAIRE

    Marček, Milan

    2009-01-01

    Forecast accuracy of economic and financial processes is a popular measure for quantifying the risk in decision making. In this paper, we develop forecasting models based on statistical (stochastic) methods, sometimes called hard computing, and on a soft method using granular computing. We consider the accuracy of forecasting models as a measure for risk evaluation. It is found that the risk estimation process based on soft methods is simplified and less critical to the question w...

  9. The transparency, reliability and utility of tropical rainforest land-use and land-cover change models.

    Science.gov (United States)

    Rosa, Isabel M D; Ahmed, Sadia E; Ewers, Robert M

    2014-06-01

    Land-use and land-cover (LULC) change is one of the largest drivers of biodiversity loss and carbon emissions globally. We use the tropical rainforests of the Amazon, the Congo basin and South-East Asia as a case study to investigate spatial predictive models of LULC change. Current predictions differ in their modelling approaches, are highly variable and often poorly validated. We carried out a quantitative review of 48 modelling methodologies, considering model spatio-temporal scales, inputs, calibration and validation methods. In addition, we requested model outputs from each of the models reviewed and carried out a quantitative assessment of model performance for tropical LULC predictions in the Brazilian Amazon. We highlight existing shortfalls in the discipline and uncover three key points that need addressing to improve the transparency, reliability and utility of tropical LULC change models: (1) a lack of openness with regard to describing and making available the model inputs and model code; (2) the difficulties of conducting appropriate model validations; and (3) the difficulty that users of tropical LULC models face in obtaining the model predictions to help inform their own analyses and policy decisions. We further draw comparisons between tropical LULC change models in the tropics and the modelling approaches and paradigms in other disciplines, and suggest that recent changes in the climate change and species distribution modelling communities may provide a pathway that tropical LULC change modellers may emulate to further improve the discipline. Climate change models have exerted considerable influence over public perceptions of climate change and now impact policy decisions at all political levels. We suggest that tropical LULC change models have an equally high potential to influence public opinion and impact the development of land-use policies based on plausible future scenarios, but, to do that reliably may require further improvements in the

  10. Linear and evolutionary polynomial regression models to forecast coastal dynamics: Comparison and reliability assessment

    Science.gov (United States)

    Bruno, Delia Evelina; Barca, Emanuele; Goncalves, Rodrigo Mikosz; de Araujo Queiroz, Heithor Alexandre; Berardi, Luigi; Passarella, Giuseppe

    2018-01-01

    In this paper, the Evolutionary Polynomial Regression data modelling strategy has been applied to study small scale, short-term coastal morphodynamics, given its capability for treating a wide database of known information, non-linearly. Simple linear and multilinear regression models were also applied to achieve a balance between the computational load and reliability of estimations of the three models. In fact, even though it is easy to imagine that the more complex the model, the more the prediction improves, sometimes a "slight" worsening of estimations can be accepted in exchange for the time saved in data organization and computational load. The models' outcomes were validated through a detailed statistical, error analysis, which revealed a slightly better estimation of the polynomial model with respect to the multilinear model, as expected. On the other hand, even though the data organization was identical for the two models, the multilinear one required a simpler simulation setting and a faster run time. Finally, the most reliable evolutionary polynomial regression model was used in order to make some conjecture about the uncertainty increase with the extension of extrapolation time of the estimation. The overlapping rate between the confidence band of the mean of the known coast position and the prediction band of the estimated position can be a good index of the weakness in producing reliable estimations when the extrapolation time increases too much. The proposed models and tests have been applied to a coastal sector located nearby Torre Colimena in the Apulia region, south Italy.

  11. Physics-Based Stress Corrosion Cracking Component Reliability Model cast in an R7-Compatible Cumulative Damage Framework

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Toloczko, Mychailo B.; Johnson, Kenneth I.; Sanborn, Scott E.

    2011-01-01

    This is a working report drafted under the Risk-Informed Safety Margin Characterization pathway of the Light Water Reactor Sustainability Program, describing statistical models of passives component reliabilities. The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). The methodology emerging from the RISMC pathway is not a conventional probabilistic risk assessment (PRA)-based one; rather, it relies on a reactor systems simulation framework in which physical conditions of normal reactor operations, as well as accident environments, are explicitly modeled subject to uncertainty characterization. RELAP 7 (R7) is the platform being developed at Idaho National Laboratory to model these physical conditions. Adverse effects of aging systems could be particularly significant in those SSCs for which management options are limited; that is, components for which replacement, refurbishment, or other means of rejuvenation are least practical. These include various passive SSCs, such as piping components. Pacific Northwest National Laboratory is developing passive component reliability models intended to be compatible with the R7 framework. In the R7 paradigm, component reliability must be characterized in the context of the physical environments that R7 predicts. So, while conventional reliability models are parametric, relying on the statistical analysis of service data, RISMC reliability models must be physics-based and driven by the physical boundary conditions that R7 provides, thus allowing full integration of passives into the R7 multi-physics environment. The model must also be cast in a form compatible with the cumulative damage framework that R7

  12. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  13. Intra-observer reliability and agreement of manual and digital orthodontic model analysis.

    Science.gov (United States)

    Koretsi, Vasiliki; Tingelhoff, Linda; Proff, Peter; Kirschneck, Christian

    2018-01-23

    Digital orthodontic model analysis is gaining acceptance in orthodontics, but its reliability is dependent on the digitalisation hardware and software used. We thus investigated intra-observer reliability and agreement / conformity of a particular digital model analysis work-flow in relation to traditional manual plaster model analysis. Forty-eight plaster casts of the upper/lower dentition were collected. Virtual models were obtained with orthoX®scan (Dentaurum) and analysed with ivoris®analyze3D (Computer konkret). Manual model analyses were done with a dial caliper (0.1 mm). Common parameters were measured on each plaster cast and its virtual counterpart five times each by an experienced observer. We assessed intra-observer reliability within method (ICC), agreement/conformity between methods (Bland-Altman analyses and Lin's concordance correlation), and changing bias (regression analyses). Intra-observer reliability was substantial within each method (ICC ≥ 0.7), except for five manual outcomes (12.8 per cent). Bias between methods was statistically significant, but less than 0.5 mm for 87.2 per cent of the outcomes. In general, larger tooth sizes were measured digitally. Total difference maxilla and mandible had wide limits of agreement (-3.25/6.15 and -2.31/4.57 mm), but bias between methods was mostly smaller than intra-observer variation within each method with substantial conformity of manual and digital measurements in general. No changing bias was detected. Although both work-flows were reliable, the investigated digital work-flow proved to be more reliable and yielded on average larger tooth sizes. Averaged differences between methods were within 0.5 mm for directly measured outcomes but wide ranges are expected for some computed space parameters due to cumulative error. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  14. A simulation model for reliability evaluation of Space Station power systems

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kumar, Mudit; Wagner, H.

    1988-01-01

    A detailed simulation model for the hybrid Space Station power system is presented which allows photovoltaic and solar dynamic power sources to be mixed in varying proportions. The model considers the dependence of reliability and storage characteristics during the sun and eclipse periods, and makes it possible to model the charging and discharging of the energy storage modules in a relatively accurate manner on a continuous basis.

  15. Comparing the reliability and accuracy of clinical measurements using plaster model and the digital model system based on crowding severity.

    Science.gov (United States)

    Liang, Yu-Ming; Rutchakitprakarn, Lalita; Kuang, Shou-Hsin; Wu, Tzu-Ying

    2018-01-25

    This study aims to clarify whether 3Shape™ digital model system could be applied in orthodontic diagnostic analysis with certainty, especially under different crowding condition. Reliability, accuracy and efficiency of 3Shape™ digital model system were assessed by comparing them with traditional plaster cast. 29 plaster casts with permanent dentition were transformed into digital models by 3Shape™ D800 scanner. All 29 models were categorized into mild-crowding (arch length discrepancy 3 mm and 8 mm). Fourteen linear measurements were made manually using a digital caliper on plaster casts and virtually using the 3Shape™ Ortho Analyzer software by two examiners. Intra-class Correlation Coefficient (ICC) was used to evaluate intra-examiner reliability, inter-examiner reliability and reliability between two model systems. Paired t test was used to evaluate accuracy between two model systems. Kruskal-Wallis test followed by Mann-Whitney U test was used to evaluate the measurement differences between 3 groups in two model systems. Both intra-examiner and inter-examiner reliability were generally excellent for all measurements made on 3Shape™ digital model and plaster cast (ICC: 0.752-0.993). Reliability between different model systems was also excellent (ICC: 0.897-0.998). Half of the accuracy test showed statistically significant differences (p systems, the mandibular required space showed significant difference (p = 0.012) between mild crowding group (0.27 + 0.01 mm) and severe crowding group (0.20 + 0.09 mm). However, the differences were less than 0.5 mm and would not affect clinical decision. Using 3Shape™ digital model system instead of plaster casts for orthodontic diagnostic measurements is clinically acceptable. Copyright © 2018. Published by Elsevier Taiwan LLC.

  16. Modeling reliability of power systems substations by using stochastic automata networks

    International Nuclear Information System (INIS)

    Šnipas, Mindaugas; Radziukynas, Virginijus; Valakevičius, Eimutis

    2017-01-01

    In this paper, stochastic automata networks (SANs) formalism to model reliability of power systems substations is applied. The proposed strategy allows reducing the size of state space of Markov chain model and simplifying system specification. Two case studies of standard configurations of substations are considered in detail. SAN models with different assumptions were created. SAN approach is compared with exact reliability calculation by using a minimal path set method. Modeling results showed that total independence of automata can be assumed for relatively small power systems substations with reliable equipment. In this case, the implementation of Markov chain model by a using SAN method is a relatively easy task. - Highlights: • We present the methodology to apply stochastic automata network formalism to create Markov chain models of power systems. • The stochastic automata network approach is combined with minimal path sets and structural functions. • Two models of substation configurations with different model assumptions are presented to illustrate the proposed methodology. • Modeling results of system with independent automata and functional transition rates are similar. • The conditions when total independence of automata can be assumed are addressed.

  17. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    Science.gov (United States)

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  18. Blooms' separation of the final exam of Engineering Mathematics II: Item reliability using Rasch measurement model

    Science.gov (United States)

    Fuaad, Norain Farhana Ahmad; Nopiah, Zulkifli Mohd; Tawil, Norgainy Mohd; Othman, Haliza; Asshaari, Izamarlina; Osman, Mohd Hanif; Ismail, Nur Arzilah

    2014-06-01

    In engineering studies and researches, Mathematics is one of the main elements which express physical, chemical and engineering laws. Therefore, it is essential for engineering students to have a strong knowledge in the fundamental of mathematics in order to apply the knowledge to real life issues. However, based on the previous results of Mathematics Pre-Test, it shows that the engineering students lack the fundamental knowledge in certain topics in mathematics. Due to this, apart from making improvements in the methods of teaching and learning, studies on the construction of questions (items) should also be emphasized. The purpose of this study is to assist lecturers in the process of item development and to monitor the separation of items based on Blooms' Taxonomy and to measure the reliability of the items itself usingRasch Measurement Model as a tool. By using Rasch Measurement Model, the final exam questions of Engineering Mathematics II (Linear Algebra) for semester 2 sessions 2012/2013 were analysed and the results will provide the details onthe extent to which the content of the item providesuseful information about students' ability. This study reveals that the items used in Engineering Mathematics II (Linear Algebra) final exam are well constructed but the separation of the items raises concern as it is argued that it needs further attention, as there is abig gap between items at several levels of Blooms' cognitive skill.

  19. Human reliability-based MC and A models for detecting insider theft

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-01-01

    Material control and accounting (MC and A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC and A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC and A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC and A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  20. Capturing cognitive causal paths in human reliability analysis with Bayesian network models

    International Nuclear Information System (INIS)

    Zwirglmaier, Kilian; Straub, Daniel; Groth, Katrina M.

    2017-01-01

    reIn the last decade, Bayesian networks (BNs) have been identified as a powerful tool for human reliability analysis (HRA), with multiple advantages over traditional HRA methods. In this paper we illustrate how BNs can be used to include additional, qualitative causal paths to provide traceability. The proposed framework provides the foundation to resolve several needs frequently expressed by the HRA community. First, the developed extended BN structure reflects the causal paths found in cognitive psychology literature, thereby addressing the need for causal traceability and strong scientific basis in HRA. Secondly, the use of node reduction algorithms allows the BN to be condensed to a level of detail at which quantification is as straightforward as the techniques used in existing HRA. We illustrate the framework by developing a BN version of the critical data misperceived crew failure mode in the IDHEAS HRA method, which is currently under development at the US NRC . We illustrate how the model could be quantified with a combination of expert-probabilities and information from operator performance databases such as SACADA. This paper lays the foundations necessary to expand the cognitive and quantitative foundations of HRA. - Highlights: • A framework for building traceable BNs for HRA, based on cognitive causal paths. • A qualitative BN structure, directly showing these causal paths is developed. • Node reduction algorithms are used for making the BN structure quantifiable. • BN quantified through expert estimates and observed data (Bayesian updating). • The framework is illustrated for a crew failure mode of IDHEAS.

  1. A case study review of technical and technology issues for transition of a utility load management program to provide system reliability resources in restructured electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Weller, G.H.

    2001-07-15

    Utility load management programs--including direct load control and interruptible load programs--were employed by utilities in the past as system reliability resources. With electricity industry restructuring, the context for these programs has changed; the market that was once controlled by vertically integrated utilities has become competitive, raising the question: can existing load management programs be modified so that they can effectively participate in competitive energy markets? In the short run, modified and/or improved operation of load management programs may be the most effective form of demand-side response available to the electricity system today. However, in light of recent technological advances in metering, communication, and load control, utility load management programs must be carefully reviewed in order to determine appropriate investments to support this transition. This report investigates the feasibility of and options for modifying an existing utility load management system so that it might provide reliability services (i.e. ancillary services) in the competitive markets that have resulted from electricity industry restructuring. The report is a case study of Southern California Edison's (SCE) load management programs. SCE was chosen because it operates one of the largest load management programs in the country and it operates them within a competitive wholesale electricity market. The report describes a wide range of existing and soon-to-be-available communication, control, and metering technologies that could be used to facilitate the evolution of SCE's load management programs and systems to provision of reliability services. The fundamental finding of this report is that, with modifications, SCE's load management infrastructure could be transitioned to provide critical ancillary services in competitive electricity markets, employing currently or soon-to-be available load control technologies.

  2. Conceptual Models of the Individual Public Service Provider

    DEFF Research Database (Denmark)

    Andersen, Lotte Bøgh; Bhatti, Yosef; Petersen, Ole Helby

    Individual public service providers’ motivation can be conceptualized as either extrinsic, autonomous or prosocial, and the question is how we can best theoretically understand this complexity without losing too much coherence and parsimony. Drawing on Allison’s approach (1969), three perspectives...... are used to gain insight on the motivation of public service providers; namely principal-agent theory, self-determination theory and public service motivation theory. We situate the theoretical discussions in the context of public service providers being transferred to private organizations...... as a consequence of outsourcing by the public sector. Although this empirical setting is interesting in itself, here it serves primarily as grist for a wider discussion on strategies for applying multiple theoretical approaches and crafting a theoretical synthesis. The key contribution of the paper is thus...

  3. A Review of the Progress with Statistical Models of Passive Component Reliability

    Directory of Open Access Journals (Sweden)

    Bengt O.Y. Lydell

    2017-03-01

    Full Text Available During the past 25 years, in the context of probabilistic safety assessment, efforts have been directed towards establishment of comprehensive pipe failure event databases as a foundation for exploratory research to better understand how to effectively organize a piping reliability analysis task. The focused pipe failure database development efforts have progressed well with the development of piping reliability analysis frameworks that utilize the full body of service experience data, fracture mechanics analysis insights, expert elicitation results that are rolled into an integrated and risk-informed approach to the estimation of piping reliability parameters with full recognition of the embedded uncertainties. The discussion in this paper builds on a major collection of operating experience data (more than 11,000 pipe failure records and the associated lessons learned from data analysis and data applications spanning three decades. The piping reliability analysis lessons learned have been obtained from the derivation of pipe leak and rupture frequencies for corrosion resistant piping in a raw water environment, loss-of-coolant-accident frequencies given degradation mitigation, high-energy pipe break analysis, moderate-energy pipe break analysis, and numerous plant-specific applications of a statistical piping reliability model framework. Conclusions are presented regarding the feasibility of determining and incorporating aging effects into probabilistic safety assessment models.

  4. A review of the progress with statistical models of passive component reliability

    International Nuclear Information System (INIS)

    Lydell, Bengt O. Y.

    2017-01-01

    During the past 25 years, in the context of probabilistic safety assessment, efforts have been directed towards establishment of comprehensive pipe failure event databases as a foundation for exploratory research to better understand how to effectively organize a piping reliability analysis task. The focused pipe failure database development efforts have progressed well with the development of piping reliability analysis frameworks that utilize the full body of service experience data, fracture mechanics analysis insights, expert elicitation results that are rolled into an integrated and risk-informed approach to the estimation of piping reliability parameters with full recognition of the embedded uncertainties. The discussion in this paper builds on a major collection of operating experience data (more than 11,000 pipe failure records) and the associated lessons learned from data analysis and data applications spanning three decades. The piping reliability analysis lessons learned have been obtained from the derivation of pipe leak and rupture frequencies for corrosion resistant piping in a raw water environment, loss-of-coolant-accident frequencies given degradation mitigation, high-energy pipe break analysis, moderate-energy pipe break analysis, and numerous plant-specific applications of a statistical piping reliability model framework. Conclusions are presented regarding the feasibility of determining and incorporating aging effects into probabilistic safety assessment models

  5. Mathematical Model of Equipment Unit Reliability for Determination of Optimum Overhaul Periods

    Directory of Open Access Journals (Sweden)

    M. A. Pasiouk

    2009-01-01

    Full Text Available The paper proposes a mathematical model of the equipment unit reliability with due account of operational mode effect and main influencing factors.Its application contributes to reduction of operating costs, optimization of overhaul periods, prolongation of life-service and rational usage of fleet resource.

  6. Probabilistic modelling of combined sewer overflow using the First Order Reliability Method

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld; Jensen, Jacob Birk

    2007-01-01

    uncertainties on an application of the commercial urban drainage model MOUSE combined with the probabilistic First Order Reliability Method (FORM). Applying statistical characteristics on several years of rainfall, it is possible to derive a parameterization of the rainfall input and the failure probability...

  7. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Science.gov (United States)

    2011-05-18

    ... COMMISSION NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection... issued for public comment a document entitled: NUREG/CR-XXXX, ``Development of Quantitative Software... development of regulatory guidance for using risk information related to digital systems in the licensing...

  8. The importance of data quality for generating reliable distribution models for rare, elusive, and cryptic species

    Science.gov (United States)

    Keith B. Aubry; Catherine M. Raley; Kevin S. McKelvey

    2017-01-01

    The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs) for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated...

  9. Reliability modeling of safety-critical network communication in a digitalized nuclear power plant

    International Nuclear Information System (INIS)

    Lee, Sang Hun; Kim, Hee Eun; Son, Kwang Seop; Shin, Sung Min; Lee, Seung Jun; Kang, Hyun Gook

    2015-01-01

    The Engineered Safety Feature-Component Control System (ESF-CCS), which uses a network communication system for the transmission of safety-critical information from group controllers (GCs) to loop controllers (LCs), was recently developed. However, the ESF-CCS has not been applied to nuclear power plants (NPPs) because the network communication failure risk in the ESF-CCS has yet to be fully quantified. Therefore, this study was performed to identify the potential hazardous states for network communication between GCs and LCs and to develop quantification schemes for various network failure causes. To estimate the risk effects of network communication failures in the ESF-CCS, a fault-tree model of an ESF-CCS signal failure in the containment spray actuation signal condition was developed for the case study. Based on a specified range of periodic inspection periods for network modules and the baseline probability of software failure, a sensitivity study was conducted to analyze the risk effect of network failure between GCs and LCs on ESF-CCS signal failure. This study is expected to provide insight into the development of a fault-tree model for network failures in digital I&C systems and the quantification of the risk effects of network failures for safety-critical information transmission in NPPs. - Highlights: • Network reliability modeling framework for digital I&C system in NPP is proposed. • Hazardous states of network protocol between GC and LC in ESF-CCS are identified. • Fault-tree model of ESF-CCS signal failure in ESF actuation condition is developed. • Risk effect of network failure on ESF-CCS signal failure is analyzed.

  10. Experimental studies on power transformer model winding provided with MOVs

    Directory of Open Access Journals (Sweden)

    G.H. Kusumadevi

    2017-05-01

    Full Text Available Surge voltage distribution across a HV transformer winding due to appearance of very fast rise time (rise time of order 1 μs transient voltages is highly non-uniform along the length of the winding for initial time instant of occurrence of surge. In order to achieve nearly uniform initial time instant voltage distribution along the length of the HV winding, investigations have been carried out on transformer model winding. By connecting similar type of metal oxide varistors across sections of HV transformer model winding, it is possible to improve initial time instant surge voltage distribution across length of the HV transformer winding. Transformer windings with α values 5.3, 9.5 and 19 have been analyzed. The experimental studies have been carried out using high speed oscilloscope of good accuracy. The initial time instant voltage distribution across sections of winding with MOV remains nearly uniform along length of the winding. Also results of fault diagnostics carried out with and without connection of MOVs across sections of winding are reported.

  11. Reliability analysis of nuclear component cooling water system using semi-Markov process model

    International Nuclear Information System (INIS)

    Veeramany, Arun; Pandey, Mahesh D.

    2011-01-01

    Research highlights: → Semi-Markov process (SMP) model is used to evaluate system failure probability of the nuclear component cooling water (NCCW) system. → SMP is used because it can solve reliability block diagram with a mixture of redundant repairable and non-repairable components. → The primary objective is to demonstrate that SMP can consider Weibull failure time distribution for components while a Markov model cannot → Result: the variability in component failure time is directly proportional to the NCCW system failure probability. → The result can be utilized as an initiating event probability in probabilistic safety assessment projects. - Abstract: A reliability analysis of nuclear component cooling water (NCCW) system is carried out. Semi-Markov process model is used in the analysis because it has potential to solve a reliability block diagram with a mixture of repairable and non-repairable components. With Markov models it is only possible to assume an exponential profile for component failure times. An advantage of the proposed model is the ability to assume Weibull distribution for the failure time of components. In an attempt to reduce the number of states in the model, it is shown that usage of poly-Weibull distribution arises. The objective of the paper is to determine system failure probability under these assumptions. Monte Carlo simulation is used to validate the model result. This result can be utilized as an initiating event probability in probabilistic safety assessment projects.

  12. Reliability analysis to resolve difficulty in choosing from alternative deflection models of RC beams

    Science.gov (United States)

    Kim, Jung J.; Reda Taha, Mahmoud M.; Noh, Hyuk-Chun; Ross, Timothy J.

    2013-05-01

    The probability of failure in reliability analysis depends on the integration of the joint probability density function (PDF) of uncertain variables at the violation regions of limit state functions corresponding to these variables. There might exist uncertainty in choosing computational models of resultants, which includes uncertain variables, and are incorporated in the limit state function. This uncertainty is not random, but can be considered as an epistemic uncertainty, since this uncertainty represents ambiguity in choosing from among alternative computational models; such an uncertainty is known as "non-specificity". In this study, non-specificity of computational models is implemented in reliability analysis for determining the deflections of reinforced concrete (RC) beams. A methodology to quantify this non-specificity is presented using possibility theory. Three deflection computational models, which accounts for the rigidity of concrete under tension using an effective moment of inertia, are selected. A limit state for a deflection limit is formulated for each deflection model and the probability of exceeding the deflection limits is calculated for each. Using possibility distributions, the three probabilities of exceeding a deflection limit are integrated and a new set of probabilities of exceeding a deflection limit are determined, where each probability is associated with a new metric that describes model non-specificity called the degree of confirmation. A case study illustrating the new reliability analysis to compute the non-specificity of a computational model is presented.

  13. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  14. Meeting Human Reliability Requirements through Human Factors Design, Testing, and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    In the design of novel systems, it is important for the human factors engineer to work in parallel with the human reliability analyst to arrive at the safest achievable design that meets design team safety goals and certification or regulatory requirements. This paper introduces the System Development Safety Triptych, a checklist of considerations for the interplay of human factors and human reliability through design, testing, and modeling in product development. This paper also explores three phases of safe system development, corresponding to the conception, design, and implementation of a system.

  15. Assessment of Electronic Circuits Reliability Using Boolean Truth Table Modeling Method

    International Nuclear Information System (INIS)

    EI-Shanshoury, A.I.

    2011-01-01

    This paper explores the use of Boolean Truth Table modeling Method (BTTM) in the analysis of qualitative data. It is widely used in certain fields especially in the fields of electrical and electronic engineering. Our work focuses on the evaluation of power supply circuit reliability using (BTTM) which involves systematic attempts to falsify and identify hypotheses on the basis of truth tables constructed from qualitative data. Reliability parameters such as the system's failure rates for the power supply case study are estimated. All possible state combinations (operating and failed states) of the major components in the circuit were listed and their effects on overall system were studied

  16. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    Energy Technology Data Exchange (ETDEWEB)

    Knezevic, J.; Odoom, E.R

    2001-07-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets.

  17. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  18. Data Applicability of Heritage and New Hardware for Launch Vehicle System Reliability Models

    Science.gov (United States)

    Al Hassan Mohammad; Novack, Steven

    2015-01-01

    Many launch vehicle systems are designed and developed using heritage and new hardware. In most cases, the heritage hardware undergoes modifications to fit new functional system requirements, impacting the failure rates and, ultimately, the reliability data. New hardware, which lacks historical data, is often compared to like systems when estimating failure rates. Some qualification of applicability for the data source to the current system should be made. Accurately characterizing the reliability data applicability and quality under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This presentation will demonstrate a data-source classification method that ranks reliability data according to applicability and quality criteria to a new launch vehicle. This method accounts for similarities/dissimilarities in source and applicability, as well as operating environments like vibrations, acoustic regime, and shock. This classification approach will be followed by uncertainty-importance routines to assess the need for additional data to reduce uncertainty.

  19. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - I: Theory

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Ionescu-Bujor, M.

    2008-01-01

    The development of the adjoint sensitivity analysis procedure (ASAP) for generic dynamic reliability models based on Markov chains is presented, together with applications of this procedure to the analysis of several systems of increasing complexity. The general theory is presented in Part I of this work and is accompanied by a paradigm application to the dynamic reliability analysis of a simple binary component, namely a pump functioning on an 'up/down' cycle until it fails irreparably. This paradigm example admits a closed form analytical solution, which permits a clear illustration of the main characteristics of the ASAP for Markov chains. In particular, it is shown that the ASAP for Markov chains presents outstanding computational advantages over other procedures currently in use for sensitivity and uncertainty analysis of the dynamic reliability of large-scale systems. This conclusion is further underscored by the large-scale applications presented in Part II. (authors)

  20. Assessing sensor reliability for multisensor data fusion within the transferable belief model.

    Science.gov (United States)

    Elouedi, Zied; Mellouli, Khaled; Smets, Philippe

    2004-02-01

    This paper presents a method for assessing the reliability of a sensor in a classification problem based on the transferable belief model. First, we develop a method for the evaluation of the reliability of a sensor when considered alone. The method is based on finding the discounting factor minimizing the distance between the pignistic probabilities computed from the discounted beliefs and the actual values of data. Next, we develop a method for assessing the reliability of several sensors that are supposed to work jointly and their readings are aggregated. The discounting factors are computed on the basis of minimizing the distance between the pignistic probabilities computed from the combined discounted belief functions and the actual values of data.

  1. Combined Strategy for a Reliable Evaluation of Spinal Cord Injury Using an in vivo Model.

    Science.gov (United States)

    Gomez, Rosa M; Ghotme, Kemel; Nino, Jackeline J; Quiroz-Padilla, Maria; Vargas, Daniela; Dominguez, Andy R; Barreto, George E; Sanchez, Magdy Y

    2018-01-26

    A complete neurological exam contributes in establishing spinal cord injury severity and its extent by identifying the damage to the sensory and motor pathways involved in order to address a more case-specific and precise pharmacological therapy. However, assessment of neurologic function in spinal cord injury models is usually reported by using sensory or motor tests independently. A reliable integral method is needed to precisely evaluate location and severity of the injury at baseline and, in further assessments, to establish the degree of spontaneous recovery. A combination of sensation-based tests and motor-based tests was used to evaluate impaired neurologic function after spinal cord injury and the degree of spontaneous recovery, in different stages, on an in vivo model. Combined neurologic evaluation was useful to establish location and severity of the injury in all animals and also to detect degrees of spontaneous recovery at different stages after the injury. Comparisons of neurological function were assessed in time-days and groups between BBB motor score, latency maintenance of posture, locomotion and latency presentation of grooming before and after the injury. Our results suggest that a combined assessment strategy, including sensory and motor tests, can lead to better evaluation of spinal cord injury severity and location, and documentation of the extent of spontaneous recovery following SCI and identify specific motor and sensory pathway integrity. In conclusion, a combined assessment strategy provides a concise method for evaluating the impact of interventions in experimental models of SCI. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors

    Science.gov (United States)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The

  3. Modeling Travel Time Reliability of Road Network Considering Connected Vehicle Guidance Characteristics Indexes

    Directory of Open Access Journals (Sweden)

    Jiangfeng Wang

    2017-01-01

    Full Text Available Travel time reliability (TTR is one of the important indexes for effectively evaluating the performance of road network, and TTR can effectively be improved using the real-time traffic guidance information. Compared with traditional traffic guidance, connected vehicle (CV guidance can provide travelers with more timely and accurate travel information, which can further improve the travel efficiency of road network. Five CV characteristics indexes are selected as explanatory variables including the Congestion Level (CL, Penetration Rate (PR, Compliance Rate (CR, release Delay Time (DT, and Following Rate (FR. Based on the five explanatory variables, a TTR model is proposed using the multilogistic regression method, and the prediction accuracy and the impact of characteristics indexes on TTR are analyzed using a CV guidance scenario. The simulation results indicate that 80% of the RMSE is concentrated within the interval of 0 to 0.0412. The correlation analysis of characteristics indexes shows that the influence of CL, PR, CR, and DT on the TTR is significant. PR and CR have a positive effect on TTR, and the average improvement rate is about 77.03% and 73.20% with the increase of PR and CR, respectively, while CL and DT have a negative effect on TTR, and TTR decreases by 31.21% with the increase of DT from 0 to 180 s.

  4. Reliability and Efficiency of Generalized Rumor Spreading Model on Complex Social Networks

    International Nuclear Information System (INIS)

    Naimi, Yaghoob; Naimi, Mohammad

    2013-01-01

    We introduce the generalized rumor spreading model and investigate some properties of this model on different complex social networks. Despite pervious rumor models that both the spreader-spreader (SS) and the spreader-stifler (SR) interactions have the same rate α, we define α (1) and α (2) for SS and SR interactions, respectively. The effect of variation of α (1) and α (2) on the final density of stiflers is investigated. Furthermore, the influence of the topological structure of the network in rumor spreading is studied by analyzing the behavior of several global parameters such as reliability and efficiency. Our results show that while networks with homogeneous connectivity patterns reach a higher reliability, scale-free topologies need a less time to reach a steady state with respect the rumor. (interdisciplinary physics and related areas of science and technology)

  5. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  6. Reliability and Efficiency of Generalized Rumor Spreading Model on Complex Social Networks

    Science.gov (United States)

    Yaghoob, Naimi; Mohammad, Naimi

    2013-07-01

    We introduce the generalized rumor spreading model and investigate some properties of this model on different complex social networks. Despite pervious rumor models that both the spreader-spreader (SS) and the spreader-stifler (SR) interactions have the same rate α, we define α(1) and α(2) for SS and SR interactions, respectively. The effect of variation of α(1) and α(2) on the final density of stiflers is investigated. Furthermore, the influence of the topological structure of the network in rumor spreading is studied by analyzing the behavior of several global parameters such as reliability and efficiency. Our results show that while networks with homogeneous connectivity patterns reach a higher reliability, scale-free topologies need a less time to reach a steady state with respect the rumor.

  7. An Open Modelling Approach for Availability and Reliability of Systems - OpenMARS

    CERN Document Server

    Penttinen, Jussi-Pekka; Gutleber, Johannes

    2018-01-01

    This document introduces and gives specification for OpenMARS, which is an open modelling approach for availability and reliability of systems. It supports the most common risk assessment and operation modelling techniques. Uniquely OpenMARS allows combining and connecting models defined with different techniques. This ensures that a modeller has a high degree of freedom to accurately describe the modelled system without limitations imposed by an individual technique. Here the OpenMARS model definition is specified with a tool independent tabular format, which supports managing models developed in a collaborative fashion. Origin of our research is in Future Circular Collider (FCC) study, where we developed the unique features of our concept to model the availability and luminosity production of particle colliders. We were motivated to describe our approach in detail as we see potential further applications in performance and energy efficiency analyses of large scientific infrastructures or industrial processe...

  8. Bayesian reliability modeling and assessment solution for NC machine tools under small-sample data

    Science.gov (United States)

    Yang, Zhaojun; Kan, Yingnan; Chen, Fei; Xu, Binbin; Chen, Chuanhai; Yang, Chuangui

    2015-11-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters' prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters' posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  9. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  10. Development of Probabilistic Reliability Models of Photovoltaic System Topologies for System Adequacy Evaluation

    Directory of Open Access Journals (Sweden)

    Ahmad Alferidi

    2017-02-01

    Full Text Available The contribution of solar power in electric power systems has been increasing rapidly due to its environmentally friendly nature. Photovoltaic (PV systems contain solar cell panels, power electronic converters, high power switching and often transformers. These components collectively play an important role in shaping the reliability of PV systems. Moreover, the power output of PV systems is variable, so it cannot be controlled as easily as conventional generation due to the unpredictable nature of weather conditions. Therefore, solar power has a different influence on generating system reliability compared to conventional power sources. Recently, different PV system designs have been constructed to maximize the output power of PV systems. These different designs are commonly adopted based on the scale of a PV system. Large-scale grid-connected PV systems are generally connected in a centralized or a string structure. Central and string PV schemes are different in terms of connecting the inverter to PV arrays. Micro-inverter systems are recognized as a third PV system topology. It is therefore important to evaluate the reliability contribution of PV systems under these topologies. This work utilizes a probabilistic technique to develop a power output model for a PV generation system. A reliability model is then developed for a PV integrated power system in order to assess the reliability and energy contribution of the solar system to meet overall system demand. The developed model is applied to a small isolated power unit to evaluate system adequacy and capacity level of a PV system considering the three topologies.

  11. Analytical models for coupling reliability in identical two-magnet systems during slow reversals

    Science.gov (United States)

    Kani, Nickvash; Naeemi, Azad

    2017-12-01

    This paper follows previous works which investigated the strength of dipolar coupling in two-magnet systems. While those works focused on qualitative analyses, this manuscript elucidates reversal through dipolar coupling culminating in analytical expressions for reversal reliability in identical two-magnet systems. The dipolar field generated by a mono-domain magnetic body can be represented by a tensor containing both longitudinal and perpendicular field components; this field changes orientation and magnitude based on the magnetization of neighboring nanomagnets. While the dipolar field does reduce to its longitudinal component at short time-scales, for slow magnetization reversals, the simple longitudinal field representation greatly underestimates the scope of parameters that ensure reliable coupling. For the first time, analytical models that map the geometric and material parameters required for reliable coupling in two-magnet systems are developed. It is shown that in biaxial nanomagnets, the x ̂ and y ̂ components of the dipolar field contribute to the coupling, while all three dimensions contribute to the coupling between a pair of uniaxial magnets. Additionally, the ratio of the longitudinal and perpendicular components of the dipolar field is also very important. If the perpendicular components in the dipolar tensor are too large, the nanomagnet pair may come to rest in an undesirable meta-stable state away from the free axis. The analytical models formulated in this manuscript map the minimum and maximum parameters for reliable coupling. Using these models, it is shown that there is a very small range of material parameters which can facilitate reliable coupling between perpendicular-magnetic-anisotropy nanomagnets; hence, in-plane nanomagnets are more suitable for coupled systems.

  12. A holistic framework of degradation modeling for reliability analysis and maintenance optimization of nuclear safety systems

    International Nuclear Information System (INIS)

    Lin, Yanhui

    2016-01-01

    Components of nuclear safety systems are in general highly reliable, which leads to a difficulty in modeling their degradation and failure behaviors due to the limited amount of data available. Besides, the complexity of such modeling task is increased by the fact that these systems are often subject to multiple competing degradation processes and that these can be dependent under certain circumstances, and influenced by a number of external factors (e.g. temperature, stress, mechanical shocks, etc.). In this complicated problem setting, this PhD work aims to develop a holistic framework of models and computational methods for the reliability-based analysis and maintenance optimization of nuclear safety systems taking into account the available knowledge on the systems, degradation and failure behaviors, their dependencies, the external influencing factors and the associated uncertainties.The original scientific contributions of the work are: (1) For single components, we integrate random shocks into multi-state physics models for component reliability analysis, considering general dependencies between the degradation and two types of random shocks. (2) For multi-component systems (with a limited number of components):(a) a piecewise-deterministic Markov process modeling framework is developed to treat degradation dependency in a system whose degradation processes are modeled by physics-based models and multi-state models; (b) epistemic uncertainty due to incomplete or imprecise knowledge is considered and a finite-volume scheme is extended to assess the (fuzzy) system reliability; (c) the mean absolute deviation importance measures are extended for components with multiple dependent competing degradation processes and subject to maintenance; (d) the optimal maintenance policy considering epistemic uncertainty and degradation dependency is derived by combining finite-volume scheme, differential evolution and non-dominated sorting differential evolution; (e) the

  13. Fuzzy sets as extension of probabilistic models for evaluating human reliability

    International Nuclear Information System (INIS)

    Przybylski, F.

    1996-11-01

    On the base of a survey of established quantification methodologies for evaluating human reliability, a new computerized methodology was developed in which a differential consideration of user uncertainties is made. In this quantification method FURTHER (FUzzy Sets Related To Human Error Rate Prediction), user uncertainties are quantified separately from model and data uncertainties. As tools fuzzy sets are applied which, however, stay hidden to the method's user. The user in the quantification process only chooses an action pattern, performance shaping factors and natural language expressions. The acknowledged method HEART (Human Error Assessment and Reduction Technique) serves as foundation of the fuzzy set approach FURTHER. By means of this method, the selection of a basic task in connection with its basic error probability, the decision how correct the basic task's selection is, the selection of a peformance shaping factor, and the decision how correct the selection and how important the performance shaping factor is, were identified as aspects of fuzzification. This fuzzification is made on the base of data collection and information from literature as well as of the estimation by competent persons. To verify the ammount of additional information to be received by the usage of fuzzy sets, a benchmark session was accomplished. In this benchmark twelve actions were assessed by five test-persons. In case of the same degree of detail in the action modelling process, the bandwidths of the interpersonal evaluations decrease in FURTHER in comparison with HEART. The uncertainties of the single results could not be reduced up to now. The benchmark sessions conducted so far showed plausible results. A further testing of the fuzzy set approach by using better confirmed fuzzy sets can only be achieved in future practical application. Adequate procedures, however, are provided. (orig.) [de

  14. Modeling and simulation for microelectronic packaging assembly manufacturing, reliability and testing

    CERN Document Server

    Liu, Sheng

    2011-01-01

    Although there is increasing need for modeling and simulation in the IC package design phase, most assembly processes and various reliability tests are still based on the time consuming ""test and try out"" method to obtain the best solution. Modeling and simulation can easily ensure virtual Design of Experiments (DoE) to achieve the optimal solution. This has greatly reduced the cost and production time, especially for new product development. Using modeling and simulation will become increasingly necessary for future advances in 3D package development.  In this book, Liu and Liu allow people

  15. Reliability of measurements made on scanned cast models using the 3 Shape R 700 scanner.

    Science.gov (United States)

    Lemos, L S; Rebello, I M C R; Vogel, C J; Barbosa, M C

    2015-01-01

    In dentistry, the latest technological advancements have been incorporated primarily into diagnostic tools such as virtual dental models. The aim of this study was to evaluate the reliability of measurements made on digital cast models scanned in the 3 Shape R 700 scanner (3 Shape, Copenhagen, Denmark) that uses a non-destructive laser beam to reproduce model surfaces so that the plaster model is not destroyed. The sample consisted of 26 cast models, and 6 linear measurements were made on the cast models and compared with the same measurements on digital models. The measurements assessed were: (1) distance between mandibular canines; (2) distance between mandibular molars; (3) distance between canine and maxillary molar; (4) buccal-lingual diameter of maxillary central incisor; (5) distance between two points of the incisive papillae of maxillary and mandibular central incisors; and (6) distance between the buccal surface of the maxillary central incisor and the buccal surface of the mandibular antagonist (overjet). The Student's t-test or Wilcoxon test was used at 5% and the Lin's concordance test at 95% confidence interval. The overjet measurement was the only one that showed a statistically significant difference (p scanner are reliable and can be considered an alternative to cast models for performing measurements and analyses in orthodontic practice.

  16. Interrater Reliability of the Postoperative Epidural Fibrosis Classification: A Histopathologic Study in the Rat Model

    OpenAIRE

    Sae-Jung, Surachai; Jirarattanaphochai, Kitti; Sumananont, Chat; Wittayapairoj, Kriangkrai; Sukhonthamarn, Kamolsak

    2015-01-01

    Study Design Agreement study. Purpose To validate the interrater reliability of the histopathological classification of the post-laminectomy epidural fibrosis in an animal model. Overview of Literature Epidural fibrosis is a common cause of failed back surgery syndrome. Many animal experiments have been developed to investigate the prevention of epidural fibrosis. One of the common outcome measurements is the epidural fibrous adherence grading, but the classification has not yet been validate...

  17. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  18. A comparison between Markovian models and Bayesian networks for treating some dependent events in reliability evaluations

    International Nuclear Information System (INIS)

    Duarte, Juliana P.; Leite, Victor C.; Melo, P.F. Frutuoso e

    2013-01-01

    Bayesian networks have become a very handy tool for solving problems in various application areas. This paper discusses the use of Bayesian networks to treat dependent events in reliability engineering typically modeled by Markovian models. Dependent events play an important role as, for example, when treating load-sharing systems, bridge systems, common-cause failures, and switching systems (those for which a standby component is activated after the main one fails by means of a switching mechanism). Repair plays an important role in all these cases (as, for example, the number of repairmen). All Bayesian network calculations are performed by means of the Netica™ software, of Norsys Software Corporation, and Fortran 90 to evaluate them over time. The discussion considers the development of time-dependent reliability figures of merit, which are easily obtained, through Markovian models, but not through Bayesian networks, because these latter need probability figures as input and not failure and repair rates. Bayesian networks produced results in very good agreement with those of Markov models and pivotal decomposition. Static and discrete time (DTBN) Bayesian networks were used in order to check their capabilities of modeling specific situations, like switching failures in cold-standby systems. The DTBN was more flexible to modeling systems where the time of occurrence of an event is important, for example, standby failure and repair. However, the static network model showed as good results as DTBN by a much more simplified approach. (author)

  19. Automatic creation of Markov models for reliability assessment of safety instrumented systems

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2008-01-01

    After the release of new international functional safety standards like IEC 61508, people care more for the safety and availability of safety instrumented systems. Markov analysis is a powerful and flexible technique to assess the reliability measurements of safety instrumented systems, but it is fallible and time-consuming to create Markov models manually. This paper presents a new technique to automatically create Markov models for reliability assessment of safety instrumented systems. Many safety related factors, such as failure modes, self-diagnostic, restorations, common cause and voting, are included in Markov models. A framework is generated first based on voting, failure modes and self-diagnostic. Then, repairs and common-cause failures are incorporated into the framework to build a complete Markov model. Eventual simplification of Markov models can be done by state merging. Examples given in this paper show how explosively the size of Markov model increases as the system becomes a little more complicated as well as the advancement of automatic creation of Markov models

  20. Human reliability in non-destructive inspections of nuclear power plant components: modeling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Marques, Raíssa Oliveira; Silva Júnior, Silvério Ferreira da; Raso, Amanda Laureano, E-mail: vasconv@cdtn.br, E-mail: soaresw@cdtn.br, E-mail: raissaomarques@gmail.com, E-mail: silvasf@cdtn.br, E-mail: amandaraso@hotmail.com [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    Non-destructive inspection (NDI) is one of the key elements in ensuring quality of engineering systems and their safe use. NDI is a very complex task, during which the inspectors have to rely on their sensory, perceptual, cognitive, and motor skills. It requires high vigilance once it is often carried out on large components, over a long period of time, and in hostile environments and restriction of workplace. A successful NDI requires careful planning, choice of appropriate NDI methods and inspection procedures, as well as qualified and trained inspection personnel. A failure of NDI to detect critical defects in safety-related components of nuclear power plants, for instance, may lead to catastrophic consequences for workers, public and environment. Therefore, ensuring that NDI methods are reliable and capable of detecting all critical defects is of utmost importance. Despite increased use of automation in NDI, human inspectors, and thus human factors, still play an important role in NDI reliability. Human reliability is the probability of humans conducting specific tasks with satisfactory performance. Many techniques are suitable for modeling and analyzing human reliability in NDI of nuclear power plant components. Among these can be highlighted Failure Modes and Effects Analysis (FMEA) and THERP (Technique for Human Error Rate Prediction). The application of these techniques is illustrated in an example of qualitative and quantitative studies to improve typical NDI of pipe segments of a core cooling system of a nuclear power plant, through acting on human factors issues. (author)

  1. Human reliability in non-destructive inspections of nuclear power plant components: modeling and analysis

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Marques, Raíssa Oliveira; Silva Júnior, Silvério Ferreira da; Raso, Amanda Laureano

    2017-01-01

    Non-destructive inspection (NDI) is one of the key elements in ensuring quality of engineering systems and their safe use. NDI is a very complex task, during which the inspectors have to rely on their sensory, perceptual, cognitive, and motor skills. It requires high vigilance once it is often carried out on large components, over a long period of time, and in hostile environments and restriction of workplace. A successful NDI requires careful planning, choice of appropriate NDI methods and inspection procedures, as well as qualified and trained inspection personnel. A failure of NDI to detect critical defects in safety-related components of nuclear power plants, for instance, may lead to catastrophic consequences for workers, public and environment. Therefore, ensuring that NDI methods are reliable and capable of detecting all critical defects is of utmost importance. Despite increased use of automation in NDI, human inspectors, and thus human factors, still play an important role in NDI reliability. Human reliability is the probability of humans conducting specific tasks with satisfactory performance. Many techniques are suitable for modeling and analyzing human reliability in NDI of nuclear power plant components. Among these can be highlighted Failure Modes and Effects Analysis (FMEA) and THERP (Technique for Human Error Rate Prediction). The application of these techniques is illustrated in an example of qualitative and quantitative studies to improve typical NDI of pipe segments of a core cooling system of a nuclear power plant, through acting on human factors issues. (author)

  2. A model for reliability analysis and calculation applied in an example from chemical industry

    Directory of Open Access Journals (Sweden)

    Pejović Branko B.

    2010-01-01

    Full Text Available The subject of the paper is reliability design in polymerization processes that occur in reactors of a chemical industry. The designed model is used to determine the characteristics and indicators of reliability, which enabled the determination of basic factors that result in a poor development of a process. This would reduce the anticipated losses through the ability to control them, as well as enabling the improvement of the quality of production, which is the major goal of the paper. The reliability analysis and calculation uses the deductive method based on designing of a scheme for fault tree analysis of a system based on inductive conclusions. It involves the use standard logical symbols and rules of Boolean algebra and mathematical logic. The paper eventually gives the results of the work in the form of quantitative and qualitative reliability analysis of the observed process, which served to obtain complete information on the probability of top event in the process, as well as objective decision making and alternative solutions.

  3. The Application of the Model Correction Factor Method to a Reliability Analysis of a Composite Blade Structure

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian

    2009-01-01

    This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...

  4. Immunization of stromal cell targeting fibroblast activation protein providing immunotherapy to breast cancer mouse model.

    Science.gov (United States)

    Meng, Mingyao; Wang, Wenju; Yan, Jun; Tan, Jing; Liao, Liwei; Shi, Jianlin; Wei, Chuanyu; Xie, Yanhua; Jin, Xingfang; Yang, Li; Jin, Qing; Zhu, Huirong; Tan, Weiwei; Yang, Fang; Hou, Zongliu

    2016-08-01

    Unlike heterogeneous tumor cells, cancer-associated fibroblasts (CAF) are genetically more stable which serve as a reliable target for tumor immunotherapy. Fibroblast activation protein (FAP) which is restrictively expressed in tumor cells and CAF in vivo and plays a prominent role in tumor initiation, progression, and metastasis can function as a tumor rejection antigen. In the current study, we have constructed artificial FAP(+) stromal cells which mimicked the FAP(+) CAF in vivo. We immunized a breast cancer mouse model with FAP(+) stromal cells to perform immunotherapy against FAP(+) cells in the tumor microenvironment. By forced expression of FAP, we have obtained FAP(+) stromal cells whose phenotype was CD11b(+)/CD34(+)/Sca-1(+)/FSP-1(+)/MHC class I(+). Interestingly, proliferation capacity of the fibroblasts was significantly enhanced by FAP. In the breast cancer-bearing mouse model, vaccination with FAP(+) stromal cells has significantly inhibited the growth of allograft tumor and reduced lung metastasis indeed. Depletion of T cell assays has suggested that both CD4(+) and CD8(+) T cells were involved in the tumor cytotoxic immune response. Furthermore, tumor tissue from FAP-immunized mice revealed that targeting FAP(+) CAF has induced apoptosis and decreased collagen type I and CD31 expression in the tumor microenvironment. These results implicated that immunization with FAP(+) stromal cells led to the disruption of the tumor microenvironment. Our study may provide a novel strategy for immunotherapy of a broad range of cancer.

  5. Trapezoidal Numerical Integration of Fire Radiative Power (FRP) Provides More Reliable Estimation of Fire Radiative Energy (FRE) and so Biomass Consumption Than Conventional Estimation Methods

    Science.gov (United States)

    Sathyachandran, S. K.; Roy, D. P.; Boschetti, L.

    2014-12-01

    The Fire Radiative Power (FRP) [MW] is a measure of the rate of biomass combustion and can be retrieved from ground based and satellite observations using middle infra-red measurements. The temporal integral of FRP is the Fire Radiative Energy (FRE) [MJ] and is related linearly to the total biomass consumption and so pyrogenic emissions. Satellite derived biomass consumption and emissions estimates have been derived conventionally by computing the summed total FRP, or the average FRP (arithmetic average of FRP retrievals), over spatial geographic grids for fixed time periods. These two methods are prone to estimation bias, especially under irregular sampling conditions such as provided by polar-orbiting satellites, because the FRP can vary rapidly in space and time as a function of the fire behavior. Linear temporal integration of FRP taking into account when the FRP values were observed and using the trapezoidal rule for numerical integration has been suggested as an alternate FRE estimation method. In this study FRP data measured rapidly with a dual-band radiometer over eight prescribed fires are used to compute eight FRE values using the sum, mean and trapezoidal estimation approaches under a variety of simulated irregular sampling conditions. The estimated values are compared to biomass consumed measurements for each of the eight fires to provide insights into which method provides more accurate and precise biomass consumption estimates. The three methods are also applied to continental MODIS FRP data to study their differences using polar orbiting satellite data. The research findings indicate that trapezoidal FRP numerical integration provides the most reliable estimator.

  6. Testing comparison models of DASS-12 and its reliability among adolescents in Malaysia.

    Science.gov (United States)

    Osman, Zubaidah Jamil; Mukhtar, Firdaus; Hashim, Hairul Anuar; Abdul Latiff, Latiffah; Mohd Sidik, Sherina; Awang, Hamidin; Ibrahim, Normala; Abdul Rahman, Hejar; Ismail, Siti Irma Fadhilah; Ibrahim, Faisal; Tajik, Esra; Othman, Norlijah

    2014-10-01

    The 21-item Depression, Anxiety and Stress Scale (DASS-21) is frequently used in non-clinical research to measure mental health factors among adults. However, previous studies have concluded that the 21 items are not stable for utilization among the adolescent population. Thus, the aims of this study are to examine the structure of the factors and to report on the reliability of the refined version of the DASS that consists of 12 items. A total of 2850 students (aged 13 to 17 years old) from three major ethnic in Malaysia completed the DASS-21. The study was conducted at 10 randomly selected secondary schools in the northern state of Peninsular Malaysia. The study population comprised secondary school students (Forms 1, 2 and 4) from the selected schools. Based on the results of the EFA stage, 12 items were included in a final CFA to test the fit of the model. Using maximum likelihood procedures to estimate the model, the selected fit indices indicated a close model fit (χ(2)=132.94, df=57, p=.000; CFI=.96; RMR=.02; RMSEA=.04). Moreover, significant loadings of all the unstandardized regression weights implied an acceptable convergent validity. Besides the convergent validity of the item, a discriminant validity of the subscales was also evident from the moderate latent factor inter-correlations, which ranged from .62 to .75. The subscale reliability was further estimated using Cronbach's alpha and the adequate reliability of the subscales was obtained (Total=76; Depression=.68; Anxiety=.53; Stress=.52). The new version of the 12-item DASS for adolescents in Malaysia (DASS-12) is reliable and has a stable factor structure, and thus it is a useful instrument for distinguishing between depression, anxiety and stress. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Accelerated Monte Carlo system reliability analysis through machine-learning-based surrogate models of network connectivity

    International Nuclear Information System (INIS)

    Stern, R.E.; Song, J.; Work, D.B.

    2017-01-01

    The two-terminal reliability problem in system reliability analysis is known to be computationally intractable for large infrastructure graphs. Monte Carlo techniques can estimate the probability of a disconnection between two points in a network by selecting a representative sample of network component failure realizations and determining the source-terminal connectivity of each realization. To reduce the runtime required for the Monte Carlo approximation, this article proposes an approximate framework in which the connectivity check of each sample is estimated using a machine-learning-based classifier. The framework is implemented using both a support vector machine (SVM) and a logistic regression based surrogate model. Numerical experiments are performed on the California gas distribution network using the epicenter and magnitude of the 1989 Loma Prieta earthquake as well as randomly-generated earthquakes. It is shown that the SVM and logistic regression surrogate models are able to predict network connectivity with accuracies of 99% for both methods, and are 1–2 orders of magnitude faster than using a Monte Carlo method with an exact connectivity check. - Highlights: • Surrogate models of network connectivity are developed by machine-learning algorithms. • Developed surrogate models can reduce the runtime required for Monte Carlo simulations. • Support vector machine and logistic regressions are employed to develop surrogate models. • Numerical example of California gas distribution network demonstrate the proposed approach. • The developed models have accuracies 99%, and are 1–2 orders of magnitude faster than MCS.

  8. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  9. High-Speed Shaft Bearing Loads Testing and Modeling in the NREL Gearbox Reliability Collaborative: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    McNiff, B.; Guo, Y.; Keller, J.; Sethuraman, L.

    2014-12-01

    Bearing failures in the high speed output stage of the gearbox are plaguing the wind turbine industry. Accordingly, the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) has performed an experimental and theoretical investigation of loads within these bearings. The purpose of this paper is to describe the instrumentation, calibrations, data post-processing and initial results from this testing and modeling effort. Measured HSS torque, bending, and bearing loads are related to model predictions. Of additional interest is examining if the shaft measurements can be simply related to bearing load measurements, eliminating the need for invasive modifications of the bearing races for such instrumentation.

  10. Final Report: System Reliability Model for Solid-State Lighting (SSL) Luminaires

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J. Lynn [RTI International, Research Triangle Park, NC (United States)

    2017-05-31

    The primary objectives of this project was to develop and validate reliability models and accelerated stress testing (AST) methodologies for predicting the lifetime of integrated SSL luminaires. This study examined the likely failure modes for SSL luminaires including abrupt failure, excessive lumen depreciation, unacceptable color shifts, and increased power consumption. Data on the relative distribution of these failure modes were acquired through extensive accelerated stress tests and combined with industry data and other source of information on LED lighting. This data was compiled and utilized to build models of the aging behavior of key luminaire optical and electrical components.

  11. Incorporation of Markov reliability models for digital instrumentation and control systems into existing PRAs

    International Nuclear Information System (INIS)

    Bucci, P.; Mangan, L. A.; Kirschenbaum, J.; Mandelli, D.; Aldemir, T.; Arndt, S. A.

    2006-01-01

    Markov models have the ability to capture the statistical dependence between failure events that can arise in the presence of complex dynamic interactions between components of digital instrumentation and control systems. One obstacle to the use of such models in an existing probabilistic risk assessment (PRA) is that most of the currently available PRA software is based on the static event-tree/fault-tree methodology which often cannot represent such interactions. We present an approach to the integration of Markov reliability models into existing PRAs by describing the Markov model of a digital steam generator feedwater level control system, how dynamic event trees (DETs) can be generated from the model, and how the DETs can be incorporated into an existing PRA with the SAPHIRE software. (authors)

  12. Human-centered modeling in human reliability analysis: some trends based on case studies

    International Nuclear Information System (INIS)

    Mosneron-Dupin, F.; Reer, B.; Heslinga, G.; Straeter, O.; Gerdes, V.; Saliou, G.; Ullwer, W.

    1997-01-01

    As an informal working group of researchers from France, Germany and The Netherlands created in 1993, the EARTH association is investigating significant subjects in the field of human reliability analysis (HRA). Our initial review of cases from nuclear operating experience showed that decision-based unrequired actions (DUA) contribute to risk significantly on the one hand. On the other hand, our evaluation of current HRA methods showed that these methods do not cover such actions adequately. Especially, practice-oriented guidelines for their predictive identification are lacking. We assumed that a basic cause for such difficulties was that these methods actually use a limited representation of the stimulus-organism-response (SOR) paradigm. We proposed a human-centered model, which better highlights the active role of the operators and the importance of their culture, attitudes and goals. This orientation was encouraged by our review of current HRA research activities. We therefore decided to envisage progress by identifying cognitive tendencies in the context of operating and simulator experience. For this purpose, advanced approaches for retrospective event analysis were discussed. Some orientations for improvements were proposed. By analyzing cases, various cognitive tendencies were identified, together with useful information about their context. Some of them match psychological findings already published in the literature, some of them are not covered adequately by the literature that we reviewed. Finally, this exploratory study shows that contextual and case-illustrated findings about cognitive tendencies provide useful help for the predictive identification of DUA in HRA. More research should be carried out to complement our findings and elaborate more detailed and systematic guidelines for using them in HRA studies

  13. Evaluation of Fatigue Life Reliability of Steering Knuckle Using Pearson Parametric Distribution Model

    Directory of Open Access Journals (Sweden)

    E. A. Azrulhisham

    2010-01-01

    Full Text Available Steering module is a part of automotive suspension system which provides a means for an accurate vehicle placement and stability control. Components such as steering knuckle are subjected to fatigue failures due to cyclic loads arising from various driving conditions. This paper intends to give a description of a method used in the fatigue life reliability evaluation of the knuckle used in a passenger car steering system. An accurate representation of Belgian pave service loads in terms of response-time history signal was obtained from accredited test track using road load data acquisition. The acquired service load data was replicated on durability test rig and the SN method was used to estimate the fatigue life. A Pearson system was developed to evaluate the predicted fatigue life reliability by considering the variations in material properties. Considering random loads experiences by the steering knuckle, it is found that shortest life appears to be in the vertical load direction with the lowest fatigue life reliability between 14000–16000 cycles. Taking into account the inconsistency of the material properties, the proposed method is capable of providing the probability of failure of mass-produced parts.

  14. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    Science.gov (United States)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  15. Reliability analysis of laser ultrasonics for train axle diagnostics based on model assisted POD curves

    Science.gov (United States)

    Malik, M. S.; Cavuto, A.; Martarelli, M.; Pandarese, G.; Revel, G. M.

    2014-05-01

    High speed train axles are integrated for a lifetime and it is time and resource consuming to conduct in service inspection with high accuracy. Laser ultrasonics is a proposed solution as a subset of non-contact measuring methods effective also for hard to reach areas and even recently proved to be effective using Laser Doppler Vibrometer (LDV) or air-coupled probes in reception. A reliability analysis of laser ultrasonics for this specific application is here performed. The research is mainly based on numerical study of the effect of high energy laser pulses on the surface of a steel axle and of the behavior of the ultrasonic waves in detecting possible defects. Probability of Detection (POD) concept is used as an estimated reliability of the inspection method. In particular Model Assisted Probability of Detection (MAPOD), a modified form of POD where models are used to infer results for making a decisive statistical approach of POD curve, is here adopted. This paper implements this approach by taking the inputs from limited experiments conducted on a high speed train axle using laser ultrasonics (source pulsed Nd:Yag, reception by high-frequency LDV) to calibrate a multiphysics FE model and by using the calibrated model to generate data samples statistically representative of damaged train axles. The simulated flaws are in accordance with the real defects present on the axle. A set of flaws of different depth has been modeled in order to assess the laser ultrasonics POD for this specific application.

  16. Assessment of High Reliability Organizations Model in Farabi Eye Hospital, Tehran, Iran.

    Science.gov (United States)

    Mousavi, Seyed Mohammad Hadi; Jabbarvand Behrouz, Mahmoud; Zerati, Hojjat; Dargahi, Hossein; Asadollahi, Akram; Mousavi, Seyed Ahmad; Ashrafi, Elham; Aliyari, Abolfazl

    2018-01-01

    A high-reliability organization (HRO) is a separate paradigm can indicate medical error reduction and patient safety improvement. Hospitals, as vital organizations in the health care system, can transform to HROs to achieve optimal performance and maximum safety in order to manage unpredicted events efficiently. Therefore, the aim of this research was to determine the knowledge of managers and staffs of Farabi Eye Hospital, Tehran, Iran about HROs model, and the extent of HROs establishment in this hospital in 2015-2016. In this descriptive-analytical and cross-sectional study, data were collected through HROs questionnaire and checklist. Validity of questionnaire and checklist was confirmed by expert panel, and the questionnaire reliability by Alpha-Cronbach method with 0.85. The collected data were analyzed with Spearman's correlation coefficient and Mann-Whitney test using the SPSS software version 19. Most of the respondents were familiar with HROs model to some extent and only 18.8% had a high level of knowledge in this regard. In addition, there was no significant correlation between the knowledge of staffs and managers with establishment of HROs model in Farabi eye hospital. Managers and staffs of Farabi Eye Hospital did not have a high knowledge level of the model of HROs and had little information about the functions and characteristics of these organizations. Therefore, we suggest HROs training courses and workshops should be established in this hospital to increase the knowledge of the managers and staffs for better establishment of HROs model.

  17. Reliability-Based Design of Wind Turbine Foundations – Computational Modelling

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammad Javad

    Among renewable green energy generators, wind turbines are the most technically and economically efficient. Therefore, wind power plants are experiencing a competitive increased trend in global growth. The gas and oil industry is shrouded by political conflict, not the least of which is burning...... increased cost-effectiveness in wind turbines, and an optimized design must be implemented on the expensive structural components. The traditional wind turbine foundation typically expends 25-30% of the total wind turbine budget; thus it is one of the most costly fabrication components. Therefore......, a reduction in foundation cost, and optimizing foundation structural design is the best solution to cost effectiveness. An optimized wind turbine foundation design should provide a suitable target reliability level. Unfortunately, the reliability level is not identified in most current deterministic design...

  18. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  19. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  20. Reliability of a Novel Model for Drug Release from 2D HPMC-Matrices

    Directory of Open Access Journals (Sweden)

    Rumiana Blagoeva

    2010-04-01

    Full Text Available A novel model of drug release from 2D-HPMC matrices is considered. Detailed mathematical description of matrix swelling and the effect of the initial drug loading are introduced. A numerical approach to solution of the posed nonlinear 2D problem is used on the basis of finite element domain approximation and time difference method. The reliability of the model is investigated in two steps: numerical evaluation of the water uptake parameters; evaluation of drug release parameters under available experimental data. The proposed numerical procedure for fitting the model is validated performing different numerical examples of drug release in two cases (with and without taking into account initial drug loading. The goodness of fit evaluated by the coefficient of determination is presented to be very good with few exceptions. The obtained results show better model fitting when accounting the effect of initial drug loading (especially for larger values.

  1. Algorithms for Bayesian network modeling and reliability assessment of infrastructure systems

    International Nuclear Information System (INIS)

    Tien, Iris; Der Kiureghian, Armen

    2016-01-01

    Novel algorithms are developed to enable the modeling of large, complex infrastructure systems as Bayesian networks (BNs). These include a compression algorithm that significantly reduces the memory storage required to construct the BN model, and an updating algorithm that performs inference on compressed matrices. These algorithms address one of the major obstacles to widespread use of BNs for system reliability assessment, namely the exponentially increasing amount of information that needs to be stored as the number of components in the system increases. The proposed compression and inference algorithms are described and applied to example systems to investigate their performance compared to that of existing algorithms. Orders of magnitude savings in memory storage requirement are demonstrated using the new algorithms, enabling BN modeling and reliability analysis of larger infrastructure systems. - Highlights: • Novel algorithms developed for Bayesian network modeling of infrastructure systems. • Algorithm presented to compress information in conditional probability tables. • Updating algorithm presented to perform inference on compressed matrices. • Algorithms applied to example systems to investigate their performance. • Orders of magnitude savings in memory storage requirement demonstrated.

  2. Demands placed on waste package performance testing and modeling by some general results on reliability analysis

    International Nuclear Information System (INIS)

    Chesnut, D.A.

    1991-09-01

    Waste packages for a US nuclear waste repository are required to provide reasonable assurance of maintaining substantially complete containment of radionuclides for 300 to 1000 years after closure. The waiting time to failure for complex failure processes affecting engineered or manufactured systems is often found to be an exponentially-distributed random variable. Assuming that this simple distribution can be used to describe the behavior of a hypothetical single barrier waste package, calculations presented in this paper show that the mean time to failure (the only parameter needed to completely specify an exponential distribution) would have to be more than 10 7 years in order to provide reasonable assurance of meeting this requirement. With two independent barriers, each would need to have a mean time to failure of only 10 5 years to provide the same reliability. Other examples illustrate how multiple barriers can provide a strategy for not only achieving but demonstrating regulatory compliance

  3. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  4. solveME: fast and reliable solution of nonlinear ME models

    DEFF Research Database (Denmark)

    Yang, Laurence; Ma, Ding; Ebrahim, Ali

    2016-01-01

    Background: Genome-scale models of metabolism and macromolecular expression (ME) significantly expand the scope and predictive capabilities of constraint-based modeling. ME models present considerable computational challenges: they are much (>30 times) larger than corresponding metabolic reconstr...... methods will accelerate the wide-spread adoption of ME models for researchers in these fields. Electronic supplementary material The online version of this article (doi:10.1186/s12859-016-1240-1) contains supplementary material, which is available to authorized users.......Background: Genome-scale models of metabolism and macromolecular expression (ME) significantly expand the scope and predictive capabilities of constraint-based modeling. ME models present considerable computational challenges: they are much (>30 times) larger than corresponding metabolic...... reconstructions (M models), are multiscale, and growth maximization is a nonlinear programming (NLP) problem, mainly due to macromolecule dilution constraints. Results: Here, we address these computational challenges. We develop a fast and numerically reliable solution method for growth maximization in ME models...

  5. Reproducibility, reliability and validity of measurements obtained from Cecile3 digital models

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Watanabe-Kanno

    2009-09-01

    Full Text Available The aim of this study was to determine the reproducibility, reliability and validity of measurements in digital models compared to plaster models. Fifteen pairs of plaster models were obtained from orthodontic patients with permanent dentition before treatment. These were digitized to be evaluated with the program Cécile3 v2.554.2 beta. Two examiners measured three times the mesiodistal width of all the teeth present, intercanine, interpremolar and intermolar distances, overjet and overbite. The plaster models were measured using a digital vernier. The t-Student test for paired samples and interclass correlation coefficient (ICC were used for statistical analysis. The ICC of the digital models were 0.84 ± 0.15 (intra-examiner and 0.80 ± 0.19 (inter-examiner. The average mean difference of the digital models was 0.23 ± 0.14 and 0.24 ± 0.11 for each examiner, respectively. When the two types of measurements were compared, the values obtained from the digital models were lower than those obtained from the plaster models (p < 0.05, although the differences were considered clinically insignificant (differences < 0.1 mm. The Cécile digital models are a clinically acceptable alternative for use in Orthodontics.

  6. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  7. Reliability Models Of Aging Passive Components Informed By Materials Degradation Metrics To Support Long-Term Reactor Operations

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Lowry, Peter P.; Toyooka, Michael Y.

    2012-01-01

    Paper describes a methodology for the synthesis of nuclear power plant service data with expert-elicited materials degradation information to estimate the future failure rates of passive components. This method should be an important resource to long-term plant operations and reactor life extension. Conventional probabilistic risk assessments (PRAs) are not well suited to addressing long-term reactor operations. Since passive structures and components are among those for which replacement can be least practical, they might be expected to contribute increasingly to risk in an aging plant; yet, passives receive limited treatment in PRAs. Furthermore, PRAs produce only snapshots of risk based on the assumption of time-independent component failure rates. This assumption is unlikely to be valid in aging systems. The treatment of aging passive components in PRA presents challenges. Service data to quantify component reliability models are sparse, and this is exacerbated by the greater data demands of age-dependent reliability models. Another factor is that there can be numerous potential degradation mechanisms associated with the materials and operating environment of a given component. This deepens the data problem since risk-informed management of component aging will demand an understanding of the long-term risk significance of individual degradation mechanisms. In this paper we describe a Bayesian methodology that integrates metrics of materials degradation susceptibility with available plant service data to estimate age-dependent passive component reliabilities. Integration of these models into conventional PRA will provide a basis for materials degradation management informed by predicted long-term operational risk.

  8. The constant failure rate model for fault tree evaluation as a tool for unit protection reliability assessment

    International Nuclear Information System (INIS)

    Vichev, S.; Bogdanov, D.

    2000-01-01

    The purpose of this paper is to introduce the fault tree analysis method as a tool for unit protection reliability estimation. The constant failure rate model applies for making reliability assessment, and especially availability assessment. For that purpose an example for unit primary equipment structure and fault tree example for simplified unit protection system is presented (author)

  9. Assessment of the reliability of reproducing two-dimensional resistivity models using an image processing technique.

    Science.gov (United States)

    Ishola, Kehinde S; Nawawi, Mohd Nm; Abdullah, Khiruddin; Sabri, Ali Idriss Aboubakar; Adiat, Kola Abdulnafiu

    2014-01-01

    This study attempts to combine the results of geophysical images obtained from three commonly used electrode configurations using an image processing technique in order to assess their capabilities to reproduce two-dimensional (2-D) resistivity models. All the inverse resistivity models were processed using the PCI Geomatica software package commonly used for remote sensing data sets. Preprocessing of the 2-D inverse models was carried out to facilitate further processing and statistical analyses. Four Raster layers were created, three of these layers were used for the input images and the fourth layer was used as the output of the combined images. The data sets were merged using basic statistical approach. Interpreted results show that all images resolved and reconstructed the essential features of the models. An assessment of the accuracy of the images for the four geologic models was performed using four criteria: the mean absolute error and mean percentage absolute error, resistivity values of the reconstructed blocks and their displacements from the true models. Generally, the blocks of the images of maximum approach give the least estimated errors. Also, the displacement of the reconstructed blocks from the true blocks is the least and the reconstructed resistivities of the blocks are closer to the true blocks than any other combined used. Thus, it is corroborated that when inverse resistivity models are combined, most reliable and detailed information about the geologic models is obtained than using individual data sets.

  10. Reliable likelihood ratios for statistical model-based voice activity detector with low false-alarm rate

    Directory of Open Access Journals (Sweden)

    Kim Younggwan

    2011-01-01

    Full Text Available Abstract The role of the statistical model-based voice activity detector (SMVAD is to detect speech regions from input signals using the statistical models of noise and noisy speech. The decision rule of SMVAD is based on the likelihood ratio test (LRT. The LRT-based decision rule may cause detection errors because of statistical properties of noise and speech signals. In this article, we first analyze the reasons why the detection errors occur and then propose two modified decision rules using reliable likelihood ratios (LRs. We also propose an effective weighting scheme considering spectral characteristics of noise and speech signals. In the experiments proposed in this study, with almost no additional computations, the proposed methods show significant performance improvement in various noise conditions. Experimental results also show that the proposed weighting scheme provides additional performance improvement over the two proposed SMVADs.

  11. Reliable likelihood ratios for statistical model-based voice activity detector with low false-alarm rate

    Science.gov (United States)

    Kim, Younggwan; Suh, Youngjoo; Kim, Hoirin

    2011-12-01

    The role of the statistical model-based voice activity detector (SMVAD) is to detect speech regions from input signals using the statistical models of noise and noisy speech. The decision rule of SMVAD is based on the likelihood ratio test (LRT). The LRT-based decision rule may cause detection errors because of statistical properties of noise and speech signals. In this article, we first analyze the reasons why the detection errors occur and then propose two modified decision rules using reliable likelihood ratios (LRs). We also propose an effective weighting scheme considering spectral characteristics of noise and speech signals. In the experiments proposed in this study, with almost no additional computations, the proposed methods show significant performance improvement in various noise conditions. Experimental results also show that the proposed weighting scheme provides additional performance improvement over the two proposed SMVADs.

  12. Reliability and Validity Evidence for Achievement Goal Models in High School Physical Education Settings

    Science.gov (United States)

    Guan, Jianmin; McBride, Ron; Xiang, Ping

    2007-01-01

    Although empirical research in academic areas provides support for both a 3-factor as well as a 4-factor achievement goal model, both models were proposed and tested with a collegiate sample. Little is known about the generalizability of either model with high school level samples. This study was designed to examine whether the 3-factor model…

  13. Identifying the effects of parameter uncertainty on the reliability of riverbank stability modelling

    Science.gov (United States)

    Samadi, A.; Amiri-Tokaldany, E.; Darby, S. E.

    2009-05-01

    Bank retreat is a key process in fluvial dynamics affecting a wide range of physical, ecological and socioeconomic issues in the fluvial environment. To predict the undesirable effects of bank retreat and to inform effective measures to prevent it, a wide range of bank stability models have been presented in the literature. These models typically express bank stability by defining a factor of safety as the ratio of driving and resisting forces acting on the incipient failure block. These forces are affected by a range of controlling factors that include such aspects as the bank profile (bank height and angle), the geotechnical properties of the bank materials, as well as the hydrological status of the riverbanks. In this paper we evaluate the extent to which uncertainties in the parameterization of these controlling factors feed through to influence the reliability of the resulting bank stability estimate. This is achieved by employing a simple model of riverbank stability with respect to planar failure (which is the most common type of bank stability model) in a series of sensitivity tests and Monte Carlo analyses to identify, for each model parameter, the range of values that induce significant changes in the simulated factor of safety. These identified parameter value ranges are compared to empirically derived parameter uncertainties to determine whether they are likely to confound the reliability of the resulting bank stability calculations. Our results show that parameter uncertainties are typically high enough that the likelihood of generating unreliable predictions is typically very high (> ˜ 80% for predictions requiring a precision of < ± 15%). Because parameter uncertainties are derived primarily from the natural variability of the parameters, rather than measurement errors, much more careful attention should be paid to field sampling strategies, such that the parameter uncertainties and consequent prediction unreliabilities can be quantified more

  14. Reliability of some ageing nuclear power plant systems: a simple stochastic model

    International Nuclear Information System (INIS)

    Suarez Antola, R.

    2007-01-01

    The random number of failure-related events in certain repairable ageing systems, like certain nuclear power plant components, during a given time interval, may be often modelled by a compound Poisson distribution. One of these is the Polya-Aeppli distribution. The derivation of a stationary Polya-Aeppli distribution as a limiting distribution of rare events for stationary Bernouilli trials with first order Markov dependence is considered. But if the parameters of the Polya-Aeppli distribution are suitable time functions, we could expect that the resulting distribution would allow us to take into account the distribution of failure-related events in an ageing system. Assuming that a critical number of damages produce an emergent failure, the abovementioned results can be applied in a reliability analysis. It is natural to ask under what conditions a Polya-Aeppli distribution could be a limiting distribution for non-homogeneous Bernouilli trials with first order Markov dependence. In this paper this problem is analyzed and possible applications of the obtained results to ageing or deteriorating nuclear power plant components are considered. The two traditional ways of modelling repairable systems in reliability theory: the - as bad as old - concept, that assumes that the replaced component is exactly under the same conditions as was the aged component before failure, and the - as good as new - concept, that assumes that the new component is under the same conditions of the replaced component when it was new, are briefly discussed in relation with the findings of the present work

  15. Physics-Based Stress Corrosion Cracking Component Reliability Model cast in an R7-Compatible Cumulative Damage Framework

    Energy Technology Data Exchange (ETDEWEB)

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Toloczko, Mychailo B.; Johnson, Kenneth I.; Sanborn, Scott E.

    2011-07-01

    This is a working report drafted under the Risk-Informed Safety Margin Characterization pathway of the Light Water Reactor Sustainability Program, describing statistical models of passives component reliabilities.

  16. Characterization of System Level Single Event Upset (SEU) Responses using SEU Data, Classical Reliability Models, and Space Environment Data

    Science.gov (United States)

    Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael

    2017-01-01

    We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.

  17. Reliability Demonstration for an Eddy Current NDE Technique Using A Computational Electromagnetic Model-Assisted Approach (Postprint)

    National Research Council Canada - National Science Library

    Aldrin, John C; Knopp, Jeremy; Lindgren, Eric; Annis, Charles; Sabbagh, Harold A; Sabbagh, Elias H; Murphy, R. K

    2006-01-01

    ...) inspection system validation. POD studies provide engineers with data on the reliability of detecting cracks across a distribution of sizes that can be used as an input to probabilistic risk assessment analysis...

  18. Halo Models of Large Scale Structure and Reliability of Cosmological N-Body Simulations

    Directory of Open Access Journals (Sweden)

    José Gaite

    2013-05-01

    Full Text Available Halo models of the large scale structure of the Universe are critically examined, focusing on the definition of halos as smooth distributions of cold dark matter. This definition is essentially based on the results of cosmological N-body simulations. By a careful analysis of the standard assumptions of halo models and N-body simulations and by taking into account previous studies of self-similarity of the cosmic web structure, we conclude that N-body cosmological simulations are not fully reliable in the range of scales where halos appear. Therefore, to have a consistent definition of halos is necessary either to define them as entities of arbitrary size with a grainy rather than smooth structure or to define their size in terms of small-scale baryonic physics.

  19. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  20. Alternative approaches to reliability modeling of a multiple engineered barrier system

    International Nuclear Information System (INIS)

    Ananda, M.M.A.; Singh, A.K.

    1994-01-01

    The lifetime of the engineered barrier system used for containment of high-level radioactive waste will significantly impact the total performance of a geological repository facility. Currently two types of designs are under consideration for an engineered barrier system, single engineered barrier system and multiple engineered barrier system. Multiple engineered barrier system consists of several metal barriers and the waste form (cladding). Some recent work show that a significant improvement of performance can be achieved by utilizing multiple engineered barrier systems. Considering sequential failures for each barrier, we model the reliability of the multiple engineered barrier system. Weibull and exponential lifetime distributions are used through out the analysis. Furthermore, the number of failed engineered barrier systems in a repository at a given time is modeled using a poisson approximation

  1. A Simplified Network Model for Travel Time Reliability Analysis in a Road Network

    Directory of Open Access Journals (Sweden)

    Kenetsu Uchida

    2017-01-01

    Full Text Available This paper proposes a simplified network model which analyzes travel time reliability in a road network. A risk-averse driver is assumed in the simplified model. The risk-averse driver chooses a path by taking into account both a path travel time variance and a mean path travel time. The uncertainty addressed in this model is that of traffic flows (i.e., stochastic demand flows. In the simplified network model, the path travel time variance is not calculated by considering all travel time covariance between two links in the network. The path travel time variance is calculated by considering all travel time covariance between two adjacent links in the network. Numerical experiments are carried out to illustrate the applicability and validity of the proposed model. The experiments introduce the path choice behavior of a risk-neutral driver and several types of risk-averse drivers. It is shown that the mean link flows calculated by introducing the risk-neutral driver differ as a whole from those calculated by introducing several types of risk-averse drivers. It is also shown that the mean link flows calculated by the simplified network model are almost the same as the flows calculated by using the exact path travel time variance.

  2. A reliable facility location design model with site-dependent disruption in the imperfect information context

    Science.gov (United States)

    Yun, Lifen; Wang, Xifu; Fan, Hongqiang; Li, Xiaopeng

    2017-01-01

    This paper proposes a reliable facility location design model under imperfect information with site-dependent disruptions; i.e., each facility is subject to a unique disruption probability that varies across the space. In the imperfect information contexts, customers adopt a realistic “trial-and-error” strategy to visit facilities; i.e., they visit a number of pre-assigned facilities sequentially until they arrive at the first operational facility or give up looking for the service. This proposed model aims to balance initial facility investment and expected long-term operational cost by finding the optimal facility locations. A nonlinear integer programming model is proposed to describe this problem. We apply a linearization technique to reduce the difficulty of solving the proposed model. A number of problem instances are studied to illustrate the performance of the proposed model. The results indicate that our proposed model can reveal a number of interesting insights into the facility location design with site-dependent disruptions, including the benefit of backup facilities and system robustness against variation of the loss-of-service penalty. PMID:28486564

  3. A Mid-Layer Model for Human Reliability Analysis: Understanding the Cognitive Causes of Human Failure Events

    Energy Technology Data Exchange (ETDEWEB)

    Stacey M. L. Hendrickson; April M. Whaley; Ronald L. Boring; James Y. H. Chang; Song-Hua Shen; Ali Mosleh; Johanna H. Oxstrand; John A. Forester; Dana L. Kelly; Erasmia L. Lois

    2010-06-01

    The Office of Nuclear Regulatory Research (RES) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method’s middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identified human failure events, analysts identify potential failure mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.

  4. A mid-layer model for human reliability analysis: understanding the cognitive causes of human failure events

    International Nuclear Information System (INIS)

    Shen, Song-Hua; Chang, James Y.H.; Boring, Ronald L.; Whaley, April M.; Lois, Erasmia; Langfitt Hendrickson, Stacey M.; Oxstrand, Johanna H.; Forester, John Alan; Kelly, Dana L.; Mosleh, Ali

    2010-01-01

    The Office of Nuclear Regulatory Research (RES) at the US Nuclear Regulatory Commission (USNRC) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method's middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identified human failure events, analysts identify potential failure mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.

  5. Microgrid Design Analysis Using Technology Management Optimization and the Performance Reliability Model

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    Microgrids are a focus of localized energy production that support resiliency, security, local con- trol, and increased access to renewable resources (among other potential benefits). The Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) Joint Capa- bility Technology Demonstration (JCTD) program between the Department of Defense (DOD), Department of Energy (DOE), and Department of Homeland Security (DHS) resulted in the pre- liminary design and deployment of three microgrids at military installations. This paper is focused on the analysis process and supporting software used to determine optimal designs for energy surety microgrids (ESMs) in the SPIDERS project. There are two key pieces of software, an ex- isting software application developed by Sandia National Laboratories (SNL) called Technology Management Optimization (TMO) and a new simulation developed for SPIDERS called the per- formance reliability model (PRM). TMO is a decision support tool that performs multi-objective optimization over a mixed discrete/continuous search space for which the performance measures are unrestricted in form. The PRM is able to statistically quantify the performance and reliability of a microgrid operating in islanded mode (disconnected from any utility power source). Together, these two software applications were used as part of the ESM process to generate the preliminary designs presented by SNL-led DOE team to the DOD. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military instal- lations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Tarek Abdallah, Melanie

  6. The importance of data quality for generating reliable distribution models for rare, elusive, and cryptic species.

    Directory of Open Access Journals (Sweden)

    Keith B Aubry

    Full Text Available The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated for rare, elusive, and cryptic species that are prone to misidentification in the field. We investigated this question for the fisher (Pekania pennanti, a forest carnivore of conservation concern in the Pacific States that is often confused with the more common Pacific marten (Martes caurina. Fisher occurrence records supported by physical evidence (verifiable records were available from a limited area, whereas occurrence records of unknown quality (unscreened records were available from throughout the fisher's historical range. We reserved 20% of the verifiable records to use as a test sample for both models and generated SDMs with each dataset using Maxent. The verifiable model performed substantially better than the unscreened model based on multiple metrics including AUCtest values (0.78 and 0.62, respectively, evaluation of training and test gains, and statistical tests of how well each model predicted test localities. In addition, the verifiable model was consistent with our knowledge of the fisher's habitat relations and potential distribution, whereas the unscreened model indicated a much broader area of high-quality habitat (indices > 0.5 that included large expanses of high-elevation habitat that fishers do not occupy. Because Pacific martens remain relatively common in upper elevation habitats in the Cascade Range and Sierra Nevada, the SDM based on unscreened records likely reflects primarily a conflation of marten and fisher habitat. Consequently, accurate identifications are far more important than the spatial extent of occurrence records for generating reliable SDMs

  7. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  8. Development of corrosion models to ensure reliable performance of nuclear power plants

    International Nuclear Information System (INIS)

    Kritzky, V.G.; Stjazhkin, P.S.

    1993-01-01

    The safety and reliability of the coolant circuits in nuclear power plants depend much on corrosion and corrosion products transfer processes. Various empirical models have been developed which are applicable to particular sets of operational conditions. In our laboratory a corrosion model has been worked out, which is based on the thermodynamic properties of the compounds, participating in corrosion process and on the assumption, that the corrosion process is controlled by the solubilities of the corrosion products forming the surface oxide layer. The validity of the model has been verified by use of retrospective experimental data, which have been obtained for a series of structural materials such as e.g., carbon and stainless steels, Cu-, Al-, and Zr alloys. With regard for hydriding the model satisfactorily describes stress corrosion cracking process in water-salt environments. This report describes a model based on the thermodynamic properties of the compounds participating in the corrosion process, and on the assumption that the corrosion process is controlled by the solubilities of the corrosion products forming the surface oxide layer

  9. Event and fault tree model for reliability analysis of the greek research reactor

    International Nuclear Information System (INIS)

    Fault trees and event trees are widely used in industry to model and to evaluate the reliability of safety systems. Detailed analyzes in nuclear installations require the combination of these two techniques. This work uses the methods of fault tree (FT) and event tree (ET) to perform the Probabilistic Safety Assessment (PSA) in research reactors. The PSA according to IAEA (International Atomic Energy Agency) is divided into Level 1, Level 2 and level 3. At Level 1, conceptually safety systems act to prevent the accident, at Level 2, the accident occurred and seeks to minimize the consequences, known as stage management of the accident, and at Level 3 are determined consequences. This paper focuses on Level 1 studies, and searches through the acquisition of knowledge consolidation of methodologies for future reliability studies. The Greek Research Reactor, GRR - 1, was used as a case example. The LOCA (Loss of Coolant Accident) was chosen as the initiating event and from there were developed the possible accident sequences, using event tree, which could lead damage to the core. Furthermore, for each of the affected systems, the possible accidents sequences were made fault tree and evaluated the probability of each event top of the FT. The studies were conducted using a commercial computational tool SAPHIRE. The results thus obtained, performance or failure to act of the systems analyzed were considered satisfactory. This work is directed to the Greek Research Reactor due to data availability. (author)

  10. Reliability of pulse oximetry during cardiopulmonary resuscitation in a piglet model of neonatal cardiac arrest.

    Science.gov (United States)

    Hassan, Mohammad Ahmad; Mendler, Marc; Maurer, Miriam; Waitz, Markus; Huang, Li; Hummler, Helmut D

    2015-01-01

    Pulse oximetry is widely used in intensive care and emergency conditions to monitor arterial oxygenation and to guide oxygen therapy. To study the reliability of pulse oximetry in comparison with CO-oximetry in newborn piglets during cardiopulmonary resuscitation (CPR). In a prospective cohort study in 30 healthy newborn piglets, cardiac arrest was induced, and thereafter each piglet received CPR for 20 min. Arterial oxygen saturation was monitored continuously by pulse oximetry (SpO2). Arterial blood was analyzed for functional oxygenation (SaO2) every 2 min. SpO2 was compared with coinciding SaO2 values and bias considered whenever the difference (SpO2 - SaO2) was beyond ±5%. Bias values were decreased at the baseline measurements (mean: 2.5 ± 4.6%) with higher precision and accuracy compared with values across the experiment. Two minutes after cardiac arrest, there was a marked decrease in precision and accuracy as well as an increase in bias up to 13 ± 34%, reaching a maximum of 45.6 ± 28.3% after 10 min over a mean SaO2 range of 29-58%. Pulse oximetry showed increased bias and decreased accuracy and precision during CPR in a model of neonatal cardiac arrest. We recommend further studies to clarify the exact mechanisms of these false readings to improve reliability of pulse oximetry during the marked desaturation and hypoperfusion found during CPR. © 2014 S. Karger AG, Basel.

  11. Pipeline integrity model-a formative approach towards reliability and life assessment

    International Nuclear Information System (INIS)

    Sayed, A.M.; Jaffery, M.A.

    2005-01-01

    Pipe forms an integral part of transmission medium in oil and gas industry. This holds true for both upstream and downstream segments of this global energy business. With the aging of this asset base, emphasis on its operational aspects has been under immense considerations from the operators and regulators sides. Moreover, the milieu of information area and enhancement in global trade has lifted the barriers on means to forge forward towards better utilization of resources. This has resulted in optimized solutions as priority for business and technical manager's world over. There is a paradigm shift from mere development of 'smart materials' to 'low life cycle cost material'. The force inducing this change is a rationale one: the recovery of development costs is no more a problem in a global community; rather it is the pay-off time which matters most to the materials end users. This means that decision makers are not evaluating just the price offered but are keen to judge the entire life cycle cost of a product. The integrity of pipe are affected by factors such as corrosion, fatigue-crack growth, stress-corrosion cracking, and mechanical damage. Extensive research in the area of reliability and life assessment has been carried out. A number of models concerning with the reliability issues of pipes have been developed and are being used by a number of pipeline operators worldwide. Yet, it is emphasised that there are no substitute for sound engineering judgment and allowance for factors of safety. The ability of a laid down pipe network to transport the intended fluid under pre-defined conditions for the entire project envisaged life, is referred to the reliability of system. The reliability is built into the product through extensive benchmarking against industry standard codes. The process of pipes construction for oil and gas service is regulated through American Petroleum Institute's Specification for Line Pipe. Subsequently, specific programs have been

  12. Probabilistic physics-of-failure models for component reliabilities using Monte Carlo simulation and Weibull analysis: a parametric study

    International Nuclear Information System (INIS)

    Hall, P.L.; Strutt, J.E.

    2003-01-01

    In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or 'wear out' failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters β having values 1 respectively. Early life failures are frequently attributed to poor design (e.g. poor materials selection) or problems associated with manufacturing or assembly processes. We describe a methodology for the implementation of physics-of-failure models of component lifetimes in the presence of parameter and model uncertainties. This treats uncertain parameters as random variables described by some appropriate statistical distribution, which may be sampled using Monte Carlo methods. The number of simulations required depends upon the desired accuracy of the predicted lifetime. Provided that the number of sampled variables is relatively small, an accuracy of 1-2% can be obtained using typically 1000 simulations. The resulting collection of times-to-failure are then sorted into ascending order and fitted to a Weibull distribution to obtain a shape factor β and a characteristic life-time η. Examples are given of the results obtained using three different models: (1) the Eyring-Peck (EP) model for corrosion of printed circuit boards; (2) a power-law corrosion growth (PCG) model which represents the progressive deterioration of oil and gas pipelines; and (3) a random shock-loading model of mechanical failure. It is shown that for any specific model the values of the Weibull shape parameters obtained may be strongly dependent on the degree of uncertainty of the underlying input parameters. Both the EP and PCG models can yield a wide range of values of β, from β>1, characteristic of wear-out behaviour, to β<1, characteristic of early-life failure, depending on the degree of

  13. Applying the crew reliability model for team error analysis in the modernized main control room of advanced nuclear power plants

    International Nuclear Information System (INIS)

    Yang Chihwei; Kao Tsumu; Huang Huiwen

    2013-01-01

    This study implemented a crew reliability model (CRM) for analyzing human errors in a modernized main control room of advanced nuclear power plants. Instrumentation and controls systems in the main control room recently have changed most significantly with the digitalization of human-system interfaces. Ensuring the safe operation of nuclear power plants is an important driving force of these changes. Probabilistic risk assessment (PRA) is one of the most common methods to respond to these changes. PRA uses human reliability analysis (HRA) to assess human risk. In emergency situation, failure to detect a problem can have significant influences in process control and considerable effort has been invested in attempting to minimize this error through improved interface design, training, and the allocation of responsibilities within a control room team. This study provides a direction related to the crew errors. Furthermore, this study found that implementing the CRM fully considers the influences of team errors on the target system. The proposed model can be applied to specific systems in conjunction with a consideration of critical elements; they are design basis accidents, critical human actions, human error modes, and performance shaping factors. This model can be used to assist human error analysis in the main control room. Advanced technologies can reduce the occurrence of existed human errors from tradition human-system interfaces. However, the highly integrated room may hide some potential human errors that need to be further investigated. Furthermore, the use of a single example in this study is insufficient. Investigation of further examples in a future study would be useful for verification and validation of the proposed model. (author)

  14. A review of the models for evaluating organizational factors in human reliability analysis

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves da; Melo, Paulo Fernando Ferreira Frutuoso e

    2009-01-01

    Human factors should be evaluated in three hierarchical levels. The first level should concern the cognitive behavior of human beings during the control of processes that occur through the man-machine interface. Here, one evaluates human errors through human reliability models of first and second generation, like THERP, ASEP and HCR (first generation) and ATHEANA and CREAM (second generation). In the second level, the focus is in the cognitive behavior of human beings when they work in groups, as in nuclear power plants. The focus here is in the anthropological aspects that govern the interaction among human beings. In the third level, one is interested in the influence that the organizational culture exerts on human beings as well as on the tasks being performed. Here, one adds to the factors of the second level the economical and political aspects that shape the company organizational culture. Nowadays, the methodologies of HRA incorporate organizational factors in the group and organization levels through performance shaping factors. This work makes a critical evaluation of the deficiencies concerning human factors and evaluates the potential of quantitative techniques that have been proposed in the last decade to model organizational factors, including the interaction among groups, with the intention of eliminating this chronic deficiency of HRA models. Two important techniques will be discussed in this context: STAMP, based on system theory and FRAM, which aims at modeling the nonlinearities of socio-technical systems. (author)

  15. Reliability in the Power System Modeled in a Multi- Stage Stochastic Mixed Integer Programming Model

    DEFF Research Database (Denmark)

    Simonsen Nielsen, Michael Pascal

    Contributions from this article are that it takes the characteristics of the power system into account at different stages, which gives a more realistic presentation of the welfare aspects to be gained by an optimal operation/ dispatch of the power system. This article is utilizing a Multi......-Stage Stochastic Mixed Integer Programming Model that handles uncertainty in a flexible and practical way. The method applied relies on state-of-the-art modeling within this field, but the method applied in this article is extended by using decomposition....

  16. Factor structure and internal reliability of an exercise health belief model scale in a Mexican population

    Directory of Open Access Journals (Sweden)

    Oscar Armando Esparza-Del Villar

    2017-03-01

    Full Text Available Abstract Background Mexico is one of the countries with the highest rates of overweight and obesity around the world, with 68.8% of men and 73% of women reporting both. This is a public health problem since there are several health related consequences of not exercising, like having cardiovascular diseases or some types of cancers. All of these problems can be prevented by promoting exercise, so it is important to evaluate models of health behaviors to achieve this goal. Among several models the Health Belief Model is one of the most studied models to promote health related behaviors. This study validates the first exercise scale based on the Health Belief Model (HBM in Mexicans with the objective of studying and analyzing this model in Mexico. Methods Items for the scale called the Exercise Health Belief Model Scale (EHBMS were developed by a health research team, then the items were applied to a sample of 746 participants, male and female, from five cities in Mexico. The factor structure of the items was analyzed with an exploratory factor analysis and the internal reliability with Cronbach’s alpha. Results The exploratory factor analysis reported the expected factor structure based in the HBM. The KMO index (0.92 and the Barlett’s sphericity test (p < 0.01 indicated an adequate and normally distributed sample. Items had adequate factor loadings, ranging from 0.31 to 0.92, and the internal consistencies of the factors were also acceptable, with alpha values ranging from 0.67 to 0.91. Conclusions The EHBMS is a validated scale that can be used to measure exercise based on the HBM in Mexican populations.

  17. Factor structure and internal reliability of an exercise health belief model scale in a Mexican population.

    Science.gov (United States)

    Villar, Oscar Armando Esparza-Del; Montañez-Alvarado, Priscila; Gutiérrez-Vega, Marisela; Carrillo-Saucedo, Irene Concepción; Gurrola-Peña, Gloria Margarita; Ruvalcaba-Romero, Norma Alicia; García-Sánchez, María Dolores; Ochoa-Alcaraz, Sergio Gabriel

    2017-03-01

    Mexico is one of the countries with the highest rates of overweight and obesity around the world, with 68.8% of men and 73% of women reporting both. This is a public health problem since there are several health related consequences of not exercising, like having cardiovascular diseases or some types of cancers. All of these problems can be prevented by promoting exercise, so it is important to evaluate models of health behaviors to achieve this goal. Among several models the Health Belief Model is one of the most studied models to promote health related behaviors. This study validates the first exercise scale based on the Health Belief Model (HBM) in Mexicans with the objective of studying and analyzing this model in Mexico. Items for the scale called the Exercise Health Belief Model Scale (EHBMS) were developed by a health research team, then the items were applied to a sample of 746 participants, male and female, from five cities in Mexico. The factor structure of the items was analyzed with an exploratory factor analysis and the internal reliability with Cronbach's alpha. The exploratory factor analysis reported the expected factor structure based in the HBM. The KMO index (0.92) and the Barlett's sphericity test (p < 0.01) indicated an adequate and normally distributed sample. Items had adequate factor loadings, ranging from 0.31 to 0.92, and the internal consistencies of the factors were also acceptable, with alpha values ranging from 0.67 to 0.91. The EHBMS is a validated scale that can be used to measure exercise based on the HBM in Mexican populations.

  18. Modelling a reliability system governed by discrete phase-type distributions

    International Nuclear Information System (INIS)

    Ruiz-Castro, Juan Eloy; Perez-Ocon, Rafael; Fernandez-Villodre, Gemma

    2008-01-01

    We present an n-system with one online unit and the others in cold standby. There is a repairman. When the online fails it goes to repair, and instantaneously a standby unit becomes the online one. The operational and repair times follow discrete phase-type distributions. Given that any discrete distribution defined on the positive integers is a discrete phase-type distribution, the system can be considered a general one. A model with unlimited number of units is considered for approximating a system with a great number of units. We show that the process that governs the system is a quasi-birth-and-death process. For this system, performance reliability measures; the up and down periods, and the involved costs are calculated in a matrix and algorithmic form. We show that the discrete case is not a trivial case of the continuous one. The results given in this paper have been implemented computationally with Matlab

  19. Evaluation of human reliability analysis methods addressing cognitive error modelling and quantification

    International Nuclear Information System (INIS)

    Reer, B.; Mertens, J.

    1996-05-01

    Actions and errors by the operating personnel, which are of significance for the safety of a technical system, are classified according to various criteria. Each type of action thus identified is roughly discussed with respect to its quantifiability by state-of-the-art human reliability analysis (HRA) within a probabilistic safety assessment (PSA). Thereby, the principal limit of quantifying human actions are discussed with special emphasis on data quality and cognitive error modelling. In this connection, the basic procedure for a HRA is briefly described under realistic conditions. With respect to the quantitative part of a HRA - the determination of error probabilities - an evaluating description of the standard method THERP (Technique of Human Error Rate Prediction) is given using eight evaluation criteria. Furthermore, six new developments (EdF'sPHRA, HCR, HCR/ORE, SLIM, HEART, INTENT) are briefly described and roughly evaluated. The report concludes with a catalogue of requirements for HRA methods. (orig.) [de

  20. Modelling a reliability system governed by discrete phase-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz-Castro, Juan Eloy [Departamento de Estadistica e Investigacion Operativa, Universidad de Granada, 18071 Granada (Spain)], E-mail: jeloy@ugr.es; Perez-Ocon, Rafael [Departamento de Estadistica e Investigacion Operativa, Universidad de Granada, 18071 Granada (Spain)], E-mail: rperezo@ugr.es; Fernandez-Villodre, Gemma [Departamento de Estadistica e Investigacion Operativa, Universidad de Granada, 18071 Granada (Spain)

    2008-11-15

    We present an n-system with one online unit and the others in cold standby. There is a repairman. When the online fails it goes to repair, and instantaneously a standby unit becomes the online one. The operational and repair times follow discrete phase-type distributions. Given that any discrete distribution defined on the positive integers is a discrete phase-type distribution, the system can be considered a general one. A model with unlimited number of units is considered for approximating a system with a great number of units. We show that the process that governs the system is a quasi-birth-and-death process. For this system, performance reliability measures; the up and down periods, and the involved costs are calculated in a matrix and algorithmic form. We show that the discrete case is not a trivial case of the continuous one. The results given in this paper have been implemented computationally with Matlab.

  1. A multi-objective reliable programming model for disruption in supply chain

    Directory of Open Access Journals (Sweden)

    Emran Mohammadi

    2013-05-01

    Full Text Available One of the primary concerns on supply chain management is to handle risk components, properly. There are various reasons for having risk in supply chain such as natural disasters, unexpected incidents, etc. When a series of facilities are built and deployed, one or a number of them could probably fail at any time due to bad weather conditions, labor strikes, economic crises, sabotage or terrorist attacks and changes in ownership of the system. The objective of risk management is to reduce the effects of different domains to an acceptable level. To overcome the risk, we propose a reliable capacitated supply chain network design (RSCND model by considering random disruptions risk in both distribution centers and suppliers. The proposed study of this paper considers three objective functions and the implementation is verified using some instance.

  2. MiRANN: a reliable approach for improved classification of precursor microRNA using Artificial Neural Network model.

    Science.gov (United States)

    Rahman, Md Eamin; Islam, Rashedul; Islam, Shahidul; Mondal, Shakhinur Islam; Amin, Md Ruhul

    2012-04-01

    MicroRNA (miRNA) is a special class of short noncoding RNA that serves pivotal function of regulating gene expression. The computational prediction of new miRNA candidates involves various methods such as learning methods and methods using expression data. This article has proposed a reliable model - miRANN which is a supervised machine learning approach. MiRANN used known pre-miRNAs as positive set and a novel negative set from human CDS regions. The number of known miRNAs is now huge and diversified that could cover almost all characteristics of unknown miRNAs which increases the quality of the result (99.9% accuracy, 99.8% sensitivity, 100% specificity) and provides a more reliable prediction. MiRANN performs better than other state-of-the-art approaches and declares to be the most potential tool to predict novel miRNAs. We have also tested our result using a previous negative set. MiRANN, opens new ground using ANN for predicting pre-miRNAs with a promise of better performance. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Reliability of some ageing nuclear power plant system: a simple stochastic model

    Energy Technology Data Exchange (ETDEWEB)

    Suarez-Antola, Roberto [Catholic University of Uruguay, Montevideo (Uruguay). School of Engineering and Technologies; Ministerio de Industria, Energia y Mineria, Montevideo (Uruguay). Direccion Nacional de Energia y Tecnologia Nuclear; E-mail: rsuarez@ucu.edu.uy

    2007-07-01

    The random number of failure-related events in certain repairable ageing systems, like certain nuclear power plant components, during a given time interval, may be often modelled by a compound Poisson distribution. One of these is the Polya-Aeppli distribution. The derivation of a stationary Polya-Aeppli distribution as a limiting distribution of rare events for stationary Bernouilli trials with first order Markov dependence is considered. But if the parameters of the Polya-Aeppli distribution are suitable time functions, we could expect that the resulting distribution would allow us to take into account the distribution of failure-related events in an ageing system. Assuming that a critical number of damages produce an emergent failure, the above mentioned results can be applied in a reliability analysis. It is natural to ask under what conditions a Polya-Aeppli distribution could be a limiting distribution for non-homogeneous Bernouilli trials with first order Markov dependence. In this paper this problem is analyzed and possible applications of the obtained results to ageing or deteriorating nuclear power plant components are considered. The two traditional ways of modelling repairable systems in reliability theory: the 'as bad as old' concept, that assumes that the replaced component is exactly under the same conditions as was the aged component before failure, and the 'as good as new' concept, that assumes that the new component is under the same conditions of the replaced component when it was new, are briefly discussed in relation with the findings of the present work. (author)

  4. Human reliability data, human error and accident models--illustration through the Three Mile Island accident analysis

    International Nuclear Information System (INIS)

    Le Bot, Pierre

    2004-01-01

    Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this field a foolproof and productive theoretical framework. Consequently, the aim of this article is to suggest potential paths of action and to provide information on EDF's progress along those paths which enables us to produce the most potentially useful Human Reliability analyses while taking into account current knowledge in Human Sciences. The second part of this article illustrates our point of view as EDF researchers through the analysis of the most famous civil nuclear accident, the Three Mile Island unit accident in 1979. Analysis of this accident allowed us to validate our positions regarding the need to move, in the case of an accident, from the concept of human error to that of systemic failure in the operation of systems such as a nuclear power plant. These concepts rely heavily on the notion of distributed cognition and we will explain how we applied it. These concepts were implemented in the MERMOS Human Reliability Probabilistic Assessment methods used in the latest EDF Probabilistic Human Reliability Assessment. Besides the fact that it is not very productive to focus exclusively on individual psychological error, the design of the MERMOS method and its implementation have confirmed two things: the significance of qualitative data collection for Human Reliability, and the central role held by Human Reliability experts in building knowledge about emergency operation, which in effect consists of Human Reliability data collection. The latest conclusion derived from the implementation of MERMOS is that, considering the difficulty in building 'generic' Human Reliability data in the field we are involved in, the best

  5. Reliable groundwater levels: failures and lessons learned from modeling and monitoring studies

    Science.gov (United States)

    Van Lanen, Henny A. J.

    2017-04-01

    Adequate management of groundwater resources requires an a priori assessment of impacts of intended groundwater abstractions. Usually, groundwater flow modeling is used to simulate the influence of the planned abstraction on groundwater levels. Model performance is tested by using observed groundwater levels. Where a multi-aquifer system occurs, groundwater levels in the different aquifers have to be monitored through observation wells with filters at different depths, i.e. above the impermeable clay layer (phreatic water level) and beneath (artesian aquifer level). A reliable artesian level can only be measured if the space between the outer wall of the borehole (vertical narrow shaft) and the observation well is refilled with impermeable material at the correct depth (post-drilling phase) to prevent a vertical hydraulic connection between the artesian and phreatic aquifer. We were involved in improper refilling, which led to impossibility to monitor reliable artesian aquifer levels. At the location of the artesian observation well, a freely overflowing spring was seen, which implied water leakage from the artesian aquifer affected the artesian groundwater level. Careful checking of the monitoring sites in a study area is a prerequisite to use observations for model performance assessment. After model testing the groundwater model is forced with proposed groundwater abstractions (sites, extraction rates). The abstracted groundwater volume is compensated by a reduction of groundwater flow to the drainage network and the model simulates associated groundwater tables. The drawdown of groundwater level is calculated by comparing the simulated groundwater level with and without groundwater abstraction. In lowland areas, such as vast areas of the Netherlands, the groundwater model has to consider a variable drainage network, which means that small streams only carry water during the wet winter season, and run dry during the summer. The main streams drain groundwater

  6. A Test–Retest Reliability Study of Human Experimental Models of Histaminergic and Non-histaminergic Itch

    DEFF Research Database (Denmark)

    Andersen, Hjalte H.; Sørensen, Anne Kathrine R.; Nielsen, Gebbie A.R.

    2017-01-01

    Numerous exploratory, proof-of-concept and interventional studies have used histaminergic and non-histaminergic human models of itch. However, no reliability studies for such surrogate models have been conducted. This study investigated the test–retest reliability for the response to histamine.......57– 0.77, CVbetween =97%, CVwithin =41%) and histamine: (ICC=0.83–0.93, CVbetween =97%, CVwithin =20%) exhibited moderate-to-excellent intra-individual reliability and moderate inter-individual reliability for the itch intensity. For a test–retest observation period of one week, SPT-delivered histamine......- and cowhage- (5, 15, 25 spiculae) induced itch in healthy volunteers. Cowhage spiculae were individually applied with tweezers and 1% histamine was applied with a skin prick test (SPT) lancet, both on the volar forearm. The intensity of itch was recorded on a visual analogue scale and self-reported area...

  7. Reliable Dual Tensor Model Estimation in Single and Crossing Fibers Based on Jeffreys Prior

    Science.gov (United States)

    Yang, Jianfei; Poot, Dirk H. J.; Caan, Matthan W. A.; Su, Tanja; Majoie, Charles B. L. M.; van Vliet, Lucas J.; Vos, Frans M.

    2016-01-01

    Purpose This paper presents and studies a framework for reliable modeling of diffusion MRI using a data-acquisition adaptive prior. Methods Automated relevance determination estimates the mean of the posterior distribution of a rank-2 dual tensor model exploiting Jeffreys prior (JARD). This data-acquisition prior is based on the Fisher information matrix and enables the assessment whether two tensors are mandatory to describe the data. The method is compared to Maximum Likelihood Estimation (MLE) of the dual tensor model and to FSL’s ball-and-stick approach. Results Monte Carlo experiments demonstrated that JARD’s volume fractions correlated well with the ground truth for single and crossing fiber configurations. In single fiber configurations JARD automatically reduced the volume fraction of one compartment to (almost) zero. The variance in fractional anisotropy (FA) of the main tensor component was thereby reduced compared to MLE. JARD and MLE gave a comparable outcome in data simulating crossing fibers. On brain data, JARD yielded a smaller spread in FA along the corpus callosum compared to MLE. Tract-based spatial statistics demonstrated a higher sensitivity in detecting age-related white matter atrophy using JARD compared to both MLE and the ball-and-stick approach. Conclusions The proposed framework offers accurate and precise estimation of diffusion properties in single and dual fiber regions. PMID:27760166

  8. A Correlated Model for Evaluating Performance and Energy of Cloud System Given System Reliability

    Directory of Open Access Journals (Sweden)

    Hongli Zhang

    2015-01-01

    Full Text Available The serious issue of energy consumption for high performance computing systems has attracted much attention. Performance and energy-saving have become important measures of a computing system. In the cloud computing environment, the systems usually allocate various resources (such as CPU, Memory, Storage, etc. on multiple virtual machines (VMs for executing tasks. Therefore, the problem of resource allocation for running VMs should have significant influence on both system performance and energy consumption. For different processor utilizations assigned to the VM, there exists the tradeoff between energy consumption and task completion time when a given task is executed by the VMs. Moreover, the hardware failure, software failure and restoration characteristics also have obvious influences on overall performance and energy. In this paper, a correlated model is built to analyze both performance and energy in the VM execution environment given the reliability restriction, and an optimization model is presented to derive the most effective solution of processor utilization for the VM. Then, the tradeoff between energy-saving and task completion time is studied and balanced when the VMs execute given tasks. Numerical examples are illustrated to build the performance-energy correlated model and evaluate the expected values of task completion time and consumed energy.

  9. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  10. Model-Based Heterogeneous Data Fusion for Reliable Force Estimation in Dynamic Structures under Uncertainties.

    Science.gov (United States)

    Khodabandeloo, Babak; Melvin, Dyan; Jo, Hongki

    2017-11-17

    Direct measurements of external forces acting on a structure are infeasible in many cases. The Augmented Kalman Filter (AKF) has several attractive features that can be utilized to solve the inverse problem of identifying applied forces, as it requires the dynamic model and the measured responses of structure at only a few locations. But, the AKF intrinsically suffers from numerical instabilities when accelerations, which are the most common response measurements in structural dynamics, are the only measured responses. Although displacement measurements can be used to overcome the instability issue, the absolute displacement measurements are challenging and expensive for full-scale dynamic structures. In this paper, a reliable model-based data fusion approach to reconstruct dynamic forces applied to structures using heterogeneous structural measurements (i.e., strains and accelerations) in combination with AKF is investigated. The way of incorporating multi-sensor measurements in the AKF is formulated. Then the formulation is implemented and validated through numerical examples considering possible uncertainties in numerical modeling and sensor measurement. A planar truss example was chosen to clearly explain the formulation, while the method and formulation are applicable to other structures as well.

  11. Techniques for modeling the reliability of fault-tolerant systems with the Markov state-space approach

    Science.gov (United States)

    Butler, Ricky W.; Johnson, Sally C.

    1995-01-01

    This paper presents a step-by-step tutorial of the methods and the tools that were used for the reliability analysis of fault-tolerant systems. The approach used in this paper is the Markov (or semi-Markov) state-space method. The paper is intended for design engineers with a basic understanding of computer architecture and fault tolerance, but little knowledge of reliability modeling. The representation of architectural features in mathematical models is emphasized. This paper does not present details of the mathematical solution of complex reliability models. Instead, it describes the use of several recently developed computer programs SURE, ASSIST, STEM, and PAWS that automate the generation and the solution of these models.

  12. Do Diametric Measurements Provide Sufficient and Reliable Tumor Assessment? An Evaluation of Diametric, Areametric, and Volumetric Variability of Lung Lesion Measurements on Computerized Tomography Scans

    Directory of Open Access Journals (Sweden)

    Aaron Frenette

    2015-01-01

    Full Text Available Diametric analysis is the standard approach utilized for tumor measurement on medical imaging. However, the availability of newer more sophisticated techniques may prove advantageous. An evaluation of diameter, area, and volume was performed on 64 different lung lesions by three trained users. These calculations were obtained using a free DICOM viewer and standardized measuring procedures. Measurement variability was then studied using relative standard deviation (RSD and intraclass correlation. Volumetric measurements were shown to be more precise than diametric. With minimal RSD and variance between different users, volumetric analysis was demonstrated as a reliable measurement technique. Additionally, the diameters were used to calculate an estimated area and volume; thereafter the estimated area and volume were compared against the actual measured values. The results in this study showed independence of the estimated and actual values. Estimated area deviated an average of 43.5% from the actual measured, and volume deviated 88.03%. The range of this variance was widely scattered and without trend. These results suggest that diametric measurements cannot be reliably correlated to actual tumor size. Access to appropriate software capable of producing volume measurements has improved drastically and shows great potential in the clinical assessment of tumors. Its applicability merits further consideration.

  13. A systematic procedure for the incorporation of common cause events into risk and reliability models

    International Nuclear Information System (INIS)

    Fleming, K.N.; Mosleh, A.; Deremer, R.K.

    1986-01-01

    Common cause events are an important class of dependent events with respect to their contribution to system unavailability and to plant risk. Unfortunately, these events have not been treated with any king of consistency in applied risk studies over the past decade. Many probabilistic risk assessments (PRA) have not included these events at all, and those that have did not employ the kind of systematic procedures that are needed to achieve consistency, accuracy, and credibility in this area of PRA methodology. In this paper, the authors report on the progress recently made in the development of a systematic approach for incorporating common cause events into applied risk and reliability evaluations. This approach takes advantage of experience from recently completed PRAs and is the result of a project, sponsored by the Electric Power Research Institute (EPRI), in which procedures for dependent events analysis are being developed. Described in this paper is a general framework for system-level common cause failure (CCF) analysis and its application to a three-train auxiliary feedwater system. Within this general framework, three parametric CCF models are compared, including the basic parameter (BP), multiple Greek letter (MGL), and binominal failure rate (BFR) models. Pitfalls of not following the recommended procedure are discussed, and some old issues, such as the benefits of redundancy and diversity, are reexamined. (orig.)

  14. Interrater Reliability of the Postoperative Epidural Fibrosis Classification: A Histopathologic Study in the Rat Model.

    Science.gov (United States)

    Sae-Jung, Surachai; Jirarattanaphochai, Kitti; Sumananont, Chat; Wittayapairoj, Kriangkrai; Sukhonthamarn, Kamolsak

    2015-08-01

    Agreement study. To validate the interrater reliability of the histopathological classification of the post-laminectomy epidural fibrosis in an animal model. Epidural fibrosis is a common cause of failed back surgery syndrome. Many animal experiments have been developed to investigate the prevention of epidural fibrosis. One of the common outcome measurements is the epidural fibrous adherence grading, but the classification has not yet been validated. Five identical sets of histopathological digital files of L5-L6 laminectomized adult Sprague-Dawley rats, representing various degrees of postoperative epidural fibrous adherence were randomized and evaluated by five independent assessors masked to the study processes. Epidural fibrosis was rated as grade 0 (no fibrosis), grade 1 (thin fibrous band), grade 2 (continuous fibrous adherence for less than two-thirds of the laminectomy area), or grade 3 (large fibrotic tissue for more than two-thirds of the laminectomy area). A statistical analysis was performed. Four hundred slides were independently evaluated by each assessor. The percent agreement and intraclass correlation coefficient (ICC) between each pair of assessors varied from 73.5% to 81.3% and from 0.81 to 0.86, respectively. The overall ICC was 0.83 (95% confidence interval, 0.81-0.86). The postoperative epidural fibrosis classification showed almost perfect agreement among the assessors. This classification can be used in research involving the histopathology of postoperative epidural fibrosis; for example, for the development of preventions of postoperative epidural fibrosis or treatment in an animal model.

  15. Reliability of thickness of oxide layer of stainless steels with chromium using cellular automaton model

    International Nuclear Information System (INIS)

    Lan, K. C.; Chen, Y.; Yu, G. P.; Hung, T. C.

    2012-01-01

    A cellular automaton (CA) model based on the stochastic approach was proposed to simulate the process of oxidation and corrosion of stainless steels with different contents of chromium in-flowing lead bismuth eutectic (LBE). Chromium is a crucial alloying element added in stainless steels and nickel based alloys which have been proposed to be used in advanced nuclear reactors to improve resistance of the oxidation and corrosion. To verify the reliability of the thickness of the oxide layer by CA model, the influence of the stochastic character on the simulating results was investigated as changing parameter of chromium content of structure material in this study. Ten independent simulations were run for each specific environment. A stable and reasonable results were obtained according to the chi-square of goodness-of-fit test, the chi-square of the thickness of oxide layer for each case were significant smaller than critical chi-square value with a confidence level of 95% (Χ 2 α, v = Χ 2 0.05,9 = 16.92). (authors)

  16. Reliability of delirium rating scale (DRS) and delirium rating scale-revised-98 (DRS-R98) using variance-based multivariate modelling.

    Science.gov (United States)

    Adamis, Dimitrios; Slor, Chantal J; Leonard, Maeve; Witlox, Joost; de Jonghe, Jos F M; Macdonald, Alastair J D; Trzepacz, Paula; Meagher, David

    2013-07-01

    Delirium's characteristic fluctuation in symptom severity complicates the assessment of test-retest reliability of scales using classical analyses, but application of modelling to longitudinal data offers a new approach. We evaluated test-retest reliability of the delirium rating scale (DRS) and delirium rating scale-revised-98 (DRS-R98), two widely used instruments with high validity and inter-rater reliability. Two existing longitudinal datasets for each scale included DSM-IV criteria for delirium diagnosis and repeated measurements using the DRS or DRS-R98. To estimate the reliability coefficients RT and RΛ for each scale we used a macros provided by Dr. Laenen at http://www.ibiostat.be/software/measurement.asp. For each dataset a linear mixed-effects model was fitted to estimate the variance-covariance parameters. A total of 531 cases with between 4 and 9 measurement points across studies including both delirious and non-delirious patients. Comorbid dementia in the datasets varied from 27% to 55%. Overall RT for the DRS were 0.71 and 0.50 and for DRS-R98 0.75 and 0.84. RΛ values for DRS were 0.99 and 0.98 and for DRS-R98 were 0.92 and 0.96. Individual RT measures for DRS-R98 and DRS across visits within studies showed more range than overall values. Our models found high overall reliability for both scales. Multiple factors impact a scale's reliability values including sample size, repeated measurements, patient population, etc in addition to rater variability. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Providing or designing? Constructing models in primary maths education (IF. 0.756)

    NARCIS (Netherlands)

    van Dijk, I.M.A.W.; van Oers, H.J.M.; Terwel, J.

    2003-01-01

    The goal of this exploratory study was to uncover the construction processes which occur when pupils are taught to work with models in primary maths education. Two approaches were studied: 'providing models' versus 'designing models in co-construction'. A qualitative observational study involved two

  18. A simulation model for reliability-based appraisal of an energy policy: The case of Lebanon

    International Nuclear Information System (INIS)

    Hamdan, H.A.; Ghajar, R.F.; Chedid, R.B.

    2012-01-01

    The Lebanese Electric Power System (LEPS) has been suffering from technical and financial deficiencies for decades and mirrors the problems encountered in many developing countries suffering from inadequate or no power systems planning resulting in incomplete and ill-operating infrastructure, and suffering from effects of political instability, huge debts, unavailability of financing desired projects and inefficiency in operation. The upgrade and development of the system necessitate the adoption of a comprehensive energy policy that introduces solutions to a diversity of problems addressing the technical, financial, administrative and governance aspects of the system. In this paper, an energy policy for Lebanon is proposed and evaluated based on integration between energy modeling and financial modeling. The paper utilizes the Load Modification Technique (LMT) as a probabilistic tool to assess the impact of policy implementation on energy production, overall cost, technical/commercial losses and reliability. Scenarios reflecting implementation of policy projects are assessed and their impacts are compared with business-as-usual scenarios which assume no new investment is to take place in the sector. Conclusions are drawn on the usefulness of the proposed evaluation methodology and the effectiveness of the adopted energy policy for Lebanon and other developing countries suffering from similar power system problems. - Highlights: ► Evaluation methodology based on a probabilistic simulation tool is proposed. ► A business-as-usual scenario for a given study period of the LEPS was modeled. ► Mitigation scenarios reflecting implementation of the energy policy are modeled. ► Policy simulated and compared with business-as-usual scenarios of the LEPS. ► Results reflect usefulness of proposed methodology and the adopted energy policy.

  19. A Model to Partly but Reliably Distinguish DDOS Flood Traffic from Aggregated One

    Directory of Open Access Journals (Sweden)

    Ming Li

    2012-01-01

    Full Text Available Reliable distinguishing DDOS flood traffic from aggregated traffic is desperately desired by reliable prevention of DDOS attacks. By reliable distinguishing, we mean that flood traffic can be distinguished from aggregated one for a predetermined probability. The basis to reliably distinguish flood traffic from aggregated one is reliable detection of signs of DDOS flood attacks. As is known, reliably distinguishing DDOS flood traffic from aggregated traffic becomes a tough task mainly due to the effects of flash-crowd traffic. For this reason, this paper studies reliable detection in the underlying DiffServ network to use static-priority schedulers. In this network environment, we present a method for reliable detection of signs of DDOS flood attacks for a given class with a given priority. There are two assumptions introduced in this study. One is that flash-crowd traffic does not have all priorities but some. The other is that attack traffic has all priorities in all classes, otherwise an attacker cannot completely achieve its DDOS goal. Further, we suppose that the protected site is equipped with a sensor that has a signature library of the legitimate traffic with the priorities flash-crowd traffic does not have. Based on those, we are able to reliably distinguish attack traffic from aggregated traffic with the priorities that flash-crowd traffic does not have according to a given detection probability.

  20. Towards a High Reliable Enforcement of Safety Regulations - A Workflow Meta Data Model and Probabilistic Failure Management Approach

    Directory of Open Access Journals (Sweden)

    Heiko Henning Thimm

    2016-10-01

    Full Text Available Today’s companies are able to automate the enforcement of Environmental, Health and Safety (EH&S duties through the use of workflow management technology. This approach requires to specify activities that are combined to workflow models for EH&S enforcement duties. In order to meet given safety regulations these activities are to be completed correctly and within given deadlines. Otherwise, activity failures emerge which may lead to breaches against safety regulations. A novel domain-specific workflow meta data model is proposed. The model enables a system to detect and predict activity failures through the use of data about the company, failure statistics, and activity proxies. Since the detection and prediction methods are based on the evaluation of constraints specified on EH&S regulations, a system approach is proposed that builds on the integration of a Workflow Management System (WMS with an EH&S Compliance Information System. Main principles of the failure detection and prediction are described. For EH&S managers the system shall provide insights into the current failure situation. This can help to prevent and mitigate critical situations such as safety enforcement measures that are behind their deadlines. As a result a more reliable enforcement of safety regulations can be achieved.

  1. Are mixed explicit/implicit solvation models reliable for studying phosphate hydrolysis? A comparative study of continuum, explicit and mixed solvation models.

    Energy Technology Data Exchange (ETDEWEB)

    Kamerlin, Shina C. L.; Haranczyk, Maciej; Warshel, Arieh

    2009-05-01

    Phosphate hydrolysis is ubiquitous in biology. However, despite intensive research on this class of reactions, the precise nature of the reaction mechanism remains controversial. In this work, we have examined the hydrolysis of three homologous phosphate diesters. The solvation free energy was simulated by means of either an implicit solvation model (COSMO), hybrid quantum mechanical / molecular mechanical free energy perturbation (QM/MM-FEP) or a mixed solvation model in which N water molecules were explicitly included in the ab initio description of the reacting system (where N=1-3), with the remainder of the solvent being implicitly modelled as a continuum. Here, both COSMO and QM/MM-FEP reproduce Delta Gobs within an error of about 2kcal/mol. However, we demonstrate that in order to obtain any form of reliable results from a mixed model, it is essential to carefully select the explicit water molecules from short QM/MM runs that act as a model for the true infinite system. Additionally, the mixed models tend to be increasingly inaccurate the more explicit water molecules are placed into the system. Thus, our analysis indicates that this approach provides an unreliable way for modelling phosphate hydrolysis in solution.

  2. Forecasting consequences of accidental release: how reliable are current assessment models

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Hoffman, F.O.; Miller, C.W.

    1983-01-01

    This paper focuses on uncertainties in model output used to assess accidents. We begin by reviewing the historical development of assessment models and the associated interest in uncertainties as these evolutionary processes occurred in the United States. This is followed by a description of the sources of uncertainties in assessment calculations. Types of models appropriate for assessment of accidents are identified. A summary of results from our analysis of uncertainty is provided in results obtained with current methodology for assessing routine and accidental radionuclide releases to the environment. We conclude with discussion of preferred procedures and suggested future directions to improve the state-of-the-art of radiological assessments

  3. JUPITER: Joint Universal Parameter IdenTification and Evaluation of Reliability - An Application Programming Interface (API) for Model Analysis

    Science.gov (United States)

    Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.

    2006-01-01

    he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model

  4. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  5. Evaluation of MCF10A as a Reliable Model for Normal Human Mammary Epithelial Cells.

    Directory of Open Access Journals (Sweden)

    Ying Qu

    Full Text Available Breast cancer is the most common cancer in women and a leading cause of cancer-related deaths for women worldwide. Various cell models have been developed to study breast cancer tumorigenesis, metastasis, and drug sensitivity. The MCF10A human mammary epithelial cell line is a widely used in vitro model for studying normal breast cell function and transformation. However, there is limited knowledge about whether MCF10A cells reliably represent normal human mammary cells. MCF10A cells were grown in monolayer, suspension (mammosphere culture, three-dimensional (3D "on-top" Matrigel, 3D "cell-embedded" Matrigel, or mixed Matrigel/collagen I gel. Suspension culture was performed with the MammoCult medium and low-attachment culture plates. Cells grown in 3D culture were fixed and subjected to either immunofluorescence staining or embedding and sectioning followed by immunohistochemistry and immunofluorescence staining. Cells or slides were stained for protein markers commonly used to identify mammary progenitor and epithelial cells. MCF10A cells expressed markers representing luminal, basal, and progenitor phenotypes in two-dimensional (2D culture. When grown in suspension culture, MCF10A cells showed low mammosphere-forming ability. Cells in mammospheres and 3D culture expressed both luminal and basal markers. Surprisingly, the acinar structure formed by MCF10A cells in 3D culture was positive for both basal markers and the milk proteins β-casein and α-lactalbumin. MCF10A cells exhibit a unique differentiated phenotype in 3D culture which may not exist or be rare in normal human breast tissue. Our results raise a question as to whether the commonly used MCF10A cell line is a suitable model for human mammary cell studies.

  6. Evaluation of MCF10A as a Reliable Model for Normal Human Mammary Epithelial Cells.

    Science.gov (United States)

    Qu, Ying; Han, Bingchen; Yu, Yi; Yao, Weiwu; Bose, Shikha; Karlan, Beth Y; Giuliano, Armando E; Cui, Xiaojiang

    2015-01-01

    Breast cancer is the most common cancer in women and a leading cause of cancer-related deaths for women worldwide. Various cell models have been developed to study breast cancer tumorigenesis, metastasis, and drug sensitivity. The MCF10A human mammary epithelial cell line is a widely used in vitro model for studying normal breast cell function and transformation. However, there is limited knowledge about whether MCF10A cells reliably represent normal human mammary cells. MCF10A cells were grown in monolayer, suspension (mammosphere culture), three-dimensional (3D) "on-top" Matrigel, 3D "cell-embedded" Matrigel, or mixed Matrigel/collagen I gel. Suspension culture was performed with the MammoCult medium and low-attachment culture plates. Cells grown in 3D culture were fixed and subjected to either immunofluorescence staining or embedding and sectioning followed by immunohistochemistry and immunofluorescence staining. Cells or slides were stained for protein markers commonly used to identify mammary progenitor and epithelial cells. MCF10A cells expressed markers representing luminal, basal, and progenitor phenotypes in two-dimensional (2D) culture. When grown in suspension culture, MCF10A cells showed low mammosphere-forming ability. Cells in mammospheres and 3D culture expressed both luminal and basal markers. Surprisingly, the acinar structure formed by MCF10A cells in 3D culture was positive for both basal markers and the milk proteins β-casein and α-lactalbumin. MCF10A cells exhibit a unique differentiated phenotype in 3D culture which may not exist or be rare in normal human breast tissue. Our results raise a question as to whether the commonly used MCF10A cell line is a suitable model for human mammary cell studies.

  7. An Appropriate Wind Model for Wind Integrated Power Systems Reliability Evaluation Considering Wind Speed Correlations

    Directory of Open Access Journals (Sweden)

    Rajesh Karki

    2013-02-01

    Full Text Available Adverse environmental impacts of carbon emissions are causing increasing concerns to the general public throughout the world. Electric energy generation from conventional energy sources is considered to be a major contributor to these harmful emissions. High emphasis is therefore being given to green alternatives of energy, such as wind and solar. Wind energy is being perceived as a promising alternative. This source of energy technology and its applications have undergone significant research and development over the past decade. As a result, many modern power systems include a significant portion of power generation from wind energy sources. The impact of wind generation on the overall system performance increases substantially as wind penetration in power systems continues to increase to relatively high levels. It becomes increasingly important to accurately model the wind behavior, the interaction with other wind sources and conventional sources, and incorporate the characteristics of the energy demand in order to carry out a realistic evaluation of system reliability. Power systems with high wind penetrations are often connected to multiple wind farms at different geographic locations. Wind speed correlations between the different wind farms largely affect the total wind power generation characteristics of such systems, and therefore should be an important parameter in the wind modeling process. This paper evaluates the effect of the correlation between multiple wind farms on the adequacy indices of wind-integrated systems. The paper also proposes a simple and appropriate probabilistic analytical model that incorporates wind correlations, and can be used for adequacy evaluation of multiple wind-integrated systems.

  8. CKow -- A More Transparent and Reliable Model for Chemical Transfer to Meat and Milk

    Energy Technology Data Exchange (ETDEWEB)

    Rosenbaum, Ralph K.; McKone, Thomas E.; Jolliet, Olivier

    2009-03-01

    The objective of this study is to increase the understanding and transparency of chemical biotransfer modeling into meat and milk and explicitly confront the uncertainties in exposure assessments of chemicals that require such estimates. In cumulative exposure assessments that include food pathways, much of the overall uncertainty is attributable to the estimation of transfer into biota and through food webs. Currently, the most commonly used meat and milk-biotransfer models date back two decades and, in spite of their widespread use in multimedia exposure models few attempts have been made to advance or improve the outdated and highly uncertain Kow regressions used in these models. Furthermore, in the range of Kow where meat and milk become the dominant human exposure pathways, these models often provide unrealistic rates and do not reflect properly the transfer dynamics. To address these issues, we developed a dynamic three-compartment cow model (called CKow), distinguishing lactating and non-lactating cows. For chemicals without available overall removal rates in the cow, a correlation is derived from measured values reported in the literature to predict this parameter from Kow. Results on carry over rates (COR) and biotransfer factors (BTF) demonstrate that a steady-state ratio between animal intake and meat concentrations is almost never reached. For meat, empirical data collected on short term experiments need to be adjusted to provide estimates of average longer term behaviors. The performance of the new model in matching measurements is improved relative to existing models--thus reducing uncertainty. The CKow model is straight forward to apply at steady state for milk and dynamically for realistic exposure durations for meat COR.

  9. New Provider Models for Sweden and Spain: Public, Private or Non-profit? Comment on "Governance, Government, and the Search for New Provider Models".

    Science.gov (United States)

    Jeurissen, Patrick P T; Maarse, Hans

    2016-06-29

    Sweden and Spain experiment with different provider models to reform healthcare provision. Both models have in common that they extend the role of the for-profit sector in healthcare. As the analysis of Saltman and Duran demonstrates, privatisation is an ambiguous and contested strategy that is used for quite different purposes. In our comment, we emphasize that their analysis leaves questions open on the consequences of privatisation for the performance of healthcare and the role of the public sector in healthcare provision. Furthermore, we briefly address the absence of the option of healthcare provision by not-for-profit providers in the privatisation strategy of Sweden and Spain. © 2016 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  10. Hot Spot Temperature and Grey Target Theory-Based Dynamic Modelling for Reliability Assessment of Transformer Oil-Paper Insulation Systems: A Practical Case Study

    Directory of Open Access Journals (Sweden)

    Lefeng Cheng

    2018-01-01

    Full Text Available This paper develops a novel dynamic correction method for the reliability assessment of large oil-immersed power transformers. First, with the transformer oil-paper insulation system (TOPIS as the target of evaluation and the winding hot spot temperature (HST as the core point, an HST-based static ageing failure model is built according to the Weibull distribution and Arrhenius reaction law, in order to describe the transformer ageing process and calculate the winding HST for obtaining the failure rate and life expectancy of TOPIS. A grey target theory based dynamic correction model is then developed, combined with the data of Dissolved Gas Analysis (DGA in power transformer oil, in order to dynamically modify the life expectancy calculated by the built static model, such that the corresponding relationship between the state grade and life expectancy correction coefficient of TOPIS can be built. Furthermore, the life expectancy loss recovery factor is introduced to correct the life expectancy of TOPIS again. Lastly, a practical case study of an operating transformer has been undertaken, in which the failure rate curve after introducing dynamic corrections can be obtained for the reliability assessment of this transformer. The curve shows a better ability of tracking the actual reliability level of transformer, thus verifying the validity of the proposed method and providing a new way for transformer reliability assessment. This contribution presents a novel model for the reliability assessment of TOPIS, in which the DGA data, as a source of information for the dynamic correction, is processed based on the grey target theory, thus the internal faults of power transformer can be diagnosed accurately as well as its life expectancy updated in time, ensuring that the dynamic assessment values can commendably track and reflect the actual operation state of the power transformers.

  11. Implementation of a cognitive human reliability model in dynamic probabilistic risk assessment of a nuclear power plant (ADS-IDA)

    Science.gov (United States)

    Shukri, Tariq Mohamad

    This research has resulted in the development of ADS-IDA which is an integrated software for performing dynamic probabilistic risk assessment (PRA). Unique features of ADS-IDA include (1) modular structure allowing the code to be used for generic applications, although current implementation includes modules designed for nuclear power plant risk assessment, and (2) implementation of the IDA human reliability model which is a cognitive model of operator actions during accident scenarios. ADS-IDA can be used to: (1) Perform a full scale dynamic PRA of a nuclear power plant (or other systems by replacing the NPP module with the appropriate model and provide the KBs associated with the operator model). (2) Identify and analyze human errors and their causes including commission and omission errors, as well as the effect of timing of operator response. (3) Run cases not covered by the current PRAs, such as scenarios involving instrument failure or miscalibration. (4) Test the relevance of the emergency operating procedures (or abnormal operating procedures if available). (5) Investigate the effect of different system failure modes, and particularly the time of system mode transition (e.g. failure, termination of operation, repair) on the progression of the accident. (6) Analyze the impact of items 2, 3, and 5 above on the evolution of accident scenarios and the final state of the plant. ADS-IDA as used to perform a dynamic PRA of an accident initiator at an actual nuclear power plant, demonstrating the capabilities of the methodology both in modeling plant behavior and in simulating operator errors of commission and omission.

  12. Hierarchical nanoreinforced composites for highly reliable large wind turbines: Computational modelling and optimization

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon

    2014-01-01

    , with modified, hybridor nanomodified structures. In this project, we seek to explore the potential of hybrid (carbon/glass),nanoreinforced and hierarchical composites (with secondary CNT, graphene or nanoclay reinforcement) as future materials for highly reliable large wind turbines. Using 3D multiscale......The major precondition for the successful development of wind energy in Europe is the high reliability of wind turbines, in particular, large off-shore turbines. The qualitative enhancement of the reliability of wind turbine blades can be achieved by the development of new highly damage materials...

  13. Are reactive transport models reliable tools for reconstructing historical contamination scenarios?

    Science.gov (United States)

    Clement, P.

    2009-12-01

    models to reconstruct the historical concentration levels. In this presentation, I will first briefly review the details of the contamination problem and the modeling results. Later I will use the field study to answer the following questions: 1) Are reactive transport modeling tools sufficiently reliable for reconstructing historical VOC contamination at field sites? 2) What are the benefits of using reactive transport models for resolving policy problems related to a groundwater risk/exposure assessment problem? Finally, we will use this example to answer a rhetorical question—-how much complexity is too much complexity?

  14. The LNT model provides the best approach for practical implementation of radiation protection.

    Science.gov (United States)

    Martin, C J

    2005-01-01

    This contribution argues the case that, at the present time, the linear-no-threshold (LNT) model provides the only rational framework on which practical radiation protection can be organized. Political, practical and healthcare difficulties with attempting to introduce an alternative approach, e.g. a threshold model, are discussed.

  15. Effectiveness of Video Modeling Provided by Mothers in Teaching Play Skills to Children with Autism

    Science.gov (United States)

    Besler, Fatma; Kurt, Onur

    2016-01-01

    Video modeling is an evidence-based practice that can be used to provide instruction to individuals with autism. Studies show that this instructional practice is effective in teaching many types of skills such as self-help skills, social skills, and academic skills. However, in previous studies, videos used in the video modeling process were…

  16. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  17. An overview of the IAEA Safety Series on procedures for evaluating the reliability of predictions made by environmental transfer models

    International Nuclear Information System (INIS)

    Hoffman, F.W.; Hofer, E.

    1987-10-01

    The International Atomic Energy Agency is preparing a Safety Series publication on practical approaches for evaluating the reliability of the predictions made by environmental radiological assessment models. This publication identifies factors that affect the reliability of these predictions and discusses methods for quantifying uncertainty. Emphasis is placed on understanding the quantity of interest specified by the assessment question and distinguishing between stochastic variability and lack of knowledge about either the true value or the true distribution of values for quantity of interest. Among the many approaches discussed, model testing using independent data sets (model validation) is considered the best method for evaluating the accuracy in model predictions. Analytical and numerical methods for propagating the uncertainties in model parameters are presented and the strengths and weaknesses of model intercomparison exercises are also discussed. It is recognized that subjective judgment is employed throughout the entire modelling process, and quantitative reliability statements must be subjectively obtained when models are applied to different situations from those under which they have been tested. (6 refs.)

  18. A Markovian Approach Applied to Reliability Modeling of Bidirectional DC-DC Converters Used in PHEVs and Smart Grids

    Directory of Open Access Journals (Sweden)

    M. Khalilzadeh

    2016-12-01

    Full Text Available In this paper, a stochastic approach is proposed for reliability assessment of bidirectional DC-DC converters, including the fault-tolerant ones. This type of converters can be used in a smart DC grid, feeding DC loads such as home appliances and plug-in hybrid electric vehicles (PHEVs. The reliability of bidirectional DC-DC converters is of such an importance, due to the key role of the expected increasingly utilization of DC grids in modern Smart Grid. Markov processes are suggested for reliability modeling and consequently calculating the expected effective lifetime of bidirectional converters. A three-leg bidirectional interleaved converter using data of Toyota Prius 2012 hybrid electric vehicle is used as a case study. Besides, the influence of environment and ambient temperature on converter lifetime is studied. The impact of modeling the reliability of the converter and adding reliability constraints on the technical design procedure of the converter is also investigated. In order to investigate the effect of leg increase on the lifetime of the converter, single leg to five-leg interleave DC-DC converters are studied considering economical aspect and the results are extrapolated for six and seven-leg converters. The proposed method could be generalized so that the number of legs and input and output capacitors could be an arbitrary number.

  19. Applications and Extensions of Signature Theory to Modeling and Inference Problems in Engineering Reliability

    Science.gov (United States)

    2012-01-26

    systems burned in to a given ordered component failure time. In a reliability economics framework , we illustrate how one might compare a new system...rate” ( IFR ) members of this parametric family). (8) Signature-based representations for the reliability of systems with heterogeneous components...and the search for a uniformly optimal network (UON) could, at least conceptually , be based on them. In Boland, Samaniego and Vestrup (2003), a new

  20. Development of a foraging model framework to reliably estimate daily food consumption by young fishes

    Science.gov (United States)

    Deslauriers, David; Rosburg, Alex J.; Chipps, Steven R.

    2017-01-01

    We developed a foraging model for young fishes that incorporates handling and digestion rate to estimate daily food consumption. Feeding trials were used to quantify functional feeding response, satiation, and gut evacuation rate. Once parameterized, the foraging model was then applied to evaluate effects of prey type, prey density, water temperature, and fish size on daily feeding rate by age-0 (19–70 mm) pallid sturgeon (Scaphirhynchus albus). Prey consumption was positively related to prey density (for fish >30 mm) and water temperature, but negatively related to prey size and the presence of sand substrate. Model evaluation results revealed good agreement between observed estimates of daily consumption and those predicted by the model (r2 = 0.95). Model simulations showed that fish feeding on Chironomidae or Ephemeroptera larvae were able to gain mass, whereas fish feeding solely on zooplankton lost mass under most conditions. By accounting for satiation and digestive processes in addition to handling time and prey density, the model provides realistic estimates of daily food consumption that can prove useful for evaluating rearing conditions for age-0 fishes.

  1. Maximizing Energy Savings Reliability in BC Hydro Industrial Demand-side Management Programs: An Assessment of Performance Incentive Models

    Science.gov (United States)

    Gosman, Nathaniel

    of alternative performance incentive program models to manage DSM risk in BC. Three performance incentive program models were assessed and compared to BC Hydro's current large industrial DSM incentive program, Power Smart Partners -- Transmission Project Incentives, itself a performance incentive-based program. Together, the selected program models represent a continuum of program design and implementation in terms of the schedule and level of incentives provided, the duration and rigour of measurement and verification (M&V), energy efficiency measures targeted and involvement of the private sector. A multi criteria assessment framework was developed to rank the capacity of each program model to manage BC large industrial DSM risk factors. DSM risk management rankings were then compared to program costeffectiveness, targeted energy savings potential in BC and survey results from BC industrial firms on the program models. The findings indicate that the reliability of DSM energy savings in the BC large industrial sector can be maximized through performance incentive program models that: (1) offer incentives jointly for capital and low-cost operations and maintenance (O&M) measures, (2) allow flexible lead times for project development, (3) utilize rigorous M&V methods capable of measuring variable load, process-based energy savings, (4) use moderate contract lengths that align with effective measure life, and (5) integrate energy management software tools capable of providing energy performance feedback to customers to maximize the persistence of energy savings. While this study focuses exclusively on the BC large industrial sector, the findings of this research have applicability to all energy utilities serving large, energy intensive industrial sectors.

  2. FRAMES-2.0 Software System: Providing Password Protection and Limited Access to Models and Simulations

    International Nuclear Information System (INIS)

    Whelan, Gene; Pelton, Mitch A.

    2007-01-01

    One of the most important concerns for regulatory agencies is the concept of reproducibility (i.e., reproducibility means credibility) of an assessment. One aspect of reproducibility deals with tampering of the assessment. In other words, when multiple groups are engaged in an assessment, it is important to lock down the problem that is to be solved and/or to restrict the models that are to be used to solve the problem. The objective of this effort is to provide the U.S. Nuclear Regulatory Commission (NRC) with a means to limit user access to models and to provide a mechanism to constrain the conceptual site models (CSMs) when appropriate. The purpose is to provide the user (i.e., NRC) with the ability to ''lock down'' the CSM (i.e., picture containing linked icons), restrict access to certain models, or both.

  3. Onboard Robust Visual Tracking for UAVs Using a Reliable Global-Local Object Model.

    Science.gov (United States)

    Fu, Changhong; Duan, Ran; Kircali, Dogan; Kayacan, Erdal

    2016-08-31

    In this paper, we present a novel onboard robust visual algorithm for long-term arbitrary 2D and 3D object tracking using a reliable global-local object model for unmanned aerial vehicle (UAV) applications, e.g., autonomous tracking and chasing a moving target. The first main approach in this novel algorithm is the use of a global matching and local tracking approach. In other words, the algorithm initially finds feature correspondences in a way that an improved binary descriptor is developed for global feature matching and an iterative Lucas-Kanade optical flow algorithm is employed for local feature tracking. The second main module is the use of an efficient local geometric filter (LGF), which handles outlier feature correspondences based on a new forward-backward pairwise dissimilarity measure, thereby maintaining pairwise geometric consistency. In the proposed LGF module, a hierarchical agglomerative clustering, i.e., bottom-up aggregation, is applied using an effective single-link method. The third proposed module is a heuristic local outlier factor (to the best of our knowledge, it is utilized for the first time to deal with outlier features in a visual tracking application), which further maximizes the representation of the target object in which we formulate outlier feature detection as a binary classification problem with the output features of the LGF module. Extensive UAV flight experiments show that the proposed visual tracker achieves real-time frame rates of more than thirty-five frames per second on an i7 processor with 640 × 512 image resolution and outperforms the most popular state-of-the-art trackers favorably in terms of robustness, efficiency and accuracy.

  4. What are healthcare providers' understandings and experiences of compassion? The healthcare compassion model: a grounded theory study of healthcare providers in Canada.

    Science.gov (United States)

    Sinclair, Shane; Hack, Thomas F; Raffin-Bouchal, Shelley; McClement, Susan; Stajduhar, Kelli; Singh, Pavneet; Hagen, Neil A; Sinnarajah, Aynharan; Chochinov, Harvey Max

    2018-03-14

    Healthcare providers are considered the primary conduit of compassion in healthcare. Although most healthcare providers desire to provide compassion, and patients and families expect to receive it, an evidence-based understanding of the construct and its associated dimensions from the perspective of healthcare providers is needed. The aim of this study was to investigate healthcare providers' perspectives and experiences of compassion in order to generate an empirically derived, clinically informed model. Data were collected via focus groups with frontline healthcare providers and interviews with peer-nominated exemplary compassionate healthcare providers. Data were independently and collectively analysed by the research team in accordance with Straussian grounded theory. 57 healthcare providers were recruited from urban and rural palliative care services spanning hospice, home care, hospital-based consult teams, and a dedicated inpatient unit within Alberta, Canada. Five categories and 13 associated themes were identified, illustrated in the Healthcare Provider Compassion Model depicting the dimensions of compassion and their relationship to one another. Compassion was conceptualised as-a virtuous and intentional response to know a person, to discern their needs and ameliorate their suffering through relational understanding and action. An empirical foundation of healthcare providers' perspectives on providing compassionate care was generated. While the dimensions of the Healthcare Provider Compassion Model were congruent with the previously developed Patient Model, further insight into compassion is now evident. The Healthcare Provider Compassion Model provides a model to guide clinical practice and research focused on developing interventions, measures and resources to improve it. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly

  5. Does a population survey provide reliable influenza vaccine uptake rates among high-risk groups? A case-study of the Netherlands.

    NARCIS (Netherlands)

    Kroneman, M.W.; Essen, G.A. van; Tacken, M.A.J.B.; Paget, W.J.; Verheij, R.

    2004-01-01

    All European countries have recommendations for influenza vaccination among the elderly and chronically ill. However, only a few countries are able to provide data on influenza uptake among these groups. The aim of our study is to investigate whether a population survey is an effective method of

  6. Reliability Assessment of Solder Joints in Power Electronic Modules by Crack Damage Model for Wind Turbine Applications

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2011-01-01

    Wind turbine reliability is an important issue for wind energy cost minimization, especially by reduction of operation and maintenance costs for critical components and by increasing wind turbine availability. To develop an optimal operation and maintenance plan for critical components......, it is necessary to understand the physics of their failure and be able to develop reliability prediction models. Such a model is proposed in this paper for an IGBT power electronic module. IGBTs are critical components in wind turbine converter systems. These are multi-layered devices where layers are soldered...... and propagation processes is discussed. A statistical analysis is performed for appropriate model parameter selection. Based on the proposed model, a layout for component life prediction with crack movement is described in details....

  7. Social models provide a norm of appropriate food intake for young women.

    Directory of Open Access Journals (Sweden)

    Lenny R Vartanian

    Full Text Available It is often assumed that social models influence people's eating behavior by providing a norm of appropriate food intake, but this hypothesis has not been directly tested. In three experiments, female participants were exposed to a low-intake model, a high-intake model, or no model (control condition. Experiments 1 and 2 used a remote-confederate manipulation and were conducted in the context of a cookie taste test. Experiment 3 used a live confederate and was conducted in the context of a task during which participants were given incidental access to food. Participants also rated the extent to which their food intake was influenced by a variety of factors (e.g., hunger, taste, how much others ate. In all three experiments, participants in the low-intake conditions ate less than did participants in the high-intake conditions, and also reported a lower perceived norm of appropriate intake. Furthermore, perceived norms of appropriate intake mediated the effects of the social model on participants' food intake. Despite the observed effects of the social models, participants were much more likely to indicate that their food intake was influenced by taste and hunger than by the behavior of the social models. Thus, social models appear to influence food intake by providing a norm of appropriate eating behavior, but people may be unaware of the influence of a social model on their behavior.

  8. How to Obtain a 100% Reliable Grid with Clean, Renewable Wind, Water, and Solar Providing 100% of all Raw Energy for All Purposes

    Science.gov (United States)

    Jacobson, M. Z.; Delucchi, M. A.; Cameron, M. A.; Frew, B. A.

    2016-12-01

    The greatest concern facing the large-scale integration of wind, water, and solar (WWS) into a power grid is the high cost of avoiding load loss caused by WWS variability and uncertainty. This talk discusses the recent development of a new grid integration model to address this issue. The model finds low-cost, no-load-loss, non-unique solutions to this problem upon electrification of all U.S. energy sectors (electricity, transportation, heating/cooling, and industry) while accounting for wind and solar time-series data from a 3-D global weather model that simulates extreme events and competition among wind turbines for available kinetic energy. Solutions are obtained by prioritizing storage for heat (in soil and water); cold (in ice and water); and electricity (in phase-change materials, pumped hydro, hydropower, and hydrogen); and using demand response. No natural gas, biofuels, or stationary batteries are needed. The resulting 2050-2055 U.S. electricity social cost for a full system is much less than for fossil fuels. These results hold for many conditions, suggesting that low-cost, stable 100% WWS systems should work many places worldwide. The paper this talk is based on was published in PNAS, 112, 15,060-15,065, 2015, doi:10.1073/pnas.1510028112.

  9. A New Software Reliability Growth Model: Multigeneration Faults and a Power-Law Testing-Effort Function

    Directory of Open Access Journals (Sweden)

    Fan Li

    2016-01-01

    Full Text Available Software reliability growth models (SRGMs based on a nonhomogeneous Poisson process (NHPP are widely used to describe the stochastic failure behavior and assess the reliability of software systems. For these models, the testing-effort effect and the fault interdependency play significant roles. Considering a power-law function of testing effort and the interdependency of multigeneration faults, we propose a modified SRGM to reconsider the reliability of open source software (OSS systems and then to validate the model’s performance using several real-world data. Our empirical experiments show that the model well fits the failure data and presents a high-level prediction capability. We also formally examine the optimal policy of software release, considering both the testing cost and the reliability requirement. By conducting sensitivity analysis, we find that if the testing-effort effect or the fault interdependency was ignored, the best time to release software would be seriously delayed and more resources would be misplaced in testing the software.

  10. Monitoring the censored lognormal reliability data in a three-stage process using AFT model

    Science.gov (United States)

    Goodarzi, Azam; Amiri, Amirhossein; Asadzadeh, Shervin

    2017-09-01

    Improving the product reliability is the main concern in both manufacturing and service processes which is obtained by monitoring the reliability-related quality characteristics. Nowadays, products or services are the result of processes with dependent stages referred to as multistage processes. In these processes, the quality characteristic in each stage is affected by the quality characteristic in the previous stages known as cascade property. Two regression-adjusted control schemes are applied to monitor the output quality variables of interest. Moreover, censoring is among the main limitations while monitoring the reliability-related quality characteristics, causing not to record the real values of some observations. Hence, the right censored observations are used to extend monitoring schemes under both the fixed- and variable-competing risks. In this paper, the accelerated failure time (AFT) is used to relate the reliability-related quality characteristic with lognormal distribution to the incoming variables. Then, two cause-selecting control charts are developed to monitor outgoing quality variables when censoring happens in each reliability-related stage. The performance of the control charts is evaluated and compared through extensive simulation studies under the censored and non-censored scenarios.

  11. Value-added strategy models to provide quality services in senior health business.

    Science.gov (United States)

    Yang, Ya-Ting; Lin, Neng-Pai; Su, Shyi; Chen, Ya-Mei; Chang, Yao-Mao; Handa, Yujiro; Khan, Hafsah Arshed Ali; Elsa Hsu, Yi-Hsin

    2017-06-20

    The rapid population aging is now a global issue. The increase in the elderly population will impact the health care industry and health enterprises; various senior needs will promote the growth of the senior health industry. Most senior health studies are focused on the demand side and scarcely on supply. Our study selected quality enterprises focused on aging health and analyzed different strategies to provide excellent quality services to senior health enterprises. We selected 33 quality senior health enterprises in Taiwan and investigated their excellent quality services strategies by face-to-face semi-structured in-depth interviews with CEO and managers of each enterprise in 2013. A total of 33 senior health enterprises in Taiwan. Overall, 65 CEOs and managers of 33 enterprises were interviewed individually. None. Core values and vision, organization structure, quality services provided, strategies for quality services. This study's results indicated four type of value-added strategy models adopted by senior enterprises to offer quality services: (i) residential care and co-residence model, (ii) home care and living in place model, (iii) community e-business experience model and (iv) virtual and physical portable device model. The common part in these four strategy models is that the services provided are elderly centered. These models offer virtual and physical integrations, and also offer total solutions for the elderly and their caregivers. Through investigation of successful strategy models for providing quality services to seniors, we identified opportunities to develop innovative service models and successful characteristics, also policy implications were summarized. The observations from this study will serve as a primary evidenced base for enterprises developing their senior market and, also for promoting the value co-creation possibility through dialogue between customers and those that deliver service. © The Author 2017. Published by Oxford

  12. Principles of performance and reliability modeling and evaluation essays in honor of Kishor Trivedi on his 70th birthday

    CERN Document Server

    Puliafito, Antonio

    2016-01-01

    This book presents the latest key research into the performance and reliability aspects of dependable fault-tolerant systems and features commentary on the fields studied by Prof. Kishor S. Trivedi during his distinguished career. Analyzing system evaluation as a fundamental tenet in the design of modern systems, this book uses performance and dependability as common measures and covers novel ideas, methods, algorithms, techniques, and tools for the in-depth study of the performance and reliability aspects of dependable fault-tolerant systems. It identifies the current challenges that designers and practitioners must face in order to ensure the reliability, availability, and performance of systems, with special focus on their dynamic behaviors and dependencies, and provides system researchers, performance analysts, and practitioners with the tools to address these challenges in their work. With contributions from Prof. Trivedi's former PhD students and collaborators, many of whom are internationally recognize...

  13. Validity and reliability of an application review process using dedicated reviewers in one stage of a multi-stage admissions model.

    Science.gov (United States)

    Zeeman, Jacqueline M; McLaughlin, Jacqueline E; Cox, Wendy C

    2017-11-01

    With increased emphasis placed on non-academic skills in the workplace, a need exists to identify an admissions process that evaluates these skills. This study assessed the validity and reliability of an application review process involving three dedicated application reviewers in a multi-stage admissions model. A multi-stage admissions model was utilized during the 2014-2015 admissions cycle. After advancing through the academic review, each application was independently reviewed by two dedicated application reviewers utilizing a six-construct rubric (written communication, extracurricular and community service activities, leadership experience, pharmacy career appreciation, research experience, and resiliency). Rubric scores were extrapolated to a three-tier ranking to select candidates for on-site interviews. Kappa statistics were used to assess interrater reliability. A three-facet Many-Facet Rasch Model (MFRM) determined reviewer severity, candidate suitability, and rubric construct difficulty. The kappa statistic for candidates' tier rank score (n = 388 candidates) was 0.692 with a perfect agreement frequency of 84.3%. There was substantial interrater reliability between reviewers for the tier ranking (kappa: 0.654-0.710). Highest construct agreement occurred in written communication (kappa: 0.924-0.984). A three-facet MFRM analysis explained 36.9% of variance in the ratings, with 0.06% reflecting application reviewer scoring patterns (i.e., severity or leniency), 22.8% reflecting candidate suitability, and 14.1% reflecting construct difficulty. Utilization of dedicated application reviewers and a defined tiered rubric provided a valid and reliable method to effectively evaluate candidates during the application review process. These analyses provide insight into opportunities for improving the application review process among schools and colleges of pharmacy. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Measuring the Quality of Services Provided for Outpatients in Kowsar Clinic in Ardebil City Based on the SERVQUAL Model

    Directory of Open Access Journals (Sweden)

    Hasan Ghobadi

    2014-12-01

    Full Text Available Background & objectives: Today, the concept of q uality of services is particularly important in health care and customer satisfaction can be defined by comparing the expectations of the services with perception of provided services. The aim of this study was to evaluate the quality of services provided for outpatients in clinic of Ardebil city based on the SERVQUAL model.   Methods: This descriptive study was conducted on 650 patients referred to outpatient clinic since July to September 201 3 using a standardized SERVQUAL questionnaire (1988 with confirmed reliability and validity. The paired t-test and Friedman test were used for analysis of data by SPSS software.   Results: 56.1 % of respondents were male and 43.9 % of them were female . The mean age of patients was 33 ± 11.91 , 68.9 % of patients were in Ardabil and 27.3 % of them had bachelor's or higher. The results showed that there is a significant difference between perceptions and expectations of the patients about five dimensions of the service quality (tangibility, reliability, assurance, responsiveness, and empathy in the studied clinic (P< 0.001. The highest mean gap and minimum gap were related to empathy and assurance, respectively.   Conclusion: Regarding to observed differences in quality , the managers and also planners have to evaluate their performance more accurately in order to have better planning for future actions. In fact, any efforts to reduce the gap between expectation and perception of patients result in greater satisfaction, loyalty and further visits to organizations.

  15. Social network analysis via multi-state reliability and conditional influence models

    International Nuclear Information System (INIS)

    Schneider, Kellie; Rainwater, Chase; Pohl, Ed; Hernandez, Ivan; Ramirez-Marquez, Jose Emmanuel

    2013-01-01

    This paper incorporates multi-state reliability measures into the assessment of a social network in which influence is treated as a multi-state commodity that flows through the network. The reliability of the network is defined as the probability that at least a certain level of influence reaches an intended target. We consider an individual's influence level as a function of the influence levels received from preceding actors in the network. We define several communication functions which describe the level of influence a particular actor will pass along to other actors within the network. Illustrative examples are presented, and the network reliability under the various communication influence levels is computed using exhaustive enumeration for a small example and Monte Carlo simulation for larger, more realistic sized examples.

  16. Towards a generic, reliable CFD modelling methodology for waste-fired grate boilers

    DEFF Research Database (Denmark)

    Rajh, Boštjan; Yin, Chungen; Samec, Niko

    Computational Fluid Dynamics (CFD) is increasingly used in industry for detailed understanding of the combustion process and for appropriate design and optimization of Waste–to–Energy (WtE) plants. In this paper, CFD modelling of waste wood combustion in a 13 MW grate-fired boiler in a WtE plant...... is presented. To reduce the risk of slagging, optimize the temperature control and enhance turbulent mixing, part of the flue gas is recycled into the grate boiler. In the simulation, a 1D in–house bed model is developed to simulate the conversion of the waste wood in the fuel bed on the grate, which provides...... of the increased CO2 and H2O vapour concentrations on radiative heat transfer in the boiler. The impacts of full buoyancy on turbulence are also investigated. As a validation effort, the temperature profiles at different ports inside the furnace are measured and the experimental values are compared with the CFD...

  17. Reliable before-fabrication forecasting of normal and touch mode MEMS capacitive pressure sensor: modeling and simulation

    Science.gov (United States)

    Jindal, Sumit Kumar; Mahajan, Ankush; Raghuwanshi, Sanjeev Kumar

    2017-10-01

    An analytical model and numerical simulation for the performance of MEMS capacitive pressure sensors in both normal and touch modes is required for expected behavior of the sensor prior to their fabrication. Obtaining such information should be based on a complete analysis of performance parameters such as deflection of diaphragm, change of capacitance when the diaphragm deflects, and sensitivity of the sensor. In the literature, limited work has been carried out on the above-stated issue; moreover, due to approximation factors of polynomials, a tolerance error cannot be overseen. Reliable before-fabrication forecasting requires exact mathematical calculation of the parameters involved. A second-order polynomial equation is calculated mathematically for key performance parameters of both modes. This eliminates the approximation factor, and an exact result can be studied, maintaining high accuracy. The elimination of approximation factors and an approach of exact results are based on a new design parameter (δ) that we propose. The design parameter gives an initial hint to the designers on how the sensor will behave once it is fabricated. The complete work is aided by extensive mathematical detailing of all the parameters involved. Next, we verified our claims using MATLAB® simulation. Since MATLAB® effectively provides the simulation theory for the design approach, more complicated finite element method is not used.

  18. Does a population survey provide reliable influenza vaccine uptake rates among high-risk groups? A case-study of The Netherlands.

    Science.gov (United States)

    Kroneman, Madelon W; van Essen, Gerrit A; Tacken, Margot A J B; Paget, W John; Verheij, Robert

    2004-06-02

    All European countries have recommendations for influenza vaccination among the elderly and chronically ill. However, only a few countries are able to provide data on influenza uptake among these groups. The aim of our study is to investigate whether a population survey is an effective method of obtaining vaccination uptake rates in the different risk groups and to find out what reasons people give as to why they have accepted or refused influenza vaccination and whether this varies among the risk groups. A mail questionnaire was sent out to households in The Netherlands, the response rate was 73%. This resulted in data for 4037 individuals on influenza and influenza vaccination during the 2001-2002 influenza season. The uptake rates and size of different risk groups from the panel survey were comparable with other national representative sources (from the National Information Network of GPs (LINH) and Statistics Netherlands (CBS)). The main reason cited for undergoing vaccination was the existence of a chronic condition. The main reasons for refraining from vaccination were having enough resistance to flu and ignorance about the recommendations. In The Netherlands, the GP is the main administrator of influenza vaccines. We believe that population surveys may be useful for revealing influenza vaccination uptake rates for the groups at risk. When combined with questions about reasons for undergoing vaccination, the results may provide useful policy information and can be used to direct vaccination campaigns at under-vaccinated risk groups or to target the information campaign more effectively.

  19. A Complex of Business Process Management Models for a Service-Providing IT Company

    OpenAIRE

    Yatsenko Roman M.; Balykov Oleksii H.

    2017-01-01

    The article presents an analysis of a complex of business process management models that are designed to improve the performance of service-providing IT companies. This class of enterprises was selected because of their significant contribution to the Ukrainian economy: third place in the structure of exports, significant budget revenues, high development dynamics, and prospects in the global marketplace. The selected complex of models is designed as a sequence of stages that must be accompli...

  20. A reliability centered maintenance model applied to the auxiliary feedwater system of a nuclear power plant

    International Nuclear Information System (INIS)

    Araujo, Jefferson Borges

    1998-01-01

    The main objective of maintenance in a nuclear power plant is to assure that structures, systems and components will perform their design functions with reliability and availability in order to obtain a safety and economic electric power generation. Reliability Centered Maintenance (RCM) is a method of systematic review to develop or optimize Preventive Maintenance Programs. This study presents the objectives, concepts, organization and methods used in the development of RCM application to nuclear power plants. Some examples of this application are included, considering the Auxiliary Feedwater System of a generic two loops PWR nuclear power plant of Westinghouse design. (author)

  1. Penerapan Model Multidimensional Scaling dalam Pemetaan Brand Positioning Internet Service Provider

    Directory of Open Access Journals (Sweden)

    Robertus Tang Herman

    2010-03-01

    Full Text Available In this high-tech era, there have been tremendous advances in tech-based products and services. Internet is one of them that have widened the world’s eyes to a new borderless marketplace. High competition among internet service providers has pushed companies to create competitive advantage and brilliant marketing strategies. They undertake positioning mapping to describe product or service’s positioning amongst many competitors. The right positioning strategy becomes a powerful weapon to win in the battle. This research is designed to create positioning mapping based on perceptual mapping. The researcher uses Multidimensional Scaling and image mapping to achieve this research goal. Sampling is using non-probability sampling in Jakarta. Based on non-attribute approach, the research findings show that there is similarity between two different brands. Thus, both brands are competing against one another. On the other hand, CBN and Netzap provider reflect some differences to others. And some brands require some improvements in terms of network reliability.

  2. Material and structural mechanical modelling and reliability of thin-walled bellows at cryogenic temperatures. Application to LHC compensation system

    CERN Document Server

    Garion, Cédric; Skoczen, Blazej

    The present thesis is dedicated to the behaviour of austenitic stainless steels at cryogenic temperatures. The plastic strain induced martensitic transformation and ductile damage are taken into account in an elastic-plastic material modelling. The kinetic law of →’ transformation and the evolution laws of kinematic/isotropic mixed hardening are established. Damage issue is analysed by different ways: mesoscopic isotropic or orthotropic model and a microscopic approach. The material parameters are measured from 316L fine gauge sheet at three levels of temperature: 293 K, 77 K and 4.2 K. The model is applied to thin-walled corrugated shell, used in the LHC interconnections. The influence of the material properties on the stability is studied by a modal analysis. The reliability of the components, defined by the Weibull distribution law, is analysed from fatigue tests. The impact on reliability of geometrical imperfections and thermo-mechanical loads is also analysed.

  3. [Interview instrument provides no reliable assessment of suicide risk. Scientific support is lacking according to report from the Swedish Council on Health Technology Assessment (SBU)].

    Science.gov (United States)

    Runeson, Bo

    2015-12-08

    Identifying individuals at risk of future suicide or suicide attempts is of clinical importance. Instruments have been developed to facilitate the assessment of the risk of future suicidal acts. A systematic review was conducted using the standard methods of the Swedish Council on Health Technology Assessment (SBU). The ability of the instrument to predict risk for future suicide/suicide attempt was assessed at follow up. The methodological quality of eligible studies was assessed; studies with moderate or low risk of bias were analysed in accordance with GRADE. None of the included studies provided scientific evidence to support that any instrument had sufficient accuracy to predict future suicidal behaviour. There is strong evidence to support that the SAD PERSONS Scale has very low sensitivity; most persons who make future suicidal acts are not identified.

  4. Cluster-based upper body marker models for three-dimensional kinematic analysis: Comparison with an anatomical model and reliability analysis.

    Science.gov (United States)

    Boser, Quinn A; Valevicius, Aïda M; Lavoie, Ewen B; Chapman, Craig S; Pilarski, Patrick M; Hebert, Jacqueline S; Vette, Albert H

    2018-02-27

    Quantifying angular joint kinematics of the upper body is a useful method for assessing upper limb function. Joint angles are commonly obtained via motion capture, tracking markers placed on anatomical landmarks. This method is associated with limitations including administrative burden, soft tissue artifacts, and intra- and inter-tester variability. An alternative method involves the tracking of rigid marker clusters affixed to body segments, calibrated relative to anatomical landmarks or known joint angles. The accuracy and reliability of applying this cluster method to the upper body has, however, not been comprehensively explored. Our objective was to compare three different upper body cluster models with an anatomical model, with respect to joint angles and reliability. Non-disabled participants performed two standardized functional upper limb tasks with anatomical and cluster markers applied concurrently. Joint angle curves obtained via the marker clusters with three different calibration methods were compared to those from an anatomical model, and between-session reliability was assessed for all models. The cluster models produced joint angle curves which were comparable to and highly correlated with those from the anatomical model, but exhibited notable offsets and differences in sensitivity for some degrees of freedom. Between-session reliability was comparable between all models, and good for most degrees of freedom. Overall, the cluster models produced reliable joint angles that, however, cannot be used interchangeably with anatomical model outputs to calculate kinematic metrics. Cluster models appear to be an adequate, and possibly advantageous alternative to anatomical models when the objective is to assess trends in movement behavior. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Model of load balancing using reliable algorithm with multi-agent system

    Science.gov (United States)

    Afriansyah, M. F.; Somantri, M.; Riyadi, M. A.

    2017-04-01

    Massive technology development is linear with the growth of internet users which increase network traffic activity. It also increases load of the system. The usage of reliable algorithm and mobile agent in distributed load balancing is a viable solution to handle the load issue on a large-scale system. Mobile agent works to collect resource information and can migrate according to given task. We propose reliable load balancing algorithm using least time first byte (LFB) combined with information from the mobile agent. In system overview, the methodology consisted of defining identification system, specification requirements, network topology and design system infrastructure. The simulation method for simulated system was using 1800 request for 10 s from the user to the server and taking the data for analysis. Software simulation was based on Apache Jmeter by observing response time and reliability of each server and then compared it with existing method. Results of performed simulation show that the LFB method with mobile agent can perform load balancing with efficient systems to all backend server without bottleneck, low risk of server overload, and reliable.

  6. Assessment of physician competency in patient education: reliability and validity of a model-based instrument

    NARCIS (Netherlands)

    Wouda, Jan C.; Zandbelt, Linda C.; Smets, Ellen M. A.; van de Wiel, Harry B. M.

    2011-01-01

    Establish the inter-rater reliability and the concept, convergent and construct validity of an instrument for assessing the competency of physicians in patient education. Three raters assessed the quality of patient education in 30 outpatient consultations with the CELI instrument. This instrument

  7. Assessment of physician competency in patient education : Reliability and validity of a model-based instrument

    NARCIS (Netherlands)

    Wouda, Jan C.; Zandbelt, Linda C.; Smets, Ellen M. A.; van de Wiel, Harry B. M.

    2011-01-01

    Objective: Establish the inter-rater reliability and the concept, convergent and construct validity of an instrument for assessing the competency of physicians in patient education. Methods: Three raters assessed the quality of patient education in 30 outpatient consultations with the CELI

  8. ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model

    Science.gov (United States)

    Hoffman, David J.; Viterna, Larry A.

    1991-01-01

    A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.

  9. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  10. On the reliability of spacecraft swarms

    NARCIS (Netherlands)

    Engelen, S.; Gill, E.K.A.; Verhoeven, C.J.M.

    2012-01-01

    Satellite swarms, consisting of a large number of identical, miniaturized and simple satellites, are claimed to provide an implementation for specific space missions which require high reliability. However, a consistent model of how reliability and availability on mission level is linked to cost-

  11. Using Models to Provide Predicted Ranges for Building-Human Interfaces: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Scheib, J.; Pless, S.; Schott, M.

    2013-09-01

    Most building energy consumption dashboards provide only a snapshot of building performance; whereas some provide more detailed historic data with which to compare current usage. This paper will discuss the Building Agent(tm) platform, which has been developed and deployed in a campus setting at the National Renewable Energy Laboratory as part of an effort to maintain the aggressive energyperformance achieved in newly constructed office buildings and laboratories. The Building Agent(tm) provides aggregated and coherent access to building data, including electric energy, thermal energy, temperatures, humidity, and lighting levels, and occupant feedback, which are displayed in various manners for visitors, building occupants, facility managers, and researchers. This paper focuseson the development of visualizations for facility managers, or an energy performance assurance role, where metered data are used to generate models that provide live predicted ranges of building performance by end use. These predicted ranges provide simple, visual context for displayed performance data without requiring users to also assess historical information or trends. Several energymodelling techniques were explored including static lookup-based performance targets, reduced-order models derived from historical data using main effect variables such as solar radiance for lighting performance, and integrated energy models using a whole-building energy simulation program.

  12. Reliability Testing Strategy - Reliability in Software Engineering

    OpenAIRE

    Taylor-Sakyi, Kevin

    2016-01-01

    This paper presents the core principles of reliability in software engineering - outlining why reliability testing is critical and specifying the process of measuring reliability. The paper provides insight for both novice and experts in the software engineering field for assessing failure intensity as well as predicting failure of software systems. Measurements are conducted by utilizing information from an operational profile to further enhance a test plan and test cases, all of which this ...

  13. Comparing consumer-directed and agency models for providing supportive services at home.

    Science.gov (United States)

    Benjamin, A E; Matthias, R; Franke, T M

    2000-04-01

    To examine the service experiences and outcomes of low-income Medicaid beneficiaries with disabilities under two different models for organizing home-based personal assistance services: agency-directed and consumer-directed. A survey of a random sample of 1,095 clients, age 18 and over, who receive services in California's In-Home Supportive Services (IHSS) program funded primarily by Medicaid. Other data were obtained from the California Management and Payrolling System (CMIPS). The sample was stratified by service model (agency-directed or consumer-directed), client age (over or under age 65), and severity. Data were collected on client demographics, condition/functional status, and supportive service experience. Outcome measures were developed in three areas: safety, unmet need, and service satisfaction. Factor analysis was used to reduce multiple outcome measures to nine dimensions. Multiple regression analysis was used to assess the effect of service model on each outcome dimension, taking into account the client-provider relationship, client demographics, and case mix. Recipients of IHSS services as of mid-1996 were interviewed by telephone. The survey was conducted in late 1996 and early 1997. On various outcomes, recipients in the consumer-directed model report more positive outcomes than those in the agency model, or they report no difference. Statistically significant differences emerge on recipient safety, unmet needs, and service satisfaction. A family member present as a paid provider is also associated with more positive reported outcomes within the consumer-directed model, but model differences persist even when this is taken into account. Although both models have strengths and weaknesses, from a recipient perspective the consumer-directed model is associated with more positive outcomes. Although health professionals have expressed concerns about the capacity of consumer direction to assure quality, particularly with respect to safety, meeting unmet

  14. MODEL OF PROVIDING WITH DEVELOPMENT STRATEGY FOR INFORMATION TECHNOLOGIES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    A. A. Kuzkin

    2015-03-01

    Full Text Available Subject of research. The paper presents research and instructional tools for assessment of providing with the development strategy for information technologies in an organization. Method. The corresponding assessment model is developed which takes into consideration IT-processes equilibrium according to selected efficiency factors of information technologies application. Basic results. The model peculiarity resides in applying neuro-fuzzy approximators where the conclusion is drawn upon fuzzy logic, and membership functions are adjusted through the use of neural networks. For the adequacy testing of the suggested model, due diligence result analysis has been carried out for the IT-strategy executed in the “Navigator” group of companies at the stage of implementation and support of new technologies and production methods. Data visualization with a circle diagram is applied for the comparative evaluation of the analysis results. The chosen model adequacy is proved by the agreement between predictive assessments for IT-strategy performance targets derived by means of the fuzzy cognitive model over 12 months planning horizon and the real values of these targets upon the expiry of the given planning term. Practical significance. The developed model application gives the possibility to solve the problem of sustainability assessment for the process of providing the required IT-strategy realization level based upon the fuzzy cognitive map analysis and to reveal IT-objectives changing tendencies for an organization over the stated planning interval.

  15. Assimilation of Remotely Sensed Soil Moisture Profiles into a Crop Modeling Framework for Reliable Yield Estimations

    Science.gov (United States)

    Mishra, V.; Cruise, J.; Mecikalski, J. R.

    2017-12-01

    Much effort has been expended recently on the assimilation of remotely sensed soil moisture into operational land surface models (LSM). These efforts have normally been focused on the use of data derived from the microwave bands and results have often shown that improvements to model simulations have been limited due to the fact that microwave signals only penetrate the top 2-5 cm of the soil surface. It is possible that model simulations could be further improved through the introduction of geostationary satellite thermal infrared (TIR) based root zone soil moisture in addition to the microwave deduced surface estimates. In this study, root zone soil moisture estimates from the TIR based Atmospheric Land Exchange Inverse (ALEXI) model were merged with NASA Soil Moisture Active Passive (SMAP) based surface estimates through the application of informational entropy. Entropy can be used to characterize the movement of moisture within the vadose zone and accounts for both advection and diffusion processes. The Principle of Maximum Entropy (POME) can be used to derive complete soil moisture profiles and, fortuitously, only requires a surface boundary condition as well as the overall mean moisture content of the soil column. A lower boundary can be considered a soil parameter or obtained from the LSM itself. In this study, SMAP provided the surface boundary while ALEXI supplied the mean and the entropy integral was used to tie the two together and produce the vertical profile. However, prior to the merging, the coarse resolution (9 km) SMAP data were downscaled to the finer resolution (4.7 km) ALEXI grid. The disaggregation scheme followed the Soil Evaporative Efficiency approach and again, all necessary inputs were available from the TIR model. The profiles were then assimilated into a standard agricultural crop model (Decision Support System for Agrotechnology, DSSAT) via the ensemble Kalman Filter. The study was conducted over the Southeastern United States for the

  16. Hyperspectral Imaging Provides Early Prediction of Random Axial Flap Necrosis in a Preclinical Model.

    Science.gov (United States)

    Chin, Michael S; Chappell, Ava G; Giatsidis, Giorgio; Perry, Dylan J; Lujan-Hernandez, Jorge; Haddad, Anthony; Matsumine, Hajime; Orgill, Dennis P; Lalikos, Janice F

    2017-06-01

    Necrosis remains a significant complication in cutaneous flap procedures. Monitoring, and ideally prediction, of vascular compromise in the early postoperative period may allow surgeons to limit the impact of complications by prompt intervention. Hyperspectral imaging could be a reliable, effective, and noninvasive method for predicting flap survival postoperatively. In this preclinical study, the authors demonstrate that hyperspectral imaging is able to correlate early skin perfusion changes and ultimate flap survival in a preclinical model. Thirty-one hairless, immunocompetent, adult male mice were used. Random pattern dorsal skin flaps were elevated and sutured back into place with a silicone barrier. Hyperspectral imaging and digital images were obtained 30 minutes, 24 hours, or 72 hours after flap elevation and before sacrifice on postoperative day 7. Areas of high deoxygenated hemoglobin change (124; 95 percent CI, 118 to 129) seen at 30 minutes after surgery were associated with greater than 50 percent flap necrosis at postoperative day 7. Areas demarcated by high deoxygenated hemoglobin at 30 minutes postoperatively had a statistically significant correlation with areas of macroscopic necrosis on postoperative day 7. Analysis of images obtained at 24 and 72 hours did not show similar changes. These findings suggest that early changes in deoxygenated hemoglobin seen with hyperspectral imaging may predict the region and extent of flap necrosis. Further clinical studies are needed to determine whether hyperspectral imaging is applicable to the clinical setting.

  17. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment.

    Science.gov (United States)

    Mahato, Niladri K; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-05-18

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4

  18. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment

    International Nuclear Information System (INIS)

    Mahato, Niladri K.; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-01-01

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T 1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T 1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4

  19. Simulation model for transcervical laryngeal injection providing real-time feedback.

    Science.gov (United States)

    Ainsworth, Tiffiny A; Kobler, James B; Loan, Gregory J; Burns, James A

    2014-12-01

    This study aimed to develop and evaluate a model for teaching transcervical laryngeal injections. A 3-dimensional printer was used to create a laryngotracheal framework based on de-identified computed tomography images of a human larynx. The arytenoid cartilages and intrinsic laryngeal musculature were created in silicone from clay casts and thermoplastic molds. The thyroarytenoid (TA) muscle was created with electrically conductive silicone using metallic filaments embedded in silicone. Wires connected TA muscles to an electrical circuit incorporating a cell phone and speaker. A needle electrode completed the circuit when inserted in the TA during simulated injection, providing real-time feedback of successful needle placement by producing an audible sound. Face validation by the senior author confirmed appropriate tactile feedback and anatomical realism. Otolaryngologists pilot tested the model and completed presimulation and postsimulation questionnaires. The high-fidelity simulation model provided tactile and audio feedback during needle placement, simulating transcervical vocal fold injections. Otolaryngology residents demonstrated higher comfort levels with transcervical thyroarytenoid injection on postsimulation questionnaires. This is the first study to describe a simulator for developing transcervical vocal fold injection skills. The model provides real-time tactile and auditory feedback that aids in skill acquisition. Otolaryngologists reported increased confidence with transcervical injection after using the simulator. © The Author(s) 2014.

  20. The analysis and choice of models of durability at probes of reliability of drive lines

    OpenAIRE

    Pastukhov, A.

    2014-01-01

    The most actual problem of modern agricultural mechanical engineering is increasing of durability of details of transport and technological cars and the equipment by criteria of wear, corrosion and fatigue strength at simultaneous drop of their massogabaritny indicators. The continuous increase in power, working speed and other indicators of cars and the equipment, and also growth of intensity of details connected with it result in need of use in the course of probe of reliability of transmis...

  1. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  2. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  3. Reliability and acceptability of a five-station multiple mini-interview model for residency program recruitment

    Directory of Open Access Journals (Sweden)

    Julian Diaz Fraga

    2013-12-01

    Full Text Available Background: Standard interviews are used by most residency programs in the United States for assessment of aptitude of the non-cognitive competencies, but variability of interviewer skill, interviewer bias, interviewer leniency or stringency, and context specificity limit reliability. Aim: To investigate reliability and acceptability of five-station multiple mini-interview (MMI model for resident selection into an internal medicine residency program in the United States. Setting: One independent academic medical center. Participants: Two hundred and thirty-seven applicants and 17 faculty interviewers. Program description: Five, 10-min MMI stations with five different interviewers blinded to the candidate's records and one traditional 20-min interview with the program director. Candidates were rated on two items: interpersonal and communication skills, and overall performance. Program evaluation: Generalizability data showed that the reliability of our process was high (>0.9. The results of anonymous surveys demonstrated that both applicants and interviewers consider the MMI as a fair and more effective tool to evaluate non-cognitive traits, and prefer the MMI to standard interviews. Discussion: The MMI process for residency interviews can generate reliable interview results using only five stations, and it is acceptable and preferred over standard interview modalities by the applicants and faculty members of one US residency program.

  4. A Stochastic Reliability Model for Application in a Multidisciplinary Optimization of a Low Pressure Turbine Blade Made of Titanium Aluminide

    Directory of Open Access Journals (Sweden)

    Christian Dresbach

    Full Text Available Abstract Currently, there are a lot of research activities dealing with gamma titanium aluminide (γ-TiAl alloys as new materials for low pressure turbine (LPT blades. Even though the scatter in mechanical properties of such intermetallic alloys is more distinctive as in conventional metallic alloys, stochastic investigations on γ -TiAl alloys are very rare. For this reason, we analyzed the scatter in static and dynamic mechanical properties of the cast alloy Ti-48Al-2Cr-2Nb. It was found that this alloy shows a size effect in strength which is less pronounced than the size effect of brittle materials. A weakest-link approach is enhanced for describing a scalable size effect under multiaxial stress states and implemented in a post processing tool for reliability analysis of real components. The presented approach is a first applicable reliability model for semi-brittle materials. The developed reliability tool was integrated into a multidisciplinary optimization of the geometry of a LPT blade. Some processes of the optimization were distributed in a wide area network, so that specialized tools for each discipline could be employed. The optimization results show that it is possible to increase the aerodynamic efficiency and the structural mechanics reliability at the same time, while ensuring the blade can be manufactured in an investment casting process.

  5. A New Biobjective Model to Optimize Integrated Redundancy Allocation and Reliability-Centered Maintenance Problems in a System Using Metaheuristics

    Directory of Open Access Journals (Sweden)

    Shima MohammadZadeh Dogahe

    2015-01-01

    Full Text Available A novel integrated model is proposed to optimize the redundancy allocation problem (RAP and the reliability-centered maintenance (RCM simultaneously. A system of both repairable and nonrepairable components has been considered. In this system, electronic components are nonrepairable while mechanical components are mostly repairable. For nonrepairable components, a redundancy allocation problem is dealt with to determine optimal redundancy strategy and number of redundant components to be implemented in each subsystem. In addition, a maintenance scheduling problem is considered for repairable components in order to identify the best maintenance policy and optimize system reliability. Both active and cold standby redundancy strategies have been taken into account for electronic components. Also, net present value of the secondary cost including operational and maintenance costs has been calculated. The problem is formulated as a biobjective mathematical programming model aiming to reach a tradeoff between system reliability and cost. Three metaheuristic algorithms are employed to solve the proposed model: Nondominated Sorting Genetic Algorithm (NSGA-II, Multiobjective Particle Swarm Optimization (MOPSO, and Multiobjective Firefly Algorithm (MOFA. Several test problems are solved using the mentioned algorithms to test efficiency and effectiveness of the solution approaches and obtained results are analyzed.

  6. A Tutorial on Nonlinear Time-Series Data Mining in Engineering Asset Health and Reliability Prediction: Concepts, Models, and Algorithms

    Directory of Open Access Journals (Sweden)

    Ming Dong

    2010-01-01

    Full Text Available The primary objective of engineering asset management is to optimize assets service delivery potential and to minimize the related risks and costs over their entire life through the development and application of asset health and usage management in which the health and reliability prediction plays an important role. In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset is generally described as monitored nonlinear time-series data and subject to high levels of uncertainty and unpredictability. It has been proved that application of data mining techniques is very useful for extracting relevant features which can be used as parameters for assets diagnosis and prognosis. In this paper, a tutorial on nonlinear time-series data mining in engineering asset health and reliability prediction is given. Besides that an overview on health and reliability prediction techniques for engineering assets is covered, this tutorial will focus on concepts, models, algorithms, and applications of hidden Markov models (HMMs and hidden semi-Markov models (HSMMs in engineering asset health prognosis, which are representatives of recent engineering asset health prediction techniques.

  7. Relationship between anaerobic parameters provided from MAOD and critical power model in specific table tennis test.

    Science.gov (United States)

    Zagatto, A M; Gobatto, C A

    2012-08-01

    The aim of this study was to verify the validity of the curvature constant parameter (W'), calculated from 2-parameter mathematical equations of critical power model, in estimating the anaerobic capacity and anaerobic work capacity from a table tennis-specific test. Specifically, we aimed to i) compare constants estimated from three critical intensity models in a table tennis-specific test (Cf); ii) correlate each estimated W' with the maximal accumulated oxygen deficit (MAOD); iii) correlate each W' with the total amount of anaerobic work (W ANAER) performed in each exercise bout performed during the Cf test. Nine national-standard male table tennis players participated in the study. MAOD was 63.0(10.8) mL · kg - 1 and W' values were 32.8(6.6) balls for the linear-frequency model, 38.3(6.9) balls for linear-total balls model, 48.7(8.9) balls for Nonlinear-2 parameter model. Estimated W' from the Nonlinear 2-parameter model was significantly different from W' from the other 2 models (P0.13). Thus, W' estimated from the 2-parameter mathematical equations did not correlate with MAOD or W ANAER in table tennis-specific tests, indicating that W' may not provide a strong and valid estimation of anaerobic capacity and anaerobic capacity work. © Georg Thieme Verlag KG Stuttgart · New York.

  8. Test Reliability at the Individual Level

    Science.gov (United States)

    Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly

    2016-01-01

    Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107

  9. Wind farms providing secondary frequency regulation: Evaluating the performance of model-based receding horizon control

    International Nuclear Information System (INIS)

    Shapiro, Carl R.; Meneveau, Charles; Gayme, Dennice F.; Meyers, Johan

    2016-01-01

    We investigate the use of wind farms to provide secondary frequency regulation for a power grid. Our approach uses model-based receding horizon control of a wind farm that is tested using a large eddy simulation (LES) framework. In order to enable real-time implementation, the control actions are computed based on a time-varying one-dimensional wake model. This model describes wake advection and interactions, both of which play an important role in wind farm power production. This controller is implemented in an LES model of an 84-turbine wind farm represented by actuator disk turbine models. Differences between the velocities at each turbine predicted by the wake model and measured in LES are used for closed-loop feedback. The controller is tested on two types of regulation signals, “RegA” and “RegD”, obtained from PJM, an independent system operator in the eastern United States. Composite performance scores, which are used by PJM to qualify plants for regulation, are used to evaluate the performance of the controlled wind farm. Our results demonstrate that the controlled wind farm consistently performs well, passing the qualification threshold for all fastacting RegD signals. For the RegA signal, which changes over slower time scales, the controlled wind farm's average performance surpasses the threshold, but further work is needed to enable the controlled system to achieve qualifying performance all of the time. (paper)

  10. Biomass transformation webs provide a unified approach to consumer-resource modelling.

    Science.gov (United States)

    Getz, Wayne M

    2011-02-01

    An approach to modelling food web biomass flows among live and dead compartments within and among species is formulated using metaphysiological principles that characterise population growth in terms of basal metabolism, feeding, senescence and exploitation. This leads to a unified approach to modelling interactions among plants, herbivores, carnivores, scavengers, parasites and their resources. Also, dichotomising sessile miners from mobile gatherers of resources, with relevance to feeding and starvation time scales, suggests a new classification scheme involving 10 primary categories of consumer types. These types, in various combinations, rigorously distinguish scavenger from parasite, herbivory from phytophagy and detritivore from decomposer. Application of the approach to particular consumer-resource interactions is demonstrated, culminating in the construction of an anthrax-centred food web model, with parameters applicable to Etosha National Park, Namibia, where deaths of elephants and zebra from the bacterial pathogen, Bacillus anthracis, provide significant subsidies to jackals, vultures and other scavengers. © 2010 Blackwell Publishing Ltd/CNRS.

  11. Biomass transformation webs provide a unified approach to consumer–resource modelling

    Science.gov (United States)

    Getz, Wayne M.

    2011-01-01

    An approach to modelling food web biomass flows among live and dead compartments within and among species is formulated using metaphysiological principles that characterise population growth in terms of basal metabolism, feeding, senescence and exploitation. This leads to a unified approach to modelling interactions among plants, herbivores, carnivores, scavengers, parasites and their resources. Also, dichotomising sessile miners from mobile gatherers of resources, with relevance to feeding and starvation time scales, suggests a new classification scheme involving 10 primary categories of consumer types. These types, in various combinations, rigorously distinguish scavenger from parasite, herbivory from phytophagy and detritivore from decomposer. Application of the approach to particular consumer–resource interactions is demonstrated, culminating in the construction of an anthrax-centred food web model, with parameters applicable to Etosha National Park, Namibia, where deaths of elephants and zebra from the bacterial pathogen, Bacillus anthracis, provide significant subsidies to jackals, vultures and other scavengers. PMID:21199247

  12. Modeling Key Drivers of Cholera Transmission Dynamics Provides New Perspectives for Parasitology.

    Science.gov (United States)

    Rinaldo, Andrea; Bertuzzo, Enrico; Blokesch, Melanie; Mari, Lorenzo; Gatto, Marino

    2017-08-01

    Hydroclimatological and anthropogenic factors are key drivers of waterborne disease transmission. Information on human settlements and host mobility on waterways along which pathogens and hosts disperse, and relevant hydroclimatological processes, can be acquired remotely and included in spatially explicit mathematical models of disease transmission. In the case of epidemic cholera, such models allowed the description of complex disease patterns and provided insight into the course of ongoing epidemics. The inclusion of spatial information in models of disease transmission can aid in emergency management and the assessment of alternative interventions. Here, we review the study of drivers of transmission via spatially explicit approaches and argue that, because many parasitic waterborne diseases share the same drivers as cholera, similar principles may apply. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A Complex of Business Process Management Models for a Service-Providing IT Company

    Directory of Open Access Journals (Sweden)

    Yatsenko Roman M.

    2017-10-01

    Full Text Available The article presents an analysis of a complex of business process management models that are designed to improve the performance of service-providing IT companies. This class of enterprises was selected because of their significant contribution to the Ukrainian economy: third place in the structure of exports, significant budget revenues, high development dynamics, and prospects in the global marketplace. The selected complex of models is designed as a sequence of stages that must be accomplished in order to optimize business processes. The first stage is an analysis of the nature of the process approach, approaches to strategic management, and the characteristics of service-providing IT companies. The second stage is to build the formal and hierarchical models to define the characteristics of the business processes and their structure, respectively. The third stage is to evaluate individual business processes (information model and the entire business process system (multi-level assessment of business processes. The fourth stage is to optimize the business processes at each level: strategic, tactical and operational. The fifth stage is to restructure the business processes after optimization. The sixth (final stage is to analyze the efficiency of the restructured system of business processes.

  14. Pharmacists providing care in the outpatient setting through telemedicine models: a narrative review

    Directory of Open Access Journals (Sweden)

    Littauer SL

    2017-12-01

    Full Text Available Telemedicine refers to the delivery of clinical services using technology that allows two-way, real time, interactive communication between the patient and the clinician at a distant site. Commonly, telemedicine is used to improve access to general and specialty care for patients in rural areas. This review aims to provide an overview of existing telemedicine models involving the delivery of care by pharmacists via telemedicine (including telemonitoring and video, but excluding follow-up telephone calls and to highlight the main areas of chronic-disease management where these models have been applied. Studies within the areas of hypertension, diabetes, asthma, anticoagulation and depression were identified, but only two randomized controlled trials with adequate sample size demonstrating the positive impact of telemonitoring combined with pharmacist care in hypertension were identified. The evidence for the impact of pharmacist-based telemedicine models is sparse and weak, with the studies conducted presenting serious threats to internal and external validity. Therefore, no definitive conclusions about the impact of pharmacist-led telemedicine models can be made at this time. In the Unites States, the increasing shortage of primary care providers and specialists represents an opportunity for pharmacists to assume a more prominent role managing patients with chronic disease in the ambulatory care setting. However, lack of reimbursement may pose a barrier to the provision of care by pharmacists using telemedicine.

  15. Agent-based organizational modelling for analysis of safety culture at an air navigation service provider

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Sharpanskykh, Alexei; Kirwan, Barry

    2011-01-01

    Assessment of safety culture is done predominantly by questionnaire-based studies, which tend to reveal attitudes on immaterial characteristics (values, beliefs, norms). There is a need for a better understanding of the implications of the material aspects of an organization (structures, processes, etc.) for safety culture and their interactions with the immaterial characteristics. This paper presents a new agent-based organizational modelling approach for integrated and systematic evaluation of material and immaterial characteristics of socio-technical organizations in safety culture analysis. It uniquely considers both the formal organization and the value- and belief-driven behaviour of individuals in the organization. Results are presented of a model for safety occurrence reporting at an air navigation service provider. Model predictions consistent with questionnaire-based results are achieved. A sensitivity analysis provides insight in organizational factors that strongly influence safety culture indicators. The modelling approach can be used in combination with attitude-focused safety culture research, towards an integrated evaluation of material and immaterial characteristics of socio-technical organizations. By using this approach an organization is able to gain a deeper understanding of causes of diverse problems and inefficiencies both in the formal organization and in the behaviour of organizational agents, and to systematically identify and evaluate improvement options.

  16. A reliability model of the Angra 1 power system by the device of stages optimized by genetic algorithms

    International Nuclear Information System (INIS)

    Crossetti, Patricia Guimaraes

    2006-01-01

    This thesis proposes a probabilistic model to perform the reliability analysis of nuclear power plant systems under aging. This work analyses the Angra 1 power system. Systems subject to aging consist of components whose failure rates are not all constant, thus generating Non-Markovian models. Genetic algorithms were used for optimizing the application of the device of stages. Two approaches were used in the optimization, MCEF and MCEV. The results obtained for the Angra 1 power system show that the probability of a station blackout is negligible. (author)

  17. Modeling fMRI signals can provide insights into neural processing in the cerebral cortex.

    Science.gov (United States)

    Vanni, Simo; Sharifian, Fariba; Heikkinen, Hanna; Vigário, Ricardo

    2015-08-01

    Every stimulus or task activates multiple areas in the mammalian cortex. These distributed activations can be measured with functional magnetic resonance imaging (fMRI), which has the best spatial resolution among the noninvasive brain imaging methods. Unfortunately, the relationship between the fMRI activations and distributed cortical processing has remained unclear, both because the coupling between neural and fMRI activations has remained poorly understood and because fMRI voxels are too large to directly sense the local neural events. To get an idea of the local processing given the macroscopic data, we need models to simulate the neural activity and to provide output that can be compared with fMRI data. Such models can describe neural mechanisms as mathematical functions between input and output in a specific system, with little correspondence to physiological mechanisms. Alternatively, models can be biomimetic, including biological details with straightforward correspondence to experimental data. After careful balancing between complexity, computational efficiency, and realism, a biomimetic simulation should be able to provide insight into how biological structures or functions contribute to actual data processing as well as to promote theory-driven neuroscience experiments. This review analyzes the requirements for validating system-level computational models with fMRI. In particular, we study mesoscopic biomimetic models, which include a limited set of details from real-life networks and enable system-level simulations of neural mass action. In addition, we discuss how recent developments in neurophysiology and biophysics may significantly advance the modelling of fMRI signals. Copyright © 2015 the American Physiological Society.

  18. An Inventory Model for Deteriorating Item with Reliability Consideration and Trade Credit

    Directory of Open Access Journals (Sweden)

    S. R. Singh

    2014-10-01

    Full Text Available In today’s global market every body want to buy products of high level quality and to achieve a high level product quality supplier have to invest in improving reliability of production process. In present article we have studies reliable production process with stock dependent unit production and holding cost. Demand is exponential function of time and infinite production process wit non- instantaneous deterioration rate are considered in this paper. Whole study has been done under the effect of trade credit. The main objective of this paper is to optimize the total relevant cost for reliable production process. Numerical example and sensitivity analysis is given at the end of this paper.   Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

  19. Natural Circulation in Water Cooled Nuclear Power Plants Phenomena, models, and methodology for system reliability assessments

    Energy Technology Data Exchange (ETDEWEB)

    Jose Reyes

    2005-02-14

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''.

  20. Reliability analysis on passive residual heat removal of AP1000 based on Grey model

    Energy Technology Data Exchange (ETDEWEB)

    Qi, Shi; Zhou, Tao; Shahzad, Muhammad Ali; Li, Yu [North China Electric Power Univ., Beijing (China). School of Nuclear Science and Engineering; Beijing Key Laboratory of Passive Safety Technology for Nuclear Energy, Beijing (China); Jiang, Guangming [Nuclear Power Institute of China, Chengdu (China). Science and Technology on Reactor System Design Technology Laboratory

    2017-06-15

    It is common to base the design of passive systems on the natural laws of physics, such as gravity, heat conduction, inertia. For AP1000, a generation-III reactor, such systems have an inherent safety associated with them due to the simplicity of their structures. However, there is a fairly large amount of uncertainty in the operating conditions of these passive safety systems. In some cases, a small deviation in the design or operating conditions can affect the function of the system. The reliability of the passive residual heat removal is analysed.

  1. Natural Circulation in Water Cooled Nuclear Power Plants Phenomena, models, and methodology for system reliability assessments

    International Nuclear Information System (INIS)

    Jose Reyes

    2005-01-01

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''

  2. Bayesian updating of reliability of civil infrastructure facilities based on condition-state data and fault-tree model

    International Nuclear Information System (INIS)

    Ching Jianye; Leu, S.-S.

    2009-01-01

    This paper considers a difficult but practical circumstance of civil infrastructure management-deterioration/failure data of the infrastructure system are absent while only condition-state data of its components are available. The goal is to develop a framework for estimating time-varying reliabilities of civil infrastructure facilities under such a circumstance. A novel method of analyzing time-varying condition-state data that only reports operational/non-operational status of the components is proposed to update the reliabilities of civil infrastructure facilities. The proposed method assumes that the degradation arrivals can be modeled as a Poisson process with unknown time-varying arrival rate and damage impact and that the target system can be represented as a fault-tree model. To accommodate large uncertainties, a Bayesian algorithm is proposed, and the reliability of the infrastructure system can be quickly updated based on the condition-state data. Use of the new method is demonstrated with a real-world example of hydraulic spillway gate system.

  3. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters

    International Nuclear Information System (INIS)

    Zagni, F.; Cicoria, G.; Lucconi, G.; Infantino, A.; Lodi, F.; Marengo, M.

    2014-01-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the “PENELOPE” EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration. - Highlights: • We developed a Monte Carlo model of a radionuclide activity meter using Geant4. • The model was validated using several

  4. The climate4impact platform: Providing, tailoring and facilitating climate model data access

    Science.gov (United States)

    Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael

    2017-04-01

    One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European

  5. How reliable are satellite precipitation estimates for driving hydrological models: a verification study over the Mediterranean area

    Science.gov (United States)

    Camici, Stefania; Ciabatta, Luca; Massari, Christian; Brocca, Luca

    2017-04-01

    , TMPA 3B42-RT, CMORPH, PERSIANN and a new soil moisture-derived rainfall datasets obtained through the application of SM2RAIN algorithm (Brocca et al., 2014) to ASCAT (Advanced SCATterometer) soil moisture product are used in the analysis. The performances obtained with SRPs are compared with those obtained by using ground data during the 6-year period from 2010 to 2015. In addition, the performance obtained by an integration of the above mentioned SRPs is also investigated to see whether merged rainfall observations are able to improve flood simulation. Preliminary analysis were also carried out by using the IMERG early run product of GPM mission. The results highlight that SRPs should be used with caution for rainfall-runoff modelling in the Mediterranean region. Bias correction and model recalibration are necessary steps, even though not always sufficient to achieve satisfactory performances. Indeed, some of the products provide unreliable outcomes, mainly in smaller basins (for flood modelling in the Mediterranean area. The better performances are obtained by integrating different SRPs, and particularly by merging TMPA 3B42-RT and SM2RAIN-ASCAT products. The promising results of the integrated product are expected to increase the confidence on the use of SRPs in hydrological modeling, even in challenging areas as the Mediterranean. REFERENCES Brocca, L., Ciabatta, L., Massari, C., Moramarco, T., Hahn, S., Hasenauer, S., Kidd, R., Dorigo, W., Wagner, W., Levizzani, V. (2014). Soil as a natural rain gauge: estimating global rainfall from satellite soil moisture data. Journal of Geophysical Research, 119(9), 5128-5141, doi:10.1002/2014JD021489. Masseroni, D., Cislaghi, A., Camici, S., Massari, C., Brocca, L. (2017). A reliable rainfall-runoff model for flood forecasting: review and application to a semiurbanized watershed at high flood risk in Italy. Hydrology Research, in press, doi:10.2166/nh.2016.037.

  6. Will building new reservoirs always help increase the water supply reliability? - insight from a modeling-based global study

    Science.gov (United States)

    Zhuang, Y.; Tian, F.; Yigzaw, W.; Hejazi, M. I.; Li, H. Y.; Turner, S. W. D.; Vernon, C. R.

    2017-12-01

    More and more reservoirs are being build or planned in order to help meet the increasing water demand all over the world. However, is building new reservoirs always helpful to water supply? To address this question, the river routing module of Global Change Assessment Model (GCAM) has been extended with a simple yet physical-based reservoir scheme accounting for irrigation, flood control and hydropower operations at each individual reservoir. The new GCAM river routing model has been applied over the global domain with the runoff inputs from the Variable Infiltration Capacity Model. The simulated streamflow is validated at 150 global river basins where the observed streamflow data are available. The model performance has been significantly improved at 77 basins and worsened at 35 basins. To facilitate the analysis of additional reservoir storage impacts at the basin level, a lumped version of GCAM reservoir model has been developed, representing a single lumped reservoir at each river basin which has the regulation capacity of all reservoir combined. A Sequent Peak Analysis is used to estimate how much additional reservoir storage is required to satisfy the current water demand. For basins with water deficit, the water supply reliability can be improved with additional storage. However, there is a threshold storage value at each basin beyond which the reliability stops increasing, suggesting that building new reservoirs will not help better relieve the water stress. Findings in the research can be helpful to the future planning and management of new reservoirs.

  7. Capabilities of stochastic rainfall models as data providers for urban hydrology

    Science.gov (United States)

    Haberlandt, Uwe

    2017-04-01

    For planning of urban drainage systems using hydrological models, long, continuous precipitation series with high temporal resolution are needed. Since observed time series are often too short or not available everywhere, the use of synthetic precipitation is a common alternative. This contribution compares three precipitation models regarding their suitability to provide 5 minute continuous rainfall time series for a) sizing of drainage networks for urban flood protection and b) dimensioning of combined sewage systems for pollution reduction. The rainfall models are a parametric stochastic model (Haberlandt et al., 2008), a non-parametric probabilistic approach (Bárdossy, 1998) and a stochastic downscaling of dynamically simulated rainfall (Berg et al., 2013); all models are operated both as single site and multi-site generators. The models are applied with regionalised parameters assuming that there is no station at the target location. Rainfall and discharge characteristics are utilised for evaluation of the model performance. The simulation results are compared against results obtained from reference rainfall stations not used for parameter estimation. The rainfall simulations are carried out for the federal states of Baden-Württemberg and Lower Saxony in Germany and the discharge simulations for the drainage networks of the cities of Hamburg, Brunswick and Freiburg. Altogether, the results show comparable simulation performance for the three models, good capabilities for single site simulations but low skills for multi-site simulations. Remarkably, there is no significant difference in simulation performance comparing the tasks flood protection with pollution reduction, so the models are finally able to simulate both the extremes and the long term characteristics of rainfall equally well. Bárdossy, A., 1998. Generating precipitation time series using simulated annealing. Wat. Resour. Res., 34(7): 1737-1744. Berg, P., Wagner, S., Kunstmann, H., Schädler, G

  8. Bayesian networks in reliability

    Energy Technology Data Exchange (ETDEWEB)

    Langseth, Helge [Department of Mathematical Sciences, Norwegian University of Science and Technology, N-7491 Trondheim (Norway)]. E-mail: helgel@math.ntnu.no; Portinale, Luigi [Department of Computer Science, University of Eastern Piedmont ' Amedeo Avogadro' , 15100 Alessandria (Italy)]. E-mail: portinal@di.unipmn.it

    2007-01-15

    Over the last decade, Bayesian networks (BNs) have become a popular tool for modelling many kinds of statistical problems. We have also seen a growing interest for using BNs in the reliability analysis community. In this paper we will discuss the properties of the modelling framework that make BNs particularly well suited for reliability applications, and point to ongoing research that is relevant for practitioners in reliability.

  9. Analytical modeling provides new insight into complex mutual coupling between surface loops at ultrahigh fields.

    Science.gov (United States)

    Avdievich, N I; Pfrommer, A; Giapitzakis, I A; Henning, A

    2017-10-01

    Ultrahigh-field (UHF) (≥7 T) transmit (Tx) human head surface loop phased arrays improve both the Tx efficiency (B 1 + /√P) and homogeneity in comparison with single-channel quadrature Tx volume coils. For multi-channel arrays, decoupling becomes one of the major problems during the design process. Further insight into the coupling between array elements and its dependence on various factors can facilitate array development. The evaluation of the entire impedance matrix Z for an array loaded with a realistic voxel model or phantom is a time-consuming procedure when performed using electromagnetic (EM) solvers. This motivates the development of an analytical model, which could provide a quick assessment of the Z-matrix. In this work, an analytical model based on dyadic Green's functions was developed and validated using an EM solver and bench measurements. The model evaluates the complex coupling, including both the electric (mutual resistance) and magnetic (mutual inductance) coupling. Validation demonstrated that the model does well to describe the coupling at lower fields (≤3 T). At UHFs, the model also performs well for a practical case of low magnetic coupling. Based on the modeling, the geometry of a 400-MHz, two-loop transceiver array was optimized, such that, by simply overlapping the loops, both the mutual inductance and the mutual resistance were compensated at the same time. As a result, excellent decoupling (below -40 dB) was obtained without any additional decoupling circuits. An overlapped array prototype was compared (signal-to-noise ratio, Tx efficiency) favorably to a gapped array, a geometry which has been utilized previously in designs of UHF Tx arrays. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Providing a more complete view of ice-age palaeoclimates using model inversion and data interpolation

    Science.gov (United States)

    Cleator, Sean; Harrison, Sandy P.; Roulstone, Ian; Nichols, Nancy K.; Prentice, Iain Colin

    2017-04-01

    Site-based pollen records have been used to provide quantitative reconstructions of Last Glacial Maximum (LGM) climates, but there are too few such records to provide continuous climate fields for the evaluation of climate model simulations. Furthermore, many of the reconstructions were made using modern-analogue techniques, which do not account for the direct impact of CO2 on water-use efficiency and therefore reconstruct considerably drier conditions under low CO2 at the LGM than indicated by other sources of information. We have shown that it is possible to correct analogue-based moisture reconstructions for this effect by inverting a simple light-use efficiency model of productivity, based on the principle that the rate of water loss per unit carbon gain of a plant is the same under conditions of the true moisture, palaeotemperature and palaeo CO2 concentration as under reconstructed moisture, modern CO2 concentration and modern temperature (Prentice et al., 2016). In this study, we use data from the Bartlein el al. (2011) dataset, which provides reconstructions of one or more of six climate variables (mean annual temperature, mean temperature of the warmest and coldest months, the length of the growing seasons, mean annual precipitation, and the ratio of actual to potential evapotranspiration) at individual LGM sites. We use the SPLASH water-balance model to derive a moisture index (MI) at each site from mean annual precipitation and monthly values of sunshine fraction and average temperature, and correct this MI using the Prentice et al. (2016) inversion approach. We then use a three-dimensional variational (3D-Var) data assimilation scheme with the SPLASH model and Prentice et al. (2016) inversion approach to derive reconstructions of all six climate variables at each site, using the Bartlein et al. (2011) data set as a target. We use two alternative background climate states (or priors): modern climate derived from the CRU CL v2.0 data set (New et al., 2002

  11. Family child care home providers as role models for children: Cause for concern?

    Directory of Open Access Journals (Sweden)

    Alison Tovar

    2017-03-01

    Full Text Available Health behaviors associated with chronic disease, particularly healthy eating and regular physical activity, are important role modeling opportunities for individuals working in child care programs. Prior studies have not explored these risk factors in family child care home (FCCH providers which care for vulnerable and at-risk populations. To address this gap, we describe the socio-demographic and health risk behavior profiles in a sample of providers (n = 166 FCCH taken from baseline data of an ongoing cluster-randomized controlled intervention (2011–2016 in North Carolina. Data were collected during on-site visits where providers completed self-administered questionnaires (socio-demographics, physical activity, fruit and vegetable consumption, number of hours of sleep per night and perceived stress and had their height and weight measured. A risk score (range: 0–6; 0 no risk to 6 high risk was calculated based on how many of the following were present: not having health insurance, being overweight/obese, not meeting physical activity, fruit and vegetable, and sleep recommendations, and having high stress. Mean and frequency distributions of participant and FCCH characteristics were calculated. Close to one third (29.3% of providers reported not having health insurance. Almost all providers (89.8% were overweight or obese with approximately half not meeting guidelines for physical activity, fruit and vegetable consumption, and sleep. Over half reported a “high” stress score. The mean risk score was 3.39 (±1.2, with close to half of the providers having a risk score of 4, 5 or 6 (45.7%. These results stress the need to promote the health of these important care providers.

  12. SURROGATE MODEL DEVELOPMENT AND VALIDATION FOR RELIABILITY ANALYSIS OF REACTOR PRESSURE VESSELS

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, William M.; Riley, Matthew E.; Spencer, Benjamin W.

    2016-07-01

    In nuclear light water reactors (LWRs), the reactor coolant, core and shroud are contained within a massive, thick walled steel vessel known as a reactor pressure vessel (RPV). Given the tremendous size of these structures, RPVs typically contain a large population of pre-existing flaws introduced in the manufacturing process. After many years of operation, irradiation-induced embrittlement makes these vessels increasingly susceptible to fracture initiation at the locations of the pre-existing flaws. Because of the uncertainty in the loading conditions, flaw characteristics and material properties, probabilistic methods are widely accepted and used in assessing RPV integrity. The Fracture Analysis of Vessels – Oak Ridge (FAVOR) computer program developed by researchers at Oak Ridge National Laboratory is widely used for this purpose. This program can be used in order to perform deterministic and probabilistic risk-informed analyses of the structural integrity of an RPV subjected to a range of thermal-hydraulic events. FAVOR uses a one-dimensional representation of the global response of the RPV, which is appropriate for the beltline region, which experiences the most embrittlement, and employs an influence coefficient technique to rapidly compute stress intensity factors for axis-aligned surface-breaking flaws. The Grizzly code is currently under development at Idaho National Laboratory (INL) to be used as a general multiphysics simulation tool to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled RPVs. Grizzly can be used to model the thermo-mechanical response of an RPV under transient conditions observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local 3D models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtain stress intensity factors, which can in

  13. A multi-state model for the reliability assessment of a distributed generation system via universal generating function

    International Nuclear Information System (INIS)

    Li, Yan-Fu; Zio, Enrico

    2012-01-01

    The current and future developments of electric power systems are pushing the boundaries of reliability assessment to consider distribution networks with renewable generators. Given the stochastic features of these elements, most modeling approaches rely on Monte Carlo simulation. The computational costs associated to the simulation approach force to treating mostly small-sized systems, i.e. with a limited number of lumped components of a given renewable technology (e.g. wind or solar, etc.) whose behavior is described by a binary state, working or failed. In this paper, we propose an analytical multi-state modeling approach for the reliability assessment of distributed generation (DG). The approach allows looking to a number of diverse energy generation technologies distributed on the system. Multiple states are used to describe the randomness in the generation units, due to the stochastic nature of the generation sources and of the mechanical degradation/failure behavior of the generation systems. The universal generating function (UGF) technique is used for the individual component multi-state modeling. A multiplication-type composition operator is introduced to combine the UGFs for the mechanical degradation and renewable generation source states into the UGF of the renewable generator power output. The overall multi-state DG system UGF is then constructed and classical reliability indices (e.g. loss of load expectation (LOLE), expected energy not supplied (EENS)) are computed from the DG system generation and load UGFs. An application of the model is shown on a DG system adapted from the IEEE 34 nodes distribution test feeder.

  14. NSG Mice Provide a Better Spontaneous Model of Breast Cancer Metastasis than Athymic (Nude Mice.

    Directory of Open Access Journals (Sweden)

    Madhavi Puchalapalli

    Full Text Available Metastasis is the most common cause of mortality in breast cancer patients worldwide. To identify improved mouse models for breast cancer growth and spontaneous metastasis, we examined growth and metastasis of both estrogen receptor positive (T47D and negative (MDA-MB-231, SUM1315, and CN34BrM human breast cancer cells in nude and NSG mice. Both primary tumor growth and spontaneous metastases were increased in NSG mice compared to nude mice. In addition, a pattern of metastasis similar to that observed in human breast cancer patients (metastases to the lungs, liver, bones, brain, and lymph nodes was found in NSG mice. Furthermore, there was an increase in the metastatic burden in NSG compared to nude mice that were injected with MDA-MB-231 breast cancer cells in an intracardiac experimental metastasis model. This data demonstrates that NSG mice provide a better model for studying human breast cancer metastasis compared to the current nude mouse model.

  15. Reliability of circumplex axes

    OpenAIRE

    Strack M. Jacobs I. & Grosse Holtforth M.

    2013-01-01

    We present a confirmatory factor analysis (CFA) procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing int...

  16. State and Alternative Fuel Provider Fleets - Fleet Compliance Annual Report: Model Year 2015, Fiscal Year 2016

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-01

    The U.S. Department of Energy (DOE) regulates covered state government and alternative fuel provider fleets, pursuant to the Energy Policy Act of 1992 (EPAct), as amended. Covered fleets may meet their EPAct requirements through one of two compliance methods: Standard Compliance or Alternative Compliance. For model year (MY) 2015, the compliance rate with this program for the more than 3011 reporting fleets was 100%. More than 294 fleets used Standard Compliance and exceeded their aggregate MY 2015 acquisition requirements by 8% through acquisitions alone. The seven covered fleets that used Alternative Compliance exceeded their aggregate MY 2015 petroleum use reduction requirements by 46%.

  17. Estimations of parameters in Pareto reliability model in the presence of masked data

    International Nuclear Information System (INIS)

    Sarhan, Ammar M.

    2003-01-01

    Estimations of parameters included in the individual distributions of the life times of system components in a series system are considered in this paper based on masked system life test data. We consider a series system of two independent components each has a Pareto distributed lifetime. The maximum likelihood and Bayes estimators for the parameters and the values of the reliability of the system's components at a specific time are obtained. Symmetrical triangular prior distributions are assumed for the unknown parameters to be estimated in obtaining the Bayes estimators of these parameters. Large simulation studies are done in order: (i) explain how one can utilize the theoretical results obtained; (ii) compare the maximum likelihood and Bayes estimates obtained of the underlying parameters; and (iii) study the influence of the masking level and the sample size on the accuracy of the estimates obtained

  18. A Development of reliability evaluation model for power plant fan pitch blade control actuator

    International Nuclear Information System (INIS)

    Son, Tae Ha; Huh, Jun Young

    2007-01-01

    This paper describes the proceedings of creating countermeasures after analysis and maintenance to be able to conduct operation safely in a power plant. In order to operate the power plant in a stable and reliable way, the best condition of the governor system can be maintained through the response characteristic analysis of the control device for the pitch blade control hydraulic actuator. The fan pitch blade control hydraulic actuator of a 500MW large-scale boiler is frequently operated under normal operation conditions. Common problems or malfunctions of the pitch blade control hydraulic actuators leads to the decline of boiler thermal efficiency and unexpected power plant trip. The inlet and outlet gas can be controlled by using the fan pitch blade control hydraulic actuators in order to regulate the internal pressure of the furnace and control the frequency in the power plant facility which utilizes soft coals as a power sources

  19. A Chronological Reliability Model to Assess Operating Reserve Allocation to Wind Power Plants: Preprint

    International Nuclear Information System (INIS)

    Milligan, M. R.

    2001-01-01

    As the use of wind power plants increases worldwide, it is important to understand the effect these power sources have on the operations of the grid. This paper focuses on the operating reserve impact of wind power plants. Many probabilistic methods have been applied to power system analysis, and some of these are the basis of reliability analysis. This paper builds on a probabilistic technique to allocate the operating reserve burden among power plants in the grid. The method was originally posed by Strbac and Kirschen[1] and uses an allocation that prorates the reserve burden based on expected energy not delivered. Extending this method to include wind power plants allows the reserve burden to be allocated among different plants using the same method, yet incorporates information about the intermittent nature of wind power plants

  20. A numerical algorithm of fuzzy reliability

    International Nuclear Information System (INIS)

    Jiang Qimi; Chen, C.-H.

    2003-01-01

    In this paper, a computational model of fuzzy reliability focusing on solving the engineering problems with random general stress-fuzzy general strength is presented. The mathematical basis of this computational model is that the fuzzy probability can be computed with the computational method of conventional probability by use of a mathematical transition. Based on this computational model, a numerical algorithm is given which can be applied to compute the fuzzy reliability of mechanical components, sensors, electronic units, etc. This establishes a basis for the reliability analysis of systems consisting of components with fuzzy reliability. As an example, a case study about the fuzzy reliability analysis of a kind of sensor used in railway systems is provided to verify the logic of this algorithm. The computation results show that this algorithm fits the engineering experience

  1. A Combined Reliability Model of VSC-HVDC Connected Offshore Wind Farms Considering Wind Speed Correlation

    DEFF Research Database (Denmark)

    Guo, Yifei; Gao, Houlei; Wu, Qiuwei

    2017-01-01

    and WTGs outage. The wind speed correlation between different WFs is included in the two-dimensional multistate WF model by using an improved k-means clustering method. Then, the entire system with two WFs and a threeterminal VSC-HVDC system is modeled as a multi-state generation unit. The proposed model...

  2. Generic finite element models of orthodontic mini-implants: Are they reliable?

    Science.gov (United States)

    Albogha, Mhd Hassan; Takahashi, Ichiro

    2015-11-05

    Many finite element (FE) studies used different settings for modeling orthodontic mini-implants (OMIs). This study aims to compare different approaches for modeling OMI with FE method, to exhibit the role of key factors in modeling process. A computerized tomography (CT) dataset of a living human is used to develop subject-specific FE model of bone specimen, and a microCT was used to generate the geometry of OMI. Another five models were developed to assess the effect of changing different settings of FE modeling process. These five models differed from the subject-specific model in either: (i) bone properties' assignment method, (ii) geometries and constrains' conditions, or (iii) simulation method of bone-implant contact (BIC). The models presented significant differences in maximum principal strain distribution. These differences were most apparent when the models differed either in the nature of BIC or the method of assigning bone properties. The models different only in bone geometries showed differences in the intensity of strain rather than its distribution pattern. There is a need for assessment and validation of all FE modeling approaches currently used for simulation of mechanical environment in the bone surrounding OMIs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Inter-rater reliability of healthcare professional skills' portfolio assessments: The Andalusian Agency for Healthcare Quality model

    Directory of Open Access Journals (Sweden)

    Antonio Almuedo-Paz

    2014-07-01

    Full Text Available This study aims to determine the reliability of assessment criteria used for a portfolio at the Andalusian Agency for Healthcare Quality (ACSA. Data: all competences certification processes, regardless of their discipline. Period: 2010-2011. Three types of tests are used: 368 certificates, 17,895 reports and 22,642 clinical practice reports (N = 3,010 candidates. The tests were evaluated in pairs by the ACSA team of raters using two categories: valid and invalid. Results: The percentage agreement in assessments of certificates was 89,9%, while for the reports of clinical practice was 85,1 % and for clinical practice reports was 81,7%. The inter-rater agreement coefficients (kappa ranged from 0,468 to 0,711. Discussion: The results of this study show that the inter-rater reliability of assessments varies from fair to good. Compared with other similar studies, the results put the reliability of the model in a comfortable position. Among the improvements incorporated, progressive automation of evaluations must be highlighted.

  4. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    Science.gov (United States)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative

  5. Telecommunications system reliability engineering theory and practice

    CERN Document Server

    Ayers, Mark L

    2012-01-01

    "Increasing system complexity require new, more sophisticated tools for system modeling and metric calculation. Bringing the field up to date, this book provides telecommunications engineers with practical tools for analyzing, calculating, and reporting availability, reliability, and maintainability metrics. It gives the background in system reliability theory and covers in-depth applications in fiber optic networks, microwave networks, satellite networks, power systems, and facilities management. Computer programming tools for simulating the approaches presented, using the Matlab software suite, are also provided"

  6. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  7. Reliable NonLinear Model-Predictive Control via Validated Simulation

    OpenAIRE

    Alexandre dit Sandretto, Julien

    2017-01-01

    Model-Predictive Control (MPC) is one of the most advanced control technique nowadays. Indeed,MPC approaches are well known for their robustness and stability properties. Nevertheless, NonlinearModel-Predictive Control (NMPC), the extension of MPC in the nonlinear world, still poses challenging theoretical, computationaland implementation issues. By the help of validated simulation, which can handle nonlinear models, a new algorithmfor a robust by-construction control strategy based on NMPC i...

  8. Multi-objective optimization of generalized reliability design problems using feature models-A concept for early design stages

    International Nuclear Information System (INIS)

    Limbourg, Philipp; Kochs, Hans-Dieter

    2008-01-01

    Reliability optimization problems such as the redundancy allocation problem (RAP) have been of considerable interest in the past. However, due to the restrictions of the design space formulation, they may not be applicable in all practical design problems. A method with high modelling freedom for rapid design screening is desirable, especially in early design stages. This work presents a novel approach to reliability optimization. Feature modelling, a specification method originating from software engineering, is applied for the fast specification and enumeration of complex design spaces. It is shown how feature models can not only describe arbitrary RAPs but also much more complex design problems. The design screening is accomplished by a multi-objective evolutionary algorithm for probabilistic objectives. Comparing averages or medians may hide the true characteristics of this distributions. Therefore the algorithm uses solely the probability of a system dominating another to achieve the Pareto optimal set. We illustrate the approach by specifying a RAP and a more complex design space and screening them with the evolutionary algorithm

  9. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors.

    Science.gov (United States)

    Ettlinger, Andreas; Neuner, Hans; Burgess, Thomas

    2018-01-31

    The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning ("PDR"). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model ("GMM"), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model ("GHM"). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation.

  10. Providing Context for Complexity: Using Infographics and Conceptual Models to Teach Global Change Processes

    Science.gov (United States)

    Bean, J. R.; White, L. D.

    2015-12-01

    Understanding modern and historical global changes requires interdisciplinary knowledge of the physical and life sciences. The Understanding Global Change website from the UC Museum of Paleontology will use a focal infographic that unifies diverse content often taught in separate K-12 science units. This visualization tool provides scientists with a structure for presenting research within the broad context of global change, and supports educators with a framework for teaching and assessing student understanding of complex global change processes. This new approach to teaching the science of global change is currently being piloted and refined based on feedback from educators and scientists in anticipation of a 2016 website launch. Global change concepts are categorized within the infographic as causes of global change (e.g., burning of fossil fuels, volcanism), ongoing Earth system processes (e.g., ocean circulation, the greenhouse effect), and the changes scientists measure in Earth's physical and biological systems (e.g., temperature, extinctions/radiations). The infographic will appear on all website content pages and provides a template for the creation of flowcharts, which are conceptual models that allow teachers and students to visualize the interdependencies and feedbacks among processes in the atmosphere, hydrosphere, biosphere, and geosphere. The development of this resource is timely given that the newly adopted Next Generation Science Standards emphasize cross-cutting concepts, including model building, and Earth system science. Flowchart activities will be available on the website to scaffold inquiry-based lessons, determine student preconceptions, and assess student content knowledge. The infographic has already served as a learning and evaluation tool during professional development workshops at UC Berkeley, Stanford University, and the Smithsonian National Museum of Natural History. At these workshops, scientists and educators used the infographic

  11. Sparse Power-Law Network Model for Reliable Statistical Predictions Based on Sampled Data

    Directory of Open Access Journals (Sweden)

    Alexander P. Kartun-Giles

    2018-04-01

    Full Text Available A projective network model is a model that enables predictions to be made based on a subsample of the network data, with the predictions remaining unchanged if a larger sample is taken into consideration. An exchangeable model is a model that does not depend on the order in which nodes are sampled. Despite a large variety of non-equilibrium (growing and equilibrium (static sparse complex network models that are widely used in network science, how to reconcile sparseness (constant average degree with the desired statistical properties of projectivity and exchangeability is currently an outstanding scientific problem. Here we propose a network process with hidden variables which is projective and can generate sparse power-law networks. Despite the model not being exchangeable, it can be closely related to exchangeable uncorrelated networks as indicated by its information theory characterization and its network entropy. The use of the proposed network process as a null model is here tested on real data, indicating that the model offers a promising avenue for statistical network modelling.

  12. Analysis of the reliability of the local effect model for the use in carbon ion treatment planning systems.

    Science.gov (United States)

    Russo, G; Attili, A; Bourhaleb, F; Marchetto, F; Peroni, C; Schmitt, E; Bertrand, D

    2011-02-01

    In radiotherapy with carbon ions, biological effects of treatments have to be predicted. For this purpose, one of the most used models is the local effect model (LEM) developed at the Gesellschaft für Schwerionenforschung (GSI), Germany. At the Istituto Nazionale di Fisica Nucleare, Italy, the reliability of the last published version of LEM (LEM III) in reproducing radiobiological data has been checked under both monoenergetic and spread-out Bragg peak (SOBP) carbon-ion irradiation. The reproduction of the monoenergetic measurements with the LEM was rather successful for some cell lines, while it failed for the less-radioresistant ones. The SOBP experimental trend was predicted by the LEM, but a large shift between model curves and measured points was observed.

  13. MODEL REQUEST FOR PROPOSALS TO PROVIDE ENERGY AND OTHER ATTRIBUTES FROM AN OFFSHORE WIND POWER PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    Jeremy Firestone; Dawn Kurtz Crompton

    2011-10-22

    This document provides a model RFP for new generation. The 'base' RFP is for a single-source offshore wind RFP. Required modifications are noted should a state or utility seek multi-source bids (e.g., all renewables or all sources). The model is premised on proposals meeting threshold requirements (e.g., a MW range of generating capacity and a range in terms of years), RFP issuer preferences (e.g., likelihood of commercial operation by a date certain, price certainty, and reduction in congestion), and evaluation criteria, along with a series of plans (e.g., site, environmental effects, construction, community outreach, interconnection, etc.). The Model RFP places the most weight on project risk (45%), followed by project economics (35%), and environmental and social considerations (20%). However, if a multi-source RFP is put forward, the sponsor would need to either add per-MWh technology-specific, life-cycle climate (CO2), environmental and health impact costs to bid prices under the 'Project Economics' category or it should increase the weight given to the 'Environmental and Social Considerations' category.

  14. Mathematical modeling provides kinetic details of the human immune response to vaccination

    Directory of Open Access Journals (Sweden)

    Dustin eLe

    2015-01-01

    Full Text Available With major advances in experimental techniques to track antigen-specific immune responses many basic questions on the kinetics of virus-specific immunity in humans remain unanswered. To gain insights into kinetics of T and B cell responses in human volunteers we combine mathematical models and experimental data from recent studies employing vaccines against yellow fever and smallpox. Yellow fever virus-specific CD8 T cell population expanded slowly with the average doubling time of 2 days peaking 2.5 weeks post immunization. Interestingly, we found that the peak of the yellow fever-specific CD8 T cell response is determined by the rate of T cell proliferation and not by the precursor frequency of antigen-specific cells as has been suggested in several studies in mice. We also found that while the frequency of virus-specific T cells increases slowly, the slow increase can still accurately explain clearance of yellow fever virus in the blood. Our additional mathematical model describes well the kinetics of virus-specific antibody-secreting cell and antibody response to vaccinia virus in vaccinated individuals suggesting that most of antibodies in 3 months post immunization are derived from the population of circulating antibody-secreting cells. Taken together, our analysis provides novel insights into mechanisms by which live vaccines induce immunity to viral infections and highlight challenges of applying methods of mathematical modeling to the current, state-of-the-art yet limited immunological data.

  15. Mathematical modeling provides kinetic details of the human immune response to vaccination.

    Science.gov (United States)

    Le, Dustin; Miller, Joseph D; Ganusov, Vitaly V

    2014-01-01

    With major advances in experimental techniques to track antigen-specific immune responses many basic questions on the kinetics of virus-specific immunity in humans remain unanswered. To gain insights into kinetics of T and B cell responses in human volunteers we combined mathematical models and experimental data from recent studies employing vaccines against yellow fever and smallpox. Yellow fever virus-specific CD8 T cell population expanded slowly with the average doubling time of 2 days peaking 2.5 weeks post immunization. Interestingly, we found that the peak of the yellow fever-specific CD8 T cell response was determined by the rate of T cell proliferation and not by the precursor frequency of antigen-specific cells as has been suggested in several studies in mice. We also found that while the frequency of virus-specific T cells increased slowly, the slow increase could still accurately explain clearance of yellow fever virus in the blood. Our additional mathematical model described well the kinetics of virus-specific antibody-secreting cell and antibody response to vaccinia virus in vaccinated individuals suggesting that most of antibodies in 3 months post immunization were derived from the population of circulating antibody-secreting cells. Taken together, our analysis provided novel insights into mechanisms by which live vaccines induce immunity to viral infections and highlighted challenges of applying methods of mathematical modeling to the current, state-of-the-art yet limited immunological data.

  16. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed.......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...

  17. Guarana Provides Additional Stimulation over Caffeine Alone in the Planarian Model

    Science.gov (United States)

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R.; Constable, Mic Andre; Mulligan, Margaret E.; Voura, Evelyn B.

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  18. Guarana provides additional stimulation over caffeine alone in the planarian model.

    Directory of Open Access Journals (Sweden)

    Dimitrios Moustakas

    Full Text Available The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose.

  19. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  20. Reliability of simulated robustness testing in fast liquid chromatography, using state-of-the-art column technology, instrumentation and modelling software.

    Science.gov (United States)

    Kormány, Róbert; Fekete, Jenő; Guillarme, Davy; Fekete, Szabolcs

    2014-02-01

    The goal of this study was to evaluate the accuracy of simulated robustness testing using commercial modelling software (DryLab) and state-of-the-art stationary phases. For this purpose, a mixture of amlodipine and its seven related impurities was analyzed on short narrow bore columns (50×2.1mm, packed with sub-2μm particles) providing short analysis times. The performance of commercial modelling software for robustness testing was systematically compared to experimental measurements and DoE based predictions. We have demonstrated that the reliability of predictions was good, since the predicted retention times and resolutions were in good agreement with the experimental ones at the edges of the design space. In average, the retention time relative errors were software, we proved that the separation was feasible on all columns within the same analysis time (less than 4min), by proper adjustments of variables. Copyright © 2013 Elsevier B.V. All rights reserved.