Suitability Analysis of Continuous-Use Reliability Growth Projection Models
2015-03-26
exists for all types, shapes, and sizes. The primary focus of this study is a comparison of reliability growth projection models designed for...requirements to use reliability growth models, recent studies have noted trends in reliability failures throughout the DoD. In [14] Dr. Michael Gilmore...so a strict exponential distribu- tion was used to stay within their assumptions. In reality, however, reliability growth models often must be used
Nanowire growth process modeling and reliability models for nanodevices
Fathi Aghdam, Faranak
Nowadays, nanotechnology is becoming an inescapable part of everyday life. The big barrier in front of its rapid growth is our incapability of producing nanoscale materials in a reliable and cost-effective way. In fact, the current yield of nano-devices is very low (around 10 %), which makes fabrications of nano-devices very expensive and uncertain. To overcome this challenge, the first and most important step is to investigate how to control nano-structure synthesis variations. The main directions of reliability research in nanotechnology can be classified either from a material perspective or from a device perspective. The first direction focuses on restructuring materials and/or optimizing process conditions at the nano-level (nanomaterials). The other direction is linked to nano-devices and includes the creation of nano-electronic and electro-mechanical systems at nano-level architectures by taking into account the reliability of future products. In this dissertation, we have investigated two topics on both nano-materials and nano-devices. In the first research work, we have studied the optimization of one of the most important nanowire growth processes using statistical methods. Research on nanowire growth with patterned arrays of catalyst has shown that the wire-to-wire spacing is an important factor affecting the quality of resulting nanowires. To improve the process yield and the length uniformity of fabricated nanowires, it is important to reduce the resource competition between nanowires during the growth process. We have proposed a physical-statistical nanowire-interaction model considering the shadowing effect and shared substrate diffusion area to determine the optimal pitch that would ensure the minimum competition between nanowires. A sigmoid function is used in the model, and the least squares estimation method is used to estimate the model parameters. The estimated model is then used to determine the optimal spatial arrangement of catalyst arrays
Modeling Reliability Growth in Accelerated Stress Testing
2013-12-01
projection models for both continuous use and discrete use systems found anywhere in the literature. The review comprises a synopsis of over 80...pertaining to the research that may have been unfamiliar to the reader. The Chapter has provided a synopsis of the research accomplished in the fields of...Cox, "Analysis of the probability and risk of cause specific failure," International Journal of Radiology Oncology, Biology, Physics, vol. 29, no. 5
Exponential order statistic models of software reliability growth
Miller, D. R.
1986-01-01
Failure times of a software reliability growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.
Stochastic Differential Equation-Based Flexible Software Reliability Growth Model
P. K. Kapur
2009-01-01
Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.
Open Source Software Reliability Growth Model by Considering Change- Point
Mashaallah Basirzadeh
2012-01-01
Full Text Available The modeling technique for Software Reliability is reaching its prosperity. Software reliability growth models have been used extensively for closed source software. The design and development of open source software (OSS is different from closed source software. We observed some basic characteristics for open source software like (i more instructions execution and code coverage taking place with respect to time, (ii release early, release often (iii frequent addition of patches (iv heterogeneity in fault density and effort expenditure (v Frequent release activities seem to have changed the bug dynamics significantly (vi Bug reporting on bug tracking system drastically increases and decreases. Due to this reason bug reported on bug tracking system keeps an irregular state and fluctuations. Therefore, fault detection/removal process can not be smooth and may be changed at some time point called change-point. In this paper, an instructions executed dependent software reliability growth model has been developed by considering change-point in order to cater diverse and huge user profile, irregular state of bug tracking system and heterogeneity in fault distribution. We have analyzed actual software failure count data to show numerical examples of software reliability assessment for the OSS. We also compare our model with the conventional in terms of goodness-of-fit for actual data. We have shown that the proposed model can assist improvement of quality for OSS systems developed under the open source project.
A Markov chain model for reliability growth and decay
Siegrist, K.
1982-01-01
A mathematical model is developed to describe a complex system undergoing a sequence of trials in which there is interaction between the internal states of the system and the outcomes of the trials. For example, the model might describe a system undergoing testing that is redesigned after each failure. The basic assumptions for the model are that the state of the system after a trial depends probabilistically only on the state before the trial and on the outcome of the trial and that the outcome of a trial depends probabilistically only on the state of the system before the trial. It is shown that under these basic assumptions, the successive states form a Markov chain and the successive states and outcomes jointly form a Markov chain. General results are obtained for the transition probabilities, steady-state distributions, etc. A special case studied in detail describes a system that has two possible state ('repaired' and 'unrepaired') undergoing trials that have three possible outcomes ('inherent failure', 'assignable-cause' 'failure' and 'success'). For this model, the reliability function is computed explicitly and an optimal repair policy is obtained.
B.B. Sagar
2016-09-01
Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.
Alaa F. Sheta
2016-04-01
Full Text Available In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM can be used to predict the number of failures that may be encountered during the software testing process. In this paper we explore the advantages of the Grey Wolf Optimization (GWO algorithm in estimating the SRGM’s parameters with the objective of minimizing the difference between the estimated and the actual number of failures of the software system. We evaluated three different software reliability growth models: the Exponential Model (EXPM, the Power Model (POWM and the Delayed S-Shaped Model (DSSM. In addition, we used three different datasets to conduct an experimental study in order to show the effectiveness of our approach.
无
2007-01-01
Several software reliability growth models (SRGM) have been developed to monitor the reliability growth during the testing phase of software development. In most of the existing research available in the literatures, it is considered that a similar testing effort is required on each debugging effort. However, in practice, different types of faults may require different amounts of testing efforts for their detection and removal. Consequently, faults are classified into three categories on the basis of severity: simple, hard and complex. This categorization may be extended to r type of faults on the basis of severity. Although some existing research in the literatures has incorporated this concept that fault removal rate (FRR) is different for different types of faults, they assume that the FRR remains constant during the overall testing period. On the contrary, it has been observed that as testing progresses, FRR changes due to changing testing strategy, skill, environment and personnel resources. In this paper, a general discrete SRGM is proposed for errors of different severity in software systems using the change-point concept. Then, the models are formulated for two particular environments. The models were validated on two real-life data sets. The results show better fit and wider applicability of the proposed models as to different types of failure datasets.
Fang, Chih-Chiang; Yeh, Chun-Wu
2016-09-01
The quantitative evaluation of software reliability growth model is frequently accompanied by its confidence interval of fault detection. It provides helpful information to software developers and testers when undertaking software development and software quality control. However, the explanation of the variance estimation of software fault detection is not transparent in previous studies, and it influences the deduction of confidence interval about the mean value function that the current study addresses. Software engineers in such a case cannot evaluate the potential hazard based on the stochasticity of mean value function, and this might reduce the practicability of the estimation. Hence, stochastic differential equations are utilised for confidence interval estimation of the software fault-detection process. The proposed model is estimated and validated using real data-sets to show its flexibility.
Reliability Growth in Space Life Support Systems
Jones, Harry W.
2014-01-01
A hardware system's failure rate often increases over time due to wear and aging, but not always. Some systems instead show reliability growth, a decreasing failure rate with time, due to effective failure analysis and remedial hardware upgrades. Reliability grows when failure causes are removed by improved design. A mathematical reliability growth model allows the reliability growth rate to be computed from the failure data. The space shuttle was extensively maintained, refurbished, and upgraded after each flight and it experienced significant reliability growth during its operational life. In contrast, the International Space Station (ISS) is much more difficult to maintain and upgrade and its failure rate has been constant over time. The ISS Carbon Dioxide Removal Assembly (CDRA) reliability has slightly decreased. Failures on ISS and with the ISS CDRA continue to be a challenge.
指数可靠性增长模型研究%Research on Exponential Reliability Growth Models
韩庆田; 李文强; 曹文静
2012-01-01
产品的可靠性增长试验通常有若干个阶段,每个阶段都是在设计、工艺、材料等方面有所改进时进行的,可靠度不断提高.结合产品研制阶段的可靠性增长特点,基于Duane学习曲线性质,研究了可靠性增长模型,给出了参数的极大似然估计和可靠性评估方法.实例分析结果表明模型方法简单,符合工程实际,适合小子样产品的可靠性增长评估.%During the period of development of the products, the reliability is always growing for the improvement of the designation, technology, materials, etc. Considering the reliability growth characteristics of the product during development phase, based on the nature of the Duane learning curve, the exponential reliability growth models was studied, and the parameters of the maximum likelihood estimator and reliability assessment method were given. The case study results show that the model is simple, actual project for small sample reliability growth assessment.
无
2007-01-01
Failure of a safety critical system can lead to big losses. Very high software reliability is required for automating the working of systems such as aircraft controller and nuclear reactor controller software systems. Fault-tolerant softwares are used to increase the overall reliability of software systems. Fault tolerance is achieved using the fault-tolerant schemes such as fault recovery (recovery block scheme), fault masking (N-version programming (NVP)) or a combination of both (Hybrid scheme). These softwares incorporate the ability of system survival even on a failure. Many researchers in the field of software engineering have done excellent work to study the reliability of fault-tolerant systems. Most of them consider the stable system reliability. Few attempts have been made in reliability modeling to study the reliability growth for an NVP system. Recently, a model was proposed to analyze the reliability growth of an NVP system incorporating the effect of fault removal efficiency. In this model, a proportion of the number of failures is assumed to be a measure of fault generation while an appropriate measure of fault generation should be the proportion of faults removed. In this paper, we first propose a testing efficiency model incorporating the effect of imperfect fault debugging and error generation. Using this model, a software reliability growth model (SRGM) is developed to model the reliability growth of an NVP system. The proposed model is useful for practical applications and can provide the measures of debugging effectiveness and additional workload or skilled professional required. It is very important for a developer to determine the optimal release time of the software to improve its performance in terms of competition and cost. In this paper, we also formulate the optimal software release time problem for a 3VP system under fuzzy environment and discuss a the fuzzy optimization technique for solving the problem with a numerical illustration.
Supply chain reliability modelling
Eugen Zaitsev
2012-03-01
Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.
无
2007-01-01
Since the early 1970s tremendous growth has been seen in the research of software reliability growth modeling. In general, software reliability growth models (SRGMs) are applicable to the late stages of testing in software development and they can provide useful information about how to improve the reliability of software products. A number of SRGMs have been proposed in the literature to represent time-dependent fault identification / removal phenomenon; still new models are being proposed that could fit a greater number of reliability growth curves. Often, it is assumed that detected faults are immediately corrected when mathematical models are developed. This assumption may not be realistic in practice because the time to remove a detected fault depends on the complexity of the fault, the skill and experience of the personnel, the size of the debugging team, the technique, and so on. Thus, the detected fault need not be immediately removed, and it may lag the fault detection process by a delay effect factor. In this paper, we first review how different software reliability growth models have been developed, where fault detection process is dependent not only on the number of residual fault content but also on the testing time, and see how these models can be reinterpreted as the delayed fault detection model by using a delay effect factor. Based on the power function of the testing time concept, we propose four new SRGMs that assume the presence of two types of faults in the software: leading and dependent faults. Leading faults are those that can be removed upon a failure being observed. However, dependent faults are masked by leading faults and can only be removed after the corresponding leading fault has been removed with a debugging time lag. These models have been tested on real software error data to show its goodness of fit, predictive validity and applicability.
RESEARCH ON RELIABILITY GROWTH FOR SYNCHRONOUSLY DEVELOPED MULTI-SYSTEMS
MA Xiao-ning; L(U) Zhen-zhou; YUE Zhu-feng
2005-01-01
An advanced reliability growth model, i. e. exponential model, was presented to estimate the model parameters for multi-systems, which was synchronously tested, synchronously censored, and synchronously improved. In the presented method,the data during the reliability growth process were taken into consideration sufficiently,including the failure numbers, safety numbers and failure time at each censored time. If the multi-systems were synchronously improved for many times, and the reliability growth of each system fitted AMSAA (Army Material Systems Analysis Activity)model, the failure time of each system could be considered rationally as an exponential distribution between two adjoining censored times. The nonparametric method was employed to obtain the reliability at each censored time of the synchronous multisystems. The point estimations of the model parameters, a and b, were given by the least square method. The confidence interval for the parameter b was given as well. An engineering illustration was used to compare the result of the presented method with those of the available models. The result shows that the presented exponential growth model fits AMSAA-BISE ( Army Material Systems Analysis Activity-Beijing Institute of Structure and Environment) model rather well, and two models are suitable to estimate the reliability growth for the synchronously developed multi-systems.
Reliability Modeling of Wind Turbines
Kostandyan, Erik
and uncertainties are quantified. Further, estimation of annual failure probability for structural components taking into account possible faults in electrical or mechanical systems is considered. For a representative structural failure mode, a probabilistic model is developed that incorporates grid loss failures...... components. Thus, models of reliability should be developed and applied in order to quantify the residual life of the components. Damage models based on physics of failure combined with stochastic models describing the uncertain parameters are imperative for development of cost-optimal decision tools...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...
Clavijo Michelangeli, Jose A; Bhakta, Mehul; Gezan, Salvador A; Boote, Kenneth J; Vallejos, C Eduardo
2013-11-01
The lack of dependable morphological indicators for the onset and end of seed growth has hindered modeling work in the common bean (Phaseolus vulgaris L.). We have addressed this problem through the use of mathematical growth functions to analyse and identify critical developmental stages, which can be linked to existing developmental indices. We performed this study under greenhouse conditions with an Andean and a Mesoamerican genotype of contrasting pod and seed phenotypes, and three selected recombinant inbred lines. Pods from tagged flowers were harvested at regular time intervals for various measurements. Differences in flower production and seed and pod growth trajectories among genotypes were detected via comparisons of parameters of fitted growth functions. Regardless of the genotype, the end of pod elongation marked the beginning of seed growth, which lasted until pods displayed a sharp decline in color, or pod hue angle. These results suggest that the end of pod elongation and the onset of color change are reliable indicators of important developmental transitions in the seed, even for widely differing pod phenotypes. We also provide a set of equations that can be used to model different aspects of reproductive growth and development in the common bean. © 2013 John Wiley & Sons Ltd.
Fatigue Reliability of Deck Structures Subjected to Correlated Crack Growth
G.Q. Feng; Y. Garbatov; C. Guedes Soares
2013-01-01
The objective of this work is to analyse fatigue reliability of deck structures subjected to correlated crack growth. The stress intensity factors of the correlated cracks are obtained by finite element analysis and based on which the geometry correction functions are derived. The Monte Carlo simulations are applied to predict the statistical descriptors of correlated cracks based on the Paris-Erdogan equation. A probabilistic model of crack growth as a function of time is used to analyse the fatigue reliability of deck structures accounting for the crack propagation correlation. A deck structure is modelled as a series system of stiffened panels, where a stiffened panel is regarded as a parallel system composed of plates and are longitudinal. It has been proven that the method developed here can be conveniently applied to perform the fatigue reliability assessment of structures subjected to correlated crack growth.
Hybrid reliability model for fatigue reliability analysis of steel bridges
曹珊珊; 雷俊卿
2016-01-01
A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.
Reliability Modeling of Wind Turbines
Kostandyan, Erik
Cost reductions for offshore wind turbines are a substantial requirement in order to make offshore wind energy more competitive compared to other energy supply methods. During the 20 – 25 years of wind turbines useful life, Operation & Maintenance costs are typically estimated to be a quarter...... the actions should be made and the type of actions requires knowledge on the accumulated damage or degradation state of the wind turbine components. For offshore wind turbines, the action times could be extended due to weather restrictions and result in damage or degradation increase of the remaining...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...
Overcoming some limitations of imprecise reliability models
Kozine, Igor; Krymsky, Victor
2011-01-01
The application of imprecise reliability models is often hindered by the rapid growth in imprecision that occurs when many components constitute a system and by the fact that time to failure is bounded from above. The latter results in the necessity to explicitly introduce an upper bound on time...... to failure which is in reality a rather arbitrary value. The practical meaning of the models of this kind is brought to question. We suggest an approach that overcomes the issue of having to impose an upper bound on time to failure and makes the calculated lower and upper reliability measures more precise....... The main assumption consists in that failure rate is bounded. Langrage method is used to solve the non-linear program. Finally, an example is provided....
Reliability growth of thin film resistors contact
Lugin A. N.
2010-10-01
Full Text Available Necessity of resistive layer growth under the contact and in the contact zone of resistive element is shown in order to reduce peak values of current flow and power dissipation in the contact of thin film resistor, thereby to increase the resistor stability to parametric and catastrophic failures.
Software reliability models for critical applications
Pham, H.; Pham, M.
1991-12-01
This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.
Software reliability models for critical applications
Pham, H.; Pham, M.
1991-12-01
This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.
A Software Reliability Model Using Quantile Function
Bijamma Thomas
2014-01-01
Full Text Available We study a class of software reliability models using quantile function. Various distributional properties of the class of distributions are studied. We also discuss the reliability characteristics of the class of distributions. Inference procedures on parameters of the model based on L-moments are studied. We apply the proposed model to a real data set.
Analysis on Some of Software Reliability Models
无
2001-01-01
Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.
Delivery Time Reliability Model of Logistics Network
Liusan Wu; Qingmei Tan; Yuehui Zhang
2013-01-01
Natural disasters like earthquake and flood will surely destroy the existing traffic network, usually accompanied by delivery delay or even network collapse. A logistics-network-related delivery time reliability model defined by a shortest-time entropy is proposed as a means to estimate the actual delivery time reliability. The less the entropy is, the stronger the delivery time reliability remains, and vice versa. The shortest delivery time is computed separately based on two different assum...
SOFTWARE RELIABILITY MODEL FOR COMPONENT INTERACTION MODE
Wang Qiang; Lu Yang; Xu Zijun; Han Jianghong
2011-01-01
With the rapid progress of component technology,the software development methodology of gathering a large number of components for designing complex software systems has matured.But,how to assess the application reliability accurately with the information of system architecture and the components reliabilities together has become a knotty problem.In this paper,the defects in formal description of software architecture and the limitations in existed model assumptions are both analyzed.Moreover,a new software reliability model called Component Interaction Mode (CIM) is proposed.With this model,the problem for existed component-based software reliability analysis models that cannot deal with the cases of component interaction with non-failure independent and non-random control transition is resolved.At last,the practice examples are presented to illustrate the effectiveness of this model
Modeling of reliable multicasting services
Barkauskaite, Monika; Zhang, Jiang; Wessing, Henrik
2010-01-01
This paper addresses network survivability for Multicast transport over MPLS-TP ring topology networks. Protection mechanisms standardized for unicast are not fully suitable for multicast point-to-multipoint transmission and multicast schemes are not standardized yet. Therefore, this paper...... investigates one of the proficient protection schemes and uses OPNET Modeler for analyzing and designing networks with the chosen protection method. For failure detection and protection switching initiation, the OAM (Operation, Administration and Maintenance) functions will be added to the system model. From...
Wendelberger, J.R.
1998-12-01
In reliability modeling, the term availability is used to represent the fraction of time that a process is operating successfully. Several different definitions have been proposed for different types of availability. One commonly used measure of availability is cumulative availability, which is defined as the ratio of the amount of time that a system is up and running to the total elapsed time. During the startup phase of a process, cumulative availability may be treated as a growth process. A procedure for modeling cumulative availability as a function of time is proposed. Estimates of other measures of availability are derived from the estimated cumulative availability function. The use of empirical Bayes techniques to improve the resulting estimates is also discussed.
MING Zhimao; TAO Junyong; ZHANG Yunan; YI Xiaoshan; CHEN Xun
2009-01-01
New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.
Towards a reliable animal model of migraine
Olesen, Jes; Jansen-Olesen, Inger
2012-01-01
The pharmaceutical industry shows a decreasing interest in the development of drugs for migraine. One of the reasons for this could be the lack of reliable animal models for studying the effect of acute and prophylactic migraine drugs. The infusion of glyceryl trinitrate (GTN) is the best validated...... and most studied human migraine model. Several attempts have been made to transfer this model to animals. The different variants of this model are discussed as well as other recent models....
Space Vehicle Reliability Modeling in DIORAMA
Tornga, Shawn Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-07-12
When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.
Delivery Time Reliability Model of Logistics Network
Liusan Wu
2013-01-01
Full Text Available Natural disasters like earthquake and flood will surely destroy the existing traffic network, usually accompanied by delivery delay or even network collapse. A logistics-network-related delivery time reliability model defined by a shortest-time entropy is proposed as a means to estimate the actual delivery time reliability. The less the entropy is, the stronger the delivery time reliability remains, and vice versa. The shortest delivery time is computed separately based on two different assumptions. If a path is concerned without capacity restriction, the shortest delivery time is positively related to the length of the shortest path, and if a path is concerned with capacity restriction, a minimax programming model is built to figure up the shortest delivery time. Finally, an example is utilized to confirm the validity and practicality of the proposed approach.
Methods for fast, reliable growth of Sn whiskers
Bozack, M. J.; Snipes, S. K.; Flowers, G. N.
2016-10-01
We report several methods to reliably grow dense fields of high-aspect ratio tin whiskers for research purposes in a period of days to weeks. The techniques offer marked improvements over previous means to grow whiskers, which have struggled against the highly variable incubation period of tin whiskers and slow growth rate. Control of the film stress is the key to fast-growing whiskers, owing to the fact that whisker incubation and growth are fundamentally a stress-relief phenomenon. The ability to grow high-density fields of whiskers (103-106/cm2) in a reasonable period of time (days, weeks) has accelerated progress in whisker growth and aided in development of whisker mitigation strategies.
Reliability block diagrams to model disease management.
Sonnenberg, A; Inadomi, J M; Bauerfeind, P
1999-01-01
Studies of diagnostic or therapeutic procedures in the management of any given disease tend to focus on one particular aspect of the disease and ignore the interaction between the multitude of factors that determine its final outcome. The present article introduces a mathematical model that accounts for the joint contribution of various medical and non-medical components to the overall disease outcome. A reliability block diagram is used to model patient compliance, endoscopic screening, and surgical therapy for dysplasia in Barrett's esophagus. The overall probability of a patient with a Barrett's esophagus to comply with a screening program, be correctly diagnosed with dysplasia, and undergo successful therapy is 37%. The reduction in the overall success rate, despite the fact that the majority of components are assumed to function with reliability rates of 80% or more, is a reflection of the multitude of serial subsystems involved in disease management. Each serial component influences the overall success rate in a linear fashion. Building multiple parallel pathways into the screening program raises its overall success rate to 91%. Parallel arrangements render systems less sensitive to diagnostic or therapeutic failures. A reliability block diagram provides the means to model the contributions of many heterogeneous factors to disease outcome. Since no medical system functions perfectly, redundancy provided by parallel subsystems assures a greater overall reliability.
A Censored Nonparametric Software Reliability Model
无
2006-01-01
This paper analyses the effct of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs well with censored data.
AX-5 space suit reliability model
Reinhardt, AL; Magistad, John
1990-01-01
The AX-5 is an all metal Extra-vehicular (EVA) space suit currently under consideration for use on Space Station Freedom. A reliability model was developed based on the suit's unique design and on projected joint cycle requirements. Three AX-5 space suit component joints were cycled under simulated load conditions in accordance with NASA's advanced space suit evaluation plan. This paper will describe the reliability model developed, the results of the cycle testing, and an interpretation of the model and test results in terms of projected Mean Time Between Failure for the AX-5. A discussion of the maintenance implications and life cycle for the AX-5 based on this projection is also included.
Web software reliability modeling with random impulsive shocks
Jianfeng Yang; Ming Zhao; Wensheng Hu
2014-01-01
As the web-server based business is rapidly developed and popularized, how to evaluate and improve the reliability of web-servers has been extremely important. Although a large num-ber of software reliability growth models (SRGMs), including those combined with multiple change-points (CPs), have been available, these conventional SRGMs cannot be directly applied to web soft-ware reliability analysis because of the complex web operational profile. To characterize the web operational profile precisely, it should be realized that the workload of a web server is normal y non-homogeneous and often observed with the pattern of random impulsive shocks. A web software reliability model with random im-pulsive shocks and its statistical analysis method are developed. In the proposed model, the web server workload is characterized by a geometric Brownian motion process. Based on a real data set from IIS server logs of ICRMS website (www.icrms.cn), the proposed model is demonstrated to be powerful for estimating impulsive shocks and web software reliability.
Stochastic models in reliability and maintenance
2002-01-01
Our daily lives can be maintained by the high-technology systems. Computer systems are typical examples of such systems. We can enjoy our modern lives by using many computer systems. Much more importantly, we have to maintain such systems without failure, but cannot predict when such systems will fail and how to fix such systems without delay. A stochastic process is a set of outcomes of a random experiment indexed by time, and is one of the key tools needed to analyze the future behavior quantitatively. Reliability and maintainability technologies are of great interest and importance to the maintenance of such systems. Many mathematical models have been and will be proposed to describe reliability and maintainability systems by using the stochastic processes. The theme of this book is "Stochastic Models in Reliability and Main tainability. " This book consists of 12 chapters on the theme above from the different viewpoints of stochastic modeling. Chapter 1 is devoted to "Renewal Processes," under which cla...
Applying reliability models to the maintenance of Space Shuttle software
Schneidewind, Norman F.
1992-01-01
Software reliability models provide the software manager with a powerful tool for predicting, controlling, and assessing the reliability of software during maintenance. We show how a reliability model can be effectively employed for reliability prediction and the development of maintenance strategies using the Space Shuttle Primary Avionics Software Subsystem as an example.
Temporal reliability of cytokines and growth factors in EDTA plasma
Marrangoni Adele M
2010-11-01
Full Text Available Abstract Background Cytokines are involved in the development of chronic diseases, including cancer. It is important to evaluate the temporal reproducibility of cytokines in plasma prior to conducting epidemiologic studies utilizing these markers. Findings We assessed the temporal reliability of CRP, 22 cytokines and their soluble receptors (IL-1α, IL-1β, IL-1RA, IL-2, sIL-2R, IL-4, IL-5, IL-6, sIL-6R, IL-7, IL-8, IL-10, IL-12p40, IL-12p70, IL-13, IL-15, IL-17, TNFα, sTNF-R1, sTNF-R2, IFNα, IFNγ and eight growth factors (GM-CSF, EGF, bFGF, G-CSF, HGF, VEGF, EGFR, ErbB2 in repeated EDTA plasma samples collected an average of two years apart from 18 healthy women (age range: 42-62 enrolled in a prospective cohort study. We also estimated the correlation between serum and plasma biomarker levels using 18 paired clinical samples from postmenopausal women (age range: 75-86. Twenty-six assays were able to detect their analytes in at least 70% of samples. Of those 26 assays, we observed moderate to high intra-class correlation coefficients (ICCs(ranging from 0.53-0.89 for 22 assays, and low ICCs (0-0.47 for four assays. Serum and plasma levels were highly correlated (r > 0.6 for most markers, except for seven assays (r Conclusions For 22 of the 31 biomarkers, a single plasma measurement is a reliable estimate of a woman's average level over a two-year period.
Rethinking cell growth models.
Kafri, Moshe; Metzl-Raz, Eyal; Jonas, Felix; Barkai, Naama
2016-11-01
The minimal description of a growing cell consists of self-replicating ribosomes translating the cellular proteome. While neglecting all other cellular components, this model provides key insights into the control and limitations of growth rate. It shows, for example, that growth rate is maximized when ribosomes work at full capacity, explains the linear relation between growth rate and the ribosome fraction of the proteome and defines the maximal possible growth rate. This ribosome-centered model also highlights the challenge of coordinating cell growth with related processes such as cell division or nutrient production. Coordination is promoted when ribosomes don't translate at maximal capacity, as it allows escaping strict exponential growth. Recent data support the notion that multiple cellular processes limit growth. In particular, increasing transcriptional demand may be as deleterious as increasing translational demand, depending on growth conditions. Consistent with the idea of trade-off, cells may forgo maximal growth to enable more efficient interprocess coordination and faster adaptation to changing conditions. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
PV industry growth and module reliability in Thailand
Chenvidhya, Dhirayut; Seapan, Manit; Sangpongsanont, Yaowanee; Chenvidhya, Tanokkorn; Limsakul, Chamnan; Songprakorp, Roongrojana
2015-09-01
The PV applications in Thailand are now installed more than 1.2 GWp cumulatively. It is due to the National Renewable Energy Program and its targets. In the latest Alternative Energy Development Plan (AEDP), the PV electricity production target has increased from 2 GWp to 3 GWp. With this rapid growth, customers and manufacturers seek for module standard testing. So far over one thousands of PV modules per annum have been tested since 2012. The normal tests include type approval test according to TIS standard, acceptance test and testing for local standard development. For type test, the most module failure was found during damp heat test. For annual evaluation test, the power degradation and delamination of power was found between 0 to 6 percent from its nameplate after deployment of 0 to 5 years in the field. For thin-film module, the degradation and delamination was found in range of 0 to 13 percent (about 5 percent on average) from its nameplate for the modules in operation with less than 5 years. However, for the PV modules at the reference site on campus operated for 12 years, the power degradation was ranging from 10 to 15 percent. Therefore, a long term performance assessment needs to be considered to ensure the system reliability.
Power Electronic Packaging Design, Assembly Process, Reliability and Modeling
Liu, Yong
2012-01-01
Power Electronic Packaging presents an in-depth overview of power electronic packaging design, assembly,reliability and modeling. Since there is a drastic difference between IC fabrication and power electronic packaging, the book systematically introduces typical power electronic packaging design, assembly, reliability and failure analysis and material selection so readers can clearly understand each task's unique characteristics. Power electronic packaging is one of the fastest growing segments in the power electronic industry, due to the rapid growth of power integrated circuit (IC) fabrication, especially for applications like portable, consumer, home, computing and automotive electronics. This book also covers how advances in both semiconductor content and power advanced package design have helped cause advances in power device capability in recent years. The author extrapolates the most recent trends in the book's areas of focus to highlight where further improvement in materials and techniques can d...
Experts’ Opinions on the Reliability Gap and Some Practical Guidelines on Reliability Growth
1988-09-01
Juran " (36:21). The following seven points explain the basic elements of Taguchi’s quality philosophy: 1. An important dimension of the quality of a...guidelines (experts’ point of view) for an effective reliability program (Questions 6 to 10 ) During the interviews, I asked the experts to rank order...reliability starting point would you use? Would you use a starting reliability value of 10 % of the cumulative Mean Time Between Maintenance (Inherent)? 20
solveME: fast and reliable solution of nonlinear ME models
Yang, Laurence; Ma, Ding; Ebrahim, Ali
2016-01-01
reconstructions (M models), are multiscale, and growth maximization is a nonlinear programming (NLP) problem, mainly due to macromolecule dilution constraints. Results: Here, we address these computational challenges. We develop a fast and numerically reliable solution method for growth maximization in ME models...
Stochastic ontogenetic growth model
West, B. J.; West, D.
2012-02-01
An ontogenetic growth model (OGM) for a thermodynamically closed system is generalized to satisfy both the first and second law of thermodynamics. The hypothesized stochastic ontogenetic growth model (SOGM) is shown to entail the interspecies allometry relation by explicitly averaging the basal metabolic rate and the total body mass over the steady-state probability density for the total body mass (TBM). This is the first derivation of the interspecies metabolic allometric relation from a dynamical model and the asymptotic steady-state distribution of the TBM is fit to data and shown to be inverse power law.
Bendell, A
1986-01-01
Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo
Equivalent reliability polynomials modeling EAS and their geometries
Hassan Zahir Abdul Haddi
2015-07-01
Full Text Available In this paper we shall introduce two equivalent techniques in order to evaluate reliability analysis of electrical aircrafts systems (EAS: (i graph theory technique, and (ii simplifying diffeomorphism technique. Geometric modeling of reliability models is based on algebraic hypersurfaces, whose intrinsic properties are able to select those models which are relevant for applications. The basic idea is to cover the reliability hypersurfaces by exponentially decay curves. Most of the calculations made in this paper have used Maple and Matlab software.
Economic Growth Models Transition
Coralia Angelescu
2006-03-01
Full Text Available The transitional recession in countries of Eastern Europe has been much longer than expected. The legacy and recent policy mistakes have both contributed to the slow progress. As structural reforms and gradual institution building have taken hold, the post-socialist economics have started to recover, with some leading countries building momentum toward faster growth. There is a possibility that in wider context of globalization several of these emerging market economies will be able to catch up with the more advanced industrial economies in a matter of one or two generations. Over the past few years, most candidate countries have made progress in the transition to a competitive market economy, macroeconomic stabilization and structural reform. However their income levels have remained far below those in the Member States. Measured by per capita income in purchasing power standards, there has been a very limited amount of catching up over the past fourteen years. Prior, the distinctions between Solow-Swan model and endogenous growth model. The interdependence between transition and integration are stated in this study. Finally, some measures of macroeconomic policy for sustainable growth are proposed in correlation with real macroeconomic situation of the Romanian economy. Our study would be considered the real convergence for the Romanian economy and the recommendations for the adequate policies to achieve a fast real convergence and sustainable growth.
Economic Growth Models Transition
Coralia Angelescu
2006-01-01
Full Text Available The transitional recession in countries of Eastern Europe has been much longer than expected. The legacy and recent policy mistakes have both contributed to the slow progress. As structural reforms and gradual institution building have taken hold, the post-socialist economics have started to recover, with some leading countries building momentum toward faster growth. There is a possibility that in wider context of globalization several of these emerging market economies will be able to catch up with the more advanced industrial economies in a matter of one or two generations. Over the past few years, most candidate countries have made progress in the transition to a competitive market economy, macroeconomic stabilization and structural reform. However their income levels have remained far below those in the Member States. Measured by per capita income in purchasing power standards, there has been a very limited amount of catching up over the past fourteen years. Prior, the distinctions between Solow-Swan model and endogenous growth model. The interdependence between transition and integration are stated in this study. Finally, some measures of macroeconomic policy for sustainable growth are proposed in correlation with real macroeconomic situation of the Romanian economy. Our study would be considered the real convergence for the Romanian economy and the recommendations for the adequate policies to achieve a fast real convergence and sustainable growth.
Reliability models of belt drive systems under slipping failure mode
Peng Gao
2017-01-01
Full Text Available Conventional reliability assessment and reliability-based optimal design of belt drive are based on the stress–strength interference model. However, the stress–strength interference model is essentially a static model, and the sensitivity analysis of belt drive reliability with respect to design parameters needs further investigations. In this article, time-dependent factors that contribute the dynamic characteristics of reliability are pointed out. Moreover, dynamic reliability models and failure rate models of belt drive systems under the failure mode of slipping are developed. Furthermore, dynamic sensitivity models of belt drive reliability based on the proposed dynamic reliability models are proposed. In addition, numerical examples are given to illustrate the proposed models and analyze the influences of design parameters on dynamic characteristics of reliability, failure rate, and sensitivity functions. The results show that the statistical properties of design parameters have different influences on reliability and failure rate of belt drive in cases of different values of design parameters and different operational durations.
Combined HW/SW Reliability Models.
1982-04-01
Stone, C. J. (1972). Introduction to Stochastic Processes . New York: Houghton Mifflin. Jelinski, Z. and Moranda, P. (1972). Software reliability...research. Statistical Computer Performance Evaluation, New York: Academic Press, 465-484. Kannan, D. (1979). An Introduction to Stochastic Processes . New
Singularity of Some Software Reliability Models and Parameter Estimation Method
无
2000-01-01
According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out the conclusion from the fitting results of failure data of a software project, the SRES can recommend users “the most suitable model” as a software reliability measurement model. We believe that the SRES can overcome the inconsistency in applications of software reliability models well. We report investigation results of singularity and parameter estimation methods of experimental models in SRES.
A MANAGEMENT APPROACH TO RELIABILITY GROWTH FOR COMPLEX ELECTROMECHANICAL SYSTEMS
A.C. Rooney
2012-01-01
Full Text Available
ENGLISH ABSTRACT: This paper proposes a reliability management process for the development of complex electromechanical systems. Specific emphasis is the development of these systems in an environment of limited development resources, and where small production quantities are envisaged.
The results of this research provides a management strategy for reliability engineering activities, within a systems engineering environment, where concurrent engineering techniques are used to reduce development cycles and costs.
AFRIKAANSE OPSOMMING: Hierdie artikel stel 'n proses, vir die bestuur van die betroubaarheid gedurende die ontwikkeling van komplekse elektromeganiese stelsels voor. Die omgewing van beperkte ontwikkelingshulpbronne en klein produksie hoeveelhede word beklemtoon.
Die resultate van hierdie navorsing stel 'n bestuurstrategie, vir betroubaarheidsbestuur in n stelselsingenieurswese omgewing waar gelyktydige ingenieurswese tegnieke gebruik word am die ontwikkelingsiklus en -kostes te beperk, voor.
Assessment of stochastically updated finite element models using reliability indicator
Hua, X. G.; Wen, Q.; Ni, Y. Q.; Chen, Z. Q.
2017-01-01
Finite element (FE) model updating techniques have been a viable approach to correcting an initial mathematical model based on test data. Validation of the updated FE models is usually conducted by comparing model predictions with independent test data that have not been used for model updating. This approach of model validation cannot be readily applied in the case of a stochastically updated FE model. In recognizing that structural reliability is a major decision factor throughout the lifecycle of a structure, this study investigates the use of structural reliability as a measure for assessing the quality of stochastically updated FE models. A recently developed perturbation method for stochastic FE model updating is first applied to attain the stochastically updated models by using the measured modal parameters with uncertainty. The reliability index and failure probability for predefined limit states are computed for the initial and the stochastically updated models, respectively, and are compared with those obtained from the 'true' model to assess the quality of the two models. Numerical simulation of a truss bridge is provided as an example. The simulated modal parameters involving different uncertainty magnitudes are used to update an initial model of the bridge. It is shown that the reliability index obtained from the updated model is much closer to true reliability index than that obtained from the initial model in the case of small uncertainty magnitude; in the case of large uncertainty magnitude, the reliability index computed from the initial model rather than from the updated model is closer to the true value. The present study confirms the usefulness of measurement-calibrated FE models and at the same time also highlights the importance of the uncertainty reduction in test data for reliable model updating and reliability evaluation.
A Note on Structural Equation Modeling Estimates of Reliability
Yang, Yanyun; Green, Samuel B.
2010-01-01
Reliability can be estimated using structural equation modeling (SEM). Two potential problems with this approach are that estimates may be unstable with small sample sizes and biased with misspecified models. A Monte Carlo study was conducted to investigate the quality of SEM estimates of reliability by themselves and relative to coefficient…
Reliability models applicable to space telescope solar array assembly system
Patil, S. A.
1986-01-01
A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.
Modeling of humidity-related reliability in enclosures with electronics
Hygum, Morten Arnfeldt; Popok, Vladimir
2015-01-01
Reliability of electronics that operate outdoor is strongly affected by environmental factors such as temperature and humidity. Fluctuations of these parameters can lead to water condensation inside enclosures. Therefore, modelling of humidity distribution in a container with air and freely exposed...... to predict humidity-related reliability of a printed circuit board (PCB) located in a cabinet by combining structural reliability methods and non-linear diffusion models. This framework can, thus, be used for reliability prediction from a climatic point-of-view. The proposed numerical approach is then tested...
Models of travel time and reliability for freight transport
Terziev, M.N.; Roberts, P.O.
1976-12-01
The model produces a probability distribution of the trip time associated with the shipment of freight between a given origin and destination by a given mode and route. Using distributions of the type produced by the model, it is possible to determine two important measures of the quality of service offered by the carrier. These measures are the main travel time and the reliability of delivery. The reliability measure describes the spread of the travel-time distribution. The model described herein was developed originally as part of the railroad rationalization study conducted at MIT and sponsored by the Federal Railroad Administration. This work built upon earlier research in railroad reliability models. Because of the predominantly rail background of this model, the initial discussion focuses on the problem of modeling rail-trip-time reliability. Then, it is shown that the model can also be used to study truck and barge operations.
Developing Fast and Reliable Flood Models
Thrysøe, Cecilie; Toke, Jens; Borup, Morten
2016-01-01
State-of-the-art flood modelling in urban areas are based on distributed physically based models. However, their usage is impeded by high computational demands and numerical instabilities, which make calculations both difficult and time consuming. To address these challenges we develop and test...... is modelled by response surface surrogates, which are empirical data driven models. These are trained using the volume-discharge relations by piecewise linear functions. (ii) The surface flooding is modelled by lower-fidelity physically based surrogates, which are based on surface depressions and flow paths....... A surrogate model is set up for a case study area in Aarhus, Denmark, to replace a MIKE FLOOD model. The drainage surrogates are able to reproduce the MIKE URBAN results for a set of rain inputs. The coupled drainage-surface surrogate model lacks details in the surface description which reduces its overall...
Modelling and Simulation of Scraper Reliability for Maintenance
HUANG Liang-pei; LU Zhong-hai; GONG Zheng-li
2011-01-01
A scraper conveyor is a kind of heavy machinery which can continuously transport goods and widely used in mines, ports and store enterprises. Since scraper failure rate directly affects production costs and production capacity, the evaluation and the prediction of scraper conveyor reliability are important for these enterprises. In this paper, the reliabilities of different parts are classified and discussed according to their structural characteristics and different failure factors. Based on the component＇s time-to-failure density function, the reliability model of scraper chain is constructed to track the age distribution of part population and the reliability change of the scraper chain. Based on the stress-strength interference model, considering the decrease of strength due to fatigue failure, the dynamic reliability model of such component as gear, axis is developed to observe the change of the part reliability with the service time of scraper. Finally, system reliability model of the scraper is established for the maintenance to simulate and calculate the scraper reliability.
Transformer real-time reliability model based on operating conditions
HE Jian; CHENG Lin; SUN Yuan-zhang
2007-01-01
Operational reliability evaluation theory reflects real-time reliability level of power system. The component failure rate varies with operating conditions. The impact of real-time operating conditions such as ambient temperature and transformer MVA (megavolt-ampere) loading on transformer insulation life is studied in this paper. The formula of transformer failure rate based on the winding hottest-spot temperature (HST) is given. Thus the real-time reliability model of transformer based on operating conditions is presented. The work is illustrated using the 1979 IEEE Reliability Test System. The changes of operating conditions are simulated by using hourly load curve and temperature curve, so the curves of real-time reliability indices are obtained by using operational reliability evaluation.
Models for Battery Reliability and Lifetime
Smith, K.; Wood, E.; Santhanagopalan, S.; Kim, G. H.; Neubauer, J.; Pesaran, A.
2014-03-01
Models describing battery degradation physics are needed to more accurately understand how battery usage and next-generation battery designs can be optimized for performance and lifetime. Such lifetime models may also reduce the cost of battery aging experiments and shorten the time required to validate battery lifetime. Models for chemical degradation and mechanical stress are reviewed. Experimental analysis of aging data from a commercial iron-phosphate lithium-ion (Li-ion) cell elucidates the relative importance of several mechanical stress-induced degradation mechanisms.
An Interactive Whiteboard Model Survey: Reliable Development
Bih-Yaw Shih
2012-04-01
Full Text Available Applications and practices of interactive whiteboards (IWBs in school learning is important focus and development trend for developmented countries in recent years. There are rare researches and discussions about IWB teaching materials for course teaching and teaching effectiveness. As for the aspect of academic studies, there is more practical teaching sharing for subjects such as language learning, mathematical learning and physical science learning; however, it is rarely seen empirical research on the application of IWB for educational acceptances of interactive whiteboards. Based on its imporatances, we summarize previous literatures to establish a theoretical model for interactive whiteboards (IWBs. Variables in this model are then discussed to find out the interaction between each other. The contribution of the study develops an innovative model for educational acceptances of interactive whiteboards using hybrid TAM, ECM, and Flow models.
Construction of a reliable model pyranometer for irradiance ...
USER
2010-03-22
Mar 22, 2010 ... design, construction and testing of a reliable model pyranometer (RMP001) was done in Mubi,. Adamawa ... Pyranometers are widely used in meteorology, climate- .... It is calculated that an appropriate value for the capa-.
Reliability Modeling and Analysis of SCI Topological Network
Hongzhe Xu
2012-03-01
Full Text Available The problem of reliability modeling on the Scalable Coherent Interface (SCI rings and topological network is studied. The reliability models of three SCI rings are developed and the factors which influence the reliability of SCI rings are studied. By calculating the shortest path matrix and the path quantity matrix of different types SCI network topology, the communication characteristics of SCI network are obtained. For the situations of the node-damage and edge-damage, the survivability of SCI topological network is studied.
MODELING HUMAN RELIABILITY ANALYSIS USING MIDAS
Ronald L. Boring; Donald D. Dudenhoeffer; Bruce P. Hallbert; Brian F. Gore
2006-05-01
This paper summarizes an emerging collaboration between Idaho National Laboratory and NASA Ames Research Center regarding the utilization of high-fidelity MIDAS simulations for modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (i) the estimation of human error with novel control room equipment and configurations, (ii) the investigative determination of risk significance in recreating past event scenarios involving control room operating crews, and (iii) the certification of novel staffing levels in control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of risk in next generation control rooms.
Quantitative metal magnetic memory reliability modeling for welded joints
Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng
2016-03-01
Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.
A Reliability Based Model for Wind Turbine Selection
A.K. Rajeevan
2013-06-01
Full Text Available A wind turbine generator output at a specific site depends on many factors, particularly cut- in, rated and cut-out wind speed parameters. Hence power output varies from turbine to turbine. The objective of this paper is to develop a mathematical relationship between reliability and wind power generation. The analytical computation of monthly wind power is obtained from weibull statistical model using cubic mean cube root of wind speed. Reliability calculation is based on failure probability analysis. There are many different types of wind turbinescommercially available in the market. From reliability point of view, to get optimum reliability in power generation, it is desirable to select a wind turbine generator which is best suited for a site. The mathematical relationship developed in this paper can be used for site-matching turbine selection in reliability point of view.
Reliability Analysis and Modeling of ZigBee Networks
Lin, Cheng-Min
The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to
System Reliability Analysis Capability and Surrogate Model Application in RAVEN
Rabiti, Cristian; Alfonsi, Andrea; Huang, Dongli; Gleicher, Frederick; Wang, Bei; Adbel-Khalik, Hany S.; Pascucci, Valerio; Smith, Curtis L.
2015-11-01
This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.
Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.
Chatzis, Sotirios P; Andreou, Andreas S
2015-11-01
Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.
Effective stimuli for constructing reliable neuron models.
Shaul Druckmann
2011-08-01
Full Text Available The rich dynamical nature of neurons poses major conceptual and technical challenges for unraveling their nonlinear membrane properties. Traditionally, various current waveforms have been injected at the soma to probe neuron dynamics, but the rationale for selecting specific stimuli has never been rigorously justified. The present experimental and theoretical study proposes a novel framework, inspired by learning theory, for objectively selecting the stimuli that best unravel the neuron's dynamics. The efficacy of stimuli is assessed in terms of their ability to constrain the parameter space of biophysically detailed conductance-based models that faithfully replicate the neuron's dynamics as attested by their ability to generalize well to the neuron's response to novel experimental stimuli. We used this framework to evaluate a variety of stimuli in different types of cortical neurons, ages and animals. Despite their simplicity, a set of stimuli consisting of step and ramp current pulses outperforms synaptic-like noisy stimuli in revealing the dynamics of these neurons. The general framework that we propose paves a new way for defining, evaluating and standardizing effective electrical probing of neurons and will thus lay the foundation for a much deeper understanding of the electrical nature of these highly sophisticated and non-linear devices and of the neuronal networks that they compose.
Quasi-Bayesian software reliability model with small samples
ZHANG Jin; TU Jun-xiang; CHEN Zhuo-ning; YAN Xiao-guang
2009-01-01
In traditional Bayesian software reliability models,it was assume that all probabilities are precise.In practical applications the parameters of the probability distributions are often under uncertainty due to strong dependence on subjective information of experts' judgments on sparse statistical data.In this paper,a quasi-Bayesian software reliability model using interval-valued probabilities to clearly quantify experts' prior beliefs on possible intervals of the parameters of the probability distributions is presented.The model integrates experts' judgments with statistical data to obtain more convincible assessments of software reliability with small samples.For some actual data sets,the presented model yields better predictions than the Jelinski-Moranda (JM) model using maximum likelihood (ML).
Modeling and Analysis of Component Faults and Reliability
Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter;
2016-01-01
that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates.......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...
Modeling Exponential Population Growth
McCormick, Bonnie
2009-01-01
The concept of population growth patterns is a key component of understanding evolution by natural selection and population dynamics in ecosystems. The National Science Education Standards (NSES) include standards related to population growth in sections on biological evolution, interdependence of organisms, and science in personal and social…
Modeling Exponential Population Growth
McCormick, Bonnie
2009-01-01
The concept of population growth patterns is a key component of understanding evolution by natural selection and population dynamics in ecosystems. The National Science Education Standards (NSES) include standards related to population growth in sections on biological evolution, interdependence of organisms, and science in personal and social…
An interval-valued reliability model with bounded failure rates
Kozine, Igor; Krymsky, Victor
2012-01-01
The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... function if only partial failure information is available. An example is provided. © 2012 Copyright Taylor and Francis Group, LLC....
Kwon, Oh Sang; Oh, Jun Kyu; Kim, Mi Hyang; Park, So Hyun; Pyo, Hyun Keol; Kim, Kyu Han; Cho, Kwang Hyun; Eun, Hee Chul
2006-02-01
Of the numerous assays used to assess hair growth, hair follicle organ culture model is one of the most popular and powerful in vitro systems. Changes in hair growth are commonly employed as a measurement of follicular activity. Hair cycle stage of mouse vibrissa follicles in vivo is known to determine subsequent hair growth and follicle behavior in vitro and it is recommended that follicles be taken at precisely the same cyclic stage. This study was performed to evaluate whether categorization of human hair follicles by the growth in vivo could be used to select follicles of the defined anagen stage for more consistent culture. Occipital scalp samples were obtained from three subjects, 2 weeks later after hair bleaching. Hair growth and follicle length of isolated anagen VI follicles were measured under a videomicroscope. Follicles were categorized into four groups according to hair growth and some were cultured ex vivo for 6 days. Follicles showed considerable variations with respect to hair growth and follicle length; however, these two variables were relatively well correlated. Hair growth in culture was closely related with hair growth rate in vivo. Moreover, minoxidil uniquely demonstrated a significant increase of hair growth in categorized hair follicles assumed at a similar early anagen VI stage of hair cycle. Selection of follicles at a defined stage based on hair-growth rate would permit a more reliable outcome in human hair follicle organ culture.
Singularity of Software Reliability Models LVLM and LVQM
无
2000-01-01
According to the principle, “The failure data is the basis of software reliabilityanalysis”, we built a software reliability expert system (SRES) by adopting the artificialtechnology. By reasoning out the conclusion from the fitting results of failure data of asoftware project, the SRES can recommend users “the most suitable model”as a softwarereliability measurement model. We believe that the SRES can overcome the inconsistency inapplications of software reliability models well. We report investigation results of singularity and parameter estimation methods of models, LVLM and LVQM.
Learning reliable manipulation strategies without initial physical models
Christiansen, Alan D.; Mason, Matthew T.; Mitchell, Tom M.
1990-01-01
A description is given of a robot, possessing limited sensory and effectory capabilities but no initial model of the effects of its actions on the world, that acquires such a model through exploration, practice, and observation. By acquiring an increasingly correct model of its actions, it generates increasingly successful plans to achieve its goals. In an apparently nondeterministic world, achieving reliability requires the identification of reliable actions and a preference for using such actions. Furthermore, by selecting its training actions carefully, the robot can significantly improve its learning rate.
Coverage Modeling and Reliability Analysis Using Multi-state Function
无
2007-01-01
Fault tree analysis is an effective method for predicting the reliability of a system. It gives a pictorial representation and logical framework for analyzing the reliability. Also, it has been used for a long time as an effective method for the quantitative and qualitative analysis of the failure modes of critical systems. In this paper, we propose a new general coverage model (GCM) based on hardware independent faults. Using this model, an effective software tool can be constructed to detect, locate and recover fault from the faulty system. This model can be applied to identify the key component that can cause the failure of the system using failure mode effect analysis (FMEA).
Modeling HVDC links in composite reliability evaluation: issues and solutions
Reis, Lineu B. de [Sao Paulo Univ., SP (Brazil). Escola Politecnica; Ramos, Dorel S. [Centrais Eletricas de Sao Paulo, SP (Brazil); Morozowski Filho, Marciano [Santa Catarina Univ., Florianopolis, SC (Brazil)
1992-12-31
This paper deals with theoretical and practical aspects of HVDC link modeling for composite (generation and transmission) system reliability evaluation purposes. The conceptual framework used in the analysis, as well as the practical aspects, are illustrated through an application example. Initially, two distinct HVDC link operation models are described: synchronous and asynchronous. An analysis of the most significant internal failure modes and their effects on HVDC link transmission capability is presented and a reliability model is proposed. Finally, a historical performance data of the Itaipu HVDC system is shown. 6 refs., 5 figs., 8 tabs.
Modeling microbial growth and dynamics.
Esser, Daniel S; Leveau, Johan H J; Meyer, Katrin M
2015-11-01
Modeling has become an important tool for widening our understanding of microbial growth in the context of applied microbiology and related to such processes as safe food production, wastewater treatment, bioremediation, or microbe-mediated mining. Various modeling techniques, such as primary, secondary and tertiary mathematical models, phenomenological models, mechanistic or kinetic models, reactive transport models, Bayesian network models, artificial neural networks, as well as agent-, individual-, and particle-based models have been applied to model microbial growth and activity in many applied fields. In this mini-review, we summarize the basic concepts of these models using examples and applications from food safety and wastewater treatment systems. We further review recent developments in other applied fields focusing on models that explicitly include spatial relationships. Using these examples, we point out the conceptual similarities across fields of application and encourage the combined use of different modeling techniques in hybrid models as well as their cross-disciplinary exchange. For instance, pattern-oriented modeling has its origin in ecology but may be employed to parameterize microbial growth models when experimental data are scarce. Models could also be used as virtual laboratories to optimize experimental design analogous to the virtual ecologist approach. Future microbial growth models will likely become more complex to benefit from the rich toolbox that is now available to microbial growth modelers.
A random effects generalized linear model for reliability compositive evaluation
无
2009-01-01
This paper first proposes a random effects generalized linear model to evaluate the storage life of one kind of high reliable and small sample-sized products by combining multi-sources information of products coming from the same population but stored at different environments. The relevant algorithms are also provided. Simulation results manifest the soundness and effectiveness of the proposed model.
Design of a Human Reliability Assessment model for structural engineering
De Haan, J.; Terwel, K.C.; Al-Jibouri, S.H.S.
2013-01-01
It is generally accepted that humans are the “weakest link” in structural design and construction processes. Despite this, few models are available to quantify human error within engineering processes. This paper demonstrates the use of a quantitative Human Reliability Assessment model within struct
A random effects generalized linear model for reliability compositive evaluation
ZHAO Hui; YU Dan
2009-01-01
This paper first proposes a random effects generalized linear model to evaluate the storage life of one kind of high reliable and small sample-sized products by combining multi-sources information of products coming from the same population but stored at different environments.The relevant algorithms are also provided.Simulation results manifest the soundness and effectiveness of the proposed model.
Weinberger, Christopher Robert
2013-08-01
Tin, lead, and lead-tin solders are the most commonly used solders due to their low melting temperatures. However, due to the toxicity problems, lead must now be removed from solder materials. This has lead to the re-emergence of the issue of tin whisker growth. Tin whiskers are a microelectronic packaging issue because they can lead to shorts if they grow to sufficient length. However, the cause of tin whisker growth is still not well understood and there is lack of robust methods to determine when and if whiskering will be a problem. This report summarizes some of the leading theories on whisker growth and attempts to provide some ideas towards establishing the role microstructure plays in whisker growth.
Weinberger, Christopher Robert
2013-08-01
Tin, lead, and lead-tin solders are the most commonly used solders due to their low melting temperatures. However, due to the toxicity problems, lead must now be removed from solder materials. This has lead to the re-emergence of the issue of tin whisker growth. Tin whiskers are a microelectronic packaging issue because they can lead to shorts if they grow to sufficient length. However, the cause of tin whisker growth is still not well understood and there is lack of robust methods to determine when and if whiskering will be a problem. This report summarizes some of the leading theories on whisker growth and attempts to provide some ideas towards establishing the role microstructure plays in whisker growth.
Hai An
2016-08-01
Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.
Schneidewind, Norman F.
1997-01-01
The purpose of this handbook is threefold. Specifically, it: Serves as a reference guide for implementing standard software reliability practices at Marine Corps Tactical Systems Support Activity and aids in applying the software reliability model; Serves as a tool for managing the software reliability program; and Serves as a training aid. U.S. Marine Corps Tactical Systems Support Activity, Camp Pendleton, CA. RLACH
贾明明; 李旭东; 吕航
2014-01-01
腐蚀损伤会加速疲劳载荷下的飞机铝合金结构裂纹的萌生和扩展，威胁结构安全性。针对腐蚀影响下的疲劳裂纹扩展的随机性本质，对预腐蚀 LD10CS 合金的预腐蚀疲劳试验进行了数据分析，提出了基于可靠性的腐蚀裂纹扩展速率表征方法，与试验结果对比表明，该方法可以给出 LD10CS 腐蚀疲劳裂纹扩展速率的上下限，进而给出该种材料铝合金构件的疲劳裂纹扩展寿命的上下限，为评估铝合金构件的寿命提供了依据。%Fatigue loadings and environmental corrosion damage can decrease the mechanical properties of LD10CS alu-minum alloy.The paper made a research on the fatigue crack growth rate (FCG)of AA LD10CS with corrosion damage, and proposed a reliability-based method to evaluate FCG.Compare of predicted FCG and experimental results indicated that the proposed method was able to give the lower and upper limit of FCG of LD10CS with corrosion damage,which provided the basis of aluminum alloy component safe life prediction.
Statistical models and methods for reliability and survival analysis
Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo
2013-01-01
Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical
Modelling application for cognitive reliability and error analysis method
Fabio De Felice
2013-10-01
Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.
Modeling and Simulation Reliable Spacecraft On-Board Computing
Park, Nohpill
1999-01-01
The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.
Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm
Raj Kumar
2012-12-01
Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.
Modeling and Forecasting (Un)Reliable Realized Covariances for More Reliable Financial Decisions
Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier
We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases...... turnover and statistically superior positions compared to existing procedures. Translating these statistical improvements into economic gains, we find that under empirically realistic assumptions a risk-averse investor would be willing to pay up to 170 basis points per year to shift to using the new class...
Fuse Modeling for Reliability Study of Power Electronics Circuits
Bahman, Amir Sajjad; Iannuzzo, Francesco; Blaabjerg, Frede
2017-01-01
This paper describes a comprehensive modeling approach on reliability of fuses used in power electronic circuits. When fuses are subjected to current pulses, cyclic temperature stress is introduced to the fuse element and will wear out the component. Furthermore, the fuse may be used in a large...
Fuse Modeling for Reliability Study of Power Electronics Circuits
Bahman, Amir Sajjad; Iannuzzo, Francesco; Blaabjerg, Frede
2017-01-01
This paper describes a comprehensive modeling approach on reliability of fuses used in power electronic circuits. When fuses are subjected to current pulses, cyclic temperature stress is introduced to the fuse element and will wear out the component. Furthermore, the fuse may be used in a large...
Semigroup Method for a Mathematical Model in Reliability Analysis
Geni Gupur; LI Xue-zhi
2001-01-01
The system which consists of a reliable machine, an unreliable machine and a storage buffer with infinite many workpieces has been studied. The existence of a unique positive time-dependent solution of the model corresponding to the system has been obtained by using C0-semigroup theory of linear operators in functional analysis.
Towards Sustainable Growth Business Models
Kamp-Roelands, N.; Balkenende, J.P.; Van Ommen, P.
2012-03-15
The Dutch Sustainable Growth Coalition (DSGC) has the following objectives: The DSGC aims to pro-actively drive sustainable growth business models along three lines: (1) Shape. DSGC member companies aim to connect economic profitability with environmental and social progress on the basis of integrated sustainable growth business models; (2) Share. DSGC member companies aim for joint advocacy of sustainable growth business models both internationally and nationally; and (3) Stimulate. DSGC member companies aim to stimulate and influence the policy debate on enabling sustainable growth - with a view to finding solutions to the environmental and social challenges we are facing. This is their first report. The vision, actions and mission of DSGC are documented in the Manifesto in Chapter 2 of this publication. Chapter 3 contains an overview of key features of an integrated sustainable growth business model and the roadmap towards such a model. In Chapter 4, project examples of DSGC members are presented, providing insight into the hands-on reality of implementing the good practices. Chapter 5 offers an overview of how the Netherlands provides an enabling environment for sustainable growth business models. Chapter 6 offers the key conclusions.
Effective turbulence models and fatigue reliability in wind farms
Sørensen, John Dalsgaard; Frandsen, Sten Tronæs; Tarp-Johansen, N.J.
2008-01-01
intensity in wakes behind wind turbines can imply a significant reduction in the fatigue lifetime of wind turbines placed in wakes. Ill this paper the design code model ill the wind turbine code [IEC 61400-1, Wind turbine generator systems - Part 1: Safety requirements. 2005] is evaluated from...... a probabilistic point of view, including the importance of modeling the SN-curve by a bi-linear model. Fatigue models relevant for welded, cast steel and fiber reinforced details are considered. Further, the influence on the fatigue reliability is investigated from modeling the fatigue response by a stochastic...
Reliability modeling and analysis of smart power systems
Karki, Rajesh; Verma, Ajit Kumar
2014-01-01
The volume presents the research work in understanding, modeling and quantifying the risks associated with different ways of implementing smart grid technology in power systems in order to plan and operate a modern power system with an acceptable level of reliability. Power systems throughout the world are undergoing significant changes creating new challenges to system planning and operation in order to provide reliable and efficient use of electrical energy. The appropriate use of smart grid technology is an important drive in mitigating these problems and requires considerable research acti
Probabilistic Modeling of Fatigue Damage Accumulation for Reliability Prediction
Vijay Rathod
2011-01-01
Full Text Available A methodology for probabilistic modeling of fatigue damage accumulation for single stress level and multistress level loading is proposed in this paper. The methodology uses linear damage accumulation model of Palmgren-Miner, a probabilistic S-N curve, and an approach for a one-to-one transformation of probability density functions to achieve the objective. The damage accumulation is modeled as a nonstationary process as both the expected damage accumulation and its variability change with time. The proposed methodology is then used for reliability prediction under single stress level and multistress level loading, utilizing dynamic statistical model of cumulative fatigue damage. The reliability prediction under both types of loading is demonstrated with examples.
Strength Reliability Analysis of Turbine Blade Using Surrogate Models
Wei Duan
2014-05-01
Full Text Available There are many stochastic parameters that have an effect on the reliability of steam turbine blades performance in practical operation. In order to improve the reliability of blade design, it is necessary to take these stochastic parameters into account. In this study, a variable cross-section twisted blade is investigated and geometrical parameters, material parameters and load parameters are considered as random variables. A reliability analysis method as a combination of a Finite Element Method (FEM, a surrogate model and Monte Carlo Simulation (MCS, is applied to solve the blade reliability analysis. Based on the blade finite element parametrical model and the experimental design, two kinds of surrogate models, Polynomial Response Surface (PRS and Artificial Neural Network (ANN, are applied to construct the approximation analytical expressions between the blade responses (including maximum stress and deflection and random input variables, which act as a surrogate of finite element solver to drastically reduce the number of simulations required. Then the surrogate is used for most of the samples needed in the Monte Carlo method and the statistical parameters and cumulative distribution functions of the maximum stress and deflection are obtained by Monte Carlo simulation. Finally, the probabilistic sensitivities analysis, which combines the magnitude of the gradient and the width of the scatter range of the random input variables, is applied to evaluate how much the maximum stress and deflection of the blade are influenced by the random nature of input parameters.
Reliability-based design optimization with progressive surrogate models
Kanakasabai, Pugazhendhi; Dhingra, Anoop K.
2014-12-01
Reliability-based design optimization (RBDO) has traditionally been solved as a nested (bilevel) optimization problem, which is a computationally expensive approach. Unilevel and decoupled approaches for solving the RBDO problem have also been suggested in the past to improve the computational efficiency. However, these approaches also require a large number of response evaluations during optimization. To alleviate the computational burden, surrogate models have been used for reliability evaluation. These approaches involve construction of surrogate models for the reliability computation at each point visited by the optimizer in the design variable space. In this article, a novel approach to solving the RBDO problem is proposed based on a progressive sensitivity surrogate model. The sensitivity surrogate models are built in the design variable space outside the optimization loop using the kriging method or the moving least squares (MLS) method based on sample points generated from low-discrepancy sampling (LDS) to estimate the most probable point of failure (MPP). During the iterative deterministic optimization, the MPP is estimated from the surrogate model for each design point visited by the optimizer. The surrogate sensitivity model is also progressively updated for each new iteration of deterministic optimization by adding new points and their responses. Four example problems are presented showing the relative merits of the kriging and MLS approaches and the overall accuracy and improved efficiency of the proposed approach.
MODELLING SOCIAL CAPITAL AND GROWTH
Chou, Yuan K.
2002-01-01
This paper proposes three theoretical growth models incorporating social capital, based on varied expositions on the concept of social capital and the empirical evidence gathered to date. In these models, social capital impacts growth by assisting in the accumulation of human capital, by affecting financial development through its effects on collective trust and social norms, and by facilitating networking between firms that result in the creation and diffusion of business and technological i...
Bring Your Own Device - Providing Reliable Model of Data Access
Stąpór Paweł
2016-10-01
Full Text Available The article presents a model of Bring Your Own Device (BYOD as a model network, which provides the user reliable access to network resources. BYOD is a model dynamically developing, which can be applied in many areas. Research network has been launched in order to carry out the test, in which as a service of BYOD model Work Folders service was used. This service allows the user to synchronize files between the device and the server. An access to the network is completed through the wireless communication by the 802.11n standard. Obtained results are shown and analyzed in this article.
Nakamura, Daisuke; Suzumura, Akitoshi; Shigetoh, Keisuke
2015-02-01
Highly reliable low-cost protective coatings have been sought after for use in crucibles and susceptors for bulk and epitaxial film growth processes involving wide bandgap materials. Here, we propose a production technique for ultra-thick (50-200 μmt) tantalum carbide (TaC) protective coatings on graphite substrates, which consists of TaC slurry application and subsequent sintering processes, i.e., a wet ceramic process. Structural analysis of the sintered TaC layers indicated that they have a dense granular structure containing coarse grain with sizes of 10-50 μm. Furthermore, no cracks or pinholes penetrated through the layers, i.e., the TaC layers are highly reliable protective coatings. The analysis also indicated that no plastic deformation occurred during the production process, and the non-textured crystalline orientation of the TaC layers is the origin of their high reliability and durability. The TaC-coated graphite crucibles were tested in an aluminum nitride (AlN) sublimation growth process, which involves extremely corrosive conditions, and demonstrated their practical reliability and durability in the AlN growth process as a TaC-coated graphite. The application of the TaC-coated graphite materials to crucibles and susceptors for use in bulk AlN single crystal growth, bulk silicon carbide (SiC) single crystal growth, chemical vapor deposition of epitaxial SiC films, and metal-organic vapor phase epitaxy of group-III nitrides will lead to further improvements in crystal quality and reduced processing costs.
Nakamura, Daisuke; Suzumura, Akitoshi; Shigetoh, Keisuke [Toyota Central R and D Labs., Inc., Nagakute, Aichi 480-1192 (Japan)
2015-02-23
Highly reliable low-cost protective coatings have been sought after for use in crucibles and susceptors for bulk and epitaxial film growth processes involving wide bandgap materials. Here, we propose a production technique for ultra-thick (50–200 μmt) tantalum carbide (TaC) protective coatings on graphite substrates, which consists of TaC slurry application and subsequent sintering processes, i.e., a wet ceramic process. Structural analysis of the sintered TaC layers indicated that they have a dense granular structure containing coarse grain with sizes of 10–50 μm. Furthermore, no cracks or pinholes penetrated through the layers, i.e., the TaC layers are highly reliable protective coatings. The analysis also indicated that no plastic deformation occurred during the production process, and the non-textured crystalline orientation of the TaC layers is the origin of their high reliability and durability. The TaC-coated graphite crucibles were tested in an aluminum nitride (AlN) sublimation growth process, which involves extremely corrosive conditions, and demonstrated their practical reliability and durability in the AlN growth process as a TaC-coated graphite. The application of the TaC-coated graphite materials to crucibles and susceptors for use in bulk AlN single crystal growth, bulk silicon carbide (SiC) single crystal growth, chemical vapor deposition of epitaxial SiC films, and metal-organic vapor phase epitaxy of group-III nitrides will lead to further improvements in crystal quality and reduced processing costs.
NHPP-Based Software Reliability Models Using Equilibrium Distribution
Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi
Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.
Simulation modeling of reliability and efficiency of mine ventilation systems
Ushakov, V.K. (Moskovskii Gornyi Institut (USSR))
1991-06-01
Discusses a method developed by the MGI institute for computerized simulation of operation of ventilation systems used in deep underground coal mines. The modeling is aimed at assessment of system reliability and efficiency (probability of failure-free operation and stable air distribution). The following stages of the simulation procedure are analyzed: development of a scheme of the ventilation system (type, aerodynamic characteristics and parameters that describe system elements, e.g. ventilation tunnels, ventilation equipment, main blowers etc., dynamics of these parameters depending among others on mining and geologic conditions), development of mathematical models that describe system characteristics as well as external factors and their effects on the system, development of a structure of the simulated ventilation system, development of an algorithm, development of the final computer program for simulation of a mine ventilation system. Use of the model for forecasting reliability of air supply and efficiency of mine ventilation is discussed. 2 refs.
Lazzaroni, Massimo
2012-01-01
This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be
The establishment of reliability model for LED lamps
Jian, Hao; Lei, Jing; Yao, Wang; Qun, Gao; Hongliang, Ke; Xiaoxun, Wang; Yanchao, Zhang; Qiang, Sun; Zhijun, Xu
2016-07-01
In order to verify which of the distributions and established methods of reliability model are more suitable for the analysis of the accelerated aging of LED lamp, three established methods (approximate method, analytical method and two-stage method) of reliability model are used to analyze the experimental data under the condition of the Weibull distribution and Lognormal distribution, in this paper. Ten LED lamps are selected for the accelerated aging experiment and the luminous fluxes are measured at an accelerated aging temperature. AIC information criterion is adopted in the evaluation of the models. The results show that the accuracies of the analytical method and the two-stage method are higher than that of the approximation method, with the widths of confidence intervals of unknown parameters of the reliability model being the smallest for the two-stage method. In a comparison between the two types of distributions, the accuracies are nearly identical. Project supported by the National High Technology Research and Development Program of China (Nos. 2015AA03A101, 2013AA03A116), the Cuican Project of Chinese Academy of Sciences (No. KZCC-EW-102), and the Jilin Province Science and Technology Development Plan Item (No. 20130206018GX).
Bayesian synthetic evaluation of multistage reliability growth with instant and delayed fix modes
无
2008-01-01
In the multistage reliability growth tests with instant and delayed fix modes, the failure data can be assumed to follow Weibull processes with different parameters at different stages. For the Weibull process within a stage, by the proper selection of prior distribution form and the parameters, a concise posterior distribution form is obtained, thus simplifying the Bayesian analysis. In the multistage tests, the improvement factor is used to convert the posterior of one stage to the prior of the subsequent stage. The conversion criterion is carefully analyzed to determine the distribution parameters of the subsequent stage's variable reasonably. Based on the mentioned results, a new synthetic Bayesian evaluation program and algorithm framework is put forward to evaluate the multistage reliability growth tests with instant and delayed fix modes. The example shows the effectiveness and flexibility of this method.
Human Performance Modeling for Dynamic Human Reliability Analysis
Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory
2015-08-01
Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.
Harris, D.O.; Lim, E.Y.; Dedhia, D.D.; Woo, H.H.; Chou, C.K.
1982-06-01
The efforts concentrated on modifications of the stratified Monte Carlo code called PRAISE (Piping Reliability Analysis Including Seismic Events) to make it more widely applicable to probabilistic fracture mechanics analysis of nuclear reactor piping. Pipe failures are considered to occur as the result of crack-like defects introduced during fabrication, that escape detection during inspections. The code modifications allow the following factors in addition to those considered in earlier work to be treated: other materials, failure criteria and subcritical crack growth characteristic; welding residual and vibratory stresses; and longitudinal welds (the original version considered only circumferential welds). The fracture mechanics background for the code modifications is included, and details of the modifications themselves provided. Additionally, an updated version of the PRAISE user's manual is included. The revised code, known as PRAISE-B was then applied to a variety of piping problems, including various size lines subject to stress corrosion cracking and vibratory stresses. Analyses including residual stresses and longitudinal welds were also performed.
Testing the reliability of ice-cream cone model
Pan, Zonghao; Shen, Chenglong; Wang, Chuanbing; Liu, Kai; Xue, Xianghui; Wang, Yuming; Wang, Shui
2015-04-01
Coronal Mass Ejections (CME)'s properties are important to not only the physical scene itself but space-weather prediction. Several models (such as cone model, GCS model, and so on) have been raised to get rid of the projection effects within the properties observed by spacecraft. According to SOHO/ LASCO observations, we obtain the 'real' 3D parameters of all the FFHCMEs (front-side full halo Coronal Mass Ejections) within the 24th solar cycle till July 2012, by the ice-cream cone model. Considering that the method to obtain 3D parameters from the CME observations by multi-satellite and multi-angle has higher accuracy, we use the GCS model to obtain the real propagation parameters of these CMEs in 3D space and compare the results with which by ice-cream cone model. Then we could discuss the reliability of the ice-cream cone model.
Czochralski crystal growth: Modeling study
Dudukovic, M. P.; Ramachandran, P. A.; Srivastava, R. K.; Dorsey, D.
1986-01-01
The modeling study of Czochralski (Cz) crystal growth is reported. The approach was to relate in a quantitative manner, using models based on first priniciples, crystal quality to operating conditions and geometric variables. The finite element method is used for all calculations.
Model uncertainty in growth empirics
Prüfer, P.
2008-01-01
This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high gro
Aircraft conceptual design modelling incorporating reliability and maintainability predictions
Vaziry-Zanjany , Mohammad Ali (F)
1996-01-01
A computer assisted conceptual aircraft design program has been developed (CACAD). It has an optimisation capability, with extensive break-down in maintenance costs. CACAD's aim is to optimise the size, and configurations of turbofan-powered transport aircraft. A methodology was developed to enhance the reliability of current aircraft systems, and was applied to avionics systems. R&M models of thermal management were developed and linked with avionics failure rate and its ma...
A literature review on inventory modeling with reliability consideration
Imtiaz Ahmed
2014-01-01
Full Text Available Inventories are the materials stored either waiting for processing or experiencing processing and in some cases for future delivery. Inventories are treated both as blessings and evil. As they are like money placed in a drawer, assets tied up in investments, incurring costs for the care of the stored material and also subject to spoilage and obsolescence there have been a spate of programs developed by industries, all aimed at reducing inventory levels and increasing efficiency on the shop floor. Nevertheless, they do have positive purposes such as stable source of input required for production, less replenishment and may reduce ordering costs because of economies of scale. Finished goods inventories provide for better customer service. So formulating a suitable inventory model is one of the major concerns for an industry. Again considering reliability of any process is an important trend in the current research activities. Inventory models could be both deterministic and probabilistic and both of which must account for the reliability of the associated production process. This paper discusses the major works in the field of inventory modeling driven by reliability considerations, which ranges from the very beginning to latest works just published.
DESIGNING, MODELLING AND OPTIMISING OF AN INTEGRATED RELIABILITY REDUNDANT SYSTEM
G. Sankaraiah
2012-01-01
Full Text Available
ENGLISH ABSTRACT: The reliability of a system is generally treated as a function of cost; but in many real-life situations reliability will depend on a variety of factors. It is therefore interesting to probe the hidden impact of constraints apart from cost – such as weight, volume, and space. This paper attempts to study the impact of multiple constraints on system reliability. For the purposes of analysis, an integrated redundant reliability system is considered, modelled and solved by applying a Lagrangian multiplier that gives a real valued solution for the number of components, for its reliability at each stage, and for the system. The problem is further studied by using a heuristic algorithm and an integer programming method, and is validated by sensitivity analysis to present an integer solution.
AFRIKAANSE OPSOMMING: Die betroubaarheid van ‘n sisteem word normaalweg as ‘n funksie van koste beskou, alhoewel dit in baie gevalle afhang van ‘n verskeidenheid faktore. Dit is dus interessant om die verskuilde impak van randvoorwaardes soos massa, volume en ruimte te ondersoek. Hierdie artikel poog om die impak van meervoudige randvoorwaardes op sisteem-betroubaarheid te bestudeer. Vir die ontleding, word ‘n geïntegreerde betroubaarheid-sisteem met oortolligheid beskou, gemodelleer en opgelos aan die hand van ‘n Lagrange-vermenigvuldiger. Die problem word verder bestudeer deur gebruik te maak van ‘n heuristiese algoritme en heeltalprogrammering asook gevalideer by wyse van ‘n sensitiwiteitsanalise sodat ‘n heeltaloplossing voorgehou kan word.
Li, Qiuying; Pham, Hoang
2017-01-01
In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.
Reliability Assessment of IGBT Modules Modeled as Systems with Correlated Components
Kostandyan, Erik; Sørensen, John Dalsgaard
2013-01-01
configuration. The estimated system reliability by the proposed method is a conservative estimate. Application of the suggested method could be extended for reliability estimation of systems composing of welding joints, bolts, bearings, etc. The reliability model incorporates the correlation between...
Reliability modelling - PETROBRAS 2010 integrated gas supply chain
Faertes, Denise; Heil, Luciana; Saker, Leonardo; Vieira, Flavia; Risi, Francisco; Domingues, Joaquim; Alvarenga, Tobias; Carvalho, Eduardo; Mussel, Patricia
2010-09-15
The purpose of this paper is to present the innovative reliability modeling of Petrobras 2010 integrated gas supply chain. The model represents a challenge in terms of complexity and software robustness. It was jointly developed by PETROBRAS Gas and Power Department and Det Norske Veritas. It was carried out with the objective of evaluating security of supply of 2010 gas network design that was conceived to connect Brazilian Northeast and Southeast regions. To provide best in class analysis, state of the art software was used to quantify the availability and the efficiency of the overall network and its individual components.
Reliability physics and engineering time-to-failure modeling
McPherson, J W
2013-01-01
Reliability Physics and Engineering provides critically important information that is needed for designing and building reliable cost-effective products. Key features include: · Materials/Device Degradation · Degradation Kinetics · Time-To-Failure Modeling · Statistical Tools · Failure-Rate Modeling · Accelerated Testing · Ramp-To-Failure Testing · Important Failure Mechanisms for Integrated Circuits · Important Failure Mechanisms for Mechanical Components · Conversion of Dynamic Stresses into Static Equivalents · Small Design Changes Producing Major Reliability Improvements · Screening Methods · Heat Generation and Dissipation · Sampling Plans and Confidence Intervals This textbook includes numerous example problems with solutions. Also, exercise problems along with the answers are included at the end of each chapter. Relia...
Reliability modeling of hydraulic system of drum shearer machine
SEYED HADI Hoseinie; MOHAMMAD Ataie; REZA Khalookakaei; UDAY Kumar
2011-01-01
The hydraulic system plays an important role in supplying power and its transition to other working parts of a coal shearer machine.In this paper,the reliability of the hydraulic system of a drum shearer was analyzed.A case study was done in the Tabas Coal Mine in Iran for failure data collection.The results of the statistical analysis show that the time between failures (TBF)data of this system followed the 3-parameters Weibull distribution.There is about a 54％ chance that the hydraulic system of the drum shearer will not fail for the first 50 h of operation.The developed model shows that the reliability of the hydraulic system reduces to a zero value after approximately 1 650 hours of operation.The failure rate of this system decreases when time increases.Therefore,corrective maintenance(run-to-failure)was selected as the best maintenance strategy for it.
Charge transport model to predict intrinsic reliability for dielectric materials
Ogden, Sean P. [Howard P. Isermann Department of Chemical and Biological Engineering, Rensselaer Polytechnic Institute, Troy, New York 12180 (United States); GLOBALFOUNDRIES, 400 Stonebreak Rd. Ext., Malta, New York 12020 (United States); Borja, Juan; Plawsky, Joel L., E-mail: plawsky@rpi.edu; Gill, William N. [Howard P. Isermann Department of Chemical and Biological Engineering, Rensselaer Polytechnic Institute, Troy, New York 12180 (United States); Lu, T.-M. [Department of Physics, Rensselaer Polytechnic Institute, Troy, New York 12180 (United States); Yeap, Kong Boon [GLOBALFOUNDRIES, 400 Stonebreak Rd. Ext., Malta, New York 12020 (United States)
2015-09-28
Several lifetime models, mostly empirical in nature, are used to predict reliability for low-k dielectrics used in integrated circuits. There is a dispute over which model provides the most accurate prediction for device lifetime at operating conditions. As a result, there is a need to transition from the use of these largely empirical models to one built entirely on theory. Therefore, a charge transport model was developed to predict the device lifetime of low-k interconnect systems. The model is based on electron transport and donor-type defect formation. Breakdown occurs when a critical defect concentration accumulates, resulting in electron tunneling and the emptying of positively charged traps. The enhanced local electric field lowers the barrier for electron injection into the dielectric, causing a positive feedforward failure. The charge transport model is able to replicate experimental I-V and I-t curves, capturing the current decay at early stress times and the rapid current increase at failure. The model is based on field-driven and current-driven failure mechanisms and uses a minimal number of parameters. All the parameters have some theoretical basis or have been measured experimentally and are not directly used to fit the slope of the time-to-failure versus applied field curve. Despite this simplicity, the model is able to accurately predict device lifetime for three different sources of experimental data. The simulation's predictions at low fields and very long lifetimes show that the use of a single empirical model can lead to inaccuracies in device reliability.
Yang, Hongfei; Chang, Edward C
2014-01-01
We examined the factor structure, reliability, and validity of the Chinese version of the Personal Growth Initiative Scale-II (CPGIS-II) using data from a sample of 927 Chinese university students. Consistent with previous findings, confirmatory factor analyses supported a 4-factor model of the CPGIS-II. Reliability analyses indicated that the 4 CPGIS-II subscales, namely Readiness for Change, Planfulness, Using Resources, and Intentional Behavior, demonstrated good internal consistency reliability and adequate test-retest reliability across a 4-week period. In addition, evidence for convergent and incremental validity was found in relation to measures of positive and negative psychological adjustment. Finally, results of hierarchical regression analyses indicated that the 4 personal growth initiative dimensions, especially planfulness, accounted for additional unique variance in psychological adjustment beyond resilience. Some implications for using the CPGIS-II in Chinese are discussed.
A Model of Ship Auxiliary System for Reliable Ship Propulsion
Dragan Martinović
2012-03-01
Full Text Available The main purpose of a vessel is to transport goods and passengers at minimum cost. Out of the analysis of relevant global databases on ship machinery failures, it is obvious that the most frequent failures occur precisely on the generator-running diesel engines. Any failure in the electrical system can leave the ship without propulsion, even if the main engine is working properly. In that case, the consequences could be devastating: higher running expenses, damage to the ship, oil spill or substantial marine pollution. These are the reasons why solutions that will prevent the ship being unable to manoeuvre during her exploitation should be implemented. Therefore, it is necessary to define a propulsion restoration model which would not depend on the primary electrical energy. The paper provides a model of the marine auxiliary system for more reliable propulsion. This includes starting, reversing and stopping of the propulsion engine. The proposed solution of reliable propulsion model based on the use of a shaft generator and an excitation engine enables the restoration of propulsion following total failure of the electrical energy primary production system, and the self-propelled ship navigation. A ship is an important factor in the Technology of Transport, and the implementation of this model increases safety, reduces downtime, and significantly decreases hazards of pollution damage.KEYWORDSreliable propulsion, failure, ship auxiliary system, control, propulsion restoration
Modeling growth in biological materials
Jones, Gareth Wyn; Chapman, S. Jonathan
2012-01-01
The biomechanical modeling of growing tissues has recently become an area of intense interest. In particular, the interplay between growth patterns and mechanical stress is of great importance, with possible applications to arterial mechanics, embryo morphogenesis, tumor development, and bone remodeling. This review aims to give an overview of the theories that have been used to model these phenomena, categorized according to whether the tissue is considered as a continuum object or a collect...
Using the Weibull distribution reliability, modeling and inference
McCool, John I
2012-01-01
Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution
Reliability prediction from burn-in data fit to reliability models
Bernstein, Joseph
2014-01-01
This work will educate chip and system designers on a method for accurately predicting circuit and system reliability in order to estimate failures that will occur in the field as a function of operating conditions at the chip level. This book will combine the knowledge taught in many reliability publications and illustrate how to use the knowledge presented by the semiconductor manufacturing companies in combination with the HTOL end-of-life testing that is currently performed by the chip suppliers as part of their standard qualification procedure and make accurate reliability predictions. Th
Modeling service time reliability in urban ferry system
Chen, Yifan; Luo, Sida; Zhang, Mengke; Shen, Hanxia; Xin, Feifei; Luo, Yujie
2017-09-01
The urban ferry system can carry a large number of travelers, which may alleviate the pressure on road traffic. As an indicator of its service quality, service time reliability (STR) plays an essential part in attracting travelers to the ferry system. A wide array of studies have been conducted to analyze the STR of land transportation. However, the STR of ferry systems has received little attention in the transportation literature. In this study, a model was established to obtain the STR in urban ferry systems. First, the probability density function (PDF) of the service time provided by ferry systems was constructed. Considering the deficiency of the queuing theory, this PDF was determined by Bayes’ theorem. Then, to validate the function, the results of the proposed model were compared with those of the Monte Carlo simulation. With the PDF, the reliability could be determined mathematically by integration. Results showed how the factors including the frequency, capacity, time schedule and ferry waiting time affected the STR under different degrees of congestion in ferry systems. Based on these results, some strategies for improving the STR were proposed. These findings are of great significance to increasing the share of ferries among various urban transport modes.
System reliability assessment with an approximate reasoning model
Eisenhawer, S.W.; Bott, T.F.; Helm, T.M.; Boerigter, S.T.
1998-12-31
The projected service life of weapons in the US nuclear stockpile will exceed the original design life of their critical components. Interim metrics are needed to describe weapon states for use in simulation models of the nuclear weapons complex. The authors present an approach to this problem based upon the theory of approximate reasoning (AR) that allows meaningful assessments to be made in an environment where reliability models are incomplete. AR models are designed to emulate the inference process used by subject matter experts. The emulation is based upon a formal logic structure that relates evidence about components. This evidence is translated using natural language expressions into linguistic variables that describe membership in fuzzy sets. The authors introduce a metric that measures the acceptability of a weapon to nuclear deterrence planners. Implication rule bases are used to draw a series of forward chaining inferences about the acceptability of components, subsystems and individual weapons. They describe each component in the AR model in some detail and illustrate its behavior with a small example. The integration of the acceptability metric into a prototype model to simulate the weapons complex is also described.
Reliability assessment using degradation models: bayesian and classical approaches
Marta Afonso Freitas
2010-04-01
Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.
Testing mechanistic models of growth in insects
Maino, James L.; Kearney, Michael R.
2015-01-01
Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compare...
Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.
Proppe, Jonny; Reiher, Markus
2017-07-11
One of the major challenges in computational science is to determine the uncertainty of a virtual measurement, that is the prediction of an observable based on calculations. As highly accurate first-principles calculations are in general unfeasible for most physical systems, one usually resorts to parameteric property models of observables, which require calibration by incorporating reference data. The resulting predictions and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the (57)Fe Mössbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with 12 density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm s(-1) and 0.04-0.05 mm s(-1), respectively, the latter being close to the average experimental uncertainty of 0.02 mm s(-1). Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the squared coefficient of correlation, r(2), or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical M
Modeling turkey growth with the relative growth rate.
Maruyama, K; Potts, W J; Bacon, W L; Nestor, K E
1998-01-01
Six sigmoidal growth curves and two growth curves derived from a two-phase relative growth rate model were evaluated, using an experimental body-weight data from male and female turkeys of two genetic lines; a fast-growing (F) line and a randombred control (RBC) line from which the F line was developed. When their root mean square error was compared to the root mean square error of the local regression smoother, all sigmoidal growth curves: the logistic, Gompertz, von Bertalanffy, Richards, Weibull, and Morgan-Mercer-Flodin growth curves demonstrated a lack of fit. The primary source of the systematic lack of fit was identified with nonparametric estimates of the relative growth rate (the growth rate as a fraction of the body weight) of 20 turkeys. When the relative growth rate was estimated from the above sigmoidal growth curves, none could accommodate features of the nonparametric estimates of the relative growth rate. Based on the feature of the relative growth rate, two new growth curves were derived from a segmented two-phase model. Both models, in which the relative growth rate decreases in two linear phases with slopes of beta1 and beta2 joined together at time=kappa, gave growth curves that fit the experimental data acceptably. The linear-linear model with the smooth transition rendered better fit over the model with the abrupt transition. When the growth curves of male and female turkeys were compared, beta1, beta2, and kappa were smaller in males. When the F line was compared to the RBC line, beta1 and kappa were smaller and beta2 was closer to zero, indicating that the relative growth rate declined rapidly until about 61 days of age in the F line, while it declined less rapidly until about 71 days of age in the RBC line.
Numerical Model based Reliability Estimation of Selective Laser Melting Process
Mohanty, Sankhya; Hattel, Jesper Henri
2014-01-01
Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....
Fatigue reliability based on residual strength model with hybrid uncertain parameters
Jun Wang; Zhi-Ping Qiu
2012-01-01
The aim of this paper is to evaluate the fatigue reliability with hybrid uncertain parameters based on a residual strength model.By solving the non-probabilistic setbased reliability problem and analyzing the reliability with randomness,the fatigue reliability with hybrid parameters can be obtained.The presented hybrid model can adequately consider all uncertainties affecting the fatigue reliability with hybrid uncertain parameters.A comparison among the presented hybrid model,non-probabilistic set-theoretic model and the conventional random model is made through two typical numerical examples.The results show that the presented hybrid model,which can ensure structural security,is effective and practical.
Methods of modelling relative growth rate
Arne Pommerening; Anders Muszta
2015-01-01
Background:Analysing and modelling plant growth is an important interdisciplinary field of plant science. The use of relative growth rates, involving the analysis of plant growth relative to plant size, has more or less independently emerged in different research groups and at different times and has provided powerful tools for assessing the growth performance and growth efficiency of plants and plant populations. In this paper, we explore how these isolated methods can be combined to form a consistent methodology for modelling relative growth rates. Methods:We review and combine existing methods of analysing and modelling relative growth rates and apply a combination of methods to Sitka spruce (Picea sitchensis (Bong.) Carr.) stem-analysis data from North Wales (UK) and British Douglas fir (Pseudotsuga menziesi (Mirb.) Franco) yield table data. Results:The results indicate that, by combining the approaches of different plant-growth analysis laboratories and using them simultaneously, we can advance and standardise the concept of relative plant growth. Particularly the growth multiplier plays an important role in modelling relative growth rates. Another useful technique has been the recent introduction of size-standardised relative growth rates. Conclusions:Modelling relative growth rates mainly serves two purposes, 1) an improved analysis of growth performance and efficiency and 2) the prediction of future or past growth rates. This makes the concept of relative growth ideally suited to growth reconstruction as required in dendrochronology, climate change and forest decline research and for interdisciplinary research projects beyond the realm of plant science.
Methods of modelling relative growth rate
Arne Pommerening
2015-03-01
Full Text Available Background Analysing and modelling plant growth is an important interdisciplinary field of plant science. The use of relative growth rates, involving the analysis of plant growth relative to plant size, has more or less independently emerged in different research groups and at different times and has provided powerful tools for assessing the growth performance and growth efficiency of plants and plant populations. In this paper, we explore how these isolated methods can be combined to form a consistent methodology for modelling relative growth rates. Methods We review and combine existing methods of analysing and modelling relative growth rates and apply a combination of methods to Sitka spruce (Picea sitchensis (Bong. Carr. stem-analysis data from North Wales (UK and British Douglas fir (Pseudotsuga menziesii (Mirb. Franco yield table data. Results The results indicate that, by combining the approaches of different plant-growth analysis laboratories and using them simultaneously, we can advance and standardise the concept of relative plant growth. Particularly the growth multiplier plays an important role in modelling relative growth rates. Another useful technique has been the recent introduction of size-standardised relative growth rates. Conclusions Modelling relative growth rates mainly serves two purposes, 1 an improved analysis of growth performance and efficiency and 2 the prediction of future or past growth rates. This makes the concept of relative growth ideally suited to growth reconstruction as required in dendrochronology, climate change and forest decline research and for interdisciplinary research projects beyond the realm of plant science.
Hamlin, Teri L.
2011-01-01
It is important to the Space Shuttle Program (SSP), as well as future manned spaceflight programs, to understand the early mission risk and progression of risk as the program gains insights into the integrated vehicle through flight. The risk progression is important to the SSP as part of the documentation of lessons learned. The risk progression is important to future programs to understand reliability growth and the first flight risk. This analysis uses the knowledge gained from 30 years of operational flights and the current Shuttle PRA to calculate the risk of Loss of Crew and Vehicle (LOCV) at significant milestones beginning with the first flight. Key flights were evaluated based upon historical events and significant re-designs. The results indicated that the Shuttle risk tends to follow a step function as opposed to following a traditional reliability growth pattern where risk exponentially improves with each flight. In addition, it shows that risk can increase due to trading safety margin for increased performance or due to external events. Due to the risk drivers not being addressed, the risk did not improve appreciably during the first 25 flights. It was only after significant events occurred such as Challenger and Columbia, where the risk drivers were apparent, that risk was significantly improved. In addition, this paper will show that the SSP has reduced the risk of LOCV by almost an order of magnitude. It is easy to look back afte r 30 years and point to risks that are now obvious, however; the key is to use this knowledge to benefit other programs which are in their infancy stages. One lesson learned from the SSP is understanding risk drivers are essential in order to considerably reduce risk. This will enable the new program to focus time and resources on identifying and reducing the significant risks. A comprehensive PRA, similar to that of the Shuttle PRA, is an effective tool quantifying risk drivers if support from all of the stakeholders is
A particle swarm model for estimating reliability and scheduling system maintenance
Puzis, Rami; Shirtz, Dov; Elovici, Yuval
2016-05-01
Modifying data and information system components may introduce new errors and deteriorate the reliability of the system. Reliability can be efficiently regained with reliability centred maintenance, which requires reliability estimation for maintenance scheduling. A variant of the particle swarm model is used to estimate reliability of systems implemented according to the model view controller paradigm. Simulations based on data collected from an online system of a large financial institute are used to compare three component-level maintenance policies. Results show that appropriately scheduled component-level maintenance greatly reduces the cost of upholding an acceptable level of reliability by reducing the need in system-wide maintenance.
Dejun Yang
Full Text Available ABSTRACT Simulations for root growth, crop growth, and N uptake in agro-hydrological models are of significant concern to researchers. SWMS_2D is one of the most widely used physical hydrologically related models. This model solves equations that govern soil-water movement by the finite element method, and has a public access source code. Incorporating key agricultural components into the SWMS_2D model is of practical importance, especially for modeling some critical cereal crops such as winter wheat. We added root growth, crop growth, and N uptake modules into SWMS_2D. The root growth model had two sub-models, one for root penetration and the other for root length distribution. The crop growth model used was adapted from EU-ROTATE_N, linked to the N uptake model. Soil-water limitation, nitrogen limitation, and temperature effects were all considered in dry-weight modeling. Field experiments for winter wheat in Bouwing, the Netherlands, in 1983-1984 were selected for validation. Good agreements were achieved between simulations and measurements, including soil water content at different depths, normalized root length distribution, dry weight and nitrogen uptake. This indicated that the proposed new modules used in the SWMS_2D model are robust and reliable. In the future, more rigorous validation should be carried out, ideally under 2D situations, and attention should be paid to improve some modules, including the module simulating soil N mineralization.
Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard
2015-01-01
Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... uncertainties can be implemented in probabilistic reliability assessments....
Coordination polyhedron growth mechanism model and growth habit of crystals
无
2001-01-01
A new growth mechanism model, coordination polyhedron growth mechanism model, is introduced from the angle of the coordination of anion and cation to each other at the interface. It is pointed out that the force driving the growth unit to enter the crystal lattice is the electrostatic attraction force between ions, whose relative size can be approximately measured by the electrostatic bond strength (EBS) that reaches a nearest neighbor anion (or cation) in the parent phase from a cation (or anion) at the interface. The growth habits of NaCl, ZnS, CaF2 and CsI crystals are discussed, and a new growth habit rule is proposed as follows. When the growth rate of a crystal is determined by the step generation rate, the growth habit of this crystal is related to the coordination number of the ion with the smallest coordination rate at the interface of various crystal faces. The smaller the coordination number of the ion at the interface, the faster the growth rate of corresponding crystal face. When the growth of a crystal depends on the step movement rate, the growth habit of this crystal is related to the density of the ion with the smallest coordination rate at the interface of various crystal faces. The smaller the densities of the ion at the interface is, the faster the growth rate of corresponding crystal face will be.
2017-01-01
Current evidence on the reliability of growth indicators in the identification of the pubertal growth spurt and efficiency of functional treatment for skeletal Class II malocclusion, the timing of which relies on such indicators, is highly controversial. Regarding growth indicators, the hand and wrist (including the sole middle phalanx of the third finger) maturation method and the standing height recording appear to be most reliable. Other methods are subjected to controversies or were showed to be unreliable. Main sources of controversies include use of single stages instead of ossification events and diagnostic reliability conjecturally based on correlation analyses. Regarding evidence on the efficiency of functional treatment, when treated during the pubertal growth spurt, more favorable response is seen in skeletal Class II patients even though large individual responsiveness remains. Main sources of controversies include design of clinical trials, definition of Class II malocclusion, and lack of inclusion of skeletal maturity among the prognostic factors. While no growth indicator may be considered to have a full diagnostic reliability in the identification of the pubertal growth spurt, their use may still be recommended for increasing efficiency of functional treatment for skeletal Class II malocclusion. PMID:28168195
Reliability block diagrams to model the management of colorectal cancer.
Sonnenberg, A; Inadomi, J M
1999-02-01
The present study aims to show how various medical and nonmedical components contribute to success and failure in the management of colorectal cancer. The first encounter, subsequent diagnosis, and surgical therapy of a patient with Dukes B sigmoid cancer is modeled as a reliability block diagram with a serial and parallel arrangement of various components. The overall probability of a patient with new-onset colorectal cancer to visit a physician, be correctly diagnosed, and undergo successful therapy is 69%. The reduction in the overall success, despite the fact that the majority of components are assumed to function with failure rates of 5% or less, is a reflection of the multitude of serial subsystems involved in the management of the patient. In contrast, the parallel arrangement of subsystems results in a relative insensitivity of the overall system to failure, a greater stability, and an improved performance. Since no medical system functions perfectly, redundancy associated with parallel subsystems assures a better overall outcome. System analysis of health care provides a means to improve its performance.
Rovinelli, Andrea; Guilhem, Yoann; Proudhon, Henry; Lebensohn, Ricardo A.; Ludwig, Wolfgang; Sangid, Michael D.
2017-06-01
Microstructurally small cracks exhibit large variability in their fatigue crack growth rate. It is accepted that the inherent variability in microstructural features is related to the uncertainty in the growth rate. However, due to (i) the lack of cycle-by-cycle experimental data, (ii) the complexity of the short crack growth phenomenon, and (iii) the incomplete physics of constitutive relationships, only empirical damage metrics have been postulated to describe the short crack driving force metric (SCDFM) at the mesoscale level. The identification of the SCDFM of polycrystalline engineering alloys is a critical need, in order to achieve more reliable fatigue life prediction and improve material design. In this work, the first steps in the development of a general probabilistic framework are presented, which uses experimental result as an input, retrieves missing experimental data through crystal plasticity (CP) simulations, and extracts correlations utilizing machine learning and Bayesian networks (BNs). More precisely, experimental results representing cycle-by-cycle data of a short crack growing through a beta-metastable titanium alloy, VST-55531, have been acquired via phase and diffraction contrast tomography. These results serve as an input for FFT-based CP simulations, which provide the micromechanical fields influenced by the presence of the crack, complementing the information available from the experiment. In order to assess the correlation between postulated SCDFM and experimental observations, the data is mined and analyzed utilizing BNs. Results show the ability of the framework to autonomously capture relevant correlations and the equivalence in the prediction capability of different postulated SCDFMs for the high cycle fatigue regime.
Reliability Modeling of Microelectromechanical Systems Using Neural Networks
Perera. J. Sebastian
2000-01-01
Microelectromechanical systems (MEMS) are a broad and rapidly expanding field that is currently receiving a great deal of attention because of the potential to significantly improve the ability to sense, analyze, and control a variety of processes, such as heating and ventilation systems, automobiles, medicine, aeronautical flight, military surveillance, weather forecasting, and space exploration. MEMS are very small and are a blend of electrical and mechanical components, with electrical and mechanical systems on one chip. This research establishes reliability estimation and prediction for MEMS devices at the conceptual design phase using neural networks. At the conceptual design phase, before devices are built and tested, traditional methods of quantifying reliability are inadequate because the device is not in existence and cannot be tested to establish the reliability distributions. A novel approach using neural networks is created to predict the overall reliability of a MEMS device based on its components and each component's attributes. The methodology begins with collecting attribute data (fabrication process, physical specifications, operating environment, property characteristics, packaging, etc.) and reliability data for many types of microengines. The data are partitioned into training data (the majority) and validation data (the remainder). A neural network is applied to the training data (both attribute and reliability); the attributes become the system inputs and reliability data (cycles to failure), the system output. After the neural network is trained with sufficient data. the validation data are used to verify the neural networks provided accurate reliability estimates. Now, the reliability of a new proposed MEMS device can be estimated by using the appropriate trained neural networks developed in this work.
Hierarchical modeling for reliability analysis using Markov models. B.S./M.S. Thesis - MIT
Fagundo, Arturo
1994-01-01
Markov models represent an extremely attractive tool for the reliability analysis of many systems. However, Markov model state space grows exponentially with the number of components in a given system. Thus, for very large systems Markov modeling techniques alone become intractable in both memory and CPU time. Often a particular subsystem can be found within some larger system where the dependence of the larger system on the subsystem is of a particularly simple form. This simple dependence can be used to decompose such a system into one or more subsystems. A hierarchical technique is presented which can be used to evaluate these subsystems in such a way that their reliabilities can be combined to obtain the reliability for the full system. This hierarchical approach is unique in that it allows the subsystem model to pass multiple aggregate state information to the higher level model, allowing more general systems to be evaluated. Guidelines are developed to assist in the system decomposition. An appropriate method for determining subsystem reliability is also developed. This method gives rise to some interesting numerical issues. Numerical error due to roundoff and integration are discussed at length. Once a decomposition is chosen, the remaining analysis is straightforward but tedious. However, an approach is developed for simplifying the recombination of subsystem reliabilities. Finally, a real world system is used to illustrate the use of this technique in a more practical context.
Mathematical modeling of microbial growth in milk
Jhony Tiago Teleken
2011-12-01
Full Text Available A mathematical model to predict microbial growth in milk was developed and analyzed. The model consists of a system of two differential equations of first order. The equations are based on physical hypotheses of population growth. The model was applied to five different sets of data of microbial growth in dairy products selected from Combase, which is the most important database in the area with thousands of datasets from around the world, and the results showed a good fit. In addition, the model provides equations for the evaluation of the maximum specific growth rate and the duration of the lag phase which may provide useful information about microbial growth.
Reliability Modeling and Optimization Using Fuzzy Logic and Chaos Theory
Alexander Rotshtein
2012-01-01
Full Text Available Fuzzy sets membership functions integrated with logistic map as the chaos generator were used to create reliability bifurcations diagrams of the system with redundancy of the components. This paper shows that increasing in the number of redundant components results in a postponement of the moment of the first bifurcation which is considered as most contributing to the loss of the reliability. The increasing of redundancy also provides the shrinkage of the oscillation orbit of the level of the system’s membership to reliable state. The paper includes the problem statement of redundancy optimization under conditions of chaotic behavior of influencing parameters and genetic algorithm of this problem solving. The paper shows the possibility of chaos-tolerant systems design with the required level of reliability.
Reliability Analysis of Wireless Sensor Networks Using Markovian Model
Jin Zhu
2012-01-01
Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.
Mathematical models to characterize early epidemic growth: A review
Chowell, Gerardo; Sattenspiel, Lisa; Bansal, Shweta; Viboud, Cécile
2016-09-01
There is a long tradition of using mathematical models to generate insights into the transmission dynamics of infectious diseases and assess the potential impact of different intervention strategies. The increasing use of mathematical models for epidemic forecasting has highlighted the importance of designing reliable models that capture the baseline transmission characteristics of specific pathogens and social contexts. More refined models are needed however, in particular to account for variation in the early growth dynamics of real epidemics and to gain a better understanding of the mechanisms at play. Here, we review recent progress on modeling and characterizing early epidemic growth patterns from infectious disease outbreak data, and survey the types of mathematical formulations that are most useful for capturing a diversity of early epidemic growth profiles, ranging from sub-exponential to exponential growth dynamics. Specifically, we review mathematical models that incorporate spatial details or realistic population mixing structures, including meta-population models, individual-based network models, and simple SIR-type models that incorporate the effects of reactive behavior changes or inhomogeneous mixing. In this process, we also analyze simulation data stemming from detailed large-scale agent-based models previously designed and calibrated to study how realistic social networks and disease transmission characteristics shape early epidemic growth patterns, general transmission dynamics, and control of international disease emergencies such as the 2009 A/H1N1 influenza pandemic and the 2014-2015 Ebola epidemic in West Africa.
Semi-Markov Models for Degradation-Based Reliability
2010-01-01
aircraft, marine sys- tems, and machinery ( Jardine and Anderson, 1985; Jardine et al., 1987, 1989; Zhan et al., 2003). An excellent review of PHMs...distributions in a time-varying environment. IEEE Transactions on Reliability, 57, 539–550. Jardine , A.K.S. and Anderson, M. (1985) Use of...concomitant variables for reliability estimation. Maintenance Management International, 5, 135–140. Jardine , A.K.S., Anderson, P.M. and Mann, D.S. (1987
Michael, Joseph Richard; Grant, Richard P.; Rodriguez, Mark Andrew; Pillars, Jamin; Susan, Donald Francis; McKenzie, Bonnie Beth; Yelton, William Graham
2012-01-01
Tin (Sn) whiskers are conductive Sn filaments that grow from Sn-plated surfaces, such as surface finishes on electronic packages. The phenomenon of Sn whiskering has become a concern in recent years due to requirements for lead (Pb)-free soldering and surface finishes in commercial electronics. Pure Sn finishes are more prone to whisker growth than their Sn-Pb counterparts and high profile failures due to whisker formation (causing short circuits) in space applications have been documented. At Sandia, Sn whiskers are of interest due to increased use of Pb-free commercial off-the-shelf (COTS) parts and possible future requirements for Pb-free solders and surface finishes in high-reliability microelectronics. Lead-free solders and surface finishes are currently being used or considered for several Sandia applications. Despite the long history of Sn whisker research and the recently renewed interest in this topic, a comprehensive understanding of whisker growth remains elusive. This report describes recent research on characterization of Sn whiskers with the aim of understanding the underlying whisker growth mechanism(s). The report is divided into four sections and an Appendix. In Section 1, the Sn plating process is summarized. Specifically, the Sn plating parameters that were successful in producing samples with whiskers will be reviewed. In Section 2, the scanning electron microscopy (SEM) of Sn whiskers and time-lapse SEM studies of whisker growth will be discussed. This discussion includes the characterization of straight as well as kinked whiskers. In Section 3, a detailed discussion is given of SEM/EBSD (electron backscatter diffraction) techniques developed to determine the crystallography of Sn whiskers. In Section 4, these SEM/EBSD methods are employed to determine the crystallography of Sn whiskers, with a statistically significant number of whiskers analyzed. This is the largest study of Sn whisker crystallography ever reported. This section includes a
Theoretical model of ``fuzz'' growth
Krasheninnikov, Sergei; Smirnov, Roman
2012-10-01
Recent more detailed experiments on tungsten irradiation with low energy helium plasma, relevant to the near-wall plasma conditions in magnetic fusion reactor like ITER, demonstrated (e.g. see Ref. 1) a very dramatic change in both surface morphology and near surface material structure of the samples. In particular, it was shown that a long (mm-scale) and thin (nm-scale) fiber-like structures filled with nano-bubbles, so-called ``fuzz,'' start to grow. In this work theoretical model of ``fuzz'' growth [2] describing the main features observed in experiments is presented. This model, based on the assumption of enhancement of creep of tungsten containing significant fraction of helium atoms and clusters. The results of the MD simulations [3] support this idea and demonstrate a strong reduction of the yield strength for all temperature range. They also show that the ``flow'' of tungsten strongly facilitates coagulation of helium clusters and the formation of nano-bubbles.[4pt] [1] M. J. Baldwin, et al., J. Nucl. Mater. 390-391 (2009) 885;[0pt] [2] S. I. Krasheninnikov, Physica Scripta T145 (2011) 014040;[0pt] [3] R. D. Smirnov and S. I. Krasheninnikov, submitted to J. Nucl. Materials.
Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel
2008-01-01
In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....
Reliability and validity of measurements on digital study models and plaster models.
Reuschl, Ralph Philip; Heuer, Wieland; Stiesch, Meike; Wenzel, Daniela; Dittmer, Marc Philipp
2016-02-01
To compare manual plaster cast and digitized model analysis for accuracy and efficiency. Nineteen plaster models of orthodontic patients in permanent dentition were analyzed by two calibrated examiners. Analyses were performed with a diagnostic calliper and computer-assisted analysis after digitization of the plaster models. The reliability and efficiency of different examiners and methods were compared statistically using a mixed model. Statistically significant differences were found for comparisons of all 28 teeth (P plaster model analysis appears to be an adequate, reliable, and time saving alternative to analogue model analysis using a calliper. © The Author 2015. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Modular System Modeling for Quantitative Reliability Evaluation of Technical Systems
Stephan Neumann
2016-01-01
Full Text Available In modern times, it is necessary to offer reliable products to match the statutory directives concerning product liability and the high expectations of customers for durable devices. Furthermore, to maintain a high competitiveness, engineers need to know as accurately as possible how long their product will last and how to influence the life expectancy without expensive and time-consuming testing. As the components of a system are responsible for the system reliability, this paper introduces and evaluates calculation methods for life expectancy of common machine elements in technical systems. Subsequently, a method for the quantitative evaluation of the reliability of technical systems is proposed and applied to a heavy-duty power shift transmission.
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range
Testing mechanistic models of growth in insects.
Maino, James L; Kearney, Michael R
2015-11-22
Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes.
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-18
Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.
Fault maintenance trees: reliability centered maintenance via statistical model checking
Ruijters, Enno; Guck, Dennis; Drolenga, Peter; Stoelinga, Mariëlle
2016-01-01
The current trend in infrastructural asset management is towards risk-based (a.k.a. reliability centered) maintenance, promising better performance at lower cost. By maintaining crucial components more intensively than less important ones, dependability increases while costs decrease. This requires
Fault maintenance trees: reliability centered maintenance via statistical model checking
Ruijters, Enno Jozef Johannes; Guck, Dennis; Drolenga, Peter; Stoelinga, Mariëlle Ida Antoinette
The current trend in infrastructural asset management is towards risk-based (a.k.a. reliability centered) maintenance, promising better performance at lower cost. By maintaining crucial components more intensively than less important ones, dependability increases while costs decrease. This requires
Continuously Optimized Reliable Energy (CORE) Microgrid: Models & Tools (Fact Sheet)
2013-07-01
This brochure describes Continuously Optimized Reliable Energy (CORE), a trademarked process NREL employs to produce conceptual microgrid designs. This systems-based process enables designs to be optimized for economic value, energy surety, and sustainability. Capabilities NREL offers in support of microgrid design are explained.
Novel Software Reliability Estimation Model for Altering Paradigms of Software Engineering
Ritika Wason
2012-05-01
Full Text Available A number of different software engineering paradigms like Component-Based Software Engineering (CBSE, Autonomic Computing, Service-Oriented Computing (SOC, Fault-Tolerant Computing and many others are being researched currently. These paradigms denote a paradigm shift from the currently mainstream object-oriented paradigm and are altering the way we view, design, develop and exercise software. Though these paradigms indicate a major shift in the way we design and code software. However, we still rely on traditional reliability models for estimating the reliability of any of the above systems. This paper analyzes the underlying characteristics of these paradigms and proposes a novel Finite Automata Based Reliability model as a suitable model for estimating reliability of modern, complex, distributed and critical software applications. We further outline the basic framework for an intelligent, automata-based reliability model that can be used for accurate estimation of system reliability of software systems at any point in the software life cycle.
Value Concept and Economic Growth Model
Truong Hong Trinh
2014-12-01
Full Text Available This paper approaches the value added method for Gross Domestic Product (GDP measurement that explains the interrelationship between the expenditure approach and the income approach. The economic growth model is also proposed with three key elements of capital accumulation, technological innovation, and institutional reform. Although capital accumulation and technological innovation are two integrated elements in driving economic growth, institutional reforms play a key role in creating incentives that effect the transitional and steady state growth rate in the real world economy. The paper provides a theoretical insight on economic growth to understand incentives and driving forces in economic growth model.
A continuous growth model for plant tissue
Bozorg, Behruz; Krupinski, Pawel; Jönsson, Henrik
2016-12-01
Morphogenesis in plants and animals involves large irreversible deformations. In plants, the response of the cell wall material to internal and external forces is determined by its mechanical properties. An appropriate model for plant tissue growth must include key features such as anisotropic and heterogeneous elasticity and cell dependent evaluation of mechanical variables such as turgor pressure, stress and strain. In addition, a growth model needs to cope with cell divisions as a necessary part of the growth process. Here we develop such a growth model, which is capable of employing not only mechanical signals but also morphogen signals for regulating growth. The model is based on a continuous equation for updating the resting configuration of the tissue. Simultaneously, material properties can be updated at a different time scale. We test the stability of our model by measuring convergence of growth results for a tissue under the same mechanical and material conditions but with different spatial discretization. The model is able to maintain a strain field in the tissue during re-meshing, which is of particular importance for modeling cell division. We confirm the accuracy of our estimations in two and three-dimensional simulations, and show that residual stresses are less prominent if strain or stress is included as input signal to growth. The approach results in a model implementation that can be used to compare different growth hypotheses, while keeping residual stresses and other mechanical variables updated and available for feeding back to the growth and material properties.
Qiuying Li; Haifeng Li; Minyan Lu
2015-01-01
Testing-effort (TE) and imperfect debugging (ID) in the reliability modeling process may further improve the fitting and pre-diction results of software reliability growth models (SRGMs). For describing the S-shaped varying trend of TE increasing rate more accurately, first, two S-shaped testing-effort functions (TEFs), i.e., delayed S-shaped TEF (DS-TEF) and inflected S-shaped TEF (IS-TEF), are proposed. Then these two TEFs are incorporated into various types (exponential-type, delayed S-shaped and in-flected S-shaped) of non-homogeneous Poisson process (NHPP) SRGMs with two forms of ID respectively for obtaining a series of new NHPP SRGMs which consider S-shaped TEFs as wel as ID. Final y these new SRGMs and several comparison NHPP SRGMs are applied into four real failure data-sets respectively for investigating the fitting and prediction power of these new SRGMs. The experimental results show that: (i) the proposed IS-TEF is more suitable and flexible for describing the consumption of TE than the previous TEFs; (i ) incorporating TEFs into the inflected S-shaped NHPP SRGM may be more effective and appropriate compared with the exponential-type and the delayed S-shaped NHPP SRGMs; (i i) the inflected S-shaped NHPP SRGM con-sidering both IS-TEF and ID yields the most accurate fitting and prediction results than the other comparison NHPP SRGMs.
Trajectories and models of individual growth
Arseniy Karkach
2006-11-01
Full Text Available It has long been recognized that the patterns of growth play an important role in the evolution of age trajectories of fertility and mortality (Williams, 1957. Life history studies would benefit from a better understanding of strategies and mechanisms of growth, but still no comparative research on individual growth strategies has been conducted. Growth patterns and methods have been shaped by evolution and a great variety of them are observed. Two distinct patterns - determinate and indeterminate growth - are of a special interest for these studies since they present qualitatively different outcomes of evolution. We attempt to draw together studies covering growth in plant and animal species across a wide range of phyla focusing primarily on the noted qualitative features. We also review mathematical descriptions of growth, namely empirical growth curves and growth models, and discuss the directions of future research.
SIERRA - A 3-D device simulator for reliability modeling
Chern, Jue-Hsien; Arledge, Lawrence A., Jr.; Yang, Ping; Maeda, John T.
1989-05-01
SIERRA is a three-dimensional general-purpose semiconductor-device simulation program which serves as a foundation for investigating integrated-circuit (IC) device and reliability issues. This program solves the Poisson and continuity equations in silicon under dc, transient, and small-signal conditions. Executing on a vector/parallel minisupercomputer, SIERRA utilizes a matrix solver which uses an incomplete LU (ILU) preconditioned conjugate gradient square (CGS, BCG) method. The ILU-CGS method provides a good compromise between memory size and convergence rate. The authors have observed a 5x to 7x speedup over standard direct methods in simulations of transient problems containing highly coupled Poisson and continuity equations such as those found in reliability-oriented simulations. The application of SIERRA to parasitic CMOS latchup and dynamic random-access memory single-event-upset studies is described.
Modelling Reliability-adaptive Multi-system Operation
Uwe K. Rakowsky
2006-01-01
This contribution discusses the concept of Reliability-Adaptive Systems (RAS) to multi-system operation. A fleet of independently operating systems and a single maintenance unit are considered. It is the objective in this paper to increase overall performance or workload respectively by avoiding delay due to busy maintenance units. This is achieved by concerted and coordinated derating of individual system performance, which increases reliability. Quantification is carried out by way of a convolution-based approach. The approach is tailored to fleets of ships, aeroplanes, spacecraft, and vehicles (trains, trams, buses, cars, trucks, etc.) - Finally, the effectiveness of derating is validated using different criteria. The RAS concept makes sense if average system output loss due to lowered performance level (yielding longer time to failure) is smaller than average loss due to waiting for maintenance in a non-adaptive case.
Mathematical models for Isoptera (Insecta mound growth
MLT. Buschini
Full Text Available In this research we proposed two mathematical models for Isoptera mound growth derived from the Von Bertalanffy growth curve, one appropriated for Nasutitermes coxipoensis, and a more general formulation. The mean height and the mean diameter of ten small colonies were measured each month for twelve months, from April, 1995 to April, 1996. Through these data, the monthly volumes were calculated for each of them. Then the growth in height and in volume was estimated and the models proposed.
Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der
2002-01-01
The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals...... are considered to be Gaussian. Conventional FORM analysis yields the linearization point of the idealized limit-state surface. A model correction factor is then introduced to push the idealized limit-state surface onto the actual limit-state surface. A few iterations yield a good approximation of the reliability...... reliability method; Model correction factor method; Nataf field integration; Non-Gaussion random field; Random field integration; Structural reliability; Pile foundation reliability...
Iwankiewicz, R.; Nielsen, Søren R. K.; Skjærbæk, P. S.
The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation.......The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation....
Green, Samuel B.; Yang, Yanyun
2009-01-01
A method is presented for estimating reliability using structural equation modeling (SEM) that allows for nonlinearity between factors and item scores. Assuming the focus is on consistency of summed item scores, this method for estimating reliability is preferred to those based on linear SEM models and to the most commonly reported estimate of…
A Cumulative Damage Reliability Model on the Basis of Contact Fatigue of the Rolling Bearing
HUANG Li
2006-01-01
A cumulative damage reliability model of contact fatigue of the rolling bearing is more identical with the actual conditions. It is put forward on the basis of contact fatigue life probability distribution of the rolling bearing that obey Weibull distribution and rest on the Miner cumulative damage theory. Finally a case is given to predict the reliability of bearing roller by using these models.
Reliability Modeling and Optimization Strategy for Manufacturing System Based on RQR Chain
Yihai He
2015-01-01
Full Text Available Accurate and dynamic reliability modeling for the running manufacturing system is the prerequisite to implement preventive maintenance. However, existing studies could not output the reliability value in real time because their abandonment of the quality inspection data originated in the operation process of manufacturing system. Therefore, this paper presents an approach to model the manufacturing system reliability dynamically based on their operation data of process quality and output data of product reliability. Firstly, on the basis of importance explanation of the quality variations in manufacturing process as the linkage for the manufacturing system reliability and product inherent reliability, the RQR chain which could represent the relationships between them is put forward, and the product qualified probability is proposed to quantify the impacts of quality variation in manufacturing process on the reliability of manufacturing system further. Secondly, the impact of qualified probability on the product inherent reliability is expounded, and the modeling approach of manufacturing system reliability based on the qualified probability is presented. Thirdly, the preventive maintenance optimization strategy for manufacturing system driven by the loss of manufacturing quality variation is proposed. Finally, the validity of the proposed approach is verified by the reliability analysis and optimization example of engine cover manufacturing system.
Modeling Tissue Growth Within Nonwoven Scaffolds Pores
Church, Jeffrey S.; Alexander, David L.J.; Russell, Stephen J.; Ingham, Eileen; Ramshaw, John A.M.; Werkmeister, Jerome A.
2011-01-01
In this study we present a novel approach for predicting tissue growth within the pores of fibrous tissue engineering scaffolds. Thin nonwoven polyethylene terephthalate scaffolds were prepared to characterize tissue growth within scaffold pores, by mouse NR6 fibroblast cells. On the basis of measurements of tissue lengths at fiber crossovers and along fiber segments, mathematical models were determined during the proliferative phase of cell growth. Tissue growth at fiber crossovers decreased with increasing interfiber angle, with exponential relationships determined on day 6 and 10 of culture. Analysis of tissue growth along fiber segments determined two growth profiles, one with enhanced growth as a result of increased tissue lengths near the fiber crossover, achieved in the latter stage of culture. Derived mathematical models were used in the development of a software program to visualize predicted tissue growth within a pore. This study identifies key pore parameters that contribute toward tissue growth, and suggests models for predicting this growth, based on fibroblast cells. Such models may be used in aiding scaffold design, for optimum pore infiltration during the tissue engineering process. PMID:20687775
Modeling and Simulation of Sensor-to-Sink Data Transport Reliability in WSNs
Faisal Karim Shaikh
2012-01-01
Full Text Available The fundamental functionality of WSN (Wireless Sensor Networks is to transport data from sensor nodes to the sink. To increase the fault tolerance, inherent sensor node redundancy in WSN can be exploited but the reliability guarantees are not ensured. The data transport process in WSN is devised as a set of operations on raw data generated in response to user requirements. The different operations filter the raw data to rationalize the reliable transport. Accordingly, we provide reliability models for various data transport semantics. In this paper we argue for the effectiveness of the proposed reliability models by comparing analytically and via simulations in TOSSIM.
An Identifiable State Model To Describe Light Intensity Influence on Microalgae Growth.
Bernardi, A; Perin, G; Sforza, E; Galvanin, F; Morosinotto, T; Bezzo, F
2014-04-23
Despite the high potential as feedstock for the production of fuels and chemicals, the industrial cultivation of microalgae still exhibits many issues. Yield in microalgae cultivation systems is limited by the solar energy that can be harvested. The availability of reliable models representing key phenomena affecting algae growth may help designing and optimizing effective production systems at an industrial level. In this work the complex influence of different light regimes on seawater alga Nannochloropsis salina growth is represented by first principles models. Experimental data such as in vivo fluorescence measurements are employed to develop the model. The proposed model allows description of all growth curves and fluorescence data in a reliable way. The model structure is assessed and modified in order to guarantee the model identifiability and the estimation of its parametric set in a robust and reliable way.
Probabilistic Gompertz model of irreversible growth.
Bardos, D C
2005-05-01
Characterizing organism growth within populations requires the application of well-studied individual size-at-age models, such as the deterministic Gompertz model, to populations of individuals whose characteristics, corresponding to model parameters, may be highly variable. A natural approach is to assign probability distributions to one or more model parameters. In some contexts, size-at-age data may be absent due to difficulties in ageing individuals, but size-increment data may instead be available (e.g., from tag-recapture experiments). A preliminary transformation to a size-increment model is then required. Gompertz models developed along the above lines have recently been applied to strongly heterogeneous abalone tag-recapture data. Although useful in modelling the early growth stages, these models yield size-increment distributions that allow negative growth, which is inappropriate in the case of mollusc shells and other accumulated biological structures (e.g., vertebrae) where growth is irreversible. Here we develop probabilistic Gompertz models where this difficulty is resolved by conditioning parameter distributions on size, allowing application to irreversible growth data. In the case of abalone growth, introduction of a growth-limiting biological length scale is then shown to yield realistic length-increment distributions.
Development of an Environment for Software Reliability Model Selection
1992-09-01
t-1, the reliability of the system is estimated to be [1:954] •i~t): •_’(i• !(2-11) + tp(i,j3) where 5 and f5 are the NIL estimates of a, 3. This...extern FILE f5 extern double ka[]; static FILE *fpl; static boolean DIFFERENT = TRUE; static double LRT, X1[MAX..SAMPLES], Yi[MAX-.SAMPLES]; .static int...function replaced** ** by printfo) function double Betacf(double a, double b, double x) { double qap, qam, qab, em, tem, d; double bz, bm = 1.0, bp, bpp
A new approach to real-time reliability analysis of transmission system using fuzzy Markov model
Tanrioven, M.; Kocatepe, C. [University of Yildiz Technical, Istanbul (Turkey). Dept. of Electrical Engineering; Wu, Q.H.; Turner, D.R.; Wang, J. [Liverpool Univ. (United Kingdom). Dept. of Electrical Engineering and Economics
2004-12-01
To date the studies of power system reliability over a specified time period have used average values of the system transition rates in Markov techniques. [Singh C, Billinton R. System reliability modeling and evaluation. London: Hutchison Educational; 1977]. However, the level of power systems reliability varies from time to time due to weather conditions, power demand and random faults [Billinton R, Wojczynski E. Distributional variation of distribution system reliability indices. IEEE Trans Power Apparatus Systems 1985; PAS-104(11):3152-60]. It is essential to obtain an estimate of system reliability under all environmental and operating conditions. In this paper, fuzzy logic is used in the Markov model to describe both transition rates and temperature-based seasonal variations, which identifies multiple weather conditions such as normal, less stormy, very stormy, etc. A three-bus power system model is considered to determine the variation of system reliability in real-time, using this newly developed fuzzy Markov model (FMM). The results cover different aspects such as daily and monthly reliability changes during January and August. The reliability of the power transmission system is derived as a function of augmentation in peak load level. Finally the variation of the system reliability with weather conditions is determined. (author)
Using PoF models to predict system reliability considering failure collaboration
Zhiguo Zeng
2016-10-01
Full Text Available Existing Physics-of-Failure-based (PoF-based system reliability prediction methods are grounded on the independence assumption, which overlooks the dependency among the components. In this paper, a new type of dependency, referred to as failure collaboration, is introduced and considered in reliability predictions. A PoF-based model is developed to describe the failure behavior of systems subject to failure collaboration. Based on the developed model, the Bisection-based Reliability Analysis Method (BRAM is exploited to calculate the system reliability. The developed methods are applied to predicting the reliability of a Hydraulic Servo Actuator (HSA. The results demonstrate that the developed methods outperform the traditional PoF-based reliability prediction methods when applied to systems subject to failure collaboration.
Stochastic modeling for reliability shocks, burn-in and heterogeneous populations
Finkelstein, Maxim
2013-01-01
Focusing on shocks modeling, burn-in and heterogeneous populations, Stochastic Modeling for Reliability naturally combines these three topics in the unified stochastic framework and presents numerous practical examples that illustrate recent theoretical findings of the authors. The populations of manufactured items in industry are usually heterogeneous. However, the conventional reliability analysis is performed under the implicit assumption of homogeneity, which can result in distortion of the corresponding reliability indices and various misconceptions. Stochastic Modeling for Reliability fills this gap and presents the basics and further developments of reliability theory for heterogeneous populations. Specifically, the authors consider burn-in as a method of elimination of ‘weak’ items from heterogeneous populations. The real life objects are operating in a changing environment. One of the ways to model an impact of this environment is via the external shocks occurring in accordance with some stocha...
Using Model Replication to Improve the Reliability of Agent-Based Models
Zhong, Wei; Kim, Yushim
The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.
Analysis of whisker-toughened CMC structural components using an interactive reliability model
Duffy, Stephen F.; Palko, Joseph L.
1992-01-01
Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.
Growth curve models and statistical diagnostics
Pan, Jian-Xin
2002-01-01
Growth-curve models are generalized multivariate analysis-of-variance models. These models are especially useful for investigating growth problems on short times in economics, biology, medical research, and epidemiology. This book systematically introduces the theory of the GCM with particular emphasis on their multivariate statistical diagnostics, which are based mainly on recent developments made by the authors and their collaborators. The authors provide complete proofs of theorems as well as practical data sets and MATLAB code.
Modeling Performance of Plant Growth Regulators
W. C. Kreuser
2017-03-01
Full Text Available Growing degree day (GDD models can predict the performance of plant growth regulators (PGRs applied to creeping bentgrass ( L.. The goal of this letter is to describe experimental design strategies and modeling approaches to create PGR models for different PGRs, application rates, and turf species. Results from testing the models indicate that clipping yield should be measured until the growth response has diminished. This is in contrast to reapplication of a PGR at preselected intervals. During modeling, inclusion of an amplitude-dampening coefficient in the sinewave model allows the PGR effect to dissipate with time.
Analysis and Application of Mechanical System Reliability Model Based on Copula Function
An Hai
2016-10-01
Full Text Available There is complicated correlations in mechanical system. By using the advantages of copula function to solve the related issues, this paper proposes the mechanical system reliability model based on copula function. And makes a detailed research for the serial and parallel mechanical system model and gets their reliability function respectively. Finally, the application research is carried out for serial mechanical system reliability model to prove its validity by example. Using Copula theory to make mechanical system reliability modeling and its expectation, studying the distribution of the random variables (marginal distribution of the mechanical product’ life and associated structure of variables separately, can reduce the difficulty of multivariate probabilistic modeling and analysis to make the modeling and analysis process more clearly.
Nonlinear Mixed-Effects Models for Repairable Systems Reliability
TAN Fu-rong; JIANG Zhi-bin; KUO Way; Suk Joo BAE
2007-01-01
Mixed-effects models, also called random-effects models, are a regression type of analysis which enables the analyst to not only describe the trend over time within each subject, but also to describe the variation among different subjects. Nonlinear mixed-effects models provide a powerful and flexible tool for handling the unbalanced count data. In this paper, nonlinear mixed-effects models are used to analyze the failure data from a repairable system with multiple copies. By using this type of models, statistical inferences about the population and all copies can be made when accounting for copy-to-copy variance. Results of fitting nonlinear mixed-effects models to nine failure-data sets show that the nonlinear mixed-effects models provide a useful tool for analyzing the failure data from multi-copy repairable systems.
Young Children's Selective Learning of Rule Games from Reliable and Unreliable Models
Rakoczy, Hannes; Warneken, Felix; Tomasello, Michael
2009-01-01
We investigated preschoolers' selective learning from models that had previously appeared to be reliable or unreliable. Replicating previous research, children from 4 years selectively learned novel words from reliable over unreliable speakers. Extending previous research, children also selectively learned other kinds of acts--novel games--from…
Kovar K; Leijnse A; Uffink G; Pastoors MJH; Mulschlegel JHC; Zaadnoordijk WJ; LDL; IMD; TNO/NITG; Haskoning
2005-01-01
A modelling approach was developed, incorporated in the finite-element method based program LGMLUC, making it possible to determine the reliability of travel times of groundwater flowing to groundwater abstraction sites. The reliability is seen here as a band (zone) around the expected travel-time i
Reliability Based Optimal Design of Vertical Breakwaters Modelled as a Series System Failure
Christiani, E.; Burcharth, H. F.; Sørensen, John Dalsgaard
1996-01-01
Reliability based design of monolithic vertical breakwaters is considered. Probabilistic models of important failure modes such as sliding and rupture failure in the rubble mound and the subsoil are described. Characterisation of the relevant stochastic parameters are presented, and relevant design...... variables are identified and an optimal system reliability formulation is presented. An illustrative example is given....
Zongshuai Yan
2015-01-01
Full Text Available The two-terminal reliability calculation for wireless sensor networks (WSNs is a #P-hard problem. The reliability calculation of WSNs on the multicast model provides an even worse combinatorial explosion of node states with respect to the calculation of WSNs on the unicast model; many real WSNs require the multicast model to deliver information. This research first provides a formal definition for the WSN on the multicast model. Next, a symbolic OBDD_Multicast algorithm is proposed to evaluate the reliability of WSNs on the multicast model. Furthermore, our research on OBDD_Multicast construction avoids the problem of invalid expansion, which reduces the number of subnetworks by identifying the redundant paths of two adjacent nodes and s-t unconnected paths. Experiments show that the OBDD_Multicast both reduces the complexity of the WSN reliability analysis and has a lower running time than Xing’s OBDD- (ordered binary decision diagram- based algorithm.
Wind Farm Reliability Modelling Using Bayesian Networks and Semi-Markov Processes
Robert Adam Sobolewski
2015-09-01
Full Text Available Technical reliability plays an important role among factors affecting the power output of a wind farm. The reliability is determined by an internal collection grid topology and reliability of its electrical components, e.g. generators, transformers, cables, switch breakers, protective relays, and busbars. A wind farm reliability’s quantitative measure can be the probability distribution of combinations of operating and failed states of the farm’s wind turbines. The operating state of a wind turbine is its ability to generate power and to transfer it to an external power grid, which means the availability of the wind turbine and other equipment necessary for the power transfer to the external grid. This measure can be used for quantitative analysis of the impact of various wind farm topologies and the reliability of individual farm components on the farm reliability, and for determining the expected farm output power with consideration of the reliability. This knowledge may be useful in an analysis of power generation reliability in power systems. The paper presents probabilistic models that quantify the wind farm reliability taking into account the above-mentioned technical factors. To formulate the reliability models Bayesian networks and semi-Markov processes were used. Using Bayesian networks the wind farm structural reliability was mapped, as well as quantitative characteristics describing equipment reliability. To determine the characteristics semi-Markov processes were used. The paper presents an example calculation of: (i probability distribution of the combination of both operating and failed states of four wind turbines included in the wind farm, and (ii expected wind farm output power with consideration of its reliability.
AN IMPROVED FUZZY MODEL TO PREDICT SOFTWARE RELIABILITY
Deepika Chawla
2012-09-01
Full Text Available Software faults are one of major criteria to estimate the software quality or the software reliability. There are number of matrices defined that uses the software faults to estimate the software quality. But when we have a large software system with thousands of class modules, in such case it is not easy to apply the software matrices on each module of software system. The present work isthe solution of the defined problem. In this work software quality is estimated by using the rejection method on software faults. The rejection method is applied on the basis on Fuzzy Logic in a softwaresystem. To perform the analysis in an effective way the weightage approach is used on the software faults. In this work we have assigned different weightage on software faults to categorize the faults respective to fault criticality and the frequency. Once the faults are categorized the next work is the implementation of proposed work software fault to represents the accepted and rejectedmodules from the software system. The obtained result shows the better visualization of software quality in case of software fault analysis.
Modeling urban growth in Kigali city Rwanda
kagoyire
Keywords-Urban growth, GIS, Remote Sensing, Logistic Regression modeling, Kigali city, Rwanda ... decisions across space, of which there is Cellular Automata (CA) which has a great capability to handle .... grassland, and green vegetation.
Powering stochastic reliability models by discrete event simulation
Kozine, Igor; Wang, Xiaoyun
2012-01-01
it difficult to find a solution to the problem. The power of modern computers and recent developments in discrete-event simulation (DES) software enable to diminish some of the drawbacks of stochastic models. In this paper we describe the insights we have gained based on using both Markov and DES models...
Fatigue Reliability and Effective Turbulence Models in Wind Farms
Sørensen, John Dalsgaard; Frandsen, S.; Tarp-Johansen, N.J.
2007-01-01
behind wind turbines can imply a significant reduction in the fatigue lifetime of wind turbines placed in wakes. In this paper the design code model in the wind turbine code IEC 61400-1 (2005) is evaluated from a probabilistic point of view, including the importance of modeling the SN-curve by linear...
1983-09-01
Industry Australian Atomic Energy Commission, Director CSIROj Materials Science Division, Library Trans-Australia Airlines, Library Qantas Airways ...designed to evaluate the reliability functions that result from the application of reliability analysis to the fatigue of aircraft structures, in particular...Messages 60+ A.4. Program Assembly 608 DISTRIBUTION DOCUMENT CONTROL DATA II 1. INTRODUCTION The application of reliability analysis to the fatigue
Finite State Machine Based Evaluation Model for Web Service Reliability Analysis
M, Thirumaran; Abarna, S; P, Lakshmi
2011-01-01
Now-a-days they are very much considering about the changes to be done at shorter time since the reaction time needs are decreasing every moment. Business Logic Evaluation Model (BLEM) are the proposed solution targeting business logic automation and facilitating business experts to write sophisticated business rules and complex calculations without costly custom programming. BLEM is powerful enough to handle service manageability issues by analyzing and evaluating the computability and traceability and other criteria of modified business logic at run time. The web service and QOS grows expensively based on the reliability of the service. Hence the service provider of today things that reliability is the major factor and any problem in the reliability of the service should overcome then and there in order to achieve the expected level of reliability. In our paper we propose business logic evaluation model for web service reliability analysis using Finite State Machine (FSM) where FSM will be extended to analy...
Stochastic Gompertz model of tumour cell growth.
Lo, C F
2007-09-21
In this communication, based upon the deterministic Gompertz law of cell growth, a stochastic model in tumour growth is proposed. This model takes account of both cell fission and mortality too. The corresponding density function of the size of the tumour cells obeys a functional Fokker--Planck equation which can be solved analytically. It is found that the density function exhibits an interesting "multi-peak" structure generated by cell fission as time evolves. Within this framework the action of therapy is also examined by simply incorporating a therapy term into the deterministic cell growth term.
Digital Avionics Information System (DAIS): Reliability and Maintainability Model. Final Report.
Czuchry, Andrew J.; And Others
The reliability and maintainability (R&M) model described in this report represents an important portion of a larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. The R&M model is the first of three models that comprise a modeling system for use in LCC analysis of avionics systems. The total…
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...
Reliability analysis of diesel engine crankshaft based on 2D stress strength interference model
无
2006-01-01
A 2D stress strength interference model (2D-SSIM) considering that the fatigue reliability of engineering structural components has close relationship to load asymmetric ratio and its variability to some extent is put forward. The principle, geometric schematic and limit state equation of this model are presented. Reliability evaluation for a kind of diesel engine crankshaft was made based on this theory, in which multi-axial loading fatigue criteria was employed. Because more important factors, i.e.stress asymmetric ratio and its variability, are considered, it theoretically can make more accurate evaluation for structural component reliability than the traditional interference model. Correspondingly, a Monte-Carlo Method simulation solution is also given. The computation suggests that this model can yield satisfactory reliability evaluation.
Liu, Donhang
2014-01-01
This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.
Onboard Robust Visual Tracking for UAVs Using a Reliable Global-Local Object Model
Fu, Changhong; Duan, Ran; Kircali, Dogan; Kayacan, Erdal
2016-01-01
In this paper, we present a novel onboard robust visual algorithm for long-term arbitrary 2D and 3D object tracking using a reliable global-local object model for unmanned aerial vehicle (UAV) applications, e.g...
Rademaker, Corné J.; Omedo Bebe, Bockline; Lee, van der Jan; Kilelu, Catherine; Tonui, Charles
2016-01-01
This report provides an overview of how the Kenyan dairy sector performs in three analytical domains: the robustness of the supply chains, the reliability of institutional governance and the resilience of the innovation system. Analysis is by literature review, stakeholder interviews and a validatio
Rademaker, Corné J.; Omedo Bebe, Bockline; Lee, van der Jan; Kilelu, Catherine; Tonui, Charles
2016-01-01
This report provides an overview of how the Kenyan dairy sector performs in three analytical domains: the robustness of the supply chains, the reliability of institutional governance and the resilience of the innovation system. Analysis is by literature review, stakeholder interviews and a
Reinhold H. Dauskardt
2005-08-01
Final report of our DOE funded research program. Aim of the research program was to provide a fundamental basis from which the mechanical reliability of layered structures may be understood, and to provide guidelines for the development of technologically relevant layered material structures with optimum resistance to fracture and subcritical debonding. Progress in the program to achieve these goals is described.
Modelling asymmetric growth in crowded plant communities
Damgaard, Christian
2010-01-01
A class of models that may be used to quantify the effect of size-asymmetric competition in crowded plant communities by estimating a community specific degree of size-asymmetric growth for each species in the community is suggested. The model consists of two parts: an individual size-asymmetric ......A class of models that may be used to quantify the effect of size-asymmetric competition in crowded plant communities by estimating a community specific degree of size-asymmetric growth for each species in the community is suggested. The model consists of two parts: an individual size...
Reliability-cost models for the power switching devices of wind power converters
Ma, Ke; Blaabjerg, Frede
2012-01-01
In order to satisfy the growing reliability requirements for the wind power converters with more cost-effective solution, the target of this paper is to establish a new reliability-cost model which can connect the relationship between reliability performances and corresponding semiconductor cost...... for power switching devices. First the conduction loss, switching loss as well as thermal impedance models of power switching devices (IGBT module) are related to the semiconductor chip number information respectively. Afterwards simplified analytical solutions, which can directly extract the junction...
Microstructural Modeling of Brittle Materials for Enhanced Performance and Reliability.
Teague, Melissa Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Teague, Melissa Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rodgers, Theron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rodgers, Theron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grutzik, Scott Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grutzik, Scott Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-08-01
Brittle failure is often influenced by difficult to measure and variable microstructure-scale stresses. Recent advances in photoluminescence spectroscopy (PLS), including improved confocal laser measurement and rapid spectroscopic data collection have established the potential to map stresses with microscale spatial resolution (%3C2 microns). Advanced PLS was successfully used to investigate both residual and externally applied stresses in polycrystalline alumina at the microstructure scale. The measured average stresses matched those estimated from beam theory to within one standard deviation, validating the technique. Modeling the residual stresses within the microstructure produced general agreement in comparison with the experimentally measured results. Microstructure scale modeling is primed to take advantage of advanced PLS to enable its refinement and validation, eventually enabling microstructure modeling to become a predictive tool for brittle materials.
Nikulin, M; Mesbah, M; Limnios, N
2004-01-01
Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.
Lu, Yi
2016-01-01
To model students' math growth trajectory, three conventional growth curve models and three growth mixture models are applied to the Early Childhood Longitudinal Study Kindergarten-Fifth grade (ECLS K-5) dataset in this study. The results of conventional growth curve model show gender differences on math IRT scores. When holding socio-economic…
Electromagnetic Model Reliably Predicts Radar Scattering Characteristics of Airborne Organisms
Mirkovic, Djordje; Stepanian, Phillip M.; Kelly, Jeffrey F.; Chilson, Phillip B.
2016-10-01
The radar scattering characteristics of aerial animals are typically obtained from controlled laboratory measurements of a freshly harvested specimen. These measurements are tedious to perform, difficult to replicate, and typically yield only a small subset of the full azimuthal, elevational, and polarimetric radio scattering data. As an alternative, biological applications of radar often assume that the radar cross sections of flying animals are isotropic, since sophisticated computer models are required to estimate the 3D scattering properties of objects having complex shapes. Using the method of moments implemented in the WIPL-D software package, we show for the first time that such electromagnetic modeling techniques (typically applied to man-made objects) can accurately predict organismal radio scattering characteristics from an anatomical model: here the Brazilian free-tailed bat (Tadarida brasiliensis). The simulated scattering properties of the bat agree with controlled measurements and radar observations made during a field study of bats in flight. This numerical technique can produce the full angular set of quantitative polarimetric scattering characteristics, while eliminating many practical difficulties associated with physical measurements. Such a modeling framework can be applied for bird, bat, and insect species, and will help drive a shift in radar biology from a largely qualitative and phenomenological science toward quantitative estimation of animal densities and taxonomic identification.
On new cautious structural reliability models in the framework of imprecise probabilities
Utkin, Lev; Kozine, Igor
2010-01-01
New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability...... measures when the number of events of interest or observations is very small. The main feature of the models is that prior ignorance is not modelled by a fixed single prior distribution, but by a class of priors which is defined by upper and lower probabilities that can converge as statistical data...
A d dimensional nucleation and growth model
Cerf, Raphael
2010-01-01
We analyze the relaxation time of a ferromagnetic d dimensional growth model on the lattice. The model is characterized by d param- eters which represent the activation energies of a site, depending on the number of occupied nearest neighbours. This model is a natural generalisation of the model studied by Dehghanpour and Schonmann [DS97a], where the activation energy of a site with more than two occupied neighbours is zero.
A Structural Reliability Business Process Modelling with System Dynamics Simulation
Lam, C. Y.; S.L. Chan; Ip, W.H.
2010-01-01
Business activity flow analysis enables organizations to manage structured business processes, and can thus help them to improve performance. The six types of business activities identified here (i.e., SOA, SEA, MEA, SPA, MSA and FIA) are correlated and interact with one another, and the decisions from any business activity form feedback loops with previous and succeeding activities, thus allowing the business process to be modelled and simulated. For instance, for any company that is eager t...
Testing the stability and reliability of starspot modelling.
Kovari, Zs.; Bartus, J.
1997-07-01
Since the mid 70's different starspot modelling techniques have been used to describe the observed spot variability on active stars. Spot positions and temperatures are calculated by application of surface integration techniques or solution of analytic equations on observed photometric data. Artificial spotted light curves were generated, by use of the analytic expressions of Budding (1977Ap&SS..48..207B), to test how the different constraints like the intrinsic scatter of the observed data or the angle of inclination affects the spot solutions. Counteractions between the different parameters like inclination, latitude and spot size were also investigated. The results of re-modelling the generated data were scrutinized statistically. It was found, that (1) 0.002-0.005mag of photometric accuracy is required to recover geometrical spot parameters within an acceptable error box; (2) even a 0.03-0.05mag error in unspotted brightness substantially affects the recovery of the original spot distribution; (3) especially at low inclination, under- or overestimation of inclination by 10° leads to an important systematic error in spot latitude and size; (4) when the angle of inclination i<~20° photometric spot modelling is unable to provide satisfactory information on spot location and size.
An Imprecise Probability Model for Structural Reliability Based on Evidence and Gray Theory
Bin Suo
2013-01-01
Full Text Available To avoid the shortages and limitations of probabilistic and non-probabilistic reliability model for structural reliability analysis in the case of limited samples for basic variables, a new imprecise probability model is proposed. Confidence interval with a given confidence is calculated on the basis of small samples by gray theory, which is not depending on the distribution pattern of variable. Then basic probability assignments and focal elements are constructed and approximation methods of structural reliability based on belief and plausibility functions are proposed in the situation that structure limit state function is monotonic and non-monotonic, respectively. The numerical examples show that the new reliability model utilizes all the information included in small samples and considers both aleatory and epistemic uncertainties in them, thus it can rationally measure the safety of the structure and the measurement can be more and more accurate with the increasing of sample size.
Internal stresses in ultrathin oxide flms: influence on growth and reliability
Van Overmeere, Quentin; 224th Meeting of the Electrochemical Society, Symposium on Oxide Films
2013-01-01
Oxide films grown by the anodic oxidation of valve metals are used in several technological applications, including the corrosion protection of aluminum and its alloys. The corrosion protection properties are provided by the ability to grow a very thick porous oxide layer under selected experimental conditions. The sequence of the growth of a porous anodic oxide layer can generally be divided in three successive stages: the growth of a dense oxide layer, the initiation of pores in the dense l...
Chen, D.J.
1988-01-01
The literature is abundant with combinatorial reliability analysis of communication networks and fault-tolerant computer systems. However, it is very difficult to formulate reliability indexes using combinatorial methods. These limitations have led to the development of time-dependent reliability analysis using stochastic processes. In this research, time-dependent reliability-analysis techniques using Dataflow Graphs (DGF) are developed. The chief advantages of DFG models over other models are their compactness, structural correspondence with the systems, and general amenability to direct interpretation. This makes the verification of the correspondence of the data-flow graph representation to the actual system possible. Several DGF models are developed and used to analyze the reliability of communication networks and computer systems. Specifically, Stochastic Dataflow graphs (SDFG), both the discrete-time and the continuous time models are developed and used to compute time-dependent reliability of communication networks and computer systems. The repair and coverage phenomenon of communication networks is also analyzed using SDFG models.
Microbial growth modelling with artificial neural networks.
Jeyamkonda, S; Jaya, D S; Holle, R A
2001-03-20
There is a growing interest in modelling microbial growth as an alternative to time-consuming, traditional, microbiological enumeration techniques. Several statistical models have been reported to describe the growth of different microorganisms, but there are accuracy problems. An alternate technique 'artificial neural networks' (ANN) for modelling microbial growth is explained and evaluated. Published data were used to build separate general regression neural network (GRNN) structures for modelling growth of Aeromonas hydrophila, Shigella flexneri, and Brochothrix thermosphacta. Both GRNN and published statistical model predictions were compared against the experimental data using six statistical indices. For training data sets, the GRNN predictions were far superior than the statistical model predictions, whereas the GRNN predictions were similar or slightly worse than statistical model predictions for test data sets for all the three data sets. GRNN predictions can be considered good, considering its performance for unseen data. Graphical plots, mean relative percentage residual, mean absolute relative residual, and root mean squared residual were identified as suitable indices for comparing competing models. ANN can now become a vehicle whereby predictive microbiology can be applied in food product development and food safety risk assessment.
Reliable modeling of the electronic spectra of realistic uranium complexes
Tecmer, Paweł; Govind, Niranjan; Kowalski, Karol; de Jong, Wibe A.; Visscher, Lucas
2013-07-01
We present an EOMCCSD (equation of motion coupled cluster with singles and doubles) study of excited states of the small [UO2]2+ and [UO2]+ model systems as well as the larger UVIO2(saldien) complex. In addition, the triples contribution within the EOMCCSDT and CR-EOMCCSD(T) (completely renormalized EOMCCSD with non-iterative triples) approaches for the [UO2]2+ and [UO2]+ systems as well as the active-space variant of the CR-EOMCCSD(T) method—CR-EOMCCSd(t)—for the UVIO2(saldien) molecule are investigated. The coupled cluster data were employed as benchmark to choose the "best" appropriate exchange-correlation functional for subsequent time-dependent density functional (TD-DFT) studies on the transition energies for closed-shell species. Furthermore, the influence of the saldien ligands on the electronic structure and excitation energies of the [UO2]+ molecule is discussed. The electronic excitations as well as their oscillator dipole strengths modeled with TD-DFT approach using the CAM-B3LYP exchange-correlation functional for the [UVO2(saldien)]- with explicit inclusion of two dimethyl sulfoxide molecules are in good agreement with the experimental data of Takao et al. [Inorg. Chem. 49, 2349 (2010), 10.1021/ic902225f].
Activist Model of Political Party Growth
Jeffs, Rebecca A; Roach, Paul A; Wyburn, John
2015-01-01
The membership of British political parties has a direct influence on their political effectiveness. This paper applies the mathematics of epidemiology to the analysis of the growth and decline of such memberships. The party members are divided into activists and inactive members, where all activists influence the quality of party recruitment, but only a subset of activists recruit and thus govern numerical growth. The activists recruit for only a limited period, which acts as a restriction on further party growth. This Limited Activist model is applied to post-war and recent memberships of the Labour, Scottish National and Conservative parties. The model reproduces data trends, and relates realistically to historical narratives. It is concluded that the political parties analysed are not in danger of extinction but experience repeated periods of growth and decline in membership, albeit at lower numbers than in the past.
Reliability of linear measurements on a virtual bilateral cleft lip and palate model
Oosterkamp, B.C.M.; van der Meer, W.J.; Rutenfrans, M.; Dijkstra, P.U.
2006-01-01
Objective: To assess the reliability and validity of measurements performed on three-dimensional virtual models of neonatal bilateral cleft lip and palate patients, compared with measurements performed on plaster cast models. Materials and Methods: Ten high-quality plaster cast models of bilateral c
Modeling Manufacturing Impacts on Aging and Reliability of Polyurethane Foams
Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Christine Cardinal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mondy, Lisa Ann [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lorenzo, Henry T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2016-09-25
Polyurethane is a complex multiphase material that evolves from a viscous liquid to a system of percolating bubbles, which are created via a CO2 generating reaction. The continuous phase polymerizes to a solid during the foaming process generating heat. Foams introduced into a mold increase their volume up to tenfold, and the dynamics of the expansion process may lead to voids and will produce gradients in density and degree of polymerization. These inhomogeneities can lead to structural stability issues upon aging. For instance, structural components in weapon systems have been shown to change shape as they age depending on their molding history, which can threaten critical tolerances. The purpose of this project is to develop a Cradle-to-Grave multiphysics model, which allows us to predict the material properties of foam from its birth through aging in the stockpile, where its dimensional stability is important.
Modeling Manufacturing Impacts on Aging and Reliability of Polyurethane Foams
Rao, Rekha R.; Roberts, Christine Cardinal; Mondy, Lisa Ann; Soehnel, Melissa Marie; Johnson, Kyle; Lorenzo, Henry T.
2016-10-01
Polyurethane is a complex multiphase material that evolves from a viscous liquid to a system of percolating bubbles, which are created via a CO2 generating reaction. The continuous phase polymerizes to a solid during the foaming process generating heat. Foams introduced into a mold increase their volume up to tenfold, and the dynamics of the expansion process may lead to voids and will produce gradients in density and degree of polymerization. These inhomogeneities can lead to structural stability issues upon aging. For instance, structural components in weapon systems have been shown to change shape as they age depending on their molding history, which can threaten critical tolerances. The purpose of this project is to develop a Cradle-to-Grave multiphysics model, which allows us to predict the material properties of foam from its birth through aging in the stockpile, where its dimensional stability is important.
A chaotic agricultural machines production growth model
Jablanović, Vesna D.
2011-01-01
Chaos theory, as a set of ideas, explains the structure in aperiodic, unpredictable dynamic systems. The basic aim of this paper is to provide a relatively simple agricultural machines production growth model that is capable of generating stable equilibrium, cycles, or chaos. A key hypothesis of this work is based on the idea that the coefficient π = 1 + α plays a crucial role in explaining local stability of the agricultural machines production, where α is an autonomous growth rate of the ag...
Thermal models pertaining to continental growth
Morgan, Paul; Ashwal, Lew
1988-01-01
Thermal models are important to understanding continental growth as the genesis, stabilization, and possible recycling of continental crust are closely related to the tectonic processes of the earth which are driven primarily by heat. The thermal energy budget of the earth was slowly decreasing since core formation, and thus the energy driving the terrestrial tectonic engine was decreasing. This fundamental observation was used to develop a logic tree defining the options for continental growth throughout earth history.
Fourcaud, Thierry; Zhang, Xiaopeng; Stokes, Alexia; Lambers, Hans; Körner, Christian
2008-01-01
Background Modelling plant growth allows us to test hypotheses and carry out virtual experiments concerning plant growth processes that could otherwise take years in field conditions. The visualization of growth simulations allows us to see directly and vividly the outcome of a given model and provides us with an instructive tool useful for agronomists and foresters, as well as for teaching. Functional–structural (FS) plant growth models are nowadays particularly important for integrating biological processes with environmental conditions in 3-D virtual plants, and provide the basis for more advanced research in plant sciences. Scope In this viewpoint paper, we ask the following questions. Are we modelling the correct processes that drive plant growth, and is growth driven mostly by sink or source activity? In current models, is the importance of soil resources (nutrients, water, temperature and their interaction with meristematic activity) considered adequately? Do classic models account for architectural adjustment as well as integrating the fundamental principles of development? Whilst answering these questions with the available data in the literature, we put forward the opinion that plant architecture and sink activity must be pushed to the centre of plant growth models. In natural conditions, sinks will more often drive growth than source activity, because sink activity is often controlled by finite soil resources or developmental constraints. PMA06 This viewpoint paper also serves as an introduction to this Special Issue devoted to plant growth modelling, which includes new research covering areas stretching from cell growth to biomechanics. All papers were presented at the Second International Symposium on Plant Growth Modeling, Simulation, Visualization and Applications (PMA06), held in Beijing, China, from 13–17 November, 2006. Although a large number of papers are devoted to FS models of agricultural and forest crop species, physiological and genetic
Maintenance overtime policies in reliability theory models with random working cycles
Nakagawa, Toshio
2015-01-01
This book introduces a new concept of replacement in maintenance and reliability theory. Replacement overtime, where replacement occurs at the first completion of a working cycle over a planned time, is a new research topic in maintenance theory and also serves to provide a fresh optimization technique in reliability engineering. In comparing replacement overtime with standard and random replacement techniques theoretically and numerically, 'Maintenance Overtime Policies in Reliability Theory' highlights the key benefits to be gained by adopting this new approach and shows how they can be applied to inspection policies, parallel systems and cumulative damage models. Utilizing the latest research in replacement overtime by internationally recognized experts, readers are introduced to new topics and methods, and learn how to practically apply this knowledge to actual reliability models. This book will serve as an essential guide to a new subject of study for graduate students and researchers and also provides a...
Ganiyu A. Ajenikoko
2014-01-01
Full Text Available Reliability indices are parametric quantities used to assess the performance levels of electrical power distribution systems. In this work, a generalized quadratic model is developed for electrical power distribution system contributions to system reliability indices using Ikeja, Port-Harcourt, Kaduna and Kano distribution system feeders as case studies. The mean System Average Interruption Duration Index (SAIDI, System Average Interruption Frequency Index (SAIFI and Customer Average Interruption Duration Index (CAIDI contributions to system reliability indices for Ikeja, Port-Harcourt, Kaduna and Kano distribution systems were 0.0033, 0.0026, 0.0033 and 0.0018 respectively due to the fact that a prolonged period of interruptions was recorded on most of the feeders attached to Port-Harcourt and Kano distribution systems making them to be less reliable compared to Ikeja and Kaduna distribution systems. The generalized Quadratic model forms a basis for a good design, planning and maintenance of distribution systems at large.
Guohua Chen
2013-01-01
Full Text Available To diagnose key factors which cause the failure of supply chain, on the base of taking 3-tier supply chain centering on manufacturer as the object, the diagnostic model of reliability of supply chain with common cause failure was established. Then considering unreliability and key importance as quantitative index, the diagnostic algorism of key factors of reliability of supply chain with common cause failure was studied by the method of Monte Carlo Simulation. The algorism can be used to evaluate the reliability of f supply chain and determine key factors which cause the failure of supply chain, which supplies a new method for diagnosing reliability of supply chain based on common cause failure. Finally, an example was presented to prove the feasibility and validity of the model and method.
Reliability analysis and prediction of mixed mode load using Markov Chain Model
Nikabdullah, N.; Singh, S. S. K.; Alebrahim, R.; Azizi, M. A.; K, Elwaleed A.; Noorani, M. S. M.
2014-06-01
The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.
Reliability analysis and prediction of mixed mode load using Markov Chain Model
Nikabdullah, N. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia and Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Singh, S. S. K.; Alebrahim, R.; Azizi, M. A. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); K, Elwaleed A. [Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Noorani, M. S. M. [School of Mathematical Sciences, Faculty of Science and Technology, Universiti Kebangsaan Malaysia (Malaysia)
2014-06-19
The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.
A New Software Reliability Framework——An Extended Cleanroom Model
无
2001-01-01
Cleanroom software engineering has been proven effective in improving software development quality while at the same time increasing reliability. To adapt to large software system development, the paper presents an extended the Cleanroom model, which integrates object-oriented method based on stimulus history, reversed engineering idea, automatic testing and reliability assessment into software development. The paper discusses the architecture and realizing technology of ECM.
Assessment of MARMOT Grain Growth Model
Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation Dept.; Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation Dept.; Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation Dept.; Brown, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pokharel, R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-12-01
This report assesses the MARMOT grain growth model by comparing modeling predictions with experimental results from thermal annealing. The purpose here is threefold: (1) to demonstrate the validation approach of using thermal annealing experiments with non-destructive characterization, (2) to test the reconstruction capability and computation efficiency in MOOSE, and (3) to validate the grain growth model and the associated parameters that are implemented in MARMOT for UO_{2}. To assure a rigorous comparison, the 2D and 3D initial experimental microstructures of UO_{2} samples were characterized using non-destructive Synchrotron x-ray. The same samples were then annealed at 2273K for grain growth, and their initial microstructures were used as initial conditions for simulated annealing at the same temperature using MARMOT. After annealing, the final experimental microstructures were characterized again to compare with the results from simulations. So far, comparison between modeling and experiments has been done for 2D microstructures, and 3D comparison is underway. The preliminary results demonstrated the usefulness of the non-destructive characterization method for MARMOT grain growth model validation. A detailed analysis of the 3D microstructures is in progress to fully validate the current model in MARMOT.
Markov Chain Modelling of Reliability Analysis and Prediction under Mixed Mode Loading
SINGH Salvinder; ABDULLAH Shahrum; NIK MOHAMED Nik Abdullah; MOHD NOORANI Mohd Salmi
2015-01-01
The reliability assessment for an automobile crankshaft provides an important understanding in dealing with the design life of the component in order to eliminate or reduce the likelihood of failure and safety risks. The failures of the crankshafts are considered as a catastrophic failure that leads towards a severe failure of the engine block and its other connecting subcomponents. The reliability of an automotive crankshaft under mixed mode loading using the Markov Chain Model is studied. The Markov Chain is modelled by using a two-state condition to represent the bending and torsion loads that would occur on the crankshaft. The automotive crankshaft represents a good case study of a component under mixed mode loading due to the rotating bending and torsion stresses. An estimation of the Weibull shape parameter is used to obtain the probability density function, cumulative distribution function, hazard and reliability rate functions, the bathtub curve and the mean time to failure. The various properties of the shape parameter is used to model the failure characteristic through the bathtub curve is shown. Likewise, an understanding of the patterns posed by the hazard rate onto the component can be used to improve the design and increase the life cycle based on the reliability and dependability of the component. The proposed reliability assessment provides an accurate, efficient, fast and cost effective reliability analysis in contrast to costly and lengthy experimental techniques.
Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods
Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin;
2013-01-01
Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from the product......Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... the production trait evaluation of Nordic Red dairy cattle. Genotyped bulls with daughters are used as training animals, and genotyped bulls and producing cows as candidate animals. For simplicity, size of the data is chosen so that the full inverses of the mixed model equation coefficient matrices can...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was...
Statistical Degradation Models for Reliability Analysis in Non-Destructive Testing
Chetvertakova, E. S.; Chimitova, E. V.
2017-04-01
In this paper, we consider the application of the statistical degradation models for reliability analysis in non-destructive testing. Such models enable to estimate the reliability function (the dependence of non-failure probability on time) for the fixed critical level using the information of the degradation paths of tested items. The most widely used models are the gamma and Wiener degradation models, in which the gamma or normal distributions are assumed as the distribution of degradation increments, respectively. Using the computer simulation technique, we have analysed the accuracy of the reliability estimates, obtained for considered models. The number of increments can be enlarged by increasing the sample size (the number of tested items) or by increasing the frequency of measuring degradation. It has been shown, that the sample size has a greater influence on the accuracy of the reliability estimates in comparison with the measuring frequency. Moreover, it has been shown that another important factor, influencing the accuracy of reliability estimation, is the duration of observing degradation process.
Endogenous Fertility in Models of Growth Endogenous Fertility in Models of Growth
Allan Drazen
1993-03-01
Full Text Available Endogenous Fertility in Models of Growth Most theories of economic growth ignore determinants of growth in population. The common assumption of constant population growth is strikingly inconsistent with the data, which reveal a logistic pattern of population growth, the acceleration often coinciding with industrialization. After surveying existing theories of endogenous population, we propose a model in which the family replaces the market in a "traditional" sector. Children are both the primary source of labor and the sole means of saving in this sector, with output divided behween generations via bargaining. Industrialization improves the oportunities of children outside the rural sector. It thus leads not only to higher outmigration, but also, by increasing children's bargaining power and hence their share of output, lowers the incentive to bear children. The model can thus explain observed changes in both overall population growth and in its sectorai composition.
An adaptive neuro fuzzy model for estimating the reliability of component-based software systems
Kirti Tyagi
2014-01-01
Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.
TWO-PROCEDURE OF MODEL RELIABILITY-BASED OPTIMIZATION FOR WATER DISTRIBUTION SYSTEMS
无
2000-01-01
Recently, considerable emphasis has been laid to the reliability-based optimization model for water distribution systems. But considerable computational effort is needed to determine the reliability-based optimal design of large networks, even of mid-sized networks. In this paper, a new methodology is presented for the reliability analysis for water distribution systems. This methodology consists of two procedures. The first is that the optimal design is constrained only by the pressure heads at demand nodes, done in GRG2. Because the reliability constrains are removed from the optimal problem, a number of simulations do not need to be conducted, so the computer time is greatly decreased. Then, the second procedure is a linear optimal search procedure. In this linear procedure, the optimal results obtained by GRG2 are adjusted by the reliability constrains. The results are a group of commercial diameters of pipes and the constraints of pressure heads and reliability at nodes are satisfied. Therefore, the computer burden is significantly decreased, and the reliability-based optimization is of more practical use.
Van Norman, Ethan R
2016-09-01
Curriculum-based measurement of oral reading (CBM-R) progress monitoring data is used to measure student response to instruction. Federal legislation permits educators to use CBM-R progress monitoring data as a basis for determining the presence of specific learning disabilities. However, decision making frameworks originally developed for CBM-R progress monitoring data were not intended for such high stakes assessments. Numerous documented issues with trend line estimation undermine the validity of using slope estimates to infer progress. One proposed recommendation is to use confidence interval overlap as a means of judging reliable growth. This project explored the degree to which confidence interval overlap was related to true growth magnitude using simulation methodology. True and observed CBM-R scores were generated across 7 durations of data collection (range 6-18 weeks), 3 levels of dataset quality or residual variance (5, 10, and 15 words read correct per minute) and 2 types of data collection schedules. Descriptive and inferential analyses were conducted to explore interactions between overlap status, progress monitoring scenarios, and true growth magnitude. A small but statistically significant interaction was observed between overlap status, duration, and dataset quality, b = -0.004, t(20992) =-7.96, p < .001. In general, confidence interval overlap does not appear to meaningfully account for variance in true growth across many progress monitoring conditions. Implications for research and practice are discussed. Limitations and directions for future research are addressed. (PsycINFO Database Record
Reliability Analysis of a Composite Blade Structure Using the Model Correction Factor Method
Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian
2010-01-01
This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...... in a probabilistic sense is model corrected so that it, close to the design point, represents the same structural behaviour as a realistic FE model. This approach leads to considerable improvement of computational efficiency over classical response surface methods, because the numerically “cheap” idealistic model...... is used as the response surface, while the time-consuming detailed model is called only a few times until the simplified model is calibrated to the detailed model....
Brady, Michael P.; Heiser, Lawrence A.; McCormick, Jazarae K.; Forgan, James
2016-01-01
High-stakes standardized student assessments are increasingly used in value-added evaluation models to connect teacher performance to P-12 student learning. These assessments are also being used to evaluate teacher preparation programs, despite validity and reliability threats. A more rational model linking student performance to candidates who…
Reliability-economics analysis models for photovoltaic power systems. Volume 1
Stember, L.H.; Huss, W.R.; Bridgman, M.S.
1982-11-01
This report describes the development of modeling techniques to characterize the reliability, availability, and maintenance costs of photovoltaic power systems. The developed models can be used by designers of PV systems in making design decisions and trade-offs to minimize life-cycle energy costs.
R. Li
2015-11-01
Full Text Available Reproducibility and reliability are fundamental principles of scientific research. A compiling setup that includes a specific compiler version and compiler flags is essential technical supports for Earth system modeling. With the fast development of computer software and hardware, compiling setup has to be updated frequently, which challenges the reproducibility and reliability of Earth system modeling. The existing results of a simulation using an original compiling setup may be irreproducible by a newer compiling setup because trivial round-off errors introduced by the change of compiling setup can potentially trigger significant changes in simulation results. Regarding the reliability, a compiler with millions of lines of codes may have bugs that are easily overlooked due to the uncertainties or unknowns in Earth system modeling. To address these challenges, this study shows that different compiling setups can achieve exactly the same (bitwise identical results in Earth system modeling, and a set of bitwise identical compiling setups of a model can be used across different compiler versions and different compiler flags. As a result, the original results can be more easily reproduced; for example, the original results with an older compiler version can be reproduced exactly with a newer compiler version. Moreover, this study shows that new test cases can be generated based on the differences of bitwise identical compiling setups between different models, which can help detect software bugs or risks in the codes of models and compilers and finally improve the reliability of Earth system modeling.
Reliable dual tensor model estimation in single and crossing fibers based on jeffreys prior
J. Yang (Jianfei); D.H.J. Poot; M.W.A. Caan (Matthan); Su, T. (Tanja); C.B. Majoie (Charles); L.J. van Vliet (Lucas); F. Vos (Frans)
2016-01-01
textabstractPurpose This paper presents and studies a framework for reliable modeling of diffusion MRI using a data-acquisition adaptive prior. Methods Automated relevance determination estimates the mean of the posterior distribution of a rank-2 dual tensor model exploiting Jeffreys prior (JARD).
Thorndike, Alan S.
1992-01-01
My purpose here is to present a simplified treatment of the growth of sea ice. By ignoring many details, it is possible to obtain several results that help to clarify the ways in which the sea ice cover will respond to climate change. Three models are discussed. The first deals with the growth of sea ice during the cold season. The second describes the cycle of growth and melting for perennial ice. The third model extends the second to account for the possibility that the ice melts away entirely in the summer. In each case, the objective is to understand what physical processes are most important, what ice properties determine the ice behavior, and to which climate variables the system is most sensitive.
Reliability modeling of digital RPS with consideration of undetected software faults
Khalaquzzaman, M.; Lee, Seung Jun; Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Man Cheol [Chung Ang Univ., Seoul (Korea, Republic of)
2013-10-15
This paper provides overview of different software reliability methodologies and proposes a technic for estimating the reliability of RPS with consideration of undetected software faults. Software reliability analysis of safety critical software has been challenging despite spending a huge effort for developing large number of software reliability models, and no consensus yet to attain on an appropriate modeling methodology. However, it is realized that the combined application of BBN based SDLC fault prediction method and random black-box testing of software would provide better ground for reliability estimation of safety critical software. Digitalizing the reactor protection system of nuclear power plant has been initiated several decades ago and now full digitalization has been adopted in the new generation of NPPs around the world because digital I and C systems have many better technical features like easier configurability and maintainability over analog I and C systems. Digital I and C systems are also drift-free and incorporation of new features is much easier. Rules and regulation for safe operation of NPPs are established and has been being practiced by the operators as well as regulators of NPPs to ensure safety. The failure mechanism of hardware and analog systems well understood and the risk analysis methods for these components and systems are well established. However, digitalization of I and C system in NPP introduces some crisis and uncertainty in reliability analysis methods of the digital systems/components because software failure mechanisms are still unclear.
Reliability Measure Model for Assistive Care Loop Framework Using Wireless Sensor Networks
Venki Balasubramanian
2010-01-01
Full Text Available Body area wireless sensor networks (BAWSNs are time-critical systems that rely on the collective data of a group of sensor nodes. Reliable data received at the sink is based on the collective data provided by all the source sensor nodes and not on individual data. Unlike conventional reliability, the definition of retransmission is inapplicable in a BAWSN and would only lead to an elapsed data arrival that is not acceptable for time-critical application. Time-driven applications require high data reliability to maintain detection and responses. Hence, the transmission reliability for the BAWSN should be based on the critical time. In this paper, we develop a theoretical model to measure a BAWSN's transmission reliability, based on the critical time. The proposed model is evaluated through simulation and then compared with the experimental results conducted in our existing Active Care Loop Framework (ACLF. We further show the effect of the sink buffer in transmission reliability after a detailed study of various other co-existing parameters.
Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia
2012-09-25
The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.
Reliability Stress-Strength Models for Dependent Observations with Applications in Clinical Trials
Kushary, Debashis; Kulkarni, Pandurang M.
1995-01-01
We consider the applications of stress-strength models in studies involving clinical trials. When studying the effects and side effects of certain procedures (treatments), it is often the case that observations are correlated due to subject effect, repeated measurements and observing many characteristics simultaneously. We develop maximum likelihood estimator (MLE) and uniform minimum variance unbiased estimator (UMVUE) of the reliability which in clinical trial studies could be considered as the chances of increased side effects due to a particular procedure compared to another. The results developed apply to both univariate and multivariate situations. Also, for the univariate situations we develop simple to use lower confidence bounds for the reliability. Further, we consider the cases when both stress and strength constitute time dependent processes. We define the future reliability and obtain methods of constructing lower confidence bounds for this reliability. Finally, we conduct simulation studies to evaluate all the procedures developed and also to compare the MLE and the UMVUE.
Yin Luo
2012-01-01
Full Text Available Traditional pump scheduling models neglect the operation reliability which directly relates with the unscheduled maintenance cost and the wear cost during the operation. Just for this, based on the assumption that the vibration directly relates with the operation reliability and the degree of wear, it could express the operation reliability as the normalization of the vibration level. The characteristic of the vibration with the operation point was studied, it could be concluded that idealized flow versus vibration plot should be a distinct bathtub shape. There is a narrow sweet spot (80 to 100 percent BEP to obtain low vibration levels in this shape, and the vibration also follows similar law with the square of the rotation speed without resonance phenomena. Then, the operation reliability could be modeled as the function of the capacity and rotation speed of the pump and add this function to the traditional model to form the new. And contrast with the tradition method, the result shown that the new model could fix the result produced by the traditional, make the pump operate in low vibration, then the operation reliability could increase and the maintenance cost could decrease.
Modeling Fish Growth in Low Dissolved Oxygen
Neilan, Rachael Miller
2013-01-01
This article describes a computational project designed for undergraduate students as an introduction to mathematical modeling. Students use an ordinary differential equation to describe fish weight and assume the instantaneous growth rate depends on the concentration of dissolved oxygen. Published laboratory experiments suggest that continuous…
The von Bertalanffy growth model for horticulture
Tijskens, L.M.M.; Schouten, R.E.; Unuk, T.; Šumak, D.
2017-01-01
Traditionally, crop load and fruit yield from previous seasons are used as indicators for prediction of fruit size. Disregarding the inevitable biological variation between fruit, von Bertalanffy (1938) described the growth, expressed as length, of virtually any living organism. The model is here
Ruminant models of prenatal growth restriction.
Anthony, R V; Scheaffer, A N; Wright, C D; Regnault, T R H
2003-01-01
Intrauterine growth restriction (IUGR) is a significant health issue that not only affects infant mortality and morbidity, but may also predispose individuals to coronary heart disease, diabetes, hypertension and stroke as adults. The majority of IUGR pregnancies in humans are characterized by asymmetric fetal growth, resulting from inadequate nutrient transfer to the fetus. Furthermore, most of these pregnancies involve functional placental insufficiency, and may also show altered umbilical velocimetry. As the severity of IUGR increases, the fetus becomes increasingly hypoxic, hypoglycaemic and acidotic. In addition, placental transfer or utilization of some amino acids is known to be altered in IUGR pregnancies. Although a great deal has been learned from clinical studies of human IUGR, appropriate animal models are required to define completely the mechanisms involved in the development of IUGR. The pregnant sheep is a long-standing model for placental-fetal interactions, and fetal growth restriction can be induced in pregnant sheep by maternal nutrient restriction, maternal nutrient excess, administration of glucocorticoid, utero-placental embolization, carunclectomy and maternal hyperthermia. Although all of these sheep models are capable of inducing fetal growth restriction, the degree of restriction is variable. This review compares these sheep models of IUGR with the characteristics of human IUGR.
A STOCHASTIC GROWTH MODEL WITH ENVIRONMENTAL POLLUTION
Xueqing ZHANG; Shigeng HU; Haijun WANG
2006-01-01
Pollution is introduced into the utility function and the productive function in this paper.Under appropriate macroeconomic equilibrium conditions, this paper proves that the equilibrium levels of the main economic indexes are uniquely determined by the model parameters. This paper establishes the following alternative theorem: some factors affect the economic growth and the welfare in opposite way.
Potential Negative Impact of DG on Reliability Index: A Study Based on Time-Domain Modeling
Ran, Xuanchang
This thesis presents an original insight of the negative impact of distributed generation on reliability index based on dynamic time-domain modeling. Models for essential power system components, such as protective devices and synchronous generators, were developed and tested. A 4 kV distribution loop which carries relatively high power demand was chosen for the analysis. The characteristic curves of all protective devices were extracted from utility database and applied to the time domain relay model. The performance of each device was investigated in details. The negative effect on reliability is due to the fuse opening caused by the installation of DG at the wrong location and inappropriate relay setup. Over 50% of the possible DG locations can produce an undesirable impact. The study conclusion is that there exists a significant potential for the installation of DG to negatively affect the reliability of power systems.
Hea-Jung Kim
2017-06-01
Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.
Applicability of bacterial growth models in spreadable processed cheese
Dorota Weiss
2015-09-01
Full Text Available Background. Food spoilage is a process in which the quality parameters decrease and products are no longer edible. This is a cumulative effect of bacteria growth and their metabolite production, which is a factor limiting shelf life. Thus, the aim of the study was to evaluate whether microbiological growth models for total viable count (TVC and Clostridium strain bacteria are reliable tools for prediction of microbiological changes in spreadable processed cheese. Material and methods. Investigations were conducted for two types of bacteria: TVC and Clostridium in following temperature: 8°C, 20°C and 30°C. A total number of aerobic bacteria was determined based on standard PN-EN ISO 4833:2004 and Clostridium was detected by using microbiological procedure for sulphite-reducing anaerobic spore-bacteria with a selective nourishment. During the analysis nonlinear regression and Baranyi and Roberts primary model were used. Results. For temperatures 20°C and 30°C, Baranyi and Roberts model, for total viable count showed determination coeffi cient of 70%. The models prepared for Clostridium, in these temperatures, showed much lower R2, respectively 25% and 30%. At the abovementioned temperatures also the expiration of product shelf life was much shorter and amounted 70 days at 20°C and 7 days at 30°C. For both types of bacteria incubated at 8°C the numbers of bacteria decrease until the expiration of product shelf life. Conclusions. Models used in the analyses, Baranyi and Roberts and nonlinear regression, poorly matched the experimental data, hence they are not reliable tools. Nevertheless, they gave information about dynamic of microbiological changes in spreadable processed cheese.
Jiang Zhibin; He Junming
2003-01-01
Object-oriented Petri nets (OPNs) is extended into stochastic object-oriented Petri nets (SOPNs) by associating the OPN of an object with stochastic transitions and introducing stochastic places. The stochastic transition of the SOPNs of a production resources can be used to model its reliability, while the SOPN of a production resource can describe its performance with reliability considered. The SOPN model of a case production system is built to illustrate the relationship between the system's performances and the failures of individual production resources.
Rachna Aggarwal
2014-12-01
Full Text Available This paper presents Reliability Based Design Optimization (RBDO model to deal with uncertainties involved in concrete mix design process. The optimization problem is formulated in such a way that probabilistic concrete mix input parameters showing random characteristics are determined by minimizing the cost of concrete subjected to concrete compressive strength constraint for a given target reliability. Linear and quadratic models based on Ordinary Least Square Regression (OLSR, Traditional Ridge Regression (TRR and Generalized Ridge Regression (GRR techniques have been explored to select the best model to explicitly represent compressive strength of concrete. The RBDO model is solved by Sequential Optimization and Reliability Assessment (SORA method using fully quadratic GRR model. Optimization results for a wide range of target compressive strength and reliability levels of 0.90, 0.95 and 0.99 have been reported. Also, safety factor based Deterministic Design Optimization (DDO designs for each case are obtained. It has been observed that deterministic optimal designs are cost effective but proposed RBDO model gives improved design performance.
Development of Markov model of emergency diesel generator for dynamic reliability analysis
Jin, Young Ho; Choi, Sun Yeong; Yang, Joon Eon [Korea Atomic Energy Research Institute, Taejon (Korea)
1999-02-01
The EDG (Emergency Diesal Generator) of nuclear power plant is one of the most important equipments in mitigating accidents. The FT (Fault Tree) method is widely used to assess the reliability of safety systems like an EDG in nuclear power plant. This method, however, has limitations in modeling dynamic features of safety systems exactly. We, hence, have developed a Markov model to represent the stochastic process of dynamic systems whose states change as time moves on. The Markov model enables us to develop a dynamic reliability model of EDG. This model can represent all possible states of EDG comparing to the FRANTIC code developed by U.S. NRC for the reliability analysis of standby systems. to access the regulation policy for test interval, we performed two simulations based on the generic data and plant specific data of YGN 3, respectively by using the developed model. We also estimate the effects of various repair rates and the fractions of starting failures by demand shock to the reliability of EDG. And finally, Aging effect is analyzed. (author). 23 refs., 19 figs., 9 tabs.
Time Dependent Dielectric Breakdown in Copper Low-k Interconnects: Mechanisms and Reliability Models
Terence K.S. Wong
2012-09-01
Full Text Available The time dependent dielectric breakdown phenomenon in copper low-k damascene interconnects for ultra large-scale integration is reviewed. The loss of insulation between neighboring interconnects represents an emerging back end-of-the-line reliability issue that is not fully understood. After describing the main dielectric leakage mechanisms in low-k materials (Poole-Frenkel and Schottky emission, the major dielectric reliability models that had appeared in the literature are discussed, namely: the Lloyd model, 1/E model, thermochemical E model, E1/2 models, E2 model and the Haase model. These models can be broadly categorized into those that consider only intrinsic breakdown (Lloyd, 1/E, E and Haase and those that take into account copper migration in low-k materials (E1/2, E2. For each model, the physical assumptions and the proposed breakdown mechanism will be discussed, together with the quantitative relationship predicting the time to breakdown and supporting experimental data. Experimental attempts on validation of dielectric reliability models using data obtained from low field stressing are briefly discussed. The phenomenon of soft breakdown, which often precedes hard breakdown in porous ultra low-k materials, is highlighted for future research.
Stochastic Modelling of Gompertzian Tumor Growth
O'Rourke, S. F. C.; Behera, A.
2009-08-01
We study the effect of correlated noise in the Gompertzian tumor growth model for non-zero correlation time. The steady state probability distributions and average population of tumor cells are analyzed within the Fokker-Planck formalism to investigate the importance of additive and multiplicative noise. We find that the correlation strength and correlation time have opposite effects on the steady state probability distributions. It is observed that the non-bistable Gompertzian model, driven by correlated noise exhibits a stochastic resonance and phase transition. This behaviour of the Gompertz model is unaffected with the change of correlation time and occurs as a result of multiplicative noise.
Shouri, P.V.; Sreejith, P.S. [Division of Mechanical Engineering, School of Engineering, Cochin University of Science and Technology (CUSAT), Cochin 682 022, Kerala (India)
2008-06-15
In the present scenario of energy demand overtaking energy supply, top priority is given for energy conservation programs and policies. As a result, most existing systems are redesigned or modified with a view for improving energy efficiency. Often these modifications can have an impact on process system configuration, thereby affecting process system reliability. The paper presents a model for valuation of process systems incorporating reliability that can be used to determine the change in process system value resulting from system modification. The model also determines the break even system availability and presents an algorithm for allocation of component reliabilities of the modified system based on the break even system availability. The developed equations are applied to a steam power plant to study the effect of various operating parameters on system value. (author)
Reliable gain-scheduled control of discrete-time systems and its application to CSTR model
Sakthivel, R.; Selvi, S.; Mathiyalagan, K.; Shi, Y.
2016-10-01
This paper is focused on reliable gain-scheduled controller design for a class of discrete-time systems with randomly occurring nonlinearities and actuator fault. Further, the nonlinearity in the system model is assumed to occur randomly according to a Bernoulli distribution with measurable time-varying probability in real time. The main purpose of this paper is to design a gain-scheduled controller by implementing a probability-dependent Lyapunov function and linear matrix inequality (LMI) approach such that the closed-loop discrete-time system is stochastically stable for all admissible randomly occurring nonlinearities. The existence conditions for the reliable controller is formulated in terms of LMI constraints. Finally, the proposed reliable gain-scheduled control scheme is applied on continuously stirred tank reactor model to demonstrate the effectiveness and applicability of the proposed design technique.
Woessner, J.
2012-07-14
Static stress transfer is one physical mechanism to explain triggered seismicity. Coseismic stress-change calculations strongly depend on the parameterization of the causative finite-fault source model. These models are uncertain due to uncertainties in input data, model assumptions, and modeling procedures. However, fault model uncertainties have usually been ignored in stress-triggering studies and have not been propagated to assess the reliability of Coulomb failure stress change (ΔCFS) calculations. We show how these uncertainties can be used to provide confidence intervals for co-seismic ΔCFS-values. We demonstrate this for the MW = 5.9 June 2000 Kleifarvatn earthquake in southwest Iceland and systematically map these uncertainties. A set of 2500 candidate source models from the full posterior fault-parameter distribution was used to compute 2500 ΔCFS maps. We assess the reliability of the ΔCFS-values from the coefficient of variation (CV) and deem ΔCFS-values to be reliable where they are at least twice as large as the standard deviation (CV ≤ 0.5). Unreliable ΔCFS-values are found near the causative fault and between lobes of positive and negative stress change, where a small change in fault strike causes ΔCFS-values to change sign. The most reliable ΔCFS-values are found away from the source fault in the middle of positive and negative ΔCFS-lobes, a likely general pattern. Using the reliability criterion, our results support the static stress-triggering hypothesis. Nevertheless, our analysis also suggests that results from previous stress-triggering studies not considering source model uncertainties may have lead to a biased interpretation of the importance of static stress-triggering.
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
Technique for Early Reliability Prediction of Software Components Using Behaviour Models.
Ali, Awad; N A Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction.
Reliability and validation of a behavioral model of clinical behavioral formulation
Amanda M Muñoz-Martínez
2011-05-01
Full Text Available The aim of this study was to determine the reliability and content and predictive validity of a clinical case formulation, developed from a behavioral perspective. A mixed design integrating levels of descriptive analysis and A-B case study with follow-up was used. The study established the reliability of the following descriptive and explanatory categories: (a problem description, (b predisposing factors, (c precipitating factors, (d acquisition and (e inferred mechanism (maintenance. The analysis was performed on cases from 2005 to 2008 formulated with the model derived from the current study. With regards to validity, expert judges considered that the model had content validity. The predictive validity was established across application of model to three case studies. Discussion shows the importance of extending the investigation with the model in other populations and to establish the clinical and concurrent validity of the model.
Reliability reallocation models as a support tools in traffic safety analysis.
Bačkalić, Svetlana; Jovanović, Dragan; Bačkalić, Todor
2014-04-01
One of the essential questions placed before a road authority is where to act first, i.e. which road sections should be treated in order to achieve the desired level of reliability of a particular road, while this is at the same time the subject of this research. The paper shows how the reliability reallocation theory can be applied in safety analysis of a road consisting of sections. The model has been successfully tested using two apportionment techniques - ARINC and the minimum effort algorithm. The given methods were applied in the traffic safety analysis as a basic step, for the purpose of achieving a higher level of reliability. The previous methods used for selecting hazardous locations do not provide precise values for the required frequency of accidents, i.e. the time period between the occurrences of two accidents. In other words, they do not allow for the establishment of a connection between a precise demand for increased reliability (expressed as a percentage) and the selection of particular road sections for further analysis. The paper shows that reallocation models can also be applied in road safety analysis, or more precisely, as part of the measures for increasing their level of safety. A tool has been developed for selecting road sections for treatment on the basis of a precisely defined increase in the level of reliability of a particular road, i.e. the mean time between the occurrences of two accidents.
Vargas, Susana; Millán-Chiu, Blanca E; Arvizu-Medrano, Sofía M; Loske, Achim M; Rodríguez, Rogelio
2017-04-09
A comparison between plate counting (PC) and dynamic light scattering (DLS) is reported. PC is the standard technique to determine bacterial population as a function of time; however, this method has drawbacks, such as the cumbersome preparation and handling of samples, as well as the long time required to obtain results. Alternative methods based on optical density are faster, but do not distinguish viable from non-viable cells. These inconveniences are overcome by using DLS. Two different bacteria strains were considered: Escherichia coli and Staphylococcus aureus. DLS was performed at two different illuminating conditions: continuous and intermittent. By the increment of particle size as a function of time, it was possible to observe cell division and the formation of aggregates containing very few bacteria. The scattered intensity profiles showed the lag phase and the transition to the exponential phase of growth, providing a quantity proportional to viable bacteria concentration. The results revealed a clear and linear correlation in both lag and exponential phase, between the Log10(colony-forming units/mL) from PC and the Log10 of the scattered intensity Is from DLS. These correlations provide a good support to use DLS as an alternative technique to determine bacterial population.
A Chaotic Model for Software Reliability%软件可靠性混沌模型
邹丰忠; 李传湘
2001-01-01
在分析软件失效机理后认为：有些软件失效行为具有混沌性，所以可以用混沌方法来处理其软件可靠性推断问题.但在应用混沌方法前先要进行系统辨识，确定为混沌系统后，才能应用嵌入空间技术从软件失效时间序列重建系统相空间和吸引子，进而用吸引子所揭示的混沌属性来估计软件可靠性.文中在三个标准数据集的基础上对此进行了实证分析，结果表明其中两个数据集源于混沌机制，他们的吸引子具有低维的小数极限维数，而且预测与实际可靠性吻合较好.值得指出的是文中所提混沌方法突破了软件可靠性一贯使用随机分析的局限.%Computers affected almost every aspect of human lives. As thedependency on computer systems of human beings grows, so does the need for the technology of reliability of computer systems. In contrast to computer hardware, software is far more complicated. Thus the key is to improve the reliability of software if the overall reliability of a system is to be improved. Although scientists, in the past few decades, proposed lots of reliability models for software, which greatly enhanced the reliability and productivity of software products, these models are far from satisfactory. To build models of high accuracy and to improve the existing models is therefore of practical significance. Conventional theory of software reliability assumes that the failure processes of software are completely random, whereas authors of this paper, on the basis of careful investigation on physical mechanics of software failures,suggest that some dynamics of software failures are of chaotic features. Thus the reliability issue of these systems can be addressed with chaotic approaches. But before applying chaotic methodology to estimate the reliability of the software under consideration, the first thing to do is system identification that uses certain standards to distinguish chaotic dynamics
A continuous-time Bayesian network reliability modeling and analysis framework
Boudali, H.; Dugan, J.B.
2006-01-01
We present a continuous-time Bayesian network (CTBN) framework for dynamic systems reliability modeling and analysis. Dynamic systems exhibit complex behaviors and interactions between their components; where not only the combination of failure events matters, but so does the sequence ordering of th
A fast, reliable algorithm for computing frequency responses of state space models
Wette, Matt
1991-01-01
Computation of frequency responses for large order systems described by time invariant state space systems often provides a bottleneck in control system analysis. It is shown that banding the A-matrix in the state space model can effectively reduce the computation time for such systems while maintaining reliability in the results produced.
Reviewing progress in PJM's capacity market structure via the new reliability pricing model
Sener, Adil Caner; Kimball, Stefan
2007-12-15
The Reliability Pricing Model introduces significant changes to the capacity market structure of PJM. The main feature of the RPM design is a downward-sloping demand curve, which replaces the highly volatile vertical demand curve. The authors review the latest RPM structure, results of the auctions, and the future course of the implementation process. (author)
阚英男; 杨兆军; 李国发; 何佳龙; 王彦鹍; 李洪洲
2016-01-01
A new problem that classical statistical methods are incapable of solving is reliability modeling and assessment when multiple numerical control machine tools (NCMTs) reveal zero failures after a reliability test. Thus, the zero-failure data form and corresponding Bayesian model are developed to solve the zero-failure problem of NCMTs, for which no previous suitable statistical model has been developed. An expert−judgment process that incorporates prior information is presented to solve the difficulty in obtaining reliable prior distributions of Weibull parameters. The equations for the posterior distribution of the parameter vector and the Markov chain Monte Carlo (MCMC) algorithm are derived to solve the difficulty of calculating high-dimensional integration and to obtain parameter estimators. The proposed method is applied to a real case; a corresponding programming code and trick are developed to implement an MCMC simulation in WinBUGS, and a mean time between failures (MTBF) of 1057.9 h is obtained. Given its ability to combine expert judgment, prior information, and data, the proposed reliability modeling and assessment method under the zero failure of NCMTs is validated.
2011-05-18
... COMMISSION NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection... issued for public comment a document entitled: NUREG/CR-XXXX, ``Development of Quantitative Software... development of regulatory guidance for using risk information related to digital systems in the...
M. O. Kostin
2010-09-01
Full Text Available The probabilistic model of parametric reliability of power electromagnetic valve contactors of rolling stock which helps to evaluate the probability of failures in condition of switching a contactor (the tractive force during the whole process of operation should be greater than the resulting counteracting force is proposed in the paper.
Mathematical Model of Equipment Unit Reliability for Determination of Optimum Overhaul Periods
M. A. Pasiouk
2009-01-01
Full Text Available The paper proposes a mathematical model of the equipment unit reliability with due account of operational mode effect and main influencing factors.Its application contributes to reduction of operating costs, optimization of overhaul periods, prolongation of life-service and rational usage of fleet resource.
A continuous-time Bayesian network reliability modeling and analysis framework
Boudali, H.; Dugan, J.B.
2006-01-01
We present a continuous-time Bayesian network (CTBN) framework for dynamic systems reliability modeling and analysis. Dynamic systems exhibit complex behaviors and interactions between their components; where not only the combination of failure events matters, but so does the sequence ordering of th
Modelling subtle growth of linguistic networks
Kulig, Andrzej; Kwapien, Jaroslaw; Oswiecimka, Pawel
2014-01-01
We investigate properties of evolving linguistic networks defined by the word-adjacency relation. Such networks belong to the category of networks with accelerated growth but their shortest path length appears to reveal the network size dependence of different functional form than the ones known so far. We thus compare the networks created from literary texts with their artificial substitutes based on different variants of the Dorogovtsev-Mendes model and observe that none of them is able to properly simulate the novel asymptotics of the shortest path length. Then, we identify grammar induced local chain-like linear growth as a missing element in this model and extend it by incorporating such effects. It is in this way that a satisfactory agreement with the empirical result is obtained.
Growth models for tree stems and vines
Bressan, Alberto; Palladino, Michele; Shen, Wen
2017-08-01
The paper introduces a PDE model for the growth of a tree stem or a vine. The equations describe the elongation due to cell growth, and the response to gravity and to external obstacles. An additional term accounts for the tendency of a vine to curl around branches of other plants. When obstacles are present, the model takes the form of a differential inclusion with state constraints. At each time t, a cone of admissible reactions is determined by the minimization of an elastic deformation energy. The main theorem shows that local solutions exist and can be prolonged globally in time, except when a specific ;breakdown configuration; is reached. Approximate solutions are constructed by an operator-splitting technique. Some numerical simulations are provided at the end of the paper.
Growth Model of Local Government Websites
HO Sho; IIJIMA Junichi
2004-01-01
To clarify the concept framework to assess web based information systems (WIS)evolution from an information perspective instead of the usual systems perspective, and to seek for in-depth understanding of maturing patterns of WISs based on the framework, several central concepts related to the information aspect of WIS are firstly discussed, then a growth model of local government websites based on a survey study is proposed.
Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard
2014-01-01
Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...
Dendritic growth model of multilevel marketing
Pang, James Christopher S.; Monterola, Christopher P.
2017-02-01
Biologically inspired dendritic network growth is utilized to model the evolving connections of a multilevel marketing (MLM) enterprise. Starting from agents at random spatial locations, a network is formed by minimizing a distance cost function controlled by a parameter, termed the balancing factor bf, that weighs the wiring and the path length costs of connection. The paradigm is compared to an actual MLM membership data and is shown to be successful in statistically capturing the membership distribution, better than the previously reported agent based preferential attachment or analytic branching process models. Moreover, it recovers the known empirical statistics of previously studied MLM, specifically: (i) a membership distribution characterized by the existence of peak levels indicating limited growth, and (ii) an income distribution obeying the 80 - 20 Pareto principle. Extensive types of income distributions from uniform to Pareto to a "winner-take-all" kind are also modeled by varying bf. Finally, the robustness of our dendritic growth paradigm to random agent removals is explored and its implications to MLM income distributions are discussed.
Meeting Human Reliability Requirements through Human Factors Design, Testing, and Modeling
R. L. Boring
2007-06-01
In the design of novel systems, it is important for the human factors engineer to work in parallel with the human reliability analyst to arrive at the safest achievable design that meets design team safety goals and certification or regulatory requirements. This paper introduces the System Development Safety Triptych, a checklist of considerations for the interplay of human factors and human reliability through design, testing, and modeling in product development. This paper also explores three phases of safe system development, corresponding to the conception, design, and implementation of a system.
On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities
Utkin, Lev V.; Kozine, Igor
2010-01-01
both aleatory (stochas-tic) and epistemic uncertainty and the flexibility with which information can be represented. The previous research of the authors related to generalizing structural reliability models to impre-cise statistical measures is summarized in Utkin & Kozine (2002) and Utkin (2004...... the above mentioned inputs do not exist and the analyst has on-ly some judgments or measurements (observations) of values of stress and strength. How to utilize this available information for computing the structural reliability and what to do if the number of judgments or measurements is very small...
Zemenkova, M. Yu; Shipovalov, A. N.; Zemenkov, Yu D.
2016-04-01
The main technological equipment of pipeline transport of hydrocarbons are hydraulic machines. During transportation of oil mainly used of centrifugal pumps, designed to work in the “pumping station-pipeline” system. Composition of a standard pumping station consists of several pumps, complex hydraulic piping. The authors have developed a set of models and algorithms for calculating system reliability of pumps. It is based on the theory of reliability. As an example, considered one of the estimation methods with the application of graph theory.
Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology
Knezevic, J.; Odoom, E.R
2001-07-01
A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets.
Gearbox Reliability Collaborative Phase 1 and 2: Testing and Modeling Results; Preprint
Keller, J.; Guo, Y.; LaCava, W.; Link, H.; McNiff, B.
2012-05-01
The Gearbox Reliability Collaborative (GRC) investigates root causes of wind turbine gearbox premature failures and validates design assumptions that affect gearbox reliability using a combined testing and modeling approach. Knowledge gained from the testing and modeling of the GRC gearboxes builds an understanding of how the selected loads and events translate into internal responses of three-point mounted gearboxes. This paper presents some testing and modeling results of the GRC research during Phase 1 and 2. Non-torque loads from the rotor including shaft bending and thrust, traditionally assumed to be uncoupled with gearbox, affect gear and bearing loads and resulting gearbox responses. Bearing clearance increases bearing loads and causes cyclic loading, which could contribute to a reduced bearing life. Including flexibilities of key drivetrain subcomponents is important in order to reproduce the measured gearbox response during the tests using modeling approaches.
Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.
2004-01-01
This paper describes the development of a methodology for estimating reliability and maintainability distribution parameters for a reusable launch vehicle. A disciplinary analysis code and experimental designs are used to construct approximation models for performance characteristics. These models are then used in a simulation study to estimate performance characteristic distributions efficiently. The effectiveness and limitations of the developed methodology for launch vehicle operations simulations are also discussed.
An Analysis of Starting Points for Setting Up a Model of a More Reliable Ship Propulsion
Martinović, Dragan; Tudor, Mato; Bernečić, Dean
2011-01-01
This paper considers the important requirement for ship propulsion necessary for its immaculate operation, since any failure can endanger the ship and render it useless. Particular attention is given to the failure of auxiliary engines that can also seriously jeopardise the safety of the ship. Therefore the paper presents preliminary investigations for setting up models of reliable ship propulsion accounting for the failure of auxiliary engines. Models of most frequent implementations of e...
System principles, mathematical models and methods to ensure high reliability of safety systems
Zaslavskyi, V.
2017-04-01
Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.
Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao
2017-03-15
As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Data Applicability of Heritage and New Hardware for Launch Vehicle System Reliability Models
Al Hassan Mohammad; Novack, Steven
2015-01-01
Many launch vehicle systems are designed and developed using heritage and new hardware. In most cases, the heritage hardware undergoes modifications to fit new functional system requirements, impacting the failure rates and, ultimately, the reliability data. New hardware, which lacks historical data, is often compared to like systems when estimating failure rates. Some qualification of applicability for the data source to the current system should be made. Accurately characterizing the reliability data applicability and quality under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This presentation will demonstrate a data-source classification method that ranks reliability data according to applicability and quality criteria to a new launch vehicle. This method accounts for similarities/dissimilarities in source and applicability, as well as operating environments like vibrations, acoustic regime, and shock. This classification approach will be followed by uncertainty-importance routines to assess the need for additional data to reduce uncertainty.
Teufel, Linda; Schuster, K Christian; Merschak, Petra; Bechtold, Thomas; Redl, Bernhard
2008-01-01
There is a lack of relevant methods to assess the colonization of textiles by skin bacteria because present methods are mainly culture-based procedures. Therefore, the goal of this study was to develop a fast and sensitive culture-independent procedure for the quantification of microbial colonization and growth on textiles. We have established a suitable protocol to use DNA quantification as a reliable method for in vitroand in vivoinvestigations of textiles. For DNA extraction, a two-step procedure comprising treatment of the textile with a solution containing Triton X-100 and lysozyme for 1 h and a successive treatment by SDS and proteinase K for 2 h turned out to be most efficient. DNA extracted from textiles and fabrics was than quantified with the highly sensitive PicoGreen fluorescent dye. In vitrochallenge tests demonstrated a strong correlation between numbers of bacteria on textiles and amount of DNA extracted from textiles. Therefore, this method was used to compare different materials after in vivotrials for assessment of their susceptibility for microbial colonization and growth.
Practical applications of age-dependent reliability models and analysis of operational data
Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L
2005-07-01
The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems.
A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors
Liu, Donhang
2014-01-01
The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The
Modeling and control of greenhouse crop growth
Rodríguez, Francisco; Guzmán, José Luis; Ramírez-Arias, Armando
2015-01-01
A discussion of challenges related to the modeling and control of greenhouse crop growth, this book presents state-of-the-art answers to those challenges. The authors model the subsystems involved in successful greenhouse control using different techniques and show how the models obtained can be exploited for simulation or control design; they suggest ideas for the development of physical and/or black-box models for this purpose. Strategies for the control of climate- and irrigation-related variables are brought forward. The uses of PID control and feedforward compensators, both widely used in commercial tools, are summarized. The benefits of advanced control techniques—event-based, robust, and predictive control, for example—are used to improve on the performance of those basic methods. A hierarchical control architecture is developed governed by a high-level multiobjective optimization approach rather than traditional constrained optimization and artificial intelligence techniques. Reference trajector...
Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist
2010-01-01
The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...... are not informative enough (sensitivities of 16 parameters were insignificant). This indicates that the NREL model has severe parameter uncertainty, likely to be the case for other hydrolysis models as well since similar kinetic expressions are used. To overcome this impasse, we have used the Monte Carlo procedure...
Testing R&D-Based Endogenous Growth Models
Kruse-Andersen, Peter Kjær
2017-01-01
R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship is est......-run growth rate of GDP per worker converges to between zero and 1.1 pct....
Basu, Asit P; Basu, Sujit K
1998-01-01
This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul
Huibing Hao
2015-01-01
Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.
Reliability and Maintainability model (RAM) user and maintenance manual. Part 2
Ebeling, Charles E.
1995-01-01
This report documents the procedures for utilizing and maintaining the Reliability and Maintainability Model (RAM) developed by the University of Dayton for the NASA Langley Research Center (LaRC). The RAM model predicts reliability and maintainability (R&M) parameters for conceptual space vehicles using parametric relationships between vehicle design and performance characteristics and subsystem mean time between maintenance actions (MTBM) and manhours per maintenance action (MH/MA). These parametric relationships were developed using aircraft R&M data from over thirty different military aircraft of all types. This report describes the general methodology used within the model, the execution and computational sequence, the input screens and data, the output displays and reports, and study analyses and procedures. A source listing is provided.
Luo, Yangjun; Wu, Xiaoxiang; Zhou, Mingdong
2015-01-01
on a probability-interval mixed reliability model, the imprecision of design parameters is modeled as interval uncertainties fluctuating within allowable tolerance bounds. The optimization model is defined as to minimize the total manufacturing cost under mixed reliability index constraints, which are further...... transformed into their equivalent formulations by using the performance measure approach. The optimization problem is then solved with the sequential approximate programming. Meanwhile, a numerically stable algorithm based on the trust region method is proposed to efficiently update the target performance......Both structural sizes and dimensional tolerances strongly influence the manufacturing cost and the functional performance of a practical product. This paper presents an optimization method to simultaneously find the optimal combination of structural sizes and dimensional tolerances. Based...
Reliability and efficiency of generalized rumor spreading model on complex social networks
Naimi, Yaghoob
2013-01-01
We introduce the generalized rumor spreading model and investigate some properties of this model on different complex social networks. Despite pervious rumor models that both the spreader-spreader ($SS$) and the spreader-stifler ($SR$) interactions have the same rate $\\alpha$, we define $\\alpha^{(1)}$ and $\\alpha^{(2)}$ for $SS$ and $SR$ interactions, respectively. The effect of variation of $\\alpha^{(1)}$ and $\\alpha^{(2)}$ on the final density of stiflers is investigated. Furthermore, the influence of the topological structure of the network in rumor spreading is studied by analyzing the behavior of several global parameters such as reliability and efficiency. Our results show that while networks with homogeneous connectivity patterns reach a higher reliability, scale-free topologies need a less time to reach a steady state with respect the rumor.
Reliability and Efficiency of Generalized Rumor Spreading Model on Complex Social Networks
Yaghoob Naimi; Mohammad Naimi
2013-01-01
We introduce the generalized rumor spreading model and investigate some properties of this model on different complex social networks.Despite pervious rumor models that both the spreader-spreader (SS) and the spreaderstifler (SR) interactions have the same rate α,we define α(1) and α(2) for SS and SR interactions,respectively.The effect of variation of α(1) and α(2) on the final density of stiflers is investigated.Furthermore,the influence of the topological structure of the network in rumor spreading is studied by analyzing the behavior of several global parameters such as reliability and efficiency.Our results show that while networks with homogeneous connectivity patterns reach a higher reliability,scale-free topologies need a less time to reach a steady state with respect the rumor.
Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol
Montgomery, Todd; Callahan, John R.; Whetten, Brian
1996-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
A competing risk model for the reliability of cylinder liners in marine Diesel engines
Bocchetti, D. [Grimaldi Group, Naples (Italy); Giorgio, M. [Department of Aerospace and Mechanical Engineering, Second University of Naples, Aversa (Italy); Guida, M. [Department of Information Engineering and Electrical Engineering, University of Salerno, Fisciano (Italy); Pulcini, G. [Istituto Motori, National Research Council-CNR, Naples (Italy)], E-mail: g.pulcini@im.cnr.it
2009-08-15
In this paper, a competing risk model is proposed to describe the reliability of the cylinder liners of a marine Diesel engine. Cylinder liners presents two dominant failure modes: wear degradation and thermal cracking. The wear process is described through a stochastic process, whereas the failure time due to the thermal cracking is described by the Weibull distribution. The use of the proposed model allows performing goodness-of-fit test and parameters estimation on the basis of both wear and failure data. Moreover, it enables reliability estimates of the state of the liners to be obtained and the hierarchy of the failure mechanisms to be determined for any given age and wear level of the liner. The model has been applied to a real data set: 33 cylinder liners of Sulzer RTA 58 engines, which equip twin ships of the Grimaldi Group. Estimates of the liner reliability and of other quantities of interest under the competing risk model are obtained, as well as the conditional failure probability and mean residual lifetime, given the survival age and the accumulated wear. Furthermore, the model has been used to estimate the probability that a liner fails due to one of the failure modes when both of these modes act.
Bayesian Reliability Modeling and Assessment Solution for NC Machine Tools under Small-sample Data
YANG Zhaojun; KAN Yingnan; CHEN Fei; XU Binbin; CHEN Chuanhai; YANG Chuangui
2015-01-01
Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters’ prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters’ posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.
王鹭; 张利; 王学芝
2015-01-01
As the central component of rotating machine, the performance reliability assessment and remaining useful lifetime prediction of bearing are of crucial importance in condition-based maintenance to reduce the maintenance cost and improve the reliability. A prognostic algorithm to assess the reliability and forecast the remaining useful lifetime (RUL) of bearings was proposed, consisting of three phases. Online vibration and temperature signals of bearings in normal state were measured during the manufacturing process and the most useful time-dependent features of vibration signals were extracted based on correlation analysis (feature selection step). Time series analysis based on neural network, as an identification model, was used to predict the features of bearing vibration signals at any horizons (feature prediction step). Furthermore, according to the features, degradation factor was defined. The proportional hazard model was generated to estimate the survival function and forecast the RUL of the bearing (RUL prediction step). The positive results show that the plausibility and effectiveness of the proposed approach can facilitate bearing reliability estimation and RUL prediction.
Wang Shaoping; Cui Xiaoyu; Shi Jian; Mileta M. Tomovic; Jiao Zongxia
2016-01-01
Actuation system is a vital system in an aircraft, providing the force necessary to move flight control surfaces. The system has a significant influence on the overall aircraft performance and its safety. In order to further increase already high reliability and safety, Airbus has imple-mented a dissimilar redundancy actuation system (DRAS) in its aircraft. The DRAS consists of a hydraulic actuation system (HAS) and an electro-hydrostatic actuation system (EHAS), in which the HAS utilizes a hydraulic source (HS) to move the control surface and the EHAS utilizes an elec-trical supply (ES) to provide the motion force. This paper focuses on the performance degradation processes and fault monitoring strategies of the DRAS, establishes its reliability model based on the generalized stochastic Petri nets (GSPN), and carries out a reliability assessment considering the fault monitoring coverage rate and the false alarm rate. The results indicate that the proposed reli-ability model of the DRAS, considering the fault monitoring, can express its fault logical relation and redundancy degradation process and identify potential safety hazards.
Hao Zhang
2016-01-01
Full Text Available Under the increasingly uncertain economic environment, the research on the reliability of urban distribution system has great practical significance for the integration of logistics and supply chain resources. This paper summarizes the factors that affect the city logistics distribution system. Starting from the research of factors that influence the reliability of city distribution system, further construction of city distribution system reliability influence model is built based on Bayesian networks. The complex problem is simplified by using the sub-Bayesian network, and an example is analyzed. In the calculation process, we combined the traditional Bayesian algorithm and the Expectation Maximization (EM algorithm, which made the Bayesian model able to lay a more accurate foundation. The results show that the Bayesian network can accurately reflect the dynamic relationship among the factors affecting the reliability of urban distribution system. Moreover, by changing the prior probability of the node of the cause, the correlation degree between the variables that affect the successful distribution can be calculated. The results have significant practical significance on improving the quality of distribution, the level of distribution, and the efficiency of enterprises.
Ahmad Alferidi
2017-02-01
Full Text Available The contribution of solar power in electric power systems has been increasing rapidly due to its environmentally friendly nature. Photovoltaic (PV systems contain solar cell panels, power electronic converters, high power switching and often transformers. These components collectively play an important role in shaping the reliability of PV systems. Moreover, the power output of PV systems is variable, so it cannot be controlled as easily as conventional generation due to the unpredictable nature of weather conditions. Therefore, solar power has a different influence on generating system reliability compared to conventional power sources. Recently, different PV system designs have been constructed to maximize the output power of PV systems. These different designs are commonly adopted based on the scale of a PV system. Large-scale grid-connected PV systems are generally connected in a centralized or a string structure. Central and string PV schemes are different in terms of connecting the inverter to PV arrays. Micro-inverter systems are recognized as a third PV system topology. It is therefore important to evaluate the reliability contribution of PV systems under these topologies. This work utilizes a probabilistic technique to develop a power output model for a PV generation system. A reliability model is then developed for a PV integrated power system in order to assess the reliability and energy contribution of the solar system to meet overall system demand. The developed model is applied to a small isolated power unit to evaluate system adequacy and capacity level of a PV system considering the three topologies.
Whittaker, Tiffany A.; Khojasteh, Jam
2017-01-01
Latent growth modeling (LGM) is a popular and flexible technique that may be used when data are collected across several different measurement occasions. Modeling the appropriate growth trajectory has important implications with respect to the accurate interpretation of parameter estimates of interest in a latent growth model that may impact…
Nettleship, Ian (University of Pittsburgh, Pittsburgh, PA); Hinklin, Thomas; Holcomb, David Joseph; Tandon, Rajan; Arguello, Jose Guadalupe, Jr. (,; .); Dempsey, James Franklin; Ewsuk, Kevin Gregory; Neilsen, Michael K.; Lanagan, Michael (Pennsylvania State University, University Park, PA)
2007-07-01
An interdisciplinary team of scientists and engineers having broad expertise in materials processing and properties, materials characterization, and computational mechanics was assembled to develop science-based modeling/simulation technology to design and reproducibly manufacture high performance and reliable, complex microelectronics and microsystems. The team's efforts focused on defining and developing a science-based infrastructure to enable predictive compaction, sintering, stress, and thermomechanical modeling in ''real systems'', including: (1) developing techniques to and determining materials properties and constitutive behavior required for modeling; (2) developing new, improved/updated models and modeling capabilities, (3) ensuring that models are representative of the physical phenomena being simulated; and (4) assessing existing modeling capabilities to identify advances necessary to facilitate the practical application of Sandia's predictive modeling technology.
Using multi-model averaging to improve the reliability of catchment scale nitrogen predictions
Exbrayat, J.-F.; Viney, N. R.; Frede, H.-G.; Breuer, L.
2013-01-01
Hydro-biogeochemical models are used to foresee the impact of mitigation measures on water quality. Usually, scenario-based studies rely on single model applications. This is done in spite of the widely acknowledged advantage of ensemble approaches to cope with structural model uncertainty issues. As an attempt to demonstrate the reliability of such multi-model efforts in the hydro-biogeochemical context, this methodological contribution proposes an adaptation of the reliability ensemble averaging (REA) philosophy to nitrogen losses predictions. A total of 4 models are used to predict the total nitrogen (TN) losses from the well-monitored Ellen Brook catchment in Western Australia. Simulations include re-predictions of current conditions and a set of straightforward management changes targeting fertilisation scenarios. Results show that, in spite of good calibration metrics, one of the models provides a very different response to management changes. This behaviour leads the simple average of the ensemble members to also predict reductions in TN export that are not in agreement with the other models. However, considering the convergence of model predictions in the more sophisticated REA approach assigns more weight to previously less well-calibrated models that are more in agreement with each other. This method also avoids having to disqualify any of the ensemble members.
Nakagawa, Toshio
2013-01-01
In honor of the work of Professor Shunji Osaki, Stochastic Reliability and Maintenance Modeling provides a comprehensive study of the legacy of and ongoing research in stochastic reliability and maintenance modeling. Including associated application areas such as dependable computing, performance evaluation, software engineering, communication engineering, distinguished researchers review and build on the contributions over the last four decades by Professor Shunji Osaki. Fundamental yet significant research results are presented and discussed clearly alongside new ideas and topics on stochastic reliability and maintenance modeling to inspire future research. Across 15 chapters readers gain the knowledge and understanding to apply reliability and maintenance theory to computer and communication systems. Stochastic Reliability and Maintenance Modeling is ideal for graduate students and researchers in reliability engineering, and workers, managers and engineers engaged in computer, maintenance and management wo...
Liu, Sheng
2011-01-01
Although there is increasing need for modeling and simulation in the IC package design phase, most assembly processes and various reliability tests are still based on the time consuming ""test and try out"" method to obtain the best solution. Modeling and simulation can easily ensure virtual Design of Experiments (DoE) to achieve the optimal solution. This has greatly reduced the cost and production time, especially for new product development. Using modeling and simulation will become increasingly necessary for future advances in 3D package development. In this book, Liu and Liu allow people
Ebeling, Charles E.
1996-01-01
This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. This Manual updates and supersedes the 1995 RAM User and Maintenance Manual. Changes and enhancements from the 1995 version of the model are primarily a result of the addition of more recent aircraft and shuttle R&M data.
Development of thermal hydraulic models for the reliable regulatory auditing code
Chung, B. D.; Song, C. H.; Lee, Y. J.; Kwon, T. S. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
2003-04-15
The objective of this project is to develop thermal hydraulic models for use in improving the reliability of the regulatory auditing codes. The current year fall under the first step of the 3 year project, and the main researches were focused on identifying the candidate thermal hydraulic models for improvement and to develop prototypical model development. During the current year, the verification calculations submitted for the APR 1400 design certification have been reviewed, the experimental data from the MIDAS DVI experiment facility in KAERI have been analyzed and evaluated, candidate thermal hydraulic models for improvement have been identified, prototypical models for the improved thermal hydraulic models have been developed, items for experiment in connection with the model development have been identified, and preliminary design of the experiment has been carried out.
Helicopter Reliability Growth Evaluation
1976-04-01
monitoring time i+ by 2 *The Fibonacci sequence is the sequence whereby a number in the sequence is the Sum of the previous two numbers in the sequence , i.e...year monitoring. programs, find the uppei bound on, A(T0 ) for minimum allowable MTBF’s- of 6 .And 10 hours. Fr’om the Fibonacci sequence , the...improvement since their size, nature , and complexity govern individual contributiton to the total failure rate. Table 16 compares each subsystem percentage of
2014-04-01
reliability-based design optimization ( RBDO ) process, surrogate models are frequently used to reduce the number of simulations because analysis of a...the RBDO problem and thus mitigate the curse of dimensionality. Therefore, it is desirable to develop an efficient and effective variable...screening method for reduction of the dimension of the RBDO problem. In this paper, requirements of the variable screening method for deterministic design
A model for reliability analysis and calculation applied in an example from chemical industry
Pejović Branko B.
2010-01-01
Full Text Available The subject of the paper is reliability design in polymerization processes that occur in reactors of a chemical industry. The designed model is used to determine the characteristics and indicators of reliability, which enabled the determination of basic factors that result in a poor development of a process. This would reduce the anticipated losses through the ability to control them, as well as enabling the improvement of the quality of production, which is the major goal of the paper. The reliability analysis and calculation uses the deductive method based on designing of a scheme for fault tree analysis of a system based on inductive conclusions. It involves the use standard logical symbols and rules of Boolean algebra and mathematical logic. The paper eventually gives the results of the work in the form of quantitative and qualitative reliability analysis of the observed process, which served to obtain complete information on the probability of top event in the process, as well as objective decision making and alternative solutions.
Arasinah Kamis
2013-12-01
Full Text Available The Clothing Fashion Design (CFaD assessment instrument was used to measure the level of competence among instructors in Skills Training Institute (STI. This study was conducted to select items that are valid, fair, and of quality. The CFaD instrument consists of 97 Likert scale items with six constructs of designing, pattern drafting, computer, sewing, creative, and trade/entrepreneurship. The instrument was administered for the first stage of testing to 95 instructors in STI who teach in the field of fashion and clothing. The Rasch measurement model was used to obtain the reliability, validity, relevance of person items and unidimensionality of items. Therefore, Winsteps software version 3.72.3 was used to analyze the data. The findings showed that the items in the six constructs of skill competency have high reliability, from 0.63 to 0.96 for the Likert scale items. Meanwhile, the reliability of the respondents was estimated between 0.93-0.98. The analysis also indicate that 11 out of the 97 items were misfit while 32 items need to be repaired prior to the decision of dropping some of them due to lack of unidimensionality and differing levels of difficulty. Decisions to remove or repair were made so that the instrument is more fair and equitable to all respondents, and reliable.
Growth units model of anion coordination-polyhedra and its application to crystal growth
ZHANG Xuehua; LUO Haosu; ZHONG Weizhuo
2004-01-01
Growth units model of anion coordination-polyhedra ACP model emphasizes the influence of intrinsic structure of crstal upon the crystal growth and the importance of the external conditions on which crystals grow. The ACP model is used to analyze some problems in crystal growth, such as the formation of dendrite in the crystal structure,growth habit of polar crystal, and formation of allomerism and polymorphism.
Modeling surface growth of Escherichia coli on agar plates.
Fujikawa, Hiroshi; Morozumi, Satoshi
2005-12-01
Surface growth of Escherichia coli cells on a membrane filter placed on a nutrient agar plate under various conditions was studied with a mathematical model. The surface growth of bacterial cells showed a sigmoidal curve with time on a semilogarithmic plot. To describe it, a new logistic model that we presented earlier (H. Fujikawa et al., Food Microbiol. 21:501-509, 2004) was modified. Growth curves at various constant temperatures (10 to 34 degrees C) were successfully described with the modified model (model III). Model III gave better predictions of the rate constant of growth and the lag period than a modified Gompertz model and the Baranyi model. Using the parameter values of model III at the constant temperatures, surface growth at various temperatures was successfully predicted. Surface growth curves at various initial cell numbers were also sigmoidal and converged to the same maximum cell numbers at the stationary phase. Surface growth curves at various nutrient levels were also sigmoidal. The maximum cell number and the rate of growth were lower as the nutrient level decreased. The surface growth curve was the same as that in a liquid, except for the large curvature at the deceleration period. These curves were also well described with model III. The pattern of increase in the ATP content of cells grown on a surface was sigmoidal, similar to that for cell growth. We discovered several characteristics of the surface growth of bacterial cells under various growth conditions and examined the applicability of our model to describe these growth curves.
An Assessment of the VHTR Safety Distance Using the Reliability Physics Model
Lee, Joeun; Kim, Jintae; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)
2015-10-15
In Korea planning the production of hydrogen using high temperature from nuclear power is in progress. To produce hydrogen from nuclear plants, supplying temperature above 800 .deg. C is required. Therefore, Very High Temperature Reactor (VHTR) which is able to provide about 950 .deg. C is suitable. In situation of high temperature and corrosion where hydrogen might be released easily, hydrogen production facility using VHTR has a danger of explosion. Moreover explosion not only has a bad influence upon facility itself but also on VHTR. Those explosions result in unsafe situation that cause serious damage. However, In terms of thermal-hydraulics view, long distance makes low efficiency Thus, in this study, a methodology for the safety assessment of safety distance between the hydrogen production facilities and the VHTR is developed with reliability physics model. Based on the standard safety criteria which is a value of 1 x 10{sup -6}, the safety distance between the hydrogen production facilities and the VHTR using reliability physics model are calculated to be a value of 60m - 100m. In the future, assessment for characteristic of VHTR, the capacity to resist pressure from outside hydrogen explosion and the overpressure for the large amount of detonation volume in detail is expected to identify more precise safety distance using this reliability physics model.
Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian
2009-01-01
This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...
A competition model for wormhole growth
Cabeza Diaz de Cerio, Yoar; Carrera, Jesus; Hidalgo, Juan J.
2016-04-01
Flow preferential pathways generated by dissolution are commonly known as wormholes. Wormhole generation and evolution are topics of interest not only for karst aquifer studies but also for fields as CO2 storage and oil industry among others. The objective of this work is to show that given an initial perturbation, the development of the dissolution pattern can be considered deterministic. This means that the evolution of the effective hydraulic conductivity can be predicted. To this end we use a wormhole growth model in which wormholes compete for the available water. In the competition model the wormholes grow proportionally to the flow rate through them. The wormhole flow rate is a function of the wormholes lengths and distances between them. We derive empirical expressions for the flow rates from steady state flow synthetic models with different geometries. Finally, we perform series of simulations using this competition model, applying random initial perturbations and different number of wormholes for each set of simulations and we study the evolution of the dissolution pattern. We find that the resulting wormhole patterns are in good agreement with others generated with much more complex models.
Rius-Vilarrasa, E; Strandberg, E; Fikse, W F
2012-01-01
Using a combined multi-breed reference population, this study explored the influence of model specification and the effect of including a polygenic effect on the reliability of genomic breeding values (DGV and GEBV). The combined reference population consisted of 2986 Swedish Red Breed (SRB...... effects. The influence of the inclusion of a polygenic effect on the reliability of DGV varied across traits and model specifications. Average correlation between DGV with the Mendelian sampling term, across traits, was highest (R =0.25) for the GBLUP model and decreased with increasing proportion...... of markers with large effects. Reliabilities increased when DGV and parent average information were combined in an index. The GBLUP model with the largest gain across traits in the reliability of the index achieved the highest DGV mean reliability. However, the polygenic models showed to be less biased...
Kezirian, Michael T.; Phoenix, S. Leigh; Eldridge, Jeffrey I.
2009-01-01
Composite Overwrapped Pressure Vessels (COPVs) are frequently used for storing pressurized gases aboard spacecraft and aircraft when weight saving is desirable compared to all-metal versions. Failure mechanisms in fibrous COPVs and variability in lifetime can be very different from their metallic counterparts; in the former, catastrophic stress-rupture can occur with virtually no warning, whereas in latter, a leak before burst design philosophy can be implemented. Qualification and certification typically requires only one burst test on a production sample (possibly after several pressure cycles) and the vessel need only meet a design burst strength (the maximum operating pressure divided by a knockdown factor). Typically there is no requirement to assess variability in burst strength or lifetime, much less determine production and materials processing parameters important to control of such variability. Characterizing such variability and its source is crucial to models for calculating required reliability over a given lifetime (e.g. R = 0.9999 for 15 years). In this paper we present a case study of how lack of control of certain process parameters in COPV manufacturing can result in variations among vessels and between production runs that can greatly increase uncertainty and reduce reliability. The vessels considered are 40-inch ( NASA Glenn Research center, Cleveland, OH, 44135 29,500 in3 ) spherical COPVs with a 0.74 in. thick Kevlar49/epoxy overwrap and with a titanium liner of which 34 were originally produced. Two burst tests were eventually performed that unexpectedly differed by almost 5%, and were 10% lower than anticipated from burst tests on 26-inch sister vessels similar in every detail. A major observation from measurements made during proof testing (autofrettage) of the 40-inch vessels was that permanent volume growth from liner yielding varied by a factor of more than two (150 in3 to 360 in3 ), which suggests large differences in the residual
Mathematical modeling and reliability analysis of a 3D Li-ion battery
RICHARD HONG PENG LIANG
2014-02-01
Full Text Available The three-dimensional (3D Li-ion battery presents an effective solution to issues affecting its two-dimensional counterparts, as it is able to attain high energy capacities for the same areal footprint without sacrificing power density. A 3D battery has key structural features extending in and fully utilizing 3D space, allowing it to achieve greater reliability and longevity. This study applies an electrochemical-thermal coupled model to a checkerboard array of alternating positive and negative electrodes in a 3D architecture with either square or circular electrodes. The mathematical model comprises the transient conservation of charge, species, and energy together with electroneutrality, constitutive relations and relevant initial and boundary conditions. A reliability analysis carried out to simulate malfunctioning of either a positive or negative electrode reveals that although there are deviations in electrochemical and thermal behavior for electrodes adjacent to the malfunctioning electrode as compared to that in a fully-functioning array, there is little effect on electrodes further away, demonstrating the redundancy that a 3D electrode array provides. The results demonstrate that implementation of 3D batteries allow it to reliably and safely deliver power even if a component malfunctions, a strong advantage over conventional 2D batteries.
Modelling competitive coadsorption in electrochemical growth processes
Aarão-Reis, F D A; Pauporte, T; Lincot, D; Reis, Fabio D. A. Aarao; Pauporte, Thierry; Lincot, Daniel
2006-01-01
We present models of electrodeposition of ZnO films with organic additives, with focus on the growth of hybrid films with eosin Y. First we propose a rate equation model which assumes that the additives form branches with an exposed part above the ZnO deposit, growing with larger rate than the pure film, and that the rate of production of ZnO near those branches is proportional to the height exposed to the solution. This accounts for the production of OH- ions near the branches and the reactions with Zn++ ions. The steady state solution shows both species growing with the rate of the branches, and qualitatively explains their catalytic effect. Subsequently, we propose a more realistic statistical model for the formation of the hybrid deposits from Zn++ ions, a hydroxide precursor and eosin in solution. Simple probabilistic rules are used for reactions of eosin and oxygen, taking into account diffusion from solution along the same lines of the diffusion-limited aggregation models. The catalytic effect is repre...
Flower Power: Sunflowers as a Model for Logistic Growth
Fernandez, Eileen; Geist, Kristi A.
2011-01-01
Logistic growth displays an interesting pattern: It starts fast, exhibiting the rapid growth characteristic of exponential models. As time passes, it slows in response to constraints such as limited resources or reallocation of energy. The growth continues to slow until it reaches a limit, called capacity. When the growth describes a population,…
Modeling and Optimization for Epitaxial Growth: Transport and Growth Studies
1999-01-01
Epsilon-1 microprocessor and controlled automatically in–situ. For example, PID controllers and MFCs regulate the thermocouple temperatures and inlet flow...thermocouples are regulated by PID controllers . The set-up of the reactor apparatus may partially explain the smaller variation in actual growth rates. Recall
Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression
Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.; Thiele, Ines; Palsson, Bernhard O.; Saunders, Michael A.
2017-01-01
Constraint-Based Reconstruction and Analysis (COBRA) is currently the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We have developed a quadruple-precision version of our linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.
Development of thermal hydraulic models for the reliable regulatory auditing code
Chung, B. D.; Song, C. H.; Lee, Y. J.; Kwon, T. S.; Lee, S. W. [Korea Automic Energy Research Institute, Taejon (Korea, Republic of)
2004-02-15
The objective of this project is to develop thermal hydraulic models for use in improving the reliability of the regulatory auditing codes. The current year fall under the second step of the 3 year project, and the main researches were focused on the development of downcorner boiling model. During the current year, the bubble stream model of downcorner has been developed and installed in he auditing code. The model sensitivity analysis has been performed for APR1400 LBLOCA scenario using the modified code. The preliminary calculation has been performed for the experimental test facility using FLUENT and MARS code. The facility for air bubble experiment has been installed. The thermal hydraulic phenomena for VHTR and super critical reactor have been identified for the future application and model development.
Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)
2001-01-01
This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in
Modeling Parameters of Reliability of Technological Processes of Hydrocarbon Pipeline Transportation
Shalay Viktor
2016-01-01
Full Text Available On the basis of methods of system analysis and parametric reliability theory, the mathematical modeling of processes of oil and gas equipment operation in reliability monitoring was conducted according to dispatching data. To check the quality of empiric distribution coordination , an algorithm and mathematical methods of analysis are worked out in the on-line mode in a changing operating conditions. An analysis of physical cause-and-effect relations mechanism between the key factors and changing parameters of technical systems of oil and gas facilities is made, the basic types of technical distribution parameters are defined. Evaluation of the adequacy the analyzed parameters of the type of distribution is provided by using a criterion A.Kolmogorov, as the most universal, accurate and adequate to verify the distribution of continuous processes of complex multiple-technical systems. Methods of calculation are provided for supervising by independent bodies for risk assessment and safety facilities.
Modeling and Implementation of Reliable Ternary Arithmetic and Logic Unit Design Using Vhdl
Meruva Kumar Raja
2014-06-01
Full Text Available Multivalve logic is a reliable method for defining, analyzing, testing and implementing the basic combinational circuitry with VHDL simulator. It offers better utilization of transmission channels because of its high speed for higher information carried out and it gives more efficient performance. One of the main realizing of the MVL (ternary logic is that reduces the number of required computation steps, simplicity and energy efficiency in digital logic design. This paper using reliable method is brought out for implementing the basic combinational, sequential and TALU (Ternary Arithmetic and Logic Unit circuitry with minimum number of ternary switching circuits (Multiplexers. In this the potential of VHDL modelling and simulation that can be applied to ternary switching circuits to verify its functionality and timing specifications. An intention is to show how proposed simulator can be used to simulate MVL circuits and to evaluate system performance.
Jacobsen, Stine Lindahl
The paper will present a phd study concerning reliability and validity of music therapy assessment model “Assessment of Parenting Competences” (APC) in the area of families with emotionally neglected children. This study had a multiple strategy design with a philosophical base of critical realism...... and pragmatism. The fixed design for this study was a between and within groups design in testing the APCs reliability and validity. The two different groups were parents with neglected children and parents with non-neglected children. The flexible design had a multiple case study strategy specifically...... of the theoretical understanding of the clientgroup. Furthermore, a short describtion of the specific assessment protocol and analysis procedures of APC will be a part of the presentation. The phd study sought to explore how to develop measures of parenting competences in looking at autonomy relationship...
Grady, Matthew W.; Beretvas, S. Natasha
2010-01-01
Multiple membership random effects models (MMREMs) have been developed for use in situations where individuals are members of multiple higher level organizational units. Despite their availability and the frequency with which multiple membership structures are encountered, no studies have extended the MMREM approach to hierarchical growth curve…
Agriculture and economic growth in Ethiopia: growth multipliers from a four-sector simulation model
1999-01-01
Agriculture accounts for over half of Ethiopian GDP, yet the case for agriculture as a focus of economic growth strategies must rely on identifying a set of intersectoral linkages through which agricultural growth contributes to the growth of nonagriculture in the Ethiopian economy. This article develops a four-sector numerical simulation model of economic growth in Ethiopia which permits the calculation of macroeconomic growth multipliers resulting from income shocks to agriculture, services...
Skill and reliability of climate model ensembles at the Last Glacial Maximum and mid-Holocene
J. C. Hargreaves
2013-03-01
Full Text Available Paleoclimate simulations provide us with an opportunity to critically confront and evaluate the performance of climate models in simulating the response of the climate system to changes in radiative forcing and other boundary conditions. Hargreaves et al. (2011 analysed the reliability of the Paleoclimate Modelling Intercomparison Project, PMIP2 model ensemble with respect to the MARGO sea surface temperature data synthesis (MARGO Project Members, 2009 for the Last Glacial Maximum (LGM, 21 ka BP. Here we extend that work to include a new comprehensive collection of land surface data (Bartlein et al., 2011, and introduce a novel analysis of the predictive skill of the models. We include output from the PMIP3 experiments, from the two models for which suitable data are currently available. We also perform the same analyses for the PMIP2 mid-Holocene (6 ka BP ensembles and available proxy data sets. Our results are predominantly positive for the LGM, suggesting that as well as the global mean change, the models can reproduce the observed pattern of change on the broadest scales, such as the overall land–sea contrast and polar amplification, although the more detailed sub-continental scale patterns of change remains elusive. In contrast, our results for the mid-Holocene are substantially negative, with the models failing to reproduce the observed changes with any degree of skill. One cause of this problem could be that the globally- and annually-averaged forcing anomaly is very weak at the mid-Holocene, and so the results are dominated by the more localised regional patterns in the parts of globe for which data are available. The root cause of the model-data mismatch at these scales is unclear. If the proxy calibration is itself reliable, then representativity error in the data-model comparison, and missing climate feedbacks in the models are other possible sources of error.
Liu, Zengkai; Liu, Yonghong; Wu, Xinlei; Yang, Dongwei; Cai, Baoping; Zheng, Chao
2016-09-01
Bayesian network (BN) is a widely used formalism for representing uncertainty in probabilistic systems and it has become a popular tool in reliability engineering. The GO-FLOW method is a success-oriented system analysis technique and capable of evaluating system reliability and risk. To overcome the limitations of GO-FLOW method and add new method for BN model development, this paper presents a novel approach on constructing a BN from GO-FLOW model. GO-FLOW model involves with several discrete time points and some signals change at different time points. But it is a static system at one time point, which can be described with BN. Therefore, the developed BN with the proposed method in this paper is equivalent to GO-FLOW model at one time point. The equivalent BNs of the fourteen basic operators in the GO-FLOW methodology are developed. Then, the existing GO-FLOW models can be mapped into equivalent BNs on basis of the developed BNs of operators. A case of auxiliary feedwater system of a pressurized water reactor is used to illustrate the method. The results demonstrate that the GO-FLOW chart can be successfully mapped into equivalent BNs.
A Solvable Symbiosis-Driven Growth Model
KE Jian-Hong; LIN Zhen-Quan; CHEN Xiao-Shuang
2006-01-01
We introduce a two-species symbiosis-driven growth model, in which two species can mutually benefit for their monomer birth and the self-death of each species simultaneously occurs. By means of the generalized rate equation, we investigate the dynamic evolution of the system under the monodisperse initial condition. It is found that the kinetic behaviour of the system depends crucially on the details of the rate kernels as well as the initial concentration distributions. The cluster size distribution of either species cannot be scaled in most cases;while in some special cases, they both consistently take the universal scaling form. Moreover, in some cases the system may undergo a gelation transition and the pre-gelation behaviour of the cluster size distributions satisfies the scaling form in the vicinity of the gelation point. On the other hand, the two species always live and die together.
Reliability of a Novel Model for Drug Release from 2D HPMC-Matrices
Rumiana Blagoeva
2010-04-01
Full Text Available A novel model of drug release from 2D-HPMC matrices is considered. Detailed mathematical description of matrix swelling and the effect of the initial drug loading are introduced. A numerical approach to solution of the posed nonlinear 2D problem is used on the basis of finite element domain approximation and time difference method. The reliability of the model is investigated in two steps: numerical evaluation of the water uptake parameters; evaluation of drug release parameters under available experimental data. The proposed numerical procedure for fitting the model is validated performing different numerical examples of drug release in two cases (with and without taking into account initial drug loading. The goodness of fit evaluated by the coefficient of determination is presented to be very good with few exceptions. The obtained results show better model fitting when accounting the effect of initial drug loading (especially for larger values.
Patterson-Hine, F. A.; Davis, Gloria J.; Pedar, A.
1991-01-01
The complexity of computer systems currently being designed for critical applications in the scientific, commercial, and military arenas requires the development of new techniques for utilizing models of system behavior in order to assure 'ultra-dependability'. The complexity of these systems, such as Space Station Freedom and the Air Traffic Control System, stems from their highly integrated designs containing both hardware and software as critical components. Reliability graph models, such as fault trees and digraphs, are used frequently to model hardware systems. Their applicability for software systems has also been demonstrated for software safety analysis and the analysis of software fault tolerance. This paper discusses further uses of graph models in the design and implementation of fault management systems for safety critical applications.
Modeling the reliability and maintenance costs of wind turbines using Weibull analysis
Vachon, W.A. [W.A. Vachon & Associates, Inc., Manchester, MA (United States)
1996-12-31
A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.
4D Shape-Preserving Modelling of Bone Growth
Andresen, Per Rønsholt; Nielsen, Mads; Kreiborg, Sven
1998-01-01
From a set of temporally separated scannings of the same anatomical structure we wish to identify and analyze the growth in terms of a metamorphosis. That is, we study the tempral change of shape which may prowide an understanding of the biological processes which govern the growth process. We...... subdivide the growth analysis into growth simulation, growth modelling, and finally the growth analysis. In this paper, we present results of growth simulation of the mandible from 3 scannings of the same patient in the age of 9 months, 21 months, and 7 years. We also present the first growth models...... and growth analyzes. The ultimative goal is to predict/simulate human growth which would be extremely useful in many surgical procedures....
Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.
2016-12-01
The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.
Aubry, Keith B; Raley, Catherine M; McKelvey, Kevin S
2017-01-01
The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs) for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated for rare, elusive, and cryptic species that are prone to misidentification in the field. We investigated this question for the fisher (Pekania pennanti), a forest carnivore of conservation concern in the Pacific States that is often confused with the more common Pacific marten (Martes caurina). Fisher occurrence records supported by physical evidence (verifiable records) were available from a limited area, whereas occurrence records of unknown quality (unscreened records) were available from throughout the fisher's historical range. We reserved 20% of the verifiable records to use as a test sample for both models and generated SDMs with each dataset using Maxent. The verifiable model performed substantially better than the unscreened model based on multiple metrics including AUCtest values (0.78 and 0.62, respectively), evaluation of training and test gains, and statistical tests of how well each model predicted test localities. In addition, the verifiable model was consistent with our knowledge of the fisher's habitat relations and potential distribution, whereas the unscreened model indicated a much broader area of high-quality habitat (indices > 0.5) that included large expanses of high-elevation habitat that fishers do not occupy. Because Pacific martens remain relatively common in upper elevation habitats in the Cascade Range and Sierra Nevada, the SDM based on unscreened records likely reflects primarily a conflation of marten and fisher habitat. Consequently, accurate identifications are far more important than the spatial extent of occurrence records for generating reliable SDMs for the
Sharifah Nadiyah Razali
2016-12-01
Full Text Available This study aims to generate empirical evidence on the validity and reliability of Perception of Online Collaborative Learning Questionnaire (POCLQ using Rasch model. The questionnaire was distributed to 32 (N=32 Diploma Hotel Catering students from Politeknik Ibrahim Sultan, Johor (PIS. Data obtained was analysed using WINSTEP version 3.68 software. The finding showed that POCLQ had high reliability with five categories of difficulties items. So, it can be concluded that POCLQ is reliable and strongly accepted. Meanwhile, analysis of items fit showed there were six items that are not in the specified range and based on standardised residual correlation measurement value; there were five items found to be overlapped that should be dropped. All the items that needed to be dropped based on the analysis of result had been refined and retained for the purpose of the study and based on expert's view. Therefore, all items remained after Rasch analysis. It is hoped that this study will give emphasis to other researchers about the importance of analysing items to ensure the quality of an instrument being developed.
钟绍鹏; 邓卫
2015-01-01
A reliability-based stochastic system optimum congestion pricing (SSOCP) model with endogenous market penetration and compliance rate in an advanced traveler information systems (ATIS) environment was proposed. All travelers were divided into two classes. The first guided travelers were referred to as the equipped travelers who follow ATIS advice, while the second unguided travelers were referred to as the unequipped travelers and the equipped travelers who do not follow the ATIS advice (also referred to as non-complied travelers). Travelers were assumed to take travel time, congestion pricing, and travel time reliability into account when making travel route choice decisions. In order to arrive at on time, travelers needed to allow for a safety margin to their trip. The market penetration of ATIS was determined by a continuous increasing function of the information benefit, and the ATIS compliance rate of equipped travelers was given as the probability of the actually experienced travel costs of guided travelers less than or equal to those of unguided travelers. The analysis results could enhance our understanding of the effect of travel demand level and travel time reliability confidence level on the ATIS market penetration and compliance rate; and the effect of travel time perception variation of guided and unguided travelers on the mean travel cost savings (MTCS) of the equipped travelers, the ATIS market penetration, compliance rate, and the total network effective travel time (TNETT).
A model of urban rational growth based on grey prediction
Xiao, Wenjing
2017-04-01
Smart growth focuses on building sustainable cities, using compact development to prevent urban sprawl. This paper establishes a series of models to implement smart growth theories into city design. Besides two specific city design cases are shown. Firstly, We establishes Smart Growth Measure Model to measure the success of smart growth of a city. And we use Full Permutation Polygon Synthetic Indicator Method to calculate the Comprehensive Indicator (CI) which is used to measure the success of smart growth. Secondly, this paper uses the principle of smart growth to develop a new growth plan for two cities. We establish an optimization model to maximum CI value. The Particle Swarm Optimization (PSO) algorithm is used to solve the model. Combined with the calculation results and the specific circumstances of cities, we make their the smart growth plan respectively.
3D Modeling and Simulation of Dendritic Growth during Solidification
Zuojian LIANG; Qingyan XU; Baicheng LIU
2003-01-01
A mathematical model for the three-dimensional simulation of free dendritic growth and microstructure evolutionwas developed based on the growth mechanism of crystal grains and basic transfer equations such as heat, massand momentum transfer equations. Ma
Kinetic models of cell growth, substrate utilization and bio ...
STORAGESEVER
2008-05-02
May 2, 2008 ... A simple model was proposed using the Logistic Equation for the growth,. Leudeking-Piret ... (melanoidin) which may create many problems and also .... Where, the constant µ is defined as the specific growth rate. Equation 1 ...
Statistical Ensemble Theory of Gompertz Growth Model
Takuya Yamano
2009-11-01
Full Text Available An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein statistics picture. The analytical entropy expression pertain to the law can be obtained in terms of the growth velocity distribution as well as the Gompertz function itself for the whole process.
A model for grain growth based on the novel description of dendrite shape
O. Wodo
2007-12-01
Full Text Available We use novel description of dendritic shape in the micro solid phase growth model. The model describes evolution of both primary solid solution dendrite and eutectic that forms between arms and grains in the last stage of solidification. Obtained results show that our approach can be used in grain growth model to determine more reliable eutectic distribution. In the paper no kinetics connected with the eutectic transformation is taken into account. However, this does not affect the eutectic distribution because at the beginning of eutectic reaction all liquid phase was assumed to fully transform into eutectic. Results for solid phase growth model based on this description are presented. The obtained results of eutectic distribution are especially important in the hypoeutectic alloy solidification case, where the eutectic grains grow between formed solid solution grains. Thus, the distribution of solid solution grain becomes crucial due to its influence on the delay in solid fraction increase of eutectic grains.
Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance
Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra
2017-06-01
In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.
Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance
Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra
2016-03-01
In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.
REFERENCE MODELS OF ENDOGENOUS ECONOMIC GROWTH
GEAMĂNU MARINELA
2012-05-01
Full Text Available The new endogenous growth theories are a very important research area for shaping the most effective policies and long term sustainable development strategies.Endogenous growth theory has emerged as a reaction to the imperfections of neoclassical theory, by the fact that the economic growth is the endogenous product of an economical system.
Melnyk, Bernadette Mazurek
2012-01-01
High-reliability health care organizations are those that provide care that is safe and one that minimizes errors while achieving exceptional performance in quality and safety. This article presents major concepts and characteristics of a patient safety culture and a high-reliability health care organization and explains how building a culture of evidence-based practice can assist organizations in achieving high reliability. The ARCC (Advancing Research and Clinical practice through close Collaboration) model for systemwide implementation and sustainability of evidence-based practice is highlighted as a key strategy in achieving high reliability in health care organizations.
Reproducibility, reliability and validity of measurements obtained from Cecile3 digital models
Gustavo Adolfo Watanabe-Kanno
2009-09-01
Full Text Available The aim of this study was to determine the reproducibility, reliability and validity of measurements in digital models compared to plaster models. Fifteen pairs of plaster models were obtained from orthodontic patients with permanent dentition before treatment. These were digitized to be evaluated with the program Cécile3 v2.554.2 beta. Two examiners measured three times the mesiodistal width of all the teeth present, intercanine, interpremolar and intermolar distances, overjet and overbite. The plaster models were measured using a digital vernier. The t-Student test for paired samples and interclass correlation coefficient (ICC were used for statistical analysis. The ICC of the digital models were 0.84 ± 0.15 (intra-examiner and 0.80 ± 0.19 (inter-examiner. The average mean difference of the digital models was 0.23 ± 0.14 and 0.24 ± 0.11 for each examiner, respectively. When the two types of measurements were compared, the values obtained from the digital models were lower than those obtained from the plaster models (p < 0.05, although the differences were considered clinically insignificant (differences < 0.1 mm. The Cécile digital models are a clinically acceptable alternative for use in Orthodontics.
English, Sinéad; Bateman, Andrew W; Clutton-Brock, Tim H
2012-05-01
Lifetime records of changes in individual size or mass in wild animals are scarce and, as such, few studies have attempted to model variation in these traits across the lifespan or to assess the factors that affect them. However, quantifying lifetime growth is essential for understanding trade-offs between growth and other life history parameters, such as reproductive performance or survival. Here, we used model selection based on information theory to measure changes in body mass over the lifespan of wild meerkats, and compared the relative fits of several standard growth models (monomolecular, von Bertalanffy, Gompertz, logistic and Richards). We found that meerkats exhibit monomolecular growth, with the best model incorporating separate growth rates before and after nutritional independence, as well as effects of season and total rainfall in the previous nine months. Our study demonstrates how simple growth curves may be improved by considering life history and environmental factors, which may be particularly relevant when quantifying growth patterns in wild populations.
Reactive burn models and ignition & growth concept
Menikoff, Ralph S [Los Alamos National Laboratory; Shaw, Milton S [Los Alamos National Laboratory
2010-01-01
Plastic-bonded explosives are heterogeneous materials. Experimentally, shock initiation is sensitive to small amounts of porosity, due to the formation of hot spots (small localized regions of high temperature). This leads to the Ignition and Growth concept, introduced by Lee and Tarver in 1980, as the basis for reactive burn models. A homogeneized burn rate needs to account for three mesoscale physical effects (i) the density of burnt hot spots, which depends on the lead shock strength; (ii) the growth of the burn fronts triggered by hot spots, which depends on the local deflagration speed; (iii) a geometric factor that accounts for the overlap of deflagration wavelets from adjacent hot spots. These effects can be combined and the burn model defined by specifying the reaction progress variable {lambda}(t) as a function of a dimensionless reaction length {tau}{sub hs}(t)/{ell}{sub hs}, rather than by xpecifying an explicit burn rate. The length scale {ell}{sub hs} is the average distance between hot spots, which is proportional to [N{sub hs}(P{sub s})]{sup -1/3}, where N{sub hs} is the number density of hot spots activated by the lead shock. The reaction length {tau}{sub hs}(t) = {line_integral}{sub 0}{sup t} D(P(t'))dt' is the distance the burn front propagates from a single hot spot, where D is the deflagration speed and t is the time since the shock arrival. A key implementation issue is how to determine the lead shock strength in conjunction with a shock capturing scheme. They have developed a robust algorithm for this purpose based on the Hugoniot jump condition for the energy. The algorithm utilizes the time dependence of density, pressure and energy within each cell. The method is independent of the numerical dissipation used for shock capturing. It is local and can be used in one or more space dimensions. The burn model has a small number of parameters which can be calibrated to fit velocity gauge data from shock initiation experiments.
Reactive burn models and ignition & growth concept
Shaw M.S.
2011-01-01
Full Text Available Plastic-bonded explosives are heterogeneous materials. Experimentally, shock initiation is sensitive to small amounts of porosity, due to the formation of hot spots (small localized regions of high temperature. This leads to the Ignition & Growth concept, introduced by LeeTarver in 1980, as the basis for reactive burn models. A homo- genized burn rate needs to account for three meso-scale physical effects: (i the density of active hot spots or burn centers; (ii the growth of the burn fronts triggered by the burn centers; (iii a geometric factor that accounts for the overlap of deflagration wavelets from adjacent burn centers. These effects can be combined and the burn model defined by specifying the reaction progress variable λ = g(s as a function of a dimensionless reaction length s(t = rbc/ℓbc, rather than by specifying an explicit burn rate. The length scale ℓbc(Ps = [Nbc(Ps]−1/3 is the average distance between burn centers, where Nbc is the number density of burn centers activated by the lead shock. The reaction length rbc(t = ∫t0 D(P(t′dt′ is the distance the burn front propagates from a single burn center, where D(P is the deflagration speed as a function of the local pressure and t is the time since the shock arrival. A key implementation issue is how to determine the lead shock strength in conjunction with a shock capturing scheme. We have developed a robust algorithm for this purpose based on the Hugoniot jump condition for the energy. The algorithm utilizes the time dependence of density, pressure and energy within each cell. The method is independent of the numerical dissipation used for shock capturing. It is local and can be used in one or more space dimensions. The burn model has a small number of parameters which can be calibrated to fit velocity gauge data from shock initiation experiments.
Stochastic modeling of thermal fatigue crack growth
Radu, Vasile
2015-01-01
The book describes a systematic stochastic modeling approach for assessing thermal-fatigue crack-growth in mixing tees, based on the power spectral density of temperature fluctuation at the inner pipe surface. It shows the development of a frequency-temperature response function in the framework of single-input, single-output (SISO) methodology from random noise/signal theory under sinusoidal input. The frequency response of stress intensity factor (SIF) is obtained by a polynomial fitting procedure of thermal stress profiles at various instants of time. The method, which takes into account the variability of material properties, and has been implemented in a real-world application, estimates the probabilities of failure by considering a limit state function and Monte Carlo analysis, which are based on the proposed stochastic model. Written in a comprehensive and accessible style, this book presents a new and effective method for assessing thermal fatigue crack, and it is intended as a concise and practice-or...
Numeric Modeling of Granular Asteroid Growth
Beaumont, Benjamin; Lazzati, D.
2014-01-01
It is believed that planetesimals and asteroids are created by the constructive collisions of smaller objects, loosely bound under the effect of self-gravity and/or contact forces. However, the internal dynamics of these collisions and whether they trigger growth or fragmentation are poorly understood. Prior research in the topic has established regimes for the results of constructive collisions of particles under contact forces, but neglects gravity, a critical component once particles are no longer touching, and force chains, an uneven distribution of force inherent to granular materials. We run simulations binary collisions of clusters of particles modeled as hard spheres. Our simulations take into account self-gravity, dissipation of energy, friction, and use a potential function for overlapping particles to study force chains. We present here the collision outcome for clusters with variable masses, particle counts, velocities, and impact parameter. We compare our results to other models and simulations, and find that the collisions remain constructive at higher energies than classically predicted.
Validated Loads Prediction Models for Offshore Wind Turbines for Enhanced Component Reliability
Koukoura, Christina
To improve the reliability of offshore wind turbines, accurate prediction of their response is required. Therefore, validation of models with site measurements is imperative. In the present thesis a 3.6MW pitch regulated-variable speed offshore wind turbine on a monopole foundation is built...... response of a boat impact. The first and second modal damping of the system during normal operation both from measurements and simulations are identified with the implementation of the Enhanced Frequency Domain Decomposition technique. The effect of damping on the side-side fatigue of the support structure...
Mishnaevsky, Leon
2014-01-01
, with modified, hybridor nanomodified structures. In this project, we seek to explore the potential of hybrid (carbon/glass),nanoreinforced and hierarchical composites (with secondary CNT, graphene or nanoclay reinforcement) as future materials for highly reliable large wind turbines. Using 3D multiscale...... computational models ofthe composites, we study the effect of hybrid structure and of nanomodifications on the strength, lifetime and service properties of the materials (see Figure 1). As a result, a series of recommendations toward the improvement of composites for structural applications under long term...
Probabilistic modelling of combined sewer overflow using the First Order Reliability Method
Thorndahl, Søren; Schaarup-Jensen, Kjeld; Jensen, Jacob Birk
2007-01-01
This paper presents a new and alternative method (in the context of urban drainage) for probabilistic hydrodynamical analysis of drainage systems in general and especially prediction of combined sewer overflow. Using a probabilistic shell it is possible to implement both input and parameter...... uncertainties on an application of the commercial urban drainage model MOUSE combined with the probabilistic First Order Reliability Method (FORM). Applying statistical characteristics on several years of rainfall, it is possible to derive a parameterization of the rainfall input and the failure probability...
Del Giudice, Dario; Löwe, Roland; Madsen, Henrik;
2015-01-01
provide probabilistic predictions of wastewater discharge in a similarly reliable way, both for periods ranging from a few hours up to more than 1 week ahead of time. The EBD produces more accurate predictions on long horizons but relies on computationally heavy MCMC routines for parameter inferences......In urban rainfall-runoff, commonly applied statistical techniques for uncertainty quantification mostly ignore systematic output errors originating from simplified models and erroneous inputs. Consequently, the resulting predictive uncertainty is often unreliable. Our objective is to present two...
Probabilistic Modelling of Combined Sewer Overflow Using the First Order Reliability Method
Thorndahl, Søren; Schaarup-Jensen, Kjeld; Jensen, Jacob Birk
2008-01-01
This paper presents a new and alternative method (in the context of urban drainage) for probabilistic hydrodynamical analysis of drainage systems in general and especially prediction of combined sewer overflow. Using a probabilistic shell it is possible to implement both input and parameter...... uncertainties on an application of the commercial urban drainage model MOUSE combined with the probabilistic First Order Reliability Method (FORM). Applying statistical characteristics on several years of rainfall, it is possible to derive a parameterization of the rainfall input and the failure probability...
A new lifetime estimation model for a quicker LED reliability prediction
Hamon, B. H.; Mendizabal, L.; Feuillet, G.; Gasse, A.; Bataillou, B.
2014-09-01
LED reliability and lifetime prediction is a key point for Solid State Lighting adoption. For this purpose, one hundred and fifty LEDs have been aged for a reliability analysis. LEDs have been grouped following nine current-temperature stress conditions. Stress driving current was fixed between 350mA and 1A and ambient temperature between 85C and 120°C. Using integrating sphere and I(V) measurements, a cross study of the evolution of electrical and optical characteristics has been done. Results show two main failure mechanisms regarding lumen maintenance. The first one is the typically observed lumen depreciation and the second one is a much more quicker depreciation related to an increase of the leakage and non radiative currents. Models of the typical lumen depreciation and leakage resistance depreciation have been made using electrical and optical measurements during the aging tests. The combination of those models allows a new method toward a quicker LED lifetime prediction. These two models have been used for lifetime predictions for LEDs.
Liu, Jingwei
2011-01-01
A function based nonlinear least squares estimation (FNLSE) method is proposed and investigated in parameter estimation of Jelinski-Moranda software reliability model. FNLSE extends the potential fitting functions of traditional least squares estimation (LSE), and takes the logarithm transformed nonlinear least squares estimation (LogLSE) as a special case. A novel power transformation function based nonlinear least squares estimation (powLSE) is proposed and applied to the parameter estimation of Jelinski-Moranda model. Solved with Newton-Raphson method, Both LogLSE and powLSE of Jelinski-Moranda models are applied to the mean time between failures (MTBF) predications on six standard software failure time data sets. The experimental results demonstrate the effectiveness of powLSE with optimal power index compared to the classical least--squares estimation (LSE), maximum likelihood estimation (MLE) and LogLSE in terms of recursively relative error (RE) index and Braun statistic index.
Impact of Loss Synchronization on Reliable High Speed Networks: A Model Based Simulation
Suman Kumar
2014-01-01
Full Text Available Contemporary nature of network evolution demands for simulation models which are flexible, scalable, and easily implementable. In this paper, we propose a fluid based model for performance analysis of reliable high speed networks. In particular, this paper aims to study the dynamic relationship between congestion control algorithms and queue management schemes, in order to develop a better understanding of the causal linkages between the two. We propose a loss synchronization module which is user configurable. We validate our model through simulations under controlled settings. Also, we present a performance analysis to provide insights into two important issues concerning 10 Gbps high speed networks: (i impact of bottleneck buffer size on the performance of 10 Gbps high speed network and (ii impact of level of loss synchronization on link utilization-fairness tradeoffs. The practical impact of the proposed work is to provide design guidelines along with a powerful simulation tool to protocol designers and network developers.
RELY: A reliability modeling system for analysis of sodium-sulfur battery configurations
Hostick, C.J.; Huber, H.D.; Doggett, W.H.; Dirks, J.A.; Dovey, J.F.; Grinde, R.B.; Littlefield, J.S.; Cuta, F.M.
1987-06-01
In support of the Office of Energy Storage and Distribution of the US Department of Energy (DOE), Pacific Northwest Laboratory has produced a microcomputer-based software package, called RELY, to assess the impact of sodium-sulfur cell reliability on constant current discharge battery performance. The Fortran-based software operates on IBM microcomputers and IBM-compatibles that have a minimum of 512K of internal memory. The software package has three models that provide the following: (1) a description of the failure distribution parameters used to model cell failure, (2) a Monte Carlo simulation of battery life, and (3) a detailed discharge model for a user-specified battery discharge cycle. 6 refs., 31 figs., 4 tabs.
Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study
Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Meck, Robert A. [U.S. Nuclear Regulatory Commission
2008-10-01
This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intake or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently
Using time-varying covariates in multilevel growth models
D. Betsy McCoach
2010-06-01
Full Text Available This article provides an illustration of growth curve modeling within a multilevel framework. Specifically, we demonstrate coding schemes that allow the researcher to model discontinuous longitudinal data using a linear growth model in conjunction with time varying covariates. Our focus is on developing a level-1 model that accurately reflects the shape of the growth trajectory. We demonstrate the importance of adequately modeling the shape of the level-1 growth trajectory in order to make inferences about the importance of both level-1 and level-2 predictors.
Ishola, Kehinde S; Nawawi, Mohd Nm; Abdullah, Khiruddin; Sabri, Ali Idriss Aboubakar; Adiat, Kola Abdulnafiu
2014-01-01
This study attempts to combine the results of geophysical images obtained from three commonly used electrode configurations using an image processing technique in order to assess their capabilities to reproduce two-dimensional (2-D) resistivity models. All the inverse resistivity models were processed using the PCI Geomatica software package commonly used for remote sensing data sets. Preprocessing of the 2-D inverse models was carried out to facilitate further processing and statistical analyses. Four Raster layers were created, three of these layers were used for the input images and the fourth layer was used as the output of the combined images. The data sets were merged using basic statistical approach. Interpreted results show that all images resolved and reconstructed the essential features of the models. An assessment of the accuracy of the images for the four geologic models was performed using four criteria: the mean absolute error and mean percentage absolute error, resistivity values of the reconstructed blocks and their displacements from the true models. Generally, the blocks of the images of maximum approach give the least estimated errors. Also, the displacement of the reconstructed blocks from the true blocks is the least and the reconstructed resistivities of the blocks are closer to the true blocks than any other combined used. Thus, it is corroborated that when inverse resistivity models are combined, most reliable and detailed information about the geologic models is obtained than using individual data sets.
Microbial Growth Modeling and Simulation Based on Cellular Automata
Hong Men
2013-07-01
Full Text Available In order to simulate the micro-evolutionary process of the microbial growth, [Methods] in this study, we adopt two-dimensional cellular automata as its growth space. Based on evolutionary mechanism of microbial and cell-cell interactions, we adopt Moore neighborhood and make the transition rules. Finally, we construct the microbial growth model. [Results] It can describe the relationships among the cell growth, division and death. And also can effectively reflect spatial inhibition effect and substrate limitation effect. [Conclusions] The simulation results show that CA model is not only consistent with the classic microbial kinetic model, but also be able to simulate the microbial growth and evolution.
Wang, Min; Zhou, Jianxin; Yin, Yajun; Nan, Hai; Zhang, Dongqiao; Tu, Zhixin
2017-10-01
A 3D cellular automata model is used to simulate normal austenitic grain growth in this study. The proposed model considers both the curvature- and thermodynamics-driven mechanisms of growth. The 3D grain growth kinetics shows good agreement with the Beck equation. Moreover, the growth exponent and grain size distribution calculated by the proposed model coincides well with experimental and simulation results from other researchers. A linear relationship is found between the average relative grain size and the grain face number. More specifically, for average relative grain sizes exceeding 0.5, the number of faces increases linearly with relative grain size. For average relative grain sizes meaning by adjusting the actual temperature, space, and time for austenitic grain growth. The calibrated results are found to be in agreement with the simulation results from other research as well as the experimental results. By means of calibration of the proposed model, we can reliably predict the grain size in actual grain growth.
Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers
Kenny, Sean (Technical Monitor); Wertz, Julie
2002-01-01
As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.
Predicting growth conditions from internal metabolic fluxes in an in-silico model of E. coli.
Viswanadham Sridhara
Full Text Available A widely studied problem in systems biology is to predict bacterial phenotype from growth conditions, using mechanistic models such as flux balance analysis (FBA. However, the inverse prediction of growth conditions from phenotype is rarely considered. Here we develop a computational framework to carry out this inverse prediction on a computational model of bacterial metabolism. We use FBA to calculate bacterial phenotypes from growth conditions in E. coli, and then we assess how accurately we can predict the original growth conditions from the phenotypes. Prediction is carried out via regularized multinomial regression. Our analysis provides several important physiological and statistical insights. First, we show that by analyzing metabolic end products we can consistently predict growth conditions. Second, prediction is reliable even in the presence of small amounts of impurities. Third, flux through a relatively small number of reactions per growth source (∼10 is sufficient for accurate prediction. Fourth, combining the predictions from two separate models, one trained only on carbon sources and one only on nitrogen sources, performs better than models trained to perform joint prediction. Finally, that separate predictions perform better than a more sophisticated joint prediction scheme suggests that carbon and nitrogen utilization pathways, despite jointly affecting cellular growth, may be fairly decoupled in terms of their dependence on specific assortments of molecular precursors.
A literature review on growth models and strategies: The missing link in entrepreneurial growth
Syed Fida Hussain Shah
2013-08-01
Full Text Available This study focuses on the importance of growth models, growth strategies, role of knowledge management system in the formulation of effective strategy for the enterprises following growth. Choice of an appropriate growth strategy is at the heart of any successful entrepreneurial venture. Selection of a strategy may be effective for one entrepreneur while it is not for other. Choice of Growth Strategy depends on various different factors, organisational context and environment which may vary from enterprise to enterprise. Resource based view is very important consideration for the entrepreneurs on the path of growth. Evaluation of all kind of resources helps them to grow their enterprises successfully. Selection of an appropriate growth strategy allows the entrepreneurs in overcoming growth challenges and avoiding the growth reversals and setbacks.
Bhowmick, Amiya Ranjan; Bhattacharya, Sabyasachi
2014-08-01
Growth of living organisms is a fundamental biological process. It depicts the physiological development of the species related to the environment. Mathematical development of growth curve models has a long history since its birth. We propose a mathematical model to describe the evolution of relative growth rate as a function of time based on a real life experiment on a major Indian Carp Cirrhinus mrigala. We establish that the proposed model is able to describe the fish growth dynamics more accurately for our experimental data than some existing models e.g. logistic, Gompertz, exponential. Approximate expressions of the points of inflection and the time of achieving the maximum relative growth rate are derived. We study, in detail, the existence of a nonlinear least squares estimator of the model parameters and their consistency properties. Test-statistics is developed to study the equality of points of inflection and equality of the amount of time necessary to achieve the maximum relative growth rate for a species at two different locations. Using the theory of variance stabilizing transformations, we propose a new test statistic to test the effect of the decay parameter for the proposed growth law. The testing procedure is found to be more sensitive in comparison with the test based on nonlinear least squares estimates. Our proposed model provides a general framework to model growth in other disciplines as well.
Nassar, Ahmed; Liu, Qiang; Farias, Kevin; D'Amico, Giuseppe; Tom, Cynthia; Grady, Patrick; Bennett, Ana; Diago Uso, Teresa; Eghtesad, Bijan; Kelly, Dympna; Fung, John; Abu-Elmagd, Kareem; Miller, Charles; Quintini, Cristiano
2015-02-01
Normothermic machine perfusion (NMP) is an emerging preservation modality that holds the potential to prevent the injury associated with low temperature and to promote organ repair that follows ischemic cell damage. While several animal studies have showed its superiority over cold storage (CS), minimal studies in the literature have focused on safety, feasibility, and reliability of this technology, which represent key factors in its implementation into clinical practice. The aim of the present study is to report safety and performance data on NMP of DCD porcine livers. After 60 minutes of warm ischemia time, 20 pig livers were preserved using either NMP (n = 15; physiologic perfusion temperature) or CS group (n = 5) for a preservation time of 10 hours. Livers were then tested on a transplant simulation model for 24 hours. Machine safety was assessed by measuring system failure events, the ability to monitor perfusion parameters, sterility, and vessel integrity. The ability of the machine to preserve injured organs was assessed by liver function tests, hemodynamic parameters, and histology. No system failures were recorded. Target hemodynamic parameters were easily achieved and vascular complications were not encountered. Liver function parameters as well as histology showed significant differences between the 2 groups, with NMP livers showing preserved liver function and histological architecture, while CS livers presenting postreperfusion parameters consistent with unrecoverable cell injury. Our study shows that NMP is safe, reliable, and provides superior graft preservation compared to CS in our DCD porcine model. © The Author(s) 2014.
Chang-bae Moon
2011-01-01
Full Text Available Although there have been many researches on mobile robot localization, it is still difficult to obtain reliable localization performance in a human co-existing real environment. Reliability of localization is highly dependent upon developer's experiences because uncertainty is caused by a variety of reasons. We have developed a range sensor based integrated localization scheme for various indoor service robots. Through the experience, we found out that there are several significant experimental issues. In this paper, we provide useful solutions for following questions which are frequently faced with in practical applications: 1 How to design an observation likelihood model? 2 How to detect the localization failure? 3 How to recover from the localization failure? We present design guidelines of observation likelihood model. Localization failure detection and recovery schemes are presented by focusing on abrupt wheel slippage. Experiments were carried out in a typical office building environment. The proposed scheme to identify the localizer status is useful in practical environments. Moreover, the semi-global localization is a computationally efficient recovery scheme from localization failure. The results of experiments and analysis clearly present the usefulness of proposed solutions.
DeSousa, Diogo Araújo; Zibetti, Murilo Ricardo; Trentini, Clarissa Marceli; Koller, Silvia Helena; Manfro, Gisele Gus; Salum, Giovanni Abrahão
2014-12-01
The aim of this study was to investigate the utility of creating and scoring subscales for the self-report version of the Screen for Child Anxiety Related Emotional Disorders (SCARED) by examining whether subscale scores provide reliable information after accounting for a general anxiety factor in a bifactor model analysis. A total of 2420 children aged 9-18 answered the SCARED in their schools. Results suggested adequate fit of the bifactor model. The SCARED score variance was hardly influenced by the specific domains after controlling for the common variance in the general factor. The explained common variance (ECV) for the general factor was large (63.96%). After accounting for the general total score (ωh=.83), subscale scores provided very little reliable information (ωh ranged from .005 to .04). Practitioners that use the SCARED should be careful when scoring and interpreting the instrument subscales since there is more common variance to them than specific variance. Copyright © 2014 Elsevier Ltd. All rights reserved.
Parameter Estimates in Differential Equation Models for Population Growth
Winkel, Brian J.
2011-01-01
We estimate the parameters present in several differential equation models of population growth, specifically logistic growth models and two-species competition models. We discuss student-evolved strategies and offer "Mathematica" code for a gradient search approach. We use historical (1930s) data from microbial studies of the Russian biologist,…
Phase field modeling of dendrite growth
Yutuo ZHANG; Chengzhi WANG; Dianzhong LI; Yiyi LI
2009-01-01
Single dendrite and multi-dendrite growth for A1-2 mol pct Si alloy during isothermal solidification are simulated by phase field method. In the case of single equiaxed dendrite growth, the secondary and the necking phenomenon can be observed. For multi-dendrite growth, there exists the competitive growth among the dendrites dur-ing solidification. As solidification proceeds, growing and coarsening of the primary arms occurs, together with the branching and coarsening of the secondary arms.When the diffusion fields of dendrite tips come into contact with those of the branches growing from the neighboring dendrites, the dendrites stop growing and being to ripen and thicken.
Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Toloczko, Mychailo B.; Johnson, Kenneth I.; Sanborn, Scott E.
2011-07-01
This is a working report drafted under the Risk-Informed Safety Margin Characterization pathway of the Light Water Reactor Sustainability Program, describing statistical models of passives component reliabilities.
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
Halo Models of Large Scale Structure and Reliability of Cosmological N-Body Simulations
Gaite, Jose
2013-01-01
Halo models of the large scale structure of the Universe are critically examined, focusing on the definition of halos as smooth distributions of cold dark matter. This definition is essentially based on the results of cosmological N-body simulations. By a careful analysis of the standard assumptions of halo models and N-body simulations and by taking into account previous studies of self-similarity of the cosmic web structure, we conclude that N-body cosmological simulations are not fully reliable in the range of scales where halos appear. Therefore, to have a consistent definition of halos, it is necessary either to define them as entities of arbitrary size with a grainy rather than smooth structure or to define their size in terms of small-scale baryonic physics.
Halo Models of Large Scale Structure and Reliability of Cosmological N-Body Simulations
José Gaite
2013-05-01
Full Text Available Halo models of the large scale structure of the Universe are critically examined, focusing on the definition of halos as smooth distributions of cold dark matter. This definition is essentially based on the results of cosmological N-body simulations. By a careful analysis of the standard assumptions of halo models and N-body simulations and by taking into account previous studies of self-similarity of the cosmic web structure, we conclude that N-body cosmological simulations are not fully reliable in the range of scales where halos appear. Therefore, to have a consistent definition of halos is necessary either to define them as entities of arbitrary size with a grainy rather than smooth structure or to define their size in terms of small-scale baryonic physics.
Popov, A.; Zolotarev, V.; Bychkov, S.
2016-11-01
This paper examines the results of experimental studies of a previously submitted combined algorithm designed to increase the reliability of information systems. The data that illustrates the organization and conduct of the studies is provided. Within the framework of a comparison of As a part of the study conducted, the comparison of the experimental data of simulation modeling and the data of the functioning of the real information system was made. The hypothesis of the homogeneity of the logical structure of the information systems was formulated, thus enabling to reconfigure the algorithm presented, - more specifically, to transform it into the model for the analysis and prediction of arbitrary information systems. The results presented can be used for further research in this direction. The data of the opportunity to predict the functioning of the information systems can be used for strategic and economic planning. The algorithm can be used as a means for providing information security.
An autocatalytic kinetic model for describing microbial growth during fermentation.
Ibarz, Albert; Augusto, Pedro E D
2015-01-01
The mathematical modelling of the behaviour of microbial growth is widely desired in order to control, predict and design food and bioproduct processing, stability and safety. This work develops and proposes a new semi-empirical mathematical model, based on an autocatalytic kinetic, to describe the microbial growth through its biomass concentration. The proposed model was successfully validated using 15 microbial growth patterns, covering the three most important types of microorganisms in food and biotechnological processing (bacteria, yeasts and moulds). Its main advantages and limitations are discussed, as well as the interpretation of its parameters. It is shown that the new model can be used to describe the behaviour of microbial growth.
Bifactor Modeling and the Estimation of Model-Based Reliability in the WAIS-IV
Gignac, Gilles E.; Watkins, Marley W.
2013-01-01
Previous confirmatory factor analytic research that has examined the factor structure of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) has endorsed either higher order models or oblique factor models that tend to amalgamate both general factor and index factor sources of systematic variance. An alternative model that has not yet…
Bifactor Modeling and the Estimation of Model-Based Reliability in the WAIS-IV
Gignac, Gilles E.; Watkins, Marley W.
2013-01-01
Previous confirmatory factor analytic research that has examined the factor structure of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) has endorsed either higher order models or oblique factor models that tend to amalgamate both general factor and index factor sources of systematic variance. An alternative model that has not yet…
Progress in modeling of fluid flows in crystal growth processes
Qisheng Chen; Yanni Jiang; Junyi Yan; Ming Qin
2008-01-01
Modeling of fluid flows in crystal growth processes has become an important research area in theoretical and applied mechanics.Most crystal growth processes involve fluid flows,such as flows in the melt,solution or vapor.Theoretical modeling has played an important role in developing technologies used for growing semiconductor crystals for high performance electronic and optoelectronic devices.The application of devices requires large diameter crystals with a high degree of crystallographic perfection,low defect density and uniform dopant distribution.In this article,the flow models developed in modeling of the crystal growth processes such as Czochralski,ammono-thermal and physical vapor transport methods are reviewed.In the Czochralski growth modeling,the flow models for thermocapillary flow,turbulent flow and MHD flow have been developed.In the ammonothermal growth modeling,the buoyancy and porous media flow models have been developed based on a single-domain and continuum approach for the composite fluid-porous layer systems.In the physical vapor transport growth modeling,the Stefan flow model has been proposed based on the flow-kinetics theory for the vapor growth.In addition,perspectives for future studies on crystal growth modeling are proposed.
Yun, Lifen; Wang, Xifu; Fan, Hongqiang; Li, Xiaopeng
2017-01-01
This paper proposes a reliable facility location design model under imperfect information with site-dependent disruptions; i.e., each facility is subject to a unique disruption probability that varies across the space. In the imperfect information contexts, customers adopt a realistic "trial-and-error" strategy to visit facilities; i.e., they visit a number of pre-assigned facilities sequentially until they arrive at the first operational facility or give up looking for the service. This proposed model aims to balance initial facility investment and expected long-term operational cost by finding the optimal facility locations. A nonlinear integer programming model is proposed to describe this problem. We apply a linearization technique to reduce the difficulty of solving the proposed model. A number of problem instances are studied to illustrate the performance of the proposed model. The results indicate that our proposed model can reveal a number of interesting insights into the facility location design with site-dependent disruptions, including the benefit of backup facilities and system robustness against variation of the loss-of-service penalty.
Keith Zvoch
2006-01-01
Full Text Available Mathematics achievement data from three longitudinally matched student cohorts were analyzed with multilevel growth models to investigate the viability of using status and growth-based indices of student achievement to examine the multi-year performance of schools. Elementary schools in a large southwestern school district were evaluated in terms of the mean achievement status and growth of students across cohorts as well as changes in the achievement status and growth of students between student cohorts. Results indicated that the cross and between-cohort performance of schools differed depending on whether the mean achievement status or growth of students was considered. Results also indicated that the cross-cohort indicators of school performance were more reliably estimated than their between-cohort counterparts. Further examination of the performance indices revealed that cross-cohort achievement status estimates were closely related to student demographics while between-cohort estimates were associated with cohort enrollment size and cohort initial performance status. Of the four school performance indices studied, only student growth in achievement (averaged across cohorts provided a relatively reliable and unbiased indication of school performance. Implications for the No Child Left Behind school accountability framework are discussed.
Stamp, Jason E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2016-01-01
Microgrids are a focus of localized energy production that support resiliency, security, local con- trol, and increased access to renewable resources (among other potential benefits). The Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) Joint Capa- bility Technology Demonstration (JCTD) program between the Department of Defense (DOD), Department of Energy (DOE), and Department of Homeland Security (DHS) resulted in the pre- liminary design and deployment of three microgrids at military installations. This paper is focused on the analysis process and supporting software used to determine optimal designs for energy surety microgrids (ESMs) in the SPIDERS project. There are two key pieces of software, an ex- isting software application developed by Sandia National Laboratories (SNL) called Technology Management Optimization (TMO) and a new simulation developed for SPIDERS called the per- formance reliability model (PRM). TMO is a decision support tool that performs multi-objective optimization over a mixed discrete/continuous search space for which the performance measures are unrestricted in form. The PRM is able to statistically quantify the performance and reliability of a microgrid operating in islanded mode (disconnected from any utility power source). Together, these two software applications were used as part of the ESM process to generate the preliminary designs presented by SNL-led DOE team to the DOD. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military instal- lations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Tarek Abdallah, Melanie
Bönecke, Eric; Franko, Uwe
2015-04-01
Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical
A dynamic model of tomato fruit growth integrating cell division, cell growth and endoreduplication
Fanwoua, J.; Visser, de P.H.B.; Heuvelink, E.; Yin, X.; Struik, P.C.; Marcelis, L.F.M.
2013-01-01
In this study, we developed a model of tomato (Solanum lycopersicum L.) fruit growth integrating cell division, cell growth and endoreduplication. The fruit was considered as a population of cells grouped in cell classes differing in their initial cell age and cell mass. The model describes fruit gr
Aubry, Keith B.; Raley, Catherine M.; McKelvey, Kevin S.
2017-01-01
The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs) for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated for rare, elusive, and cryptic species that are prone to misidentification in the field. We investigated this question for the fisher (Pekania pennanti), a forest carnivore of conservation concern in the Pacific States that is often confused with the more common Pacific marten (Martes caurina). Fisher occurrence records supported by physical evidence (verifiable records) were available from a limited area, whereas occurrence records of unknown quality (unscreened records) were available from throughout the fisher’s historical range. We reserved 20% of the verifiable records to use as a test sample for both models and generated SDMs with each dataset using Maxent. The verifiable model performed substantially better than the unscreened model based on multiple metrics including AUCtest values (0.78 and 0.62, respectively), evaluation of training and test gains, and statistical tests of how well each model predicted test localities. In addition, the verifiable model was consistent with our knowledge of the fisher’s habitat relations and potential distribution, whereas the unscreened model indicated a much broader area of high-quality habitat (indices > 0.5) that included large expanses of high-elevation habitat that fishers do not occupy. Because Pacific martens remain relatively common in upper elevation habitats in the Cascade Range and Sierra Nevada, the SDM based on unscreened records likely reflects primarily a conflation of marten and fisher habitat. Consequently, accurate identifications are far more important than the spatial extent of occurrence records for generating reliable SDMs for the
Modeling Urban Spatial Growth in Mountainous Regions of Western China
Guoping Huang
2017-08-01
Full Text Available The scale and speed of urbanization in the mountainous regions of western China have received little attention from researchers. These cities are facing rapid population growth and severe environmental degradation. This study analyzed historical urban growth trends in this mountainous region to better understand the interaction between the spatial growth pattern and the mountainous topography. Three major factors—slope, accessibility, and land use type—were studied in light of their relationships with urban spatial growth. With the analysis of historical data as the basis, a conceptual urban spatial growth model was devised. In this model, slope, accessibility, and land use type together create resistance to urban growth, while accessibility controls the sequence of urban development. The model was tested and evaluated using historical data. It serves as a potential tool for planners to envision and assess future urban growth scenarios and their potential environmental impacts to make informed decisions.
FTEC: a coalescent simulator for modeling faster than exponential growth
Reppell, Mark; Boehnke, Michael; Zöllner, Sebastian
2012-01-01
Summary: Recent genetic studies as well as recorded history point to massive growth in human population sizes during the recent past. To model and understand this growth accurately we introduce FTEC, an easy-to-use coalescent simulation program capable of simulating haplotype samples drawn from a population that has undergone faster than exponential growth. Samples drawn from a population that has undergone faster than exponential growth show an excess of very rare variation and more rapid LD...
Reliability and Stability of VLBI-Derived Sub-Daily EOP Models
Artz, Thomas; Boeckmann, Sarah; Jensen, Laura; Nothnagel, Axel; Steigenberger, Peter
2010-01-01
Recent investigations have shown significant shortcomings in the model which is proposed by the IERS to account for the variations in the Earth s rotation with periods around one day and less. To overcome this, an empirical model can be estimated more or less directly from the observations of space geodetic techniques. The aim of this paper is to evaluate the quality and reliability of such a model based on VLBI observations. Therefore, the impact of the estimation method and the analysis options as well as the temporal stability are investigated. It turned out that, in order to provide a realistic accuracy measure of the model coefficients, the formal errors should be inflated by a factor of three. This coincides with the noise floor and the repeatability of the model coefficients and it captures almost all of the differences that are caused by different estimation techniques. The impact of analysis options is small but significant when changing troposphere parameterization or including harmonic station position variations.
A consistent modelling methodology for secondary settling tanks: a reliable numerical method.
Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena
2013-01-01
The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.
Rahman, P. A.; Bobkova, E. Yu
2017-01-01
This paper deals with a reliability model of the restorable non-stop computing system with triple-modular redundancy based on independent computing nodes, taking into consideration the finite time for node activation and different node failure rates in the active and passive states. The obtained by authors generalized reliability model and calculation formulas for reliability indices for the system based on identical and independent computing nodes with the given threshold for quantity of active nodes, at which system is considered as operable, are also discussed. Finally, the application of the generalized model to the particular case of the non-stop restorable computing system with triple-modular redundancy based on independent nodes and calculation examples for reliability indices are also provided.
Differential model of macroeconomic growth with endogenic cyclicity
Mikhail I. Geraskin
2017-09-01
Full Text Available Objective to elaborate a mathematical model of economic growth taking into account the cyclical nature of macroeconomic dynamics with the model parameters based on the Russian economy statistics. Methods economic and mathematical modeling system analysis regression factor analysis econometric time series analysis. Results the article states that under unstable economic growth in Russia forecasting of strategic prospects of the Russian economy is one of the topical directions of scientific studies. Furthermore construction of predictive models should be based on multiple factors taking into account such basic concepts as the neoKeynesian HarrodDomar model Ramsey ndash Cass ndash Koopmans model S. V. Dubovskiyrsquos concept as well as the neoclassical growth model by R. Solow. They served as the basis for developing a multifactor differential economic growth model which is a modification of the neoclassical growth model by R. Solow taking into account the laborsaving and capitalsaving forms of scientifictechnical progress and the Keynesian concept of investment. The model parameters are determined based on the dynamics of actual GDP employment fixed assets and investments in fixed assets for 19652016 in Russia on the basis of official statistics. The generalized model showed the presence of longwave fluctuations that are not detected during the individual periods modeling. The cyclical nature of macroeconomic dynamics with a period of 54 years was found which corresponds to the parameters of long waves by N. D. Kondratiev. Basing on the model the macroeconomic growth forecast was generated which shows that after 2020 the increase of scientifictechnical progress will be negative. Scientific novelty a model is proposed of the scientifictechnical progress indicator showing the growth rate of the capital productivity ratio to the saving rate a differential model of macroeconomic growth is obtained which endogenously takes cyclicity into account
Time-Dependent Reliability Modeling and Analysis Method for Mechanics Based on Convex Process
Lei Wang
2015-01-01
Full Text Available The objective of the present study is to evaluate the time-dependent reliability for dynamic mechanics with insufficient time-varying uncertainty information. In this paper, the nonprobabilistic convex process model, which contains autocorrelation and cross-correlation, is firstly employed for the quantitative assessment of the time-variant uncertainty in structural performance characteristics. By combination of the set-theory method and the regularization treatment, the time-varying properties of structural limit state are determined and a standard convex process with autocorrelation for describing the limit state is formulated. By virtue of the classical first-passage method in random process theory, a new nonprobabilistic measure index of time-dependent reliability is proposed and its solution strategy is mathematically conducted. Furthermore, the Monte-Carlo simulation method is also discussed to illustrate the feasibility and accuracy of the developed approach. Three engineering cases clearly demonstrate that the proposed method may provide a reasonable and more efficient way to estimate structural safety than Monte-Carlo simulations throughout a product life-cycle.
Overview of RELCOMP, the reliability and cost model for electrical generation planning
Buehring, W.A.; Hub, K.A.; VanKuiken, J.C.
1979-11-01
RELCOMP is a system-planning tool that can be used to assess the reliability and economic performance of alternative expansion patterns of electric-utility-generating systems. Given input information such as capacity, forced outage rate, number of weeks of annual scheduled maintenance, and economic data for individual units along with the expected utility load characteristics, the nonoptimizing model calculates a system maintenance schedule, the loss-of-load probability, unserved demand for energy, mean time between system failures to meet the load, required reserve to meet a specified system-failure rate, expected energy generation from each unit, and system energy cost. Emergency interties and firm purchases can be included in the analysis. The calculation can be broken down into five distinct categories: maintenance scheduling, system reliability, capacity requirement, energy allocation, and energy cost. This brief description of the program is intended to serve as preliminary documentation for RELCOMP until a more-complete documentation is prepared. In addition to this documentation, a sample problem and a detailed input description are available from the authors.
Vrba, E S
1998-02-07
New models for multiphasic growth are presented. They are illustrated by analysis of brain growth in humans and chimpanzees, and the results are used to test the hypothesis of evolution by proportional growth prolongation: that all descendant growth phases are extended by the same factor while each remains at the ancestral growth rate. The results are consistent with the hypothesis and imply that gross brain weight increase towards humans required change in only one growth parameter: prolongation of the nonlinear ancestral growth phases. The restricted and orderly nature of the developmental changes hints at a basis in few genetic changes. Proportional growth prolongation is of general evolutionary importance because it can reorganize body proportions.
Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian
2013-01-01
Reliability analysis of fiber-reinforced composite structures is a relatively unexplored field, and it is therefore expected that engineers and researchers trying to apply such an approach will meet certain challenges until more knowledge is accumulated. While doing the analyses included...... in the present paper, the authors have experienced some of the possible pitfalls on the way to complete a precise and robust reliability analysis for layered composites. Results showed that in order to obtain accurate reliability estimates it is necessary to account for the various failure modes described...... by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...
Li, Yan; Chen, Jianjun; Liu, Jipeng; Zhang, Lei; Wang, Weiguo; Zhang, Shaofeng
2013-09-01
The reliability of all-ceramic crowns is of concern to both patients and doctors. This study introduces a new methodology for quantifying the reliability of all-ceramic crowns based on the stress-strength interference theory and finite element models. The variables selected for the reliability analysis include the magnitude of the occlusal contact area, the occlusal load and the residual thermal stress. The calculated reliabilities of crowns under different loading conditions showed that too small occlusal contact areas or too great a difference of the thermal coefficient between veneer and core layer led to high failure possibilities. There results were consistent with many previous reports. Therefore, the methodology is shown to be a valuable method for analyzing the reliabilities of the restorations in the complicated oral environment.
Graciele Viccini
2003-01-01
Full Text Available A mathematical model is developed for converting between the two measurement bases commonly used in the construction of growth profiles in solid-state fermentation, namely absolute mass ratio m(dry biomass/m(initial dry matter and relative mass ratio m(dry biomass/m(dry matter. These are not equivalent, due to the loss of dry matter as CO2 during the fermentation. The model is equally applicable to any biomass component used in indirect measurements of growth, such as protein. Use of the model to convert absolute mass ratio of the biomass profiles for the growth of Rhizopus oligosporus to a relative basis gave profiles that agreed well with the experimentally determined relative biomass profiles. This agreement was obtained for three different fermentations using the same set of parameter values in the model, namely a yield coefficient of m(protein/m(dry substrate = 0.2 g/g and a maintenance coefficient of zero, giving confidence in the reliability of the model. The model was then used to show that the measurement basis used can affect the form of the curve and therefore can also affect the conclusion drawn about the type of kinetics shown by the organism, with the extent of this effect depending on the length of time that growth occurs and the values of the yield and maintenance coefficients. This work shows that great care must be taken in drawing conclusions about growth kinetics in solid-state fermentation.
Computational modeling of hypertensive growth in the human carotid artery
Sáez, Pablo; Peña, Estefania; Martínez, Miguel Angel; Kuhl, Ellen
2014-06-01
Arterial hypertension is a chronic medical condition associated with an elevated blood pressure. Chronic arterial hypertension initiates a series of events, which are known to collectively initiate arterial wall thickening. However, the correlation between macrostructural mechanical loading, microstructural cellular changes, and macrostructural adaptation remains unclear. Here, we present a microstructurally motivated computational model for chronic arterial hypertension through smooth muscle cell growth. To model growth, we adopt a classical concept based on the multiplicative decomposition of the deformation gradient into an elastic part and a growth part. Motivated by clinical observations, we assume that the driving force for growth is the stretch sensed by the smooth muscle cells. We embed our model into a finite element framework, where growth is stored locally as an internal variable. First, to demonstrate the features of our model, we investigate the effects of hypertensive growth in a real human carotid artery. Our results agree nicely with experimental data reported in the literature both qualitatively and quantitatively.
Centrality Fingerprints for Power Grid Network Growth Models
Gurfinkel, Aleks Jacob; Rikvold, Per Arne
2015-01-01
In our previous work, we have shown that many of the properties of the Florida power grid are reproduced by deterministic network growth models based on the minimization of energy dissipation $E_\\mathrm{diss}$. As there is no $a~ priori$ best $E_\\mathrm{diss}$ minimizing growth model, we here present a tool, called the "centrality fingerprint," for probing the behavior of different growth models. The centrality fingerprints are comparisons of the current flow into/out of the network with the values of various centrality measures calculated at every step of the growth process. Finally, we discuss applications to the Maryland power grid.
Shelby D. Hunt
2012-12-01
Full Text Available Foss (2012 provides an informed and informative comment on my article “Trust, Personal Moral Codes, and the Resource-Advantage Theory of Competition: Explaining Productivity, Economic Growth, and Wealth Creation” (Hunt, 2012. In general, his comment is highly supportive of both the theory and the arguments developed in my article. He does, however, raise certain issues that need to be addressed. These issues relate to the concept of total factor productivity, the role of institutions in promoting economic growth, and the importance of understanding how transaction costs impact entrepreneurship and economic growth. This reply focuses on his discussion of growth economics and endogenous economic growth models.
Development of a model selection method based on the reliability of a soft sensor model
Takeshi Okada,
2012-04-01
Full Text Available Soft sensors are widely used to realize highly efficient operation in chemical process because every important variablesuch as product quality is not measured online. By using soft sensors, such a difficult-to-measure variable y can be estimatedby other process variables which are measured online. In order to estimate values of y without degradation of a soft sensormodel, a time difference (TD model was proposed previously. Though a TD model has high predictive ability, the model doesnot function well when process conditions have never been observed. To cope with this problem, a soft sensor model can beupdated with newest data. But updating a model needs time and effort for plant operators. We therefore developed an onlinemonitoring system to judge whether a TD model can predict values of y accurately or an updating model should be used forboth reducing maintenance cost and improving predictive accuracy of soft sensors. The monitoring system is based onsupport vector machine or standard deviation of y-values estimated from various intervals of time difference. We confirmedthat the proposed system has functioned successfully through the analysis of real industrial data of a distillation process.
Balance of payments constrained growth models: history and overview
Anthony P. Thirlwall
2011-12-01
Full Text Available Thirlwall’s 1979 balance of payments constrained growth model predicts that a country’s long run growth of GDP can be approximated by the ratio of the growth of real exports to the income elasticity of demand for imports assuming negligible effects from real exchange rate movements. The paper surveys developments of the model since then, allowing for capital flows, interest payments on debt, terms of trade movements, and disaggregation of the model by commodities and trading partners. Various tests of the model are discussed, and an extensive list of papers that have examined the model is presented.
Stochastic growth logistic model with aftereffect for batch fermentation process
Rosli, Norhayati; Ayoubi, Tawfiqullah [Faculty of Industrial Sciences and Technology, Universiti Malaysia Pahang, Lebuhraya Tun Razak, 26300 Gambang, Pahang (Malaysia); Bahar, Arifah; Rahman, Haliza Abdul [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia); Salleh, Madihah Md [Department of Biotechnology Industry, Faculty of Biosciences and Bioengineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia)
2014-06-19
In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.
Stochastic growth logistic model with aftereffect for batch fermentation process
Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md
2014-06-01
In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.
Estimation of growth parameters using a nonlinear mixed Gompertz model.
Wang, Z; Zuidhof, M J
2004-06-01
In order to maximize the utility of simulation models for decision making, accurate estimation of growth parameters and associated variances is crucial. A mixed Gompertz growth model was used to account for between-bird variation and heterogeneous variance. The mixed model had several advantages over the fixed effects model. The mixed model partitioned BW variation into between- and within-bird variation, and the covariance structure assumed with the random effect accounted for part of the BW correlation across ages in the same individual. The amount of residual variance decreased by over 55% with the mixed model. The mixed model reduced estimation biases that resulted from selective sampling. For analysis of longitudinal growth data, the mixed effects growth model is recommended.
Liquefaction of Tangier soils by using physically based reliability analysis modelling
Dubujet P.
2012-07-01
Full Text Available Approaches that are widely used to characterize propensity of soils to liquefaction are mainly of empirical type. The potential of liquefaction is assessed by using correlation formulas that are based on field tests such as the standard and the cone penetration tests. These correlations depend however on the site where they were derived. In order to adapt them to other sites where seismic case histories are not available, further investigation is required. In this work, a rigorous one-dimensional modelling of the soil dynamics yielding liquefaction phenomenon is considered. Field tests consisting of core sampling and cone penetration testing were performed. They provided the necessary data for numerical simulations performed by using DeepSoil software package. Using reliability analysis, the probability of liquefaction was estimated and the obtained results were used to adapt Juang method to the particular case of sandy soils located in Tangier.
Modelling a reliability system governed by discrete phase-type distributions
Ruiz-Castro, Juan Eloy [Departamento de Estadistica e Investigacion Operativa, Universidad de Granada, 18071 Granada (Spain)], E-mail: jeloy@ugr.es; Perez-Ocon, Rafael [Departamento de Estadistica e Investigacion Operativa, Universidad de Granada, 18071 Granada (Spain)], E-mail: rperezo@ugr.es; Fernandez-Villodre, Gemma [Departamento de Estadistica e Investigacion Operativa, Universidad de Granada, 18071 Granada (Spain)
2008-11-15
We present an n-system with one online unit and the others in cold standby. There is a repairman. When the online fails it goes to repair, and instantaneously a standby unit becomes the online one. The operational and repair times follow discrete phase-type distributions. Given that any discrete distribution defined on the positive integers is a discrete phase-type distribution, the system can be considered a general one. A model with unlimited number of units is considered for approximating a system with a great number of units. We show that the process that governs the system is a quasi-birth-and-death process. For this system, performance reliability measures; the up and down periods, and the involved costs are calculated in a matrix and algorithmic form. We show that the discrete case is not a trivial case of the continuous one. The results given in this paper have been implemented computationally with Matlab.
A multi-objective reliable programming model for disruption in supply chain
Emran Mohammadi
2013-05-01
Full Text Available One of the primary concerns on supply chain management is to handle risk components, properly. There are various reasons for having risk in supply chain such as natural disasters, unexpected incidents, etc. When a series of facilities are built and deployed, one or a number of them could probably fail at any time due to bad weather conditions, labor strikes, economic crises, sabotage or terrorist attacks and changes in ownership of the system. The objective of risk management is to reduce the effects of different domains to an acceptable level. To overcome the risk, we propose a reliable capacitated supply chain network design (RSCND model by considering random disruptions risk in both distribution centers and suppliers. The proposed study of this paper considers three objective functions and the implementation is verified using some instance.
From field to globe: upscaling of crop growth modelling
Bussel, van L.G.J.
2011-01-01
Recently, the scale of interest for application of crop growth models has extended to the region or even globe with time frames of 50-100 years. The application at larger scales of a crop growth model originally developed for a small scale without any adaptation might lead to errors and inaccuracies
Villar, Oscar Armando Esparza-Del; Montañez-Alvarado, Priscila; Gutiérrez-Vega, Marisela; Carrillo-Saucedo, Irene Concepción; Gurrola-Peña, Gloria Margarita; Ruvalcaba-Romero, Norma Alicia; García-Sánchez, María Dolores; Ochoa-Alcaraz, Sergio Gabriel
2017-03-01
Mexico is one of the countries with the highest rates of overweight and obesity around the world, with 68.8% of men and 73% of women reporting both. This is a public health problem since there are several health related consequences of not exercising, like having cardiovascular diseases or some types of cancers. All of these problems can be prevented by promoting exercise, so it is important to evaluate models of health behaviors to achieve this goal. Among several models the Health Belief Model is one of the most studied models to promote health related behaviors. This study validates the first exercise scale based on the Health Belief Model (HBM) in Mexicans with the objective of studying and analyzing this model in Mexico. Items for the scale called the Exercise Health Belief Model Scale (EHBMS) were developed by a health research team, then the items were applied to a sample of 746 participants, male and female, from five cities in Mexico. The factor structure of the items was analyzed with an exploratory factor analysis and the internal reliability with Cronbach's alpha. The exploratory factor analysis reported the expected factor structure based in the HBM. The KMO index (0.92) and the Barlett's sphericity test (p factor loadings, ranging from 0.31 to 0.92, and the internal consistencies of the factors were also acceptable, with alpha values ranging from 0.67 to 0.91. The EHBMS is a validated scale that can be used to measure exercise based on the HBM in Mexican populations.
WelFur-mink: inter-observer reliability of on-farm welfare assessment in the growth season
Møller, Steen Henrik; Rousing, Tine; Hansen, Steffen W
2012-01-01
the consequences of operating with several observers. Animal based measures on 9 Danish mink farms were taken in November 2011. Eight observers individually, but in paris on herd level, carried out data collection on the measures involving subjective grading, e.g. mink "activity", "injuries" and "fur......-chewing" on approximately 120 cages with mink per farm. The assessment of the two observers gave similar frequencies of welfare problems and thus similar welfare assessments. The individual problems observed were however, not the same leading to poor or fair, but rarely good inter observer reliability. Despite the skilled...... assessors, the short training was not sufficient to get highly reliable results. No overall difference was found between the inter observer reliability of cages with ≤2 or ≥3 mink in a cage. More training and better training material and, for some measures, observation procedures are needed in order...
Reliability of some ageing nuclear power plant system: a simple stochastic model
Suarez-Antola, Roberto [Catholic University of Uruguay, Montevideo (Uruguay). School of Engineering and Technologies; Ministerio de Industria, Energia y Mineria, Montevideo (Uruguay). Direccion Nacional de Energia y Tecnologia Nuclear; E-mail: rsuarez@ucu.edu.uy
2007-07-01
The random number of failure-related events in certain repairable ageing systems, like certain nuclear power plant components, during a given time interval, may be often modelled by a compound Poisson distribution. One of these is the Polya-Aeppli distribution. The derivation of a stationary Polya-Aeppli distribution as a limiting distribution of rare events for stationary Bernouilli trials with first order Markov dependence is considered. But if the parameters of the Polya-Aeppli distribution are suitable time functions, we could expect that the resulting distribution would allow us to take into account the distribution of failure-related events in an ageing system. Assuming that a critical number of damages produce an emergent failure, the above mentioned results can be applied in a reliability analysis. It is natural to ask under what conditions a Polya-Aeppli distribution could be a limiting distribution for non-homogeneous Bernouilli trials with first order Markov dependence. In this paper this problem is analyzed and possible applications of the obtained results to ageing or deteriorating nuclear power plant components are considered. The two traditional ways of modelling repairable systems in reliability theory: the 'as bad as old' concept, that assumes that the replaced component is exactly under the same conditions as was the aged component before failure, and the 'as good as new' concept, that assumes that the new component is under the same conditions of the replaced component when it was new, are briefly discussed in relation with the findings of the present work. (author)
Limiting Shapes for Deterministic Centrally Seeded Growth Models
Fey-den Boer, Anne; Redig, Frank
2007-01-01
We study the rotor router model and two deterministic sandpile models. For the rotor router model in ℤ d , Levine and Peres proved that the limiting shape of the growth cluster is a sphere. For the other two models, only bounds in dimension 2 are known. A unified approach for these models with a
Non-rigid image registration using bone growth model
Bro-Nielsen, Morten; Gramkow, Claus; Kreiborg, Sven
1997-01-01
Non-rigid registration has traditionally used physical models like elasticity and fluids. These models are very seldom valid models of the difference between the registered images. This paper presents a non-rigid registration algorithm, which uses a model of bone growth as a model of the change b...
Limiting Shapes for Deterministic Centrally Seeded Growth Models
Fey-den Boer, Anne; Redig, Frank
2007-01-01
We study the rotor router model and two deterministic sandpile models. For the rotor router model in ℤ d , Levine and Peres proved that the limiting shape of the growth cluster is a sphere. For the other two models, only bounds in dimension 2 are known. A unified approach for these models with a
WelFur-mink: inter-observer reliability of on-farm welfare assessment in the growth season
Møller, Steen Henrik; Rousing, Tine; Hansen, Steffen W
2012-01-01
A welfare assessment system should be "high" in validity, robustness and feasibility - the latter both as regards time and costs. Therefore, observers must be able to perform the on-farm assessment with acceptable validity after some training. Based on empiric data this paper evaluates...... assessors, the short training was not sufficient to get highly reliable results. No overall difference was found between the inter observer reliability of cages with ≤2 or ≥3 mink in a cage. More training and better training material and, for some measures, observation procedures are needed in order...
Modeling of dendritic growth in the presence of convection
ZHU; Mingfang; DAI; Ting; LEE; Sungyoon; HONG; Chunpyo
2005-01-01
A two-dimensional coupling modified cellular automaton (MCA)-transport model has been employed to investigate the asymmetrical dendritic growth behavior in a flowing melt. In the present model, the cellular automaton method for crystal growth is incorporated with a transport model, for numerical calculating of the fluid flow and mass transport by both convection and diffusion. The MCA takes into account the effects of the thermal, the constitutional and the curvature undercoolings on dendritic growth. It also considers the preferred growth orientation of crystal and solute redistribution during solidification. In the transport model, the SIMPLE scheme and a fully implicit finite volume method are employed to solve the governing equations of momentum and species transfers. The present model was applied to simulating the evolution of a single dendrite and multi-dendrites of an Al-3mass%Cu alloy in a forced flow. The simulated results show that dendritic growth morphology is strongly influenced by melt convection.
Reliable groundwater levels: failures and lessons learned from modeling and monitoring studies
Van Lanen, Henny A. J.
2017-04-01
Adequate management of groundwater resources requires an a priori assessment of impacts of intended groundwater abstractions. Usually, groundwater flow modeling is used to simulate the influence of the planned abstraction on groundwater levels. Model performance is tested by using observed groundwater levels. Where a multi-aquifer system occurs, groundwater levels in the different aquifers have to be monitored through observation wells with filters at different depths, i.e. above the impermeable clay layer (phreatic water level) and beneath (artesian aquifer level). A reliable artesian level can only be measured if the space between the outer wall of the borehole (vertical narrow shaft) and the observation well is refilled with impermeable material at the correct depth (post-drilling phase) to prevent a vertical hydraulic connection between the artesian and phreatic aquifer. We were involved in improper refilling, which led to impossibility to monitor reliable artesian aquifer levels. At the location of the artesian observation well, a freely overflowing spring was seen, which implied water leakage from the artesian aquifer affected the artesian groundwater level. Careful checking of the monitoring sites in a study area is a prerequisite to use observations for model performance assessment. After model testing the groundwater model is forced with proposed groundwater abstractions (sites, extraction rates). The abstracted groundwater volume is compensated by a reduction of groundwater flow to the drainage network and the model simulates associated groundwater tables. The drawdown of groundwater level is calculated by comparing the simulated groundwater level with and without groundwater abstraction. In lowland areas, such as vast areas of the Netherlands, the groundwater model has to consider a variable drainage network, which means that small streams only carry water during the wet winter season, and run dry during the summer. The main streams drain groundwater
A Correlated Model for Evaluating Performance and Energy of Cloud System Given System Reliability
Hongli Zhang
2015-01-01
Full Text Available The serious issue of energy consumption for high performance computing systems has attracted much attention. Performance and energy-saving have become important measures of a computing system. In the cloud computing environment, the systems usually allocate various resources (such as CPU, Memory, Storage, etc. on multiple virtual machines (VMs for executing tasks. Therefore, the problem of resource allocation for running VMs should have significant influence on both system performance and energy consumption. For different processor utilizations assigned to the VM, there exists the tradeoff between energy consumption and task completion time when a given task is executed by the VMs. Moreover, the hardware failure, software failure and restoration characteristics also have obvious influences on overall performance and energy. In this paper, a correlated model is built to analyze both performance and energy in the VM execution environment given the reliability restriction, and an optimization model is presented to derive the most effective solution of processor utilization for the VM. Then, the tradeoff between energy-saving and task completion time is studied and balanced when the VMs execute given tasks. Numerical examples are illustrated to build the performance-energy correlated model and evaluate the expected values of task completion time and consumed energy.
Jeffrey C. JOe; Ronald L. Boring
2014-06-01
Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.
Ken Hashimoto
2014-10-01
Full Text Available In this study, we developed a highly reliable CAE analysis model of the mechanisms that cause loosening of bolt fasteners, which has been a bottleneck in automobile development and design, using a technical element model for highly accurate CAE that we had previously developed, and verified its validity. Specifically, drawing on knowledge gained from our clarification of the mechanisms that cause loosening of bolt fasteners using actual machine tests, we conducted an accelerated bench test consisting of a threedimensional vibration load test of the loosening of bolt fasteners used in mounts and rear suspension arms, where interviews with personnel at an automaker indicated loosening was most pronounced, and reproduced actual machine tests with CAE analysis based on a technical element model for highly accurate CAE analysis. Based on these results, we were able to reproduce dynamic behavior in which larger screw pitches (lead angles lead to greater non-uniformity of surface pressure, particularly around the nut seating surface, causing loosening to occur in areas with the lowest surface pressure. Furthermore, we implemented highly accurate CAE analysis with no error (gap compared to actual machine tests.
First Versus Second Order Latent Growth Curve Models: Some Insights From Latent State-Trait Theory.
Geiser, Christian; Keller, Brian; Lockhart, Ginger
2013-07-01
First order latent growth curve models (FGMs) estimate change based on a single observed variable and are widely used in longitudinal research. Despite significant advantages, second order latent growth curve models (SGMs), which use multiple indicators, are rarely used in practice, and not all aspects of these models are widely understood. In this article, our goal is to contribute to a deeper understanding of theoretical and practical differences between FGMs and SGMs. We define the latent variables in FGMs and SGMs explicitly on the basis of latent state-trait (LST) theory and discuss insights that arise from this approach. We show that FGMs imply a strict trait-like conception of the construct under study, whereas SGMs allow for both trait and state components. Based on a simulation study and empirical applications to the CES-D depression scale (Radloff, 1977) we illustrate that, as an important practical consequence, FGMs yield biased reliability estimates whenever constructs contain state components, whereas reliability estimates based on SGMs were found to be accurate. Implications of the state-trait distinction for the measurement of change via latent growth curve models are discussed.
Power electronics reliability.
Kaplar, Robert James; Brock, Reinhard C.; Marinella, Matthew; King, Michael Patrick; Stanley, James K.; Smith, Mark A.; Atcitty, Stanley
2010-10-01
The project's goals are: (1) use experiments and modeling to investigate and characterize stress-related failure modes of post-silicon power electronic (PE) devices such as silicon carbide (SiC) and gallium nitride (GaN) switches; and (2) seek opportunities for condition monitoring (CM) and prognostics and health management (PHM) to further enhance the reliability of power electronics devices and equipment. CM - detect anomalies and diagnose problems that require maintenance. PHM - track damage growth, predict time to failure, and manage subsequent maintenance and operations in such a way to optimize overall system utility against cost. The benefits of CM/PHM are: (1) operate power conversion systems in ways that will preclude predicted failures; (2) reduce unscheduled downtime and thereby reduce costs; and (3) pioneering reliability in SiC and GaN.
The Role of Human Capital and Population Growth in R&D-Based Models of Economic Growth
Holger Strulik
2001-01-01
Human capital accumulation is introduced in a growth model with R\\&D-driven expansion in variety and quality of intermediate goods andknowledge spillovers from both research activities. Economic growth is no longer uniquely tied to population growth as previous growth models without scale effects suggest. The model predicts that economic growth depends positively on the rate of human capital accumulation and positively or negatively on population growth and is therefore supported by empirical...
A suction blister model reliably assesses skin barrier restoration and immune response.
Smith, Tracey J; Wilson, Marques A; Young, Andrew J; Montain, Scott J
2015-02-01
Skin wound healing models can be used to detect changes in immune function in response to interventions. This study used a test-retest format to assess the reliability of a skin suction blister procedure for quantitatively evaluating human immune function in repeated measures type studies. Up to eight suction blisters (~30 mm(2)) were induced via suction on each participant's left and right forearm (randomized order; blister session 1 and 2), separated by approximately one week. Fluid was sampled from each blister, and the top layer of each blister was removed to reveal up to eight skin wounds. Fluid from each wound was collected 4, 7 and 24h after blisters were induced, and proinflammatory cytokines were measured. Transepidermal water loss (TEWL), to assess skin barrier recovery, was measured daily at each wound site until values were within 90% of baseline values (i.e., unbroken skin). Sleep, stress and inflammation (i.e., factors that affect wound healing and immune function), preceding the blister induction, were assessed via activity monitors (Actical, Philips Respironics, Murrysville, Pennsylvania), the Perceived Stress Scale (PSS) and C-reactive protein (CRP), respectively. Area-under-the-curve and TEWL, between blister session 1 and 2, were compared using Pearson correlations and partial correlations (controlling for average nightly sleep, PSS scores and CRP). The suction blister method was considered reliable for assessing immune response and skin barrier recovery if correlation coefficients reached 0.7. Volunteers (n=16; 12 M; 4F) were 23 ± 5 years [mean ± SD]. Time to skin barrier restoration was 4.9 ± 0.8 and 4.8 ± 0.9 days for sessions 1 and 2, respectively. Correlation coefficients for skin barrier restoration, IL-6, IL-8 and MIP-1α were 0.9 (Pblister method is sufficiently reliable for assessing skin barrier restoration and immune responsiveness. This data can be used to determine sample sizes for cross-sectional or repeated-measures types of
A von Bertalanffy growth model with a seasonally varying coefficient
Cloern, James E.; Nichols, Frederic H.
1978-01-01
The von Bertalanffy model of body growth is inappropriate for organisms whose growth is restricted to a seasonal period because it assumes that growth rate is invariant with time. Incorporation of a time-varying coefficient significantly improves the capability of the von Bertalanffy equation to describe changing body size of both the bivalve mollusc Macoma balthicain San Francisco Bay and the flathead sole, Hippoglossoides elassodon, in Washington state. This simple modification of the von Bertalanffy model should offer improved predictions of body growth for a variety of other aquatic animals.
Probabilistic Approach to System Reliability of Mechanism with Correlated Failure Models
Xianzhen Huang
2012-01-01
Full Text Available In this paper, based on the kinematic accuracy theory and matrix-based system reliability analysis method, a practical method for system reliability analysis of the kinematic performance of planar linkages with correlated failure modes is proposed. The Taylor series expansion is utilized to derive a general expression of the kinematic performance errors caused by random variables. A proper limit state function (performance function for reliability analysis of the kinematic performance of planar linkages is established. Through the reliability theory and the linear programming method the upper and lower bounds of the system reliability of planar linkages are provided. In the course of system reliability analysis, the correlation of different failure modes is considered. Finally, the practicality, efficiency, and accuracy of the proposed method are shown by a numerical example.
THE EXISTENCE THEOREM OF OPTIMAL GROWTH MODEL
Gong Liutang; Peng Xianze
2005-01-01
This paper proves a general existence theorem of optimal growth theory. This theorem is neither restricted to the case of a constant technology progress, nor stated in terms of mathematical conditions which have no direct economic interpretation and moreover, are difficult to apply.
Modeling growth curves to track growing obesity
Our purpose was to examine the relationship between total physical activity (PA) and PA at various intensity levels with insulin resistance at increasing waist circumference and skinfold thickness levels. Being able to describe growth appropriately and succinctly is important in many nutrition and p...
Correlated noise in a logistic growth model
Ai, Bao-Quan; Wang, Xian-Ju; Liu, Guo-Tao; Liu, Liang-Gang
2003-02-01
The logistic differential equation is used to analyze cancer cell population, in the presence of a correlated Gaussian white noise. We study the steady state properties of tumor cell growth and discuss the effects of the correlated noise. It is found that the degree of correlation of the noise can cause tumor cell extinction.
Modeling and optimization of algae growth
Thornton, Anthony Richard; Weinhart, Thomas; Bokhove, Onno; Zhang, Bowen; van der Sar, Dick M.; Kumar, Kundan; Pisarenco, Maxim; Rudnaya, Maria; Savceno, Valeriu; Rademacher, Jens; Zijlstra, Julia; Szabelska, Alicja; Zyprych, Joanna; van der Schans, Martin; Timperio, Vincent; Veerman, Frits; Frank, J.; van der Mei, R.; den Boer, A.; Bosman, J.; Bouman, N.; van Dam, S.; Verhoef, C.
2010-01-01
The wastewater from greenhouses has a high amount of mineral contamination and an environmentally-friendly method of removal is to use algae to clean this runo water. The algae consume the minerals as part of their growth process. In addition to cleaning the water, the created algal bio-mass has a
Modeling and optimization of algae growth
Thornton, Anthony Richard; Weinhart, Thomas; Bokhove, Onno; Zhang, Bowen; van der Sar, Dick M.; Kumar, Kundan; Pisarenco, Maxim; Rudnaya, Maria; Savcenco, Valeriu; Rademacher, Jens; Zijlstra, Julia; Szabelska, Alicja; Zyprych, Joanna; van der Schans, Martin; Timperio, Vincent; Veerman, Frits
2010-01-01
The wastewater from greenhouses has a high amount of mineral contamination and an environmentally-friendly method of removal is to use algae to clean this runoff water. The algae consume the minerals as part of their growth process. In addition to cleaning the water, the created algal bio-mass has a
Modeling and optimization of algae growth
Thornton, Anthony Richard; Weinhart, Thomas; Bokhove, Onno; Zhang, Bowen; van der Sar, Dick M.; Kumar, Kundan; Pisarenco, Maxim; Rudnaya, Maria; Savceno, Valeriu; Rademacher, Jens; Zijlstra, Julia; Szabelska, Alicja; Zyprych, Joanna; van der Schans, Martin; Timperio, Vincent; Veerman, Frits; Frank, J.; van der Mei, R.; den Boer, A.; Bosman, J.; Bouman, N.; van Dam, S.; Verhoef, C.
2010-01-01
The wastewater from greenhouses has a high amount of mineral contamination and an environmentally-friendly method of removal is to use algae to clean this runo water. The algae consume the minerals as part of their growth process. In addition to cleaning the water, the created algal bio-mass has a v
Modeling and optimization of algae growth
Thornton, Anthony; Weinhart, Thomas; Bokhove, Onno; Zhang, Bowen; Sar, van der Dick M.; Kumar, Kundan; Pisarenco, Maxim; Rudnaya, Maria; Savceno, Valeriu; Rademacher, Jens; Zijlstra, Julia; Szabelska, Alicja; Zyprych, Joanna; Schans, van der Martin; Timperio, Vincent; Veerman, Frits; Frank, J.; van der Mei, R.; den Boer, A.; Bosman, J.; Bouman, N.; van Dam, S.; Verhoef, C.
2010-01-01
The wastewater from greenhouses has a high amount of mineral contamination and an environmentally-friendly method of removal is to use algae to clean this runo water. The algae consume the minerals as part of their growth process. In addition to cleaning the water, the created algal bio-mass has a v
Fanwoua, J.
2012-01-01
Keywords: cell division, cell growth, cell endoreduplication, fruit growth, genotype, G×E interaction, model, tomato. Fruit size is a major component of fruit yield and quality of many crops. Variations in fruit size can be tremendous due to genotypic and environmental factors. The mechanisms
Fracture Mechanical Markov Chain Crack Growth Model
Gansted, L.; Brincker, Rune; Hansen, Lars Pilegaard
1991-01-01
On the basis of the B-model developed in [J. L. Bogdanoff and F. Kozin, Probabilistic Models of Cumulative Damage. John Wiley, New York (1985)] a new numerical model incorporating the physical knowledge of fatigue crack propagation is developed. The model is based on the assumption that the crack...
A Model to Partly but Reliably Distinguish DDOS Flood Traffic from Aggregated One
Ming Li
2012-01-01
Full Text Available Reliable distinguishing DDOS flood traffic from aggregated traffic is desperately desired by reliable prevention of DDOS attacks. By reliable distinguishing, we mean that flood traffic can be distinguished from aggregated one for a predetermined probability. The basis to reliably distinguish flood traffic from aggregated one is reliable detection of signs of DDOS flood attacks. As is known, reliably distinguishing DDOS flood traffic from aggregated traffic becomes a tough task mainly due to the effects of flash-crowd traffic. For this reason, this paper studies reliable detection in the underlying DiffServ network to use static-priority schedulers. In this network environment, we present a method for reliable detection of signs of DDOS flood attacks for a given class with a given priority. There are two assumptions introduced in this study. One is that flash-crowd traffic does not have all priorities but some. The other is that attack traffic has all priorities in all classes, otherwise an attacker cannot completely achieve its DDOS goal. Further, we suppose that the protected site is equipped with a sensor that has a signature library of the legitimate traffic with the priorities flash-crowd traffic does not have. Based on those, we are able to reliably distinguish attack traffic from aggregated traffic with the priorities that flash-crowd traffic does not have according to a given detection probability.
Model for the growth of the World Airline Network
Verma, T; Nagler, J; Andrade, J S; Herrmann, H J
2016-01-01
We propose a probabilistic growth model for transport networks which employs a balance between popularity of nodes and the physical distance between nodes. By comparing the degree of each node in the model network and the WAN, we observe that the difference between the two is minimized for $\\alpha\\approx 2$. Interestingly, this is the value obtained for the node-node correlation function in the WAN. This suggests that our model explains quite well the growth of airline networks.
Detection of unobserved heterogeneity with growth mixture models
Jost Reinecke; Luca Mariotti
2009-01-01
Latent growth curve models as structural equation models are extensively discussedin various research fields (Duncan et al., 2006). Recent methodological and statisticalextension are focused on the consideration of unobserved heterogeneity in empiricaldata. Muth´en extended the classical structural equation approach by mixture components,i. e. categorical latent classes (Muth´en 2002, 2004, 2007).The paper will discuss applications of growth mixture models with data from oneof the first panel...
Czuchry, Andrew J.; And Others
This report provides a complete guide to the stand alone mode operation of the reliability and maintenance (R&M) model, which was developed to facilitate the performance of design versus cost trade-offs within the digital avionics information system (DAIS) acquisition process. The features and structure of the model, its input data…
Model Testing and Reliability Evalution of the New Deepwater Breakwater at La Coruña, Spain
Burcharth, Hans Falk; Maciñeira, Enrique; Canalejo, Pedro
2003-01-01
tankers are arranged along the inner side of the breakwater. The paper presents the design criteria, the design procedure, the main results from model testing, and the subsequent reliability evaluation and optimisation of the cross section design for the most exposed part of the breakwater. Model test...
Marco Innamorati
2014-01-01
Full Text Available Objectives and Methods. The aim of the study was to investigate the construct validity of the ARSQ. Methods. The ARSQ and self-report measures of depression, anxiety, and hopelessness were administered to 774 Italian adults, aged 18 to 64 years. Results. Structural equation modeling indicated that the factor structure of the ARSQ can be represented by a bifactor model: a general rejection sensitivity factor and two group factors, expectancy of rejection and rejection anxiety. Reliability of observed scores was not satisfactory: only 44% of variance in observed total scores was due to the common factors. The analyses also indicated different correlates for the general factor and the group factors. Limitations. We administered an Italian version of the ARSQ to a nonclinical sample of adults, so that studies which use clinical populations or the original version of the ARSQ could obtain different results from those presented here. Conclusion. Our results suggest that the construct validity of the ARSQ is disputable and that rejection anxiety and expectancy could bias individuals to readily perceive and strongly react to cues of rejection in different ways.
Fuaad, Norain Farhana Ahmad; Nopiah, Zulkifli Mohd; Tawil, Norgainy Mohd; Othman, Haliza; Asshaari, Izamarlina; Osman, Mohd Hanif; Ismail, Nur Arzilah
2014-06-01
In engineering studies and researches, Mathematics is one of the main elements which express physical, chemical and engineering laws. Therefore, it is essential for engineering students to have a strong knowledge in the fundamental of mathematics in order to apply the knowledge to real life issues. However, based on the previous results of Mathematics Pre-Test, it shows that the engineering students lack the fundamental knowledge in certain topics in mathematics. Due to this, apart from making improvements in the methods of teaching and learning, studies on the construction of questions (items) should also be emphasized. The purpose of this study is to assist lecturers in the process of item development and to monitor the separation of items based on Blooms' Taxonomy and to measure the reliability of the items itself usingRasch Measurement Model as a tool. By using Rasch Measurement Model, the final exam questions of Engineering Mathematics II (Linear Algebra) for semester 2 sessions 2012/2013 were analysed and the results will provide the details onthe extent to which the content of the item providesuseful information about students' ability. This study reveals that the items used in Engineering Mathematics II (Linear Algebra) final exam are well constructed but the separation of the items raises concern as it is argued that it needs further attention, as there is abig gap between items at several levels of Blooms' cognitive skill.
NONLINEAR MODELS FOR DESCRIPTION OF CACAO FRUIT GROWTH WITH ASSUMPTION VIOLATIONS
JOEL AUGUSTO MUNIZ
2017-01-01
Full Text Available Cacao (Theobroma cacao L. is an important fruit in the Brazilian economy, which is mainly cultivated in the southern State of Bahia. The optimal stage for harvesting is a major factor for fruit quality and the knowledge on its growth curves can help, especially in identifying the ideal maturation stage for harvesting. Nonlinear regression models have been widely used for description of growth curves. However, several studies in this subject do not consider the residual analysis, the existence of a possible dependence between longitudinal observations, or the sample variance heterogeneity, compromising the modeling quality. The objective of this work was to compare the fit of nonlinear regression models, considering residual analysis and assumption violations, in the description of the cacao (clone Sial-105 fruit growth. The data evaluated were extracted from Brito and Silva (1983, who conducted the experiment in the Cacao Research Center, Ilheus, State of Bahia. The variables fruit length, diameter and volume as a function of fruit age were studied. The use of weighting and incorporation of residual dependencies was efficient, since the modeling became more consistent, improving the model fit. Considering the first-order autoregressive structure, when needed, leads to significant reduction in the residual standard deviation, making the estimates more reliable. The Logistic model was the most efficient for the description of the cacao fruit growth.
Stochastic Gompertzian model for breast cancer growth process
Mazlan, Mazma Syahidatul Ayuni Binti; Rosli, Norhayati
2017-05-01
In this paper, a stochastic Gompertzian model is developed to describe the growth process of a breast cancer by incorporating the noisy behavior into a deterministic Gompertzian model. The prediction quality of the stochastic Gompertzian model is measured by comparing the simulated result with the clinical data of breast cancer growth. The kinetic parameters of the model are estimated via maximum likelihood procedure. 4-stage stochastic Runge-Kutta (SRK4) is used to simulate the sample path of the model. Low values of mean-square error (MSE) of stochastic model indicate good fits. It is shown that the stochastic Gompertzian model is adequate in explaining the breast cancer growth process compared to the deterministic model counterpart.
Lambe, N R; Navajas, E A; Simm, G; Bünger, L
2006-10-01
This study compared the use of various models to describe growth in lambs of 2 contrasting breeds from birth to slaughter. Live BW records (n = 7559) from 240 Texel and 231 Scottish Blackface (SBF) lambs weighed at 2-wk intervals were modeled. Biologically relevant variables were estimated for each lamb from modified versions of the logistic, Gompertz, Richards, and exponential models, and from linear regression. In both breeds, all nonlinear models fitted the data well, with an average coefficient of determination (R2) of > 0.98. The linear model had a lower average R2 than any of the nonlinear models (growth patterns, but the Akaike's information criteria value (which weighs log-likelihood by number of parameters estimated) was similar to that of the Gompertz model. Variables A, B, C, and D were moderately to highly heritable in Texel lambs (h2 = 0.33 to 0.87), and genetic correlations between variables within-model ranged from -0.80 to 0.89, suggesting some flexibility to change the shape of the growth curve when selecting for different variables. In SBF lambs, only variables from the logistic and Gompertz models had moderate heritabilities (0.17 to 0.56), but with high genetic correlations between variables within each model ( 0.92). Selection on growth variables seems promising (in Texel more than SBF), but high genetic correlations between variables may restrict the possibilities to change the growth curve shape. A random regression model was also fitted to the data to allow predictions of growth rates at relevant time points. Heritabilities for growth rates differed markedly at various stages of growth and between the 2 breeds (Texel: 0.14 to 0.74; SBF: 0.07 to 0.34), with negative correlations between growth rate at 60 d of age and growth rate at finishing. Following these results, future studies should investigate genetic relationships between relevant growth curve variables and other important production traits, such as carcass composition and meat
Modelling the Growth of Swine Flu
Thomson, Ian
2010-01-01
The spread of swine flu has been a cause of great concern globally. With no vaccine developed as yet, (at time of writing in July 2009) and given the fact that modern-day humans can travel speedily across the world, there are fears that this disease may spread out of control. The worst-case scenario would be one of unfettered exponential growth.…
[Models of economic theory of population growth].
Von Zameck, W
1987-01-01
"The economic theory of population growth applies the opportunity cost approach to the fertility decision. Variations and differentials in fertility are caused by the available resources and relative prices or by the relative production costs of child services. Pure changes in real income raise the demand for children or the total amount spent on children. If relative prices or production costs and real income are affected together the effect on fertility requires separate consideration." (SUMMARY IN ENG)
A Nonlinear Viscous Model for Sn-Whisker Growth
Yang, Fuqian
2016-12-01
Based on the mechanism of the grain boundary fluid flow, a nonlinear viscous model for the growth of Sn-whiskers is proposed. This model consists of two units, one with a stress exponent of one and one with a stress exponent of n -1. By letting one of the constants be zero in the model, the constitutive relationship reduces to a linear flow relation or a power-law relation, representing the flow behavior of various metals. Closed-form solutions for the growth behavior of a whisker are derived, which can be used to predict the whisker growth and the stress evolution.
Growth Model for Pulsed-Laser Deposited Perovskite Oxide Films
WANG Xu; FEI Yi-Yan; ZHU Xiang-Dong; Lu Hui-Bin; YANG Guo-Zhen
2008-01-01
We present a multi-level growth model that yields some of the key features of perovskite oxide film growth as observed in the reflection high energy electron diffraction(RHEED)and ellipsometry studies.The model describes the effect of deposition,temperature,intra-layer transport,interlayer transport and Ostwald ripening on the morphology of a growth surface in terms of the distribution of terraces and step edges during and after deposition.The numerical results of the model coincide well with the experimental observation.
Plant Growth Models Using Artificial Neural Networks
Bubenheim, David
1997-01-01
In this paper, we descrive our motivation and approach to devloping models and the neural network architecture. Initial use of the artificial neural network for modeling the single plant process of transpiration is presented.
The simulation of cutoff lows in a regional climate model: reliability and future trends
Grose, Michael R. [University of Tasmania, Antarctic Climate and Ecosystems Cooperative Research Centre (ACE CRC), Private Bag 80, Hobart, TAS (Australia); Pook, Michael J.; McIntosh, Peter C.; Risbey, James S. [CSIRO Marine and Atmospheric Research, Centre for Australian Weather and Climate Research (CAWCR), Hobart, TAS (Australia); Bindoff, Nathaniel L. [University of Tasmania, Antarctic Climate and Ecosystems Cooperative Research Centre (ACE CRC), Private Bag 80, Hobart, TAS (Australia); CSIRO Marine and Atmospheric Research, Centre for Australian Weather and Climate Research (CAWCR), Hobart, TAS (Australia); University of Tasmania, Institute of Marine and Antarctic Studies (IMAS), Private Bag 129, Hobart, TAS (Australia)
2012-07-15
Cutoff lows are an important source of rainfall in the mid-latitudes that climate models need to simulate accurately to give confidence in climate projections for rainfall. Coarse-scale general circulation models used for climate studies show some notable biases and deficiencies in the simulation of cutoff lows in the Australian region and important aspects of the broader circulation such as atmospheric blocking and the split jet structure observed over Australia. The regional climate model conformal cubic atmospheric model or CCAM gives an improvement in some aspects of the simulation of cutoffs in the Australian region, including a reduction in the underestimate of the frequency of cutoff days by more than 15 % compared to a typical GCM. This improvement is due at least in part to substantially higher resolution. However, biases in the simulation of the broader circulation, blocking and the split jet structure are still present. In particular, a northward bias in the central latitude of cutoff lows creates a substantial underestimate of the associated rainfall over Tasmania in April to October. Also, the regional climate model produces a significant north-south distortion of the vertical profile of cutoff lows, with the largest distortion occurring in the cooler months that was not apparent in GCM simulations. The remaining biases and presence of new biases demonstrates that increased horizontal resolution is not the only requirement in the reliable simulation of cutoff lows in climate models. Notwithstanding the biases in their simulation, the regional climate model projections show some responses to climate warming that are noteworthy. The projections indicate a marked closing of the split jet in winter. This change is associated with changes to atmospheric blocking in the Tasman Sea, which decreases in June to November (by up to 7.9 m s{sup -1}), and increases in December to May. The projections also show a reduction in the number of annual cutoff days by 67
Balanced growth path solutions of a Boltzmann mean field game model for knowledge growth
Burger, Martin
2016-11-18
In this paper we study balanced growth path solutions of a Boltzmann mean field game model proposed by Lucas and Moll [15] to model knowledge growth in an economy. Agents can either increase their knowledge level by exchanging ideas in learning events or by producing goods with the knowledge they already have. The existence of balanced growth path solutions implies exponential growth of the overall production in time. We prove existence of balanced growth path solutions if the initial distribution of individuals with respect to their knowledge level satisfies a Pareto-tail condition. Furthermore we give first insights into the existence of such solutions if in addition to production and knowledge exchange the knowledge level evolves by geometric Brownian motion.
Reliability estimation for single dichotomous items based on Mokken's IRT model
Meijer, R R; Sijtsma, K; Molenaar, Ivo W
1995-01-01
Item reliability is of special interest for Mokken's nonparametric item response theory, and is useful for the evaluation of item quality in nonparametric test construction research. It is also of interest for nonparametric person-fit analysis. Three methods for the estimation of the reliability of
Reliability estimation for single dichotomous items based on Mokken's IRT model
Meijer, Rob R.; Sijtsma, Klaas; Molenaar, Ivo W.
1995-01-01
Item reliability is of special interest for Mokken’s nonparametric item response theory, and is useful for the evaluation of item quality in nonparametric test construction research. It is also of interest for nonparametric person-fit analysis. Three methods for the estimation of the reliability of
Generalized exponential function and discrete growth models
Souto Martinez, Alexandre; Silva González, Rodrigo; Lauri Espíndola, Aquino
2009-07-01
Here we show that a particular one-parameter generalization of the exponential function is suitable to unify most of the popular one-species discrete population dynamic models into a simple formula. A physical interpretation is given to this new introduced parameter in the context of the continuous Richards model, which remains valid for the discrete case. From the discretization of the continuous Richards’ model (generalization of the Gompertz and Verhulst models), one obtains a generalized logistic map and we briefly study its properties. Notice, however that the physical interpretation for the introduced parameter persists valid for the discrete case. Next, we generalize the (scramble competition) θ-Ricker discrete model and analytically calculate the fixed points as well as their stabilities. In contrast to previous generalizations, from the generalized θ-Ricker model one is able to retrieve either scramble or contest models.
Reliable experimental model of hepatic veno-occlusive disease caused by monocrotaline
Miao-Yan Chen; Jian-Ting Cai; Qin Du; Liang-Jing Wang; Jia-Min Chen; Li-Ming Shao
2008-01-01
BACKGROUND:Hepatic veno-occlusive disease (HVOD) is a severe complication of chemotherapy before hematopoietic stem cell transplantation and dietary ingestion of pyrrolizidine alkaloids. Many experimental models were established to study its mechanisms or therapy, but few are ideal. This work aimed at evaluating a rat model of HVOD induced by monocrotaline to help advance research into this disease. METHODS:Thirty-two male rats were randomly classiifed into 5 groups, and PBS or monocrotaline was administered (100 mg/kg or 160 mg/kg). They were sacriifced on day 7 (groups A, B and D) or day 10 (groups C and E). Blood samples were collected to determine liver enzyme concentrations. The weight of the liver and body and the amount of ascites were measured. Histopathological changes of liver tissue on light microscopy were assessed by a modiifed Deleve scoring system. The positivity of proliferating cell nuclear antigen (PCNA) was estimated. RESULTS:The rats that were treated with 160 mg/kg monocrotaline presented with severe clinical symptoms (including two deaths) and the histopathological picture of HVOD. On the other hand, the rats that were fed with 100 mg/kg monocrotaline had milder and reversible manifestations. Comparison of the rats sacriifced on day 10 with those sacriifced on day 7 showed that the positivity of PCNA increased, especially that of hepatocytes. CONCLUSIONS:Monocrotaline induces acute, dose-dependent HVOD in rats. The model is potentially reversible with a low dose, but reliable and irreversible with a higher dose. The modiifed scoring system seems to be more accurate than the traditional one in relfecting the histopathology of HVOD. The enhancement of PCNA positivity may be associated with hepatic tissue undergoing recovery.
Rajesh Karki
2013-02-01
Full Text Available Adverse environmental impacts of carbon emissions are causing increasing concerns to the general public throughout the world. Electric energy generation from conventional energy sources is considered to be a major contributor to these harmful emissions. High emphasis is therefore being given to green alternatives of energy, such as wind and solar. Wind energy is being perceived as a promising alternative. This source of energy technology and its applications have undergone significant research and development over the past decade. As a result, many modern power systems include a significant portion of power generation from wind energy sources. The impact of wind generation on the overall system performance increases substantially as wind penetration in power systems continues to increase to relatively high levels. It becomes increasingly important to accurately model the wind behavior, the interaction with other wind sources and conventional sources, and incorporate the characteristics of the energy demand in order to carry out a realistic evaluation of system reliability. Power systems with high wind penetrations are often connected to multiple wind farms at different geographic locations. Wind speed correlations between the different wind farms largely affect the total wind power generation characteristics of such systems, and therefore should be an important parameter in the wind modeling process. This paper evaluates the effect of the correlation between multiple wind farms on the adequacy indices of wind-integrated systems. The paper also proposes a simple and appropriate probabilistic analytical model that incorporates wind correlations, and can be used for adequacy evaluation of multiple wind-integrated systems.
Evaluation of MCF10A as a Reliable Model for Normal Human Mammary Epithelial Cells.
Ying Qu
Full Text Available Breast cancer is the most common cancer in women and a leading cause of cancer-related deaths for women worldwide. Various cell models have been developed to study breast cancer tumorigenesis, metastasis, and drug sensitivity. The MCF10A human mammary epithelial cell line is a widely used in vitro model for studying normal breast cell function and transformation. However, there is limited knowledge about whether MCF10A cells reliably represent normal human mammary cells. MCF10A cells were grown in monolayer, suspension (mammosphere culture, three-dimensional (3D "on-top" Matrigel, 3D "cell-embedded" Matrigel, or mixed Matrigel/collagen I gel. Suspension culture was performed with the MammoCult medium and low-attachment culture plates. Cells grown in 3D culture were fixed and subjected to either immunofluorescence staining or embedding and sectioning followed by immunohistochemistry and immunofluorescence staining. Cells or slides were stained for protein markers commonly used to identify mammary progenitor and epithelial cells. MCF10A cells expressed markers representing luminal, basal, and progenitor phenotypes in two-dimensional (2D culture. When grown in suspension culture, MCF10A cells showed low mammosphere-forming ability. Cells in mammospheres and 3D culture expressed both luminal and basal markers. Surprisingly, the acinar structure formed by MCF10A cells in 3D culture was positive for both basal markers and the milk proteins β-casein and α-lactalbumin. MCF10A cells exhibit a unique differentiated phenotype in 3D culture which may not exist or be rare in normal human breast tissue. Our results raise a question as to whether the commonly used MCF10A cell line is a suitable model for human mammary cell studies.
Knowledge Growth: Applied Models of General and Individual Knowledge Evolution
Silkina, Galina Iu.; Bakanova, Svetlana A.
2016-01-01
The article considers the mathematical models of the growth and accumulation of scientific and applied knowledge since it is seen as the main potential and key competence of modern companies. The problem is examined on two levels--the growth and evolution of objective knowledge and knowledge evolution of a particular individual. Both processes are…