Oswald, James; Raine, Mike; Ashraf-Ball, Hezlin
There has been much academic debate on the ability of wind to provide a reliable electricity supply. The model presented here calculates the hourly power delivery of 25 GW of wind turbines distributed across Britain's grid, and assesses power delivery volatility and the implications for individual generators on the system. Met Office hourly wind speed data are used to determine power output and are calibrated using Ofgem's published wind output records. There are two main results. First, the model suggests that power swings of 70% within 12 h are to be expected in winter, and will require individual generators to go on or off line frequently, thereby reducing the utilisation and reliability of large centralised plants. These reductions will lead to increases in the cost of electricity and reductions in potential carbon savings. Secondly, it is shown that electricity demand in Britain can reach its annual peak with a simultaneous demise of wind power in Britain and neighbouring countries to very low levels. This significantly undermines the case for connecting the UK transmission grid to neighbouring grids. Recommendations are made for improving 'cost of wind' calculations. The authors are grateful for the sponsorship provided by The Renewable Energy Foundation
Full Text Available The article presents a model of Bring Your Own Device (BYOD as a model network, which provides the user reliable access to network resources. BYOD is a model dynamically developing, which can be applied in many areas. Research network has been launched in order to carry out the test, in which as a service of BYOD model Work Folders service was used. This service allows the user to synchronize files between the device and the server. An access to the network is completed through the wireless communication by the 802.11n standard. Obtained results are shown and analyzed in this article.
Full Text Available People are the most valuable asset of an organization and the results of a company mostly depends on them. The human factor can also be a weak link in the company and cause of the high risk for many of the processes. Reliability of the human factor in the process of the manufacturing process will depend on many factors. The authors include aspects of human error, safety culture, knowledge, communication skills, teamwork and leadership role in the developed model of reliability of human resources in the management of the production process. Based on the case study and the results of research and observation of the author present risk areas defined in a specific manufacturing process and the results of evaluation of the reliability of human resources in the process.
Mahmood, Oria; Dagnæs, Julia; Bube, Sarah
was significant (p Pearson's correlation of 0.77 for the nonspecialists and 0.75 for the specialists. The test-retest reliability showed the biggest difference between the 2 groups, 0.59 and 0.38 for the nonspecialist raters and the specialist raters, respectively (p ... was chosen as it is a simple procedural skill that is crucial to master in a resident urology program. RESULTS: The internal consistency of assessments was high, Cronbach's α = 0.93 and 0.95 for nonspecialist and specialist raters, respectively (p correlations). The interrater reliability...
In 1995, an improvement project was completed on the 244 klystron modulators in the linear accelerator. The modulator system has been previously described. This article offers project details and their resulting effect on modulator and component reliability. Prior to the project, the authors had collected four operating cycles (1991 through 1995) of MTTF data. In this discussion, the '91 data will be excluded since the modulators operated at 60 Hz. The five periods following the '91 run were reviewed due to the common repetition rate at 120 Hz
The report contains a brief summary of aspects of the Maximus reliability point and interval estimation technique as it has been applied to the reliability of a device whose surveillance tests contain...
Comer, M.K.; Donovan, M.D.; Gaddy, C.D.
The US Nuclear Regulatory Commission (NRC), Sandia National Laboratories (SNL), and General Physics Corporation are conducting a research program to determine the practicality, acceptability, and usefulness of a Human Reliability Data Bank for nuclear power industry probabilistic risk assessment (PRA). As part of this program, a survey was conducted of existing human reliability data banks from other industries, and a detailed concept of a Data Bank for the nuclear industry was developed. Subsequently, a detailed specification for implementing the Data Bank was developed. An evaluation of this specification was conducted and is described in this report. The evaluation tested data treatment, storage, and retrieval using the Data Bank structure, as modified from NUREG/CR-2744, and detailed procedures for data processing and retrieval, developed prior to this evaluation and documented in the test specification. The evaluation consisted of an Operability Demonstration and Evaluation of the data processing procedures, a Data Retrieval Demonstration and Evaluation, a Retrospective Analysis that included a survey of organizations currently operating data banks for the nuclear power industry, and an Internal Analysis of the current Data Bank System
Tapia, Carlos; Dies, Javier; Abal, Javier; Ibarra, Angel; Arroyo, Jose M.
There is an uncertainty issue about the applicability of industrial databases to new designs, such as the International Fusion Materials Irradiation Facility (IFMIF), as they usually contain elements for which no historical statistics exist. The exploration of common components reliability data in Accelerator Driven Systems (ADS) and Liquid Metal Technologies (LMT) frameworks is the milestone to analyze the data used in IFMIF reliability's reports and for future studies. The comparison between the reliability accelerator results given in the former IFMIF's reports and the databases explored has been made by means of a new accelerator Reliability, Availability, Maintainability (RAM) analysis. The reliability database used in this analysis is traceable.
A wide variety of research programs to produce cost-effective and reliable inspection techniques have arisen following the discovery of stress corrosion cracks in the control rod drive mechanism of pressurized water reactors, notably in France, Belgium and Spain. This article describes the research program results from a cooperative partnership between Comex Nucleaire, Westinghouse Electric and AEA Technology. The package developed offers techniques to provide complete capability in virtually all the design configurations used world-wide. After extensive acceptance trials in France and the United States the techniques are now being used on site at Bugey 3. (UK)
Condon, David; Revelle, William
Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.
Edens, John F; Penson, Brittany N; Ruchensky, Jared R; Cox, Jennifer; Smith, Shannon Toney
Published research suggests that most violence risk assessment tools have relatively high levels of interrater reliability, but recent evidence of inconsistent scores among forensic examiners in adversarial settings raises concerns about the "field reliability" of such measures. This study specifically examined the reliability of Violence Risk Appraisal Guide (VRAG) scores in Canadian criminal cases identified in the legal database, LexisNexis. Over 250 reported cases were located that made mention of the VRAG, with 42 of these cases containing 2 or more scores that could be submitted to interrater reliability analyses. Overall, scores were skewed toward higher risk categories. The intraclass correlation (ICCA1) was .66, with pairs of forensic examiners placing defendants into the same VRAG risk "bin" in 68% of the cases. For categorical risk statements (i.e., low, moderate, high), examiners provided converging assessment results in most instances (86%). In terms of potential predictors of rater disagreement, there was no evidence for adversarial allegiance in our sample. Rater disagreement in the scoring of 1 VRAG item (Psychopathy Checklist-Revised; Hare, 2003), however, strongly predicted rater disagreement in the scoring of the VRAG (r = .58). (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Uythoven, J; Carlier, E; Castronuovo, F; Ducimetière, L; Gallet, E; Goddard, B; Magnin, N; Verhagen, H
The LHC Beam Dumping System is one of the vital elements of the LHC Machine Protection System and has to operate reliably every time a beam dump request is made. Detailed dependability calculations have been made, resulting in expected rates for the different system failure modes. A 'reliability run' of the whole system, installed in its final configuration in the LHC, has been made to discover infant mortality problems and to compare the occurrence of the measured failure modes with their calculations.
The report describes aims, rules and results of the Systems Reliability Benchmark Exercise, which has been performed in order to assess methods and procedures for reliability analysis of complex systems and involved a large number of European organizations active in NPP safety evaluation. The exercise included both qualitative and quantitative methods and was structured in such a way that separation of the effects of uncertainties in modelling and in data on the overall spread was made possible. Part I describes the way in which RBE has been performed, its main results and conclusions
Egan, T.; Turk, E.
The reliability of the North American electricity grid was discussed. Government initiatives designed to control carbon dioxide (CO 2 ) and other emissions in some regions of Canada may lead to electricity supply constraints in other regions. A lack of investment in transmission infrastructure has resulted in constraints within the North American transmission grid, and the growth of smaller projects is now raising concerns about transmission capacity. Labour supply shortages in the electricity industry are also creating concerns about the long-term security of the electricity market. Measures to address constraints must be considered in the current context of the North American electricity system. The extensive transmission interconnects and integration between the United States and Canada will provide a framework for greater trade and market opportunities between the 2 countries. Coordinated actions and increased integration will enable Canada and the United States to increase the reliability of electricity supply. However, both countries must work cooperatively to increase generation supply using both mature and emerging technologies. The cross-border transmission grid must be enhanced by increasing transmission capacity as well as by implementing new reliability rules, building new infrastructure, and ensuring infrastructure protection. Barriers to cross-border electricity trade must be identified and avoided. Demand-side and energy efficiency measures must also be implemented. It was concluded that both countries must focus on developing strategies for addressing the environmental concerns related to electricity production. 6 figs
Basalamah, Anas; Sato, Takuro
For wireless multicast applications like multimedia conferencing, voice over IP and video/audio streaming, a reliable transmission of packets within short delivery delay is needed. Moreover, reliability is crucial to the performance of error intolerant applications like file transfer, distributed computing, chat and whiteboard sharing. Forward Error Correction (FEC) is frequently used in wireless multicast to enhance Packet Error Rate (PER) performance, but cannot assure full reliability unless coupled with Automatic Repeat Request forming what is knows as Hybrid-ARQ. While reliable FEC can be deployed at different levels of the protocol stack, it cannot be deployed on the MAC layer of the unreliable IEEE802.11 WLAN due to its inability to exchange ACKs with multiple recipients. In this paper, we propose a Multicast MAC protocol that enhances WLAN reliability by using Adaptive FEC and study it's performance through mathematical analysis and simulation. Our results show that our protocol can deliver high reliability and throughput performance.
Cloud services consumers demand reliable methods for choosing appropriate cloud service provider for their requirements. Number of cloud consumer is increasing day by day and so cloud providers, hence requirement for a common platform for interacting between cloud provider and cloud consumer is also on the raise. This paper introduces Cloud Providers Market Platform Dashboard. This will act as not only just cloud provider discoverability but also provide timely report to consumer on cloud ser...
Mahboubi-Moghaddam, Esmaeil; Nayeripour, Majid; Aghaei, Jamshid
Highlights: • The operation of Energy Service Providers (ESPs) in electricity markets is modeled. • Demand response as the cost-effective solution is used for energy service provider. • The market price uncertainty is modeled using the robust optimization technique. • The reliability of the distribution network is embedded into the framework. • The simulation results demonstrate the benefits of robust framework for ESPs. - Abstract: Demand response (DR) programs are becoming a critical concept for the efficiency of current electric power industries. Therefore, its various capabilities and barriers have to be investigated. In this paper, an effective decision model is presented for the strategic behavior of energy service providers (ESPs) to demonstrate how to participate in the day-ahead electricity market and how to allocate demand in the smart distribution network. Since market price affects DR and vice versa, a new two-step sequential framework is proposed, in which unit commitment problem (UC) is solved to forecast the expected locational marginal prices (LMPs), and successively DR program is applied to optimize the total cost of providing energy for the distribution network customers. This total cost includes the cost of purchased power from the market and distributed generation (DG) units, incentive cost paid to the customers, and compensation cost of power interruptions. To obtain compensation cost, the reliability evaluation of the distribution network is embedded into the framework using some innovative constraints. Furthermore, to consider the unexpected behaviors of the other market participants, the LMP prices are modeled as the uncertainty parameters using the robust optimization technique, which is more practical compared to the conventional stochastic approach. The simulation results demonstrate the significant benefits of the presented framework for the strategic performance of ESPs.
Armstrong, Kirk J.; Jarriel, Amanda J.
Context: Providing students reliable objective feedback regarding their clinical performance is of great value for ongoing clinical skill assessment. Since a standardized patient (SP) is trained to consistently portray the case, students can be assessed and receive immediate feedback within the same clinical encounter; however, no research, to our…
Zia, Jasmine; Chung, Chia-Fang; Xu, Kaiyuan; Dong, Yi; Schenk, Jeanette M; Cain, Kevin; Munson, Sean; Heitkemper, Margaret M
There are currently no standardized methods for identifying trigger food(s) from irritable bowel syndrome (IBS) food and symptom journals. The primary aim of this study was to assess the inter-rater reliability of providers' interpretations of IBS journals. A second aim was to describe whether these interpretations varied for each patient. Eight providers reviewed 17 IBS journals and rated how likely key food groups (fermentable oligo-di-monosaccharides and polyols, high-calorie, gluten, caffeine, high-fiber) were to trigger IBS symptoms for each patient. Agreement of trigger food ratings was calculated using Krippendorff's α-reliability estimate. Providers were also asked to write down recommendations they would give to each patient. Estimates of agreement of trigger food likelihood ratings were poor (average α = 0.07). Most providers gave similar trigger food likelihood ratings for over half the food groups. Four providers gave the exact same written recommendation(s) (range 3-7) to over half the patients. Inter-rater reliability of provider interpretations of IBS food and symptom journals was poor. Providers favored certain trigger food likelihood ratings and written recommendations. This supports the need for a more standardized method for interpreting these journals and/or more rigorous techniques to accurately identify personalized IBS food triggers.
Full Text Available There are currently no standardized methods for identifying trigger food(s from irritable bowel syndrome (IBS food and symptom journals. The primary aim of this study was to assess the inter-rater reliability of providers’ interpretations of IBS journals. A second aim was to describe whether these interpretations varied for each patient. Eight providers reviewed 17 IBS journals and rated how likely key food groups (fermentable oligo-di-monosaccharides and polyols, high-calorie, gluten, caffeine, high-fiber were to trigger IBS symptoms for each patient. Agreement of trigger food ratings was calculated using Krippendorff’s α-reliability estimate. Providers were also asked to write down recommendations they would give to each patient. Estimates of agreement of trigger food likelihood ratings were poor (average α = 0.07. Most providers gave similar trigger food likelihood ratings for over half the food groups. Four providers gave the exact same written recommendation(s (range 3–7 to over half the patients. Inter-rater reliability of provider interpretations of IBS food and symptom journals was poor. Providers favored certain trigger food likelihood ratings and written recommendations. This supports the need for a more standardized method for interpreting these journals and/or more rigorous techniques to accurately identify personalized IBS food triggers.
Kaplani, E.; Kaplanis, S.
Highlights: ► Solar radiation data for European cities follow the Extreme Value or Weibull distribution. ► Simulation model for the sizing of SAPV systems based on energy balance and stochastic analysis. ► Simulation of PV Generator-Loads-Battery Storage System performance for all months. ► Minimum peak power and battery capacity required for reliable SAPV sizing for various European cities. ► Peak power and battery capacity reduced by more than 30% for operation 95% success rate. -- Abstract: The large fluctuations observed in the daily solar radiation profiles affect highly the reliability of the PV system sizing. Increasing the reliability of the PV system requires higher installed peak power (P m ) and larger battery storage capacity (C L ). This leads to increased costs, and makes PV technology less competitive. This research paper presents a new stochastic simulation model for stand-alone PV systems, developed to determine the minimum installed P m and C L for the PV system to be energy independent. The stochastic simulation model developed, makes use of knowledge acquired from an in-depth statistical analysis of the solar radiation data for the site, and simulates the energy delivered, the excess energy burnt, the load profiles and the state of charge of the battery system for the month the sizing is applied, and the PV system performance for the entire year. The simulation model provides the user with values for the autonomy factor d, simulating PV performance in order to determine the minimum P m and C L depending on the requirements of the application, i.e. operation with critical or non-critical loads. The model makes use of NASA’s Surface meteorology and Solar Energy database for the years 1990–2004 for various cities in Europe with a different climate. The results obtained with this new methodology indicate a substantial reduction in installed peak power and battery capacity, both for critical and non-critical operation, when compared to
Andrea Cukusic Kalajzic
Full Text Available Structural and functional analysis of telomeres is very important for understanding basic biological functions such as genome stability, cell growth control, senescence and aging. Recently, serious concerns have been raised regarding the reliability of current telomere measurement methods such as Southern blot and quantitative polymerase chain reaction. Since telomere length is associated with age related pathologies, including cardiovascular disease and cancer, both at the individual and population level, accurate interpretation of measured results is a necessity. The telomere Q-PNA-FISH technique has been widely used in these studies as well as in commercial analysis for the general population. A hallmark of telomere Q-PNA-FISH is the wide variation among telomere signals which has a major impact on obtained results. In the present study we introduce a specific mathematical and statistical analysis of sister telomere signals during cell culture senescence which enabled us to identify high regularity in their variations. This phenomenon explains the reproducibility of results observed in numerous telomere studies when the Q-PNA-FISH technique is used. In addition, we discuss the molecular mechanisms which probably underlie the observed telomere behavior.
Frank, Guido K W; Favaro, Angela; Marsh, Rachel; Ehrlich, Stefan; Lawson, Elizabeth A
Human brain imaging can help improve our understanding of mechanisms underlying brain function and how they drive behavior in health and disease. Such knowledge may eventually help us to devise better treatments for psychiatric disorders. However, the brain imaging literature in psychiatry and especially eating disorders has been inconsistent, and studies are often difficult to replicate. The extent or severity of extremes of eating and state of illness, which are often associated with differences in, for instance hormonal status, comorbidity, and medication use, commonly differ between studies and likely add to variation across study results. Those effects are in addition to the well-described problems arising from differences in task designs, data quality control procedures, image data preprocessing and analysis or statistical thresholds applied across studies. Which of those factors are most relevant to improve reproducibility is still a question for debate and further research. Here we propose guidelines for brain imaging research in eating disorders to acquire valid results that are more reliable and clinically useful. © 2018 Wiley Periodicals, Inc.
The Event Sequence Reliability Benchmark Exercise is the fourth of a series of benchmark exercises on reliability and risk assessment, with specific reference to nuclear power plant applications, and is the logical continuation of the previous benchmark exercises on System Analysis Common Cause Failure and Human Factors. The reference plant is the Nuclear Power Plant at Grohnde Federal Republic of Germany a 1300 MW PWR plant of KWU design. The specific objective of the Exercise is to model, to quantify and to analyze such event sequences initiated by the occurrence of a loss of offsite power that involve the steam generator feed. The general aim is to develop a segment of a risk assessment, which ought to include all the specific aspects and models of quantification, such as common canal failure, Human Factors and System Analysis, developed in the previous reliability benchmark exercises, with the addition of the specific topics of dependences between homologous components belonging to different systems featuring in a given event sequence and of uncertainty quantification, to end up with an overall assessment of: - the state of the art in risk assessment and the relative influences of quantification problems in a general risk assessment framework. The Exercise has been carried out in two phases, both requiring modelling and quantification, with the second phase adopting more restrictive rules and fixing certain common data, as emerged necessary from the first phase. Fourteen teams have participated in the Exercise mostly from EEC countries, with one from Sweden and one from the USA. (author)
Pearson, Amy CS; Moman, Rajat N; Moeschler, Susan M; Eldrige, Jason S; Hooten, W Michael
Introduction Many providers report lack of confidence in managing patients with chronic pain. Thus, the primary aim of this study was to investigate the associations of provider confidence in managing chronic pain with their practice behaviors and demographics. Materials and methods The primary outcome measure was the results of the Opioid Therapy Provider Survey, which was administered to clinicians attending a pain-focused continuing medical education conference. Nonparametric correlations were assessed using Spearman’s rho. Results Of the respondents, 55.0% were women, 92.8% were white, and 56.5% were physicians. Primary care providers accounted for 56.5% of the total respondents. The majority of respondents (60.8%) did not feel confident managing patients with chronic pain. Provider confidence in managing chronic pain was positively correlated with 1) following an opioid therapy protocol (P=0.001), 2) the perceived ability to identify patients at risk for opioid misuse (P=0.006), and 3) using a consistent practice-based approach to improve their comfort level with prescribing opioids (Pcorrelated with the perception that treating pain patients was a “problem in my practice” (P=0.005). Conclusion In this study, the majority of providers did not feel confident managing chronic pain. However, provider confidence was associated with a protocolized and consistent practice-based approach toward managing opioids and the perceived ability to identify patients at risk for opioid misuse. Future studies should investigate whether provider confidence is associated with measurable competence in managing chronic pain and explore approaches to enhance appropriate levels of confidence in caring for patients with chronic pain. PMID:28652805
Lacourpaille, Lilian; Hug, François; Bouillard, Killian; Nordez, Antoine; Hogrel, Jean-Yves
The aim of the present study was to assess the reliability of shear elastic modulus measurements performed using supersonic shear imaging (SSI) in nine resting muscles (i.e. gastrocnemius medialis, tibialis anterior, vastus lateralis, rectus femoris, triceps brachii, biceps brachii, brachioradialis, adductor pollicis obliquus and abductor digiti minimi) of different architectures and typologies. Thirty healthy subjects were randomly assigned to the intra-session reliability (n = 20), inter-day reliability (n = 21) and the inter-observer reliability (n = 16) experiments. Muscle shear elastic modulus ranged from 2.99 (gastrocnemius medialis) to 4.50 kPa (adductor digiti minimi and tibialis anterior). On the whole, very good reliability was observed, with a coefficient of variation (CV) ranging from 4.6% to 8%, except for the inter-operator reliability of adductor pollicis obliquus (CV = 11.5%). The intraclass correlation coefficients were good (0.871 ± 0.045 for the intra-session reliability, 0.815 ± 0.065 for the inter-day reliability and 0.709 ± 0.141 for the inter-observer reliability). Both the reliability and the ease of use of SSI make it a potentially interesting technique that would be of benefit to fundamental, applied and clinical research projects that need an accurate assessment of muscle mechanical properties. (note)
Green, L.; Weinstock, E.V.; Karlin, E.W.
The development and production of safeguards equipment is a complex process containing many potential pitfalls between the conceptual design and its implementation in the field. The conditions for equipment use are especially demanding. At the same time, the consequences of failure may be serious. Repeated failure may result in the loss of credibility of safeguards. Expensive back up measures such as re-verification of inventories may be required. Inspectors may come to distrust the equipment. Finally, the expense of maintaining the equipment may be excessive. It is therefore essential that the process for bringing equipment for the conceptual stage to actual routine use minimizes the risk of producing equipment that is unsuitable for the job. Fortunately, approaches for accomplishing this have already been developed in both the industrial and commercial sectors. One such approach, the Low Risk Transition Plan (LRTP) is described to show it can be applied to the production of reliable safeguards equipment
Sandau, Martin; Koblauch, Henrik; Moeslund, Thomas B.
Estimating 3D joint rotations in the lower extremities accurately and reliably remains unresolved in markerless motion capture, despite extensive studies in the past decades. The main problems have been ascribed to the limited accuracy of the 3D reconstructions. Accordingly, the purpose of the pr......Estimating 3D joint rotations in the lower extremities accurately and reliably remains unresolved in markerless motion capture, despite extensive studies in the past decades. The main problems have been ascribed to the limited accuracy of the 3D reconstructions. Accordingly, the purpose...... subjects in whom hip, knee and ankle joint were analysed. Flexion/extension angles as well as hip abduction/adduction closely resembled those obtained from the marker based system. However, the internal/external rotations, knee abduction/adduction and ankle inversion/eversion were less reliable....
van Ast, J. F.; Talmon, J. L.; Renier, W. O.; Hasman, A.
We are developing seizure descriptions as a basis for decision support. Based on an existing dataset we used the Spearman-Brown prophecy formula to estimate how many neurologist/epileptologists are needed to obtain reliable seizure descriptions (rho = 0.9). By extending the number of participants to
Pidano, Anne E.; Honigfeld, Lisa; Bar-Halpern, Miri; Vivian, James E.
Background: As many as 20 % of children have diagnosable mental health conditions and nearly all of them receive pediatric primary health care. However, most children with serious mental health concerns do not receive mental health services. This study tested hypotheses that pediatric primary care providers (PPCPs) in relationships with mental…
B. V. Savchinskiy
Full Text Available On the basis of analysis of existing diagnostic methods and regeneration ways of reinforced-concrete constructions of bridges the recommendations on introduction of new modern technologies of renewal of reinforced-concrete constructions of bridges in providing their operating reliability and longevity are offered.
B. V. Savchinskiy
On the basis of analysis of existing diagnostic methods and regeneration ways of reinforced-concrete constructions of bridges the recommendations on introduction of new modern technologies of renewal of reinforced-concrete constructions of bridges in providing their operating reliability and longevity are offered.
This paper discusses issues regarding the validity and reliability of psychoeducational assessments provided to Disability Services Offices at Canadian Universities. Several vignettes illustrate some current issues and the potential consequences when university students are given less than thorough disability evaluations and ascribed diagnoses.…
Biryuk, V. V.; Tsapkova, A. B.; Larin, E. A.; Livshiz, M. Y.; Sheludko, L. P.
A set of mathematical models for calculating the reliability indexes of structurally complex multifunctional combined installations in heat and power supply systems was developed. Reliability of energy supply is considered as required condition for the creation and operation of heat and power supply systems. The optimal value of the power supply system coefficient F is based on an economic assessment of the consumers’ loss caused by the under-supply of electric power and additional system expences for the creation and operation of an emergency capacity reserve. Rationing of RI of the industrial heat supply is based on the use of concept of technological margin of safety of technological processes. The definition of rationed RI values of heat supply of communal consumers is based on the air temperature level iside the heated premises. The complex allows solving a number of practical tasks for providing reliability of heat supply for consumers. A probabilistic model is developed for calculating the reliability indexes of combined multipurpose heat and power plants in heat-and-power supply systems. The complex of models and calculation programs can be used to solve a wide range of specific tasks of optimization of schemes and parameters of combined heat and power plants and systems, as well as determining the efficiency of various redundance methods to ensure specified reliability of power supply.
Sheleshey, Tanya Viktorivna
Erosion and corrosion damage of the elements of electric equipment is one of the main reasons for the normal operation disturbance and sometimes lockup of the thermal power equipment. The metal loss in the elements of power plants due to the erosion and corrosion wear over their entire service life exceeds 8 % of the original mass that results in the efficiency reduction. In addition, erosion and corrosion process products cause the blockage of water-steam circuit which negatively affects a r...
Full Text Available File systems and applications try to implement their own update protocols to guarantee data consistency, which is one of the most crucial aspects of computing systems. However, we found that the storage devices are substantially under‐utilized when preserving data consistency because they generate massive storage write traffic with many disk cache flush operations and force‐unit‐access (FUA commands. In this paper, we present DJFS (Delta‐Journaling File System that provides both a high level of performance and data consistency for different applications. We made three technical contributions to achieve our goal. First, to remove all storage accesses with disk cache flush operations and FUA commands, DJFS uses small‐sized NVRAM for a file system journal. Second, to reduce the access latency and space requirements of NVRAM, DJFS attempts to journal compress the differences in the modified blocks. Finally, to relieve explicit checkpointing overhead, DJFS aggressively reflects the checkpoint transactions to file system area in the unit of the specified region. Our evaluation on TPC‐C SQLite benchmark shows that, using our novel optimization schemes, DJFS outperforms Ext4 by up to 64.2 times with only 128 MB of NVRAM.
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin
Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.
Michael L. Mann
Full Text Available Unreliable electricity supplies are common in developing countries and impose large socio-economic costs, yet precise information on electricity reliability is typically unavailable. This paper presents preliminary results from a machine-learning approach for using satellite imagery of nighttime lights to develop estimates of electricity reliability for western India at a finer spatial scale. We use data from the Visible Infrared Imaging Radiometer Suite (VIIRS onboard the Suomi National Polar Partnership (SNPP satellite together with newly-available data from networked household voltage meters. Our results point to the possibilities of this approach as well as areas for refinement. With currently available training data, we find a limited ability to detect individual outages identified by household-level measurements of electricity voltage. This is likely due to the relatively small number of individual outages observed in our preliminary data. However, we find that the approach can estimate electricity reliability rates for individual locations fairly well, with the predicted versus actual regression yielding an R2 > 0.5. We also find that, despite the after midnight overpass time of the SNPP satellite, the reliability estimates derived are representative of daytime reliability.
Homke, P.; Kutsch, W.; Lindauer, E.
This paper gives a survey on reliability data assessment in the FRG. The activities, which were carried out for the German Risk Assessment Study are presented together with selected results. A systematic data collection in a nuclear power plant is described and the experiences are discussed, which were gained in this project
Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...
Waste packages for a US nuclear waste repository are required to provide reasonable assurance of maintaining substantially complete containment of radionuclides for 300 to 1000 years after closure. The waiting time to failure for complex failure processes affecting engineered or manufactured systems is often found to be an exponentially-distributed random variable. Assuming that this simple distribution can be used to describe the behavior of a hypothetical single barrier waste package, calculations presented in this paper show that the mean time to failure (the only parameter needed to completely specify an exponential distribution) would have to be more than 10 7 years in order to provide reasonable assurance of meeting this requirement. With two independent barriers, each would need to have a mean time to failure of only 10 5 years to provide the same reliability. Other examples illustrate how multiple barriers can provide a strategy for not only achieving but demonstrating regulatory compliance
For the investigation of the risk of nuclear power plants loss-of-coolant accidents and transients have to be analyzed. The different functions of the engineered safety features installed to cope with transients are explained. The event tree analysis is carried out for the important transient 'loss of normal onsite power'. Preliminary results of the reliability analyses performed for quantitative evaluation of this event tree are shown. (orig.) [de
Delucchi, Mark A.; Jacobson, Mark Z.
This is Part II of two papers evaluating the feasibility of providing all energy for all purposes (electric power, transportation, and heating/cooling), everywhere in the world, from wind, water, and the sun (WWS). In Part I, we described the prominent renewable energy plans that have been proposed and discussed the characteristics of WWS energy systems, the global demand for and availability of WWS energy, quantities and areas required for WWS infrastructure, and supplies of critical materials. Here, we discuss methods of addressing the variability of WWS energy to ensure that power supply reliably matches demand (including interconnecting geographically dispersed resources, using hydroelectricity, using demand-response management, storing electric power on site, over-sizing peak generation capacity and producing hydrogen with the excess, storing electric power in vehicle batteries, and forecasting weather to project energy supplies), the economics of WWS generation and transmission, the economics of WWS use in transportation, and policy measures needed to enhance the viability of a WWS system. We find that the cost of energy in a 100% WWS will be similar to the cost today. We conclude that barriers to a 100% conversion to WWS power worldwide are primarily social and political, not technological or even economic. - Research highlights: → We evaluate the feasibility of global energy supply from wind, water, and solar energy. → WWS energy can be supplied reliably and economically to all energy-use sectors. → The social cost of WWS energy generally is less than the cost of fossil-fuel energy. → Barriers to 100% WWS power worldwide are socio-political, not techno-economic.
Troncoso, M.; Opazo, C.; Quilodran, C.; Lizama, V.
Aim: Our goal was to implement the radioisotopic method to measure the nasal mucociliary velocity of transport (NMVT) in a feasible way in order to make it easily available as well as to validate the accuracy of the results. Such a method is needed when primary ciliary dyskinesia (PCD) is suspected, a disorder characterized for low NMVT, non-specific chronic respiratory symptoms that needs to be confirmed by electronic microscopic cilia biopsy. Methods: We performed one hundred studies from February 2000 until February 2002. Patients aged 2 months to 39 years, mean 9 years. All of them were referred from the Respiratory Disease Department. Ninety had upper or lower respiratory symptoms, ten were healthy controls. The procedure, done be the Nuclear Medicine Technologist, consists to put a 20 μl drop of 99mTc-MAA (0,1 mCi, 4 MBq) behind the head of the inferior turbinate in one nostril using a frontal light, a nasal speculum and a teflon catheter attached to a tuberculin syringe. The drop movement was acquired in a gamma camera-computer system and the velocity was expressed in mm/min. As there is need for the patient not to move during the procedure, sedation has to be used in non-cooperative children. Abnormal NMVT values cases were referred for nasal biopsy. Patients were classified in three groups. Normal controls (NC), PCD confirmed by biopsy (PCDB) and cases with respiratory symptoms without biopsy (RSNB). In all patients with NMVT less than 2.4 mm/min PCD was confirmed by biopsy. There was a clear-cut separation between normal and abnormal values and interestingly even the highest NMVT in PCDB cases was lower than the lowest NMVT in NC. The procedure is not as easy as is generally described in the literature because the operator has to get some skill as well as for the need of sedation in some cases. Conclusion: The procedure gives reliable, reproducible and objective results. It is safe, not expensive and quick in cooperative patients. Although, sometimes
Starke, Michael R [ORNL; Kirby, Brendan J [ORNL; Kueck, John D [ORNL; Todd, Duane [Alcoa; Caulfield, Michael [Alcoa; Helms, Brian [Alcoa
Demand response is the largest underutilized reliability resource in North America. Historic demand response programs have focused on reducing overall electricity consumption (increasing efficiency) and shaving peaks but have not typically been used for immediate reliability response. Many of these programs have been successful but demand response remains a limited resource. The Federal Energy Regulatory Commission (FERC) report, 'Assessment of Demand Response and Advanced Metering' (FERC 2006) found that only five percent of customers are on some form of demand response program. Collectively they represent an estimated 37,000 MW of response potential. These programs reduce overall energy consumption, lower green house gas emissions by allowing fossil fuel generators to operate at increased efficiency and reduce stress on the power system during periods of peak loading. As the country continues to restructure energy markets with sophisticated marginal cost models that attempt to minimize total energy costs, the ability of demand response to create meaningful shifts in the supply and demand equations is critical to creating a sustainable and balanced economic response to energy issues. Restructured energy market prices are set by the cost of the next incremental unit of energy, so that as additional generation is brought into the market, the cost for the entire market increases. The benefit of demand response is that it reduces overall demand and shifts the entire market to a lower pricing level. This can be very effective in mitigating price volatility or scarcity pricing as the power system responds to changing demand schedules, loss of large generators, or loss of transmission. As a global producer of alumina, primary aluminum, and fabricated aluminum products, Alcoa Inc., has the capability to provide demand response services through its manufacturing facilities and uniquely through its aluminum smelting facilities. For a typical aluminum smelter
Bulte, J P; Wauters, C A P; Duijm, L E M; de Wilt, J H W; Strobbe, L J A
Fine Needle Aspiration Biopsy (FNAB), Core Needle biopsy (CNB) and hybrid techniques including Core Wash Cytology (CWC) are available for same-day diagnosis in breast lesions. In CWC a washing of the biopsy core is processed for a provisional cytological diagnosis, after which the core is processed like a regular CNB. This study focuses on the reliability of CWC in daily practice. All consecutive CWC procedures performed in a referral breast centre between May 2009 and May 2012 were reviewed, correlating CWC results with the CNB result, definitive diagnosis after surgical resection and/or follow-up. Symptomatic as well as screen-detected lesions, undergoing CNB were included. 1253 CWC procedures were performed. Definitive histology showed 849 (68%) malignant and 404 (32%) benign lesions. 80% of CWC procedures yielded a conclusive diagnosis: this percentage was higher amongst malignant lesions and lower for benign lesions: 89% and 62% respectively. Sensitivity and specificity of a conclusive CWC result were respectively 98.3% and 90.4%. The eventual incidence of malignancy in the cytological 'atypical' group (5%) was similar to the cytological 'benign' group (6%). CWC can be used to make a reliable provisional diagnosis of breast lesions within the hour. The high probability of conclusive results in malignant lesions makes CWC well suited for high risk populations. Copyright © 2016 Elsevier Ltd, BASO ~ the Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Van Oost, S; Dubois, M; Bekaert, G [Societe Anonyme Belge de Construction Aeronautique - SABCA (Belgium)
The paper present the results of various SABCA activities in the field of two-phase heat transport system. These results have been based on a critical review and analysis of the existing two-phase loop and of the future loop needs in space applications. The research and the development of a high capillary wick (capillary pressure up to 38 000 Pa) are described. These activities have led towards the development of a reliable high performance capillary loop concept (HPCPL), which is discussed in details. Several loop configurations mono/multi-evaporators have been ground tested. The presented results of various tests clearly show the viability of this concept for future applications. Proposed flight demonstrations as well as potential applications conclude this paper. (authors) 7 refs.
Van Oost, S.; Dubois, M.; Bekaert, G. [Societe Anonyme Belge de Construction Aeronautique - SABCA (Belgium)
The paper present the results of various SABCA activities in the field of two-phase heat transport system. These results have been based on a critical review and analysis of the existing two-phase loop and of the future loop needs in space applications. The research and the development of a high capillary wick (capillary pressure up to 38 000 Pa) are described. These activities have led towards the development of a reliable high performance capillary loop concept (HPCPL), which is discussed in details. Several loop configurations mono/multi-evaporators have been ground tested. The presented results of various tests clearly show the viability of this concept for future applications. Proposed flight demonstrations as well as potential applications conclude this paper. (authors) 7 refs.
Utility load management programs--including direct load control and interruptible load programs--were employed by utilities in the past as system reliability resources. With electricity industry restructuring, the context for these programs has changed; the market that was once controlled by vertically integrated utilities has become competitive, raising the question: can existing load management programs be modified so that they can effectively participate in competitive energy markets? In the short run, modified and/or improved operation of load management programs may be the most effective form of demand-side response available to the electricity system today. However, in light of recent technological advances in metering, communication, and load control, utility load management programs must be carefully reviewed in order to determine appropriate investments to support this transition. This report investigates the feasibility of and options for modifying an existing utility load management system so that it might provide reliability services (i.e. ancillary services) in the competitive markets that have resulted from electricity industry restructuring. The report is a case study of Southern California Edison's (SCE) load management programs. SCE was chosen because it operates one of the largest load management programs in the country and it operates them within a competitive wholesale electricity market. The report describes a wide range of existing and soon-to-be-available communication, control, and metering technologies that could be used to facilitate the evolution of SCE's load management programs and systems to provision of reliability services. The fundamental finding of this report is that, with modifications, SCE's load management infrastructure could be transitioned to provide critical ancillary services in competitive electricity markets, employing currently or soon-to-be available load control technologies.
García-Ramos, Amador; Feriche, Belén; Pérez-Castilla, Alejandro; Padial, Paulino; Jaric, Slobodan
This study aimed to explore the strength of the force-velocity (F-V) relationship of lower limb muscles and the reliability of its parameters (maximum force [F 0 ], slope [a], maximum velocity [V 0 ], and maximum power [P 0 ]). Twenty-three men were tested in two different jump types (squat and countermovement jump: SJ and CMJ), performed under two different loading conditions (free weight and Smith machine: Free and Smith) with 0, 17, 30, 45, 60, and 75 kg loads. The maximum and averaged values of F and V were obtained for the F-V relationship modelling. All F-V relationships were strong and linear independently whether observed from the averaged across the participants (r ≥ 0.98) or individual data (r = 0.94-0.98), while their parameters were generally highly reliable (F 0 [CV: 4.85%, ICC: 0.87], V 0 [CV: 6.10%, ICC: 0.82], a [CV: 10.5%, ICC: 0.81], and P 0 [CV: 3.5%, ICC: 0.93]). Both the strength of the F-V relationships and the reliability of their parameters were significantly higher for (1) the CMJ over the SJ, (2) the Free over the Smith loading type, and (3) the maximum over the averaged F and V variables. In conclusion, although the F-V relationships obtained from all the jumps tested were linear and generally highly reliable, the less appropriate choice for testing the F-V relationship could be through the averaged F and V data obtained from the SJ performed either in a Free weight or in a Smith machine. Insubstantial differences exist among the other combinations tested.
Full Text Available The good result reliability of regular analyzes of milk composition could improve the health monitoring of dairy cows and herd management. The aim of this study was the analysis of measurement of abilities and properties of RT (Real Time system (AfiLab = AfiMilk (NIR measurement unit (near infrared spectroscopy and electrical conductivity (C of milk by conductometry + AfiFarm (calibration and interpretation software for the analysis of individual milk samples (IMSs. There were 2 × 30 IMSs in the experiment. The reference values (RVs of milk components and properties (fat (F, proteins (P, lactose (L, C and the somatic cell count (SCC were determined by conventional (direct and indirect: conductometry (C; infrared spectroscopy 1 with the filter technology and 2 with the Fourier transformations (F, P, L; fluoro-opto-electronic cell counting (SCC in the film on the rotation disc (1 and by flow cytometry (2 methods. AfiLab method (alternative showed less close relationships as compared to the RVs as relationships between reference methods. This was expected. However, these relationships (r were mostly significant: F from .597 to .738 (P ≤ 0.01 and ≤ 0.001; P from .284 to .787 (P > 0.05 and P ≤ 0.001; C .773 (P ≤ 0.001. Correlations (r were not significant (P > 0.05: L from −.013 to .194; SCC from −.148 to −.133. Variability of the RVs explained the following percentages of variability in AfiLab results: F to 54.4 %; P to 61.9 %; L only 3.8 %; C to 59.7 %. Explanatory power (reliability of AfiLab results to the animal is increasing with the regularity of their measurements (principle of real time application. Correlation values r (x minus 1.64 × sd for confidence interval (one-sided at a level of 95 % can be used for an alternative method in assessing the calibration quality. These limits are F 0.564, P 0.784 and C 0.715 and can be essential with the further implementation of this advanced technology of dairy herd management.
Aly, Sharif S; Zhao, Jianyang; Li, Ben; Jiang, Jiming
The Intraclass Correlation Coefficient (ICC) is commonly used to estimate the similarity between quantitative measures obtained from different sources. Overdispersed data is traditionally transformed so that linear mixed model (LMM) based ICC can be estimated. A common transformation used is the natural logarithm. The reliability of environmental sampling of fecal slurry on freestall pens has been estimated for Mycobacterium avium subsp. paratuberculosis using the natural logarithm transformed culture results. Recently, the negative binomial ICC was defined based on a generalized linear mixed model for negative binomial distributed data. The current study reports on the negative binomial ICC estimate which includes fixed effects using culture results of environmental samples. Simulations using a wide variety of inputs and negative binomial distribution parameters (r; p) showed better performance of the new negative binomial ICC compared to the ICC based on LMM even when negative binomial data was logarithm, and square root transformed. A second comparison that targeted a wider range of ICC values showed that the mean of estimated ICC closely approximated the true ICC.
Bräutigam, Klaus-Rainer; Jörissen, Juliane; Priefer, Carmen
The reduction of food waste is seen as an important societal issue with considerable ethical, ecological and economic implications. The European Commission aims at cutting down food waste to one-half by 2020. However, implementing effective prevention measures requires knowledge of the reasons and the scale of food waste generation along the food supply chain. The available data basis for Europe is very heterogeneous and doubts about its reliability are legitimate. This mini-review gives an overview of available data on food waste generation in EU-27 and discusses their reliability against the results of own model calculations. These calculations are based on a methodology developed on behalf of the Food and Agriculture Organization of the United Nations and provide data on food waste generation for each of the EU-27 member states, broken down to the individual stages of the food chain and differentiated by product groups. The analysis shows that the results differ significantly, depending on the data sources chosen and the assumptions made. Further research is much needed in order to improve the data stock, which builds the basis for the monitoring and management of food waste. © The Author(s) 2014.
Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro
The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian
Davis, Charles C; Willis, Charles G; Connolly, Bryan; Kelly, Courtland; Ellison, Aaron M
Climate change has resulted in major changes in the phenology of some species but not others. Long-term field observational records provide the best assessment of these changes, but geographic and taxonomic biases limit their utility. Plant specimens in herbaria have been hypothesized to provide a wealth of additional data for studying phenological responses to climatic change. However, no study to our knowledge has comprehensively addressed whether herbarium data are accurate measures of phenological response and thus applicable to addressing such questions. We compared flowering phenology determined from field observations (years 1852-1858, 1875, 1878-1908, 2003-2006, 2011-2013) and herbarium records (1852-2013) of 20 species from New England, United States. Earliest flowering date estimated from herbarium records faithfully reflected field observations of first flowering date and substantially increased the sampling range across climatic conditions. Additionally, although most species demonstrated a response to interannual temperature variation, long-term temporal changes in phenological response were not detectable. Our findings support the use of herbarium records for understanding plant phenological responses to changes in temperature, and also importantly establish a new use of herbarium collections: inferring primary phenological cueing mechanisms of individual species (e.g., temperature, winter chilling, photoperiod). These latter data are lacking from most investigations of phenological change, but are vital for understanding differential responses of individual species to ongoing climate change. © 2015 Botanical Society of America.
As a contribution towards identifying problem areas and for assessing probabilistic safety assessment (PSA) methods and procedures of analysis, JRC has organized a wide-range Benchmark Exercise on systems reliability. This has been executed by ten different teams involving seventeen organizations from nine European countries. The exercise has been based on a real case (Auxiliary Feedwater System of EDF Paluel PWR 1300 MWe Unit), starting from analysis of technical specifications, logical and topological layout and operational procedures. Terms of references included both qualitative and quantitative analyses. The subdivision of the exercise into different phases and the rules adopted allowed assessment of the different components of the spread of the overall results. It appeared that modelling uncertainties may overwhelm data uncertainties and major efforts must be spent in order to improve consistency and completeness of qualitative analysis. After successful completion of the first exercise, CEC-JRC program has planned separate exercises on analysis of dependent failures and human factors before approaching the evaluation of a complete accident sequence
Haas, Matthias; Hamm, Bernd; Niehues, Stefan M
Today, lung volumes can be easily calculated from chest computed tomography (CT) scans. Modern postprocessing workstations allow automated volume measurement of data sets acquired. However, there are challenges in the use of lung volume as an indicator of pulmonary disease when it is obtained from routine CT. Intra-individual variation and methodologic aspects have to be considered. Our goal was to assess the reliability of volumetric measurements in routine CT lung scans. Forty adult cancer patients whose lungs were unaffected by the disease underwent routine chest CT scans in 3-month intervals, resulting in a total number of 302 chest CT scans. Lung volume was calculated by automatic volumetry software. On average of 7.2 CT scans were successfully evaluable per patient (range 2-15). Intra-individual changes were assessed. In the set of patients investigated, lung volume was approximately normally distributed, with a mean of 5283 cm(3) (standard deviation = 947 cm(3), skewness = -0.34, and curtosis = 0.16). Between different scans in one and the same patient the median intra-individual standard deviation in lung volume was 853 cm(3) (16% of the mean lung volume). Automatic lung segmentation of routine chest CT scans allows a technically stable estimation of lung volume. However, substantial intra-individual variations have to be considered. A median intra-individual deviation of 16% in lung volume between different routine scans was found. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Walser, Sarah A; Werner-Lin, Allison; Mueller, Rebecca; Miller, Victoria A; Biswas, Sawona; Bernhardt, Barbara A
This study provides preliminary data on the process and content of returning results from exome sequencing offered to children through one of the Clinical Sequencing Exploratory Research (CSER) projects. We recorded 25 sessions where providers returned diagnostic and secondary sequencing results to families. Data interpretation utilized inductive thematic analysis. Typically, providers followed a results report and discussed diagnostic findings using technical genomic and sequencing concepts. We identified four provider processes for returning results: teaching genetic concepts; assessing family response; personalizing findings; and strengthening patient-provider relationships. Sessions should reflect family interest in medical management and next steps, and minimize detailed genomic concepts. As the scope and complexity of sequencing increase, the traditional information-laden counseling model requires revision.
Omahen, G.; Zdesar, U.
Laboratories involved in the protection against radiation and therefore in the measurement of radioactivity, dose rate and contamination have always been tied to the quality of their measurements, particularly those that have performed measurements for nuclear power plants. However in the laboratories more than quality it was more important, that people are professional, that they are engaged in scientific work and know how to interpret the results. Very often these are things that do not go along with reviewing the measuring instruments and quality records. However customer requires measurement results that can be trusted. This is the purpose of the standard SIST EN ISO / IEC 17025 in which the requirements for testing and calibration laboratories are standardised. The standard in force since 1999. In some countries, requests for accreditation of testing laboratories according to SIST EN ISO / IEC 17025 is even in regulation. This request is for example in the Croatian and Slovenian regulations for laboratories involved in measuring the radioactivity, dose rate, contamination, or by checking the X-ray apparatus. Several laboratories have been accreditation for several years. From that experience we can conclude that customer gets reliable results from the accredited laboratories at relatively low cost. On the other side laboratory which his accredited has introduced a line of work and his laboratory, there are rules for equipment, personnel, training and all that eventually enhanced measurement expertise. With accreditation, it is much easier to compensate for the loss of workers due to pension or leaving the laboratory because every moment must always be in the laboratory at least two who know how to work on the method. Accreditation is not improving radiation protection or reducing Becquerel in the air. But at least we know how accurate mSv or Bq are and how small mSv and Bq can be measured. (author) [sr
Roelofs-Thijssen, M.A.; Schreuder, M.F.; Hogeveen, M.; Herwaarden, A.E. van
OBJECTIVES: While urine sampling is necessary in the diagnosis of urinary tract infection and electrolyte disturbances, the collection of urine in neonates and non-toilet-trained children is often difficult. A universal urine collection method providing representative urinalyses results is needed.
Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.
This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual's performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average
Full Text Available The analyses of genome sequences have led to the proposal that lateral gene transfers (LGTs among prokaryotes are so widespread that they disguise the interrelationships among these organisms. This has led to questioning whether the Darwinian model of evolution is applicable to the prokaryotic organisms. In this review, we discuss the usefulness of taxon-specific molecular markers such as conserved signature indels (CSIs and conserved signature proteins (CSPs for understanding the evolutionary relationships among prokaryotes and to assess the influence of LGTs on prokaryotic evolution. The analyses of genomic sequences have identified large numbers of CSIs and CSPs that are unique properties of different groups of prokaryotes ranging from phylum to genus levels. The species distribution patterns of these molecular signatures strongly support a tree-like vertical inheritance of the genes containing these molecular signatures that is consistent with phylogenetic trees. Recent detailed studies in this regard on Thermotogae and Archaea, which are reviewed here, have identified large numbers of CSIs and CSPs that are specific for the species from these two taxa and a number of their major clades. The genetic changes responsible for these CSIs (and CSPs initially likely occurred in the common ancestors of these taxa and then vertically transferred to various descendants. Although some CSIs and CSPs in unrelated groups of prokaryotes were identified, their small numbers and random occurrence has no apparent influence on the consistent tree-like branching pattern emerging from other markers. These results provide evidence that although LGT is an important evolutionary force, it does not mask the tree-like branching pattern of prokaryotes or understanding of their evolutionary relationships. The identified CSIs and CSPs also provide novel and highly specific means for identification of different groups of microbes and for taxonomical and biochemical
Bhandari, Vaibhav; Naushad, Hafiz S; Gupta, Radhey S
The analyses of genome sequences have led to the proposal that lateral gene transfers (LGTs) among prokaryotes are so widespread that they disguise the interrelationships among these organisms. This has led to questioning of whether the Darwinian model of evolution is applicable to prokaryotic organisms. In this review, we discuss the usefulness of taxon-specific molecular markers such as conserved signature indels (CSIs) and conserved signature proteins (CSPs) for understanding the evolutionary relationships among prokaryotes and to assess the influence of LGTs on prokaryotic evolution. The analyses of genomic sequences have identified large numbers of CSIs and CSPs that are unique properties of different groups of prokaryotes ranging from phylum to genus levels. The species distribution patterns of these molecular signatures strongly support a tree-like vertical inheritance of the genes containing these molecular signatures that is consistent with phylogenetic trees. Recent detailed studies in this regard on the Thermotogae and Archaea, which are reviewed here, have identified large numbers of CSIs and CSPs that are specific for the species from these two taxa and a number of their major clades. The genetic changes responsible for these CSIs (and CSPs) initially likely occurred in the common ancestors of these taxa and then vertically transferred to various descendants. Although some CSIs and CSPs in unrelated groups of prokaryotes were identified, their small numbers and random occurrence has no apparent influence on the consistent tree-like branching pattern emerging from other markers. These results provide evidence that although LGT is an important evolutionary force, it does not mask the tree-like branching pattern of prokaryotes or understanding of their evolutionary relationships. The identified CSIs and CSPs also provide novel and highly specific means for identification of different groups of microbes and for taxonomical and biochemical studies.
Düking, Peter; Fuss, Franz Konstantin; Holmberg, Hans-Christer; Sperlich, Billy
Although it is becoming increasingly popular to monitor parameters related to training, recovery, and health with wearable sensor technology (wearables), scientific evaluation of the reliability, sensitivity, and validity of such data is limited and, where available, has involved a wide variety of approaches. To improve the trustworthiness of data collected by wearables and facilitate comparisons, we have outlined recommendations for standardized evaluation. We discuss the wearable devices themselves, as well as experimental and statistical considerations. Adherence to these recommendations should be beneficial not only for the individual, but also for regulatory organizations and insurance companies. ©Peter Düking, Franz Konstantin Fuss, Hans-Christer Holmberg, Billy Sperlich. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 30.04.2018.
Amin, Faisal Mohammad; West, Anders Sode; Jørgensen, Carina Sleiborg; Simonsen, Sofie Amalie; Lindberg, Ulrich; Tranum-Jensen, Jørgen; Hougaard, Anders
Several studies have indicated that the population in general perceives doctors as reliable. In the present study perceptions of reliability and kindness attributed to another socially significant archetype, Santa Claus, have been comparatively examined in relation to the doctor. In all, 52 randomly chosen participants were shown a film, where a narrator dressed either as Santa Claus or as a doctor tells an identical story. Structured interviews were then used to assess the subjects' perceptions of reliability and kindness in relation to the narrator's appearance. We found a strong inclination for Santa Claus being perceived as friendlier than the doctor (p = 0.053). However, there was no significant difference in the perception of reliability between Santa Claus and the doctor (p = 0.524). The positive associations attributed to Santa Claus probably cause that he is perceived friendlier than the doctor who may be associated with more serious and unpleasant memories of illness and suffering. Surprisingly, and despite him being an imaginary person, Santa Claus was assessed as being as reliable as the doctor.
Bahuguna, Rajeev Nayan; Joshi, Rohit; Shukla, Alok; Pandey, Mayank; Kumar, J
A novel pathogen defense strategy by thiamine priming was evaluated for its efficacy against sheath blight pathogen, Rhizoctonia solani AG-1A, of rice and compared with that of systemic fungicide, carbendazim (BCM). Seeds of semidwarf, high yielding, basmati rice variety Vasumati were treated with thiamine (50 mM) and BCM (4 mM). The pot cultured plants were challenge inoculated with R. solani after 40 days of sowing and effect of thiamine and BCM on rice growth and yield traits was examined. Higher hydrogen peroxide content, total phenolics accumulation, phenylalanine ammonia lyase (PAL) activity and superoxide dismutase (SOD) activity under thiamine treatment displayed elevated level of systemic resistance, which was further augmented under challenging pathogen infection. High transcript level of phenylalanine ammonia lyase (PAL) and manganese superoxide dismutase (MnSOD) validated mode of thiamine primed defense. Though minimum disease severity was observed under BCM treatment, thiamine produced comparable results, with 18.12 per cent lower efficacy. Along with fortifying defense components and minor influence on photosynthetic pigments and nitrate reductase (NR) activity, thiamine treatment significantly reduced pathogen-induced loss in photosynthesis, stomatal conductance, chlorophyll fluorescence, NR activity and NR transcript level. Physiological traits affected under pathogen infection were found signatory for characterizing plant's response under disease and were detectable at early stage of infection. These findings provide a novel paradigm for developing alternative, environmentally safe strategies to control plant diseases. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Kersken, M.; Saglietti, F.
GRS work in software safety was mainly concerned with the qualitative assessment of software reliability and quality. As a supplement to these activities the work within the REQUEST project emphasized the quantitative determination of the respective parameters. The three-level quality model COQUAMO serves for the computation - and partly for the prediction - of quality factors during the software life cycle. PERFIDE controls the application of software reliability models during the test phase and in early operational life. Specific attention was paid to the assessment of fault-tolerant diverse software systems. (orig.) [de
Brian F. Geiger
Full Text Available The Alabama Lifespan Respite Resource Network™ enhances respite services for family caregivers. University evaluators conducted a statewide assessment of respite providers using multiple formats. The purpose was to determine met and unmet needs for respite training among providers serving family caregivers of individuals with disabilities and chronic illnesses. A total of 317 respite providers attempted and 191 completed survey items, revealing respite experience, disabilities and chronic illnesses, areas of difficulty, prior training and confidence, training needs and preferences. Results will be used by a state Network to match content and delivery of training to providers’ needs. Respite providers have important roles to play, sharing information about respite services and providers, advocating for caregiver eligibility to receive services, and participating in training paid and volunteer providers.
Peyrol, Mark; Rubin, Richard R.; Lauritzen, Torsten
the relationships between outcomes and both country and respondent characteristics, and the interaction between these two factors. Results Providers rated chronic-care systems and remuneration for chronic care as mediocre. Patients reported that ease of access to care was high, but not without financial barriers....... Patients reported moderate levels of collaboration among providers, and providers indicated that several specialist disciplines were not readily available to them. Patients reported high levels of collaboration with providers in their own care. Provider endorsement of primary prevention strategies for type...... 2 diabetes was high. Patients with fewer socio-economic resources and more diabetes complications had lower access (and/or higher barriers) to care and lower quality of patient–provider collaboration. Countries differed significantly for all outcomes, and the relationships between respondent...
Cadwallader, J; Asirwa, C; Li, X; Kesterson, J; Tierney, W M; Were, M C
Small numbers of tests with pending results are documented in hospital discharge summaries leading to breakdown in communication and medical errors due to inadequate followup. Evaluate effect of using a computerized provider order entry (CPOE) system to enforce documentation of tests with pending results into hospital discharge summaries. We assessed the percent of all tests with pending results and those with actionable results that were documented before (n = 182 discharges) and after (n = 203 discharges) implementing the CPOE-enforcement tool. We also surveyed providers (n = 52) about the enforcement functionality. Documentation of all tests with pending results improved from 12% (87/701 tests) before to 22% (178/812 tests) (p = 0.02) after implementation. Documentation of tests with eventual actionable results increased from 0% (0/24) to 50% (14/28)(ppending results into discharge summaries significantly increased documentation rates, especially of actionable tests. However, gaps in documentation still exist.
McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna
Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.
Buche, D. L.
This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.
Vitkovskij, I.V.; Kirillov, I.R.; Chajka, P.Yu.; Kryuchkov, E.A.; Poplavskij, V.M.; Nosov, Yu.V.; Oshkanov, N.N.
Main factors, determining the service life of induction electromagnetic pumps (IEP), are analyzed. It is shown that the IEP serviceability depends mainly on the winding reliability. The main damaging factors, acting on the windings, are noted. The expressions for calculation of the failure intensity for the coil and case insulations are obtained [ru
He, Qingping; Opposs, Dennis
National tests, public examinations, and vocational qualifications in England are used for a variety of purposes, including the certification of individual learners in different subject areas and the accountability of individual professionals and institutions. However, there has been ongoing debate about the reliability and validity of their…
Little, S. [Suncor Energy, Calgary, AB (Canada)
Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.
Rackwitz, R.; Schall, G.
The method of outcrossings has been shown to be efficient when calculating the failure probability of metallic structural components under ergodic Gaussian loading. Using Paris/Erdogan's crack growth law it is possible to develop a semi-analytical calculation model for both the reliability and the risk function. For numerical studies an approximate method of asymptotic nature is proposed. The same methodology also enables to incorporate inspection observations. (orig.) [de
Wynn, Julia; Lewis, Katie; Amendola, Laura M; Bernhardt, Barbara A; Biswas, Sawona; Joshi, Manasi; McMullen, Carmit; Scollon, Sarah
Current medical practice includes the application of genomic sequencing (GS) in clinical and research settings. Despite expanded use of this technology, the process of disclosure of genomic results to patients and research participants has not been thoroughly examined and there are no established best practices. We conducted semi-structured interviews with 21 genetic and non-genetic clinicians returning results of GS as part of the NIH funded Clinical Sequencing Exploratory Research (CSER) Consortium projects. Interviews focused on the logistics of sessions, participant/patient reactions and factors influencing them, how the sessions changed with experience, and resources and training recommended to return genomic results. The length of preparation and disclosure sessions varied depending on the type and number of results and their implications. Internal and external databases, online resources and result review meetings were used to prepare. Respondents reported that participants' reactions were variable and ranged from enthusiasm and relief to confusion and disappointment. Factors influencing reactions were types of results, expectations and health status. A recurrent challenge was managing inflated expectations about GS. Other challenges included returning multiple, unanticipated and/or uncertain results and navigating a rare diagnosis. Methods to address these challenges included traditional genetic counseling techniques and modifying practice over time in order to provide anticipatory guidance and modulate expectations. Respondents made recommendations to improve access to genomic resources and genetic referrals to prepare future providers as the uptake of GS increases in both genetic and non-genetic settings. These findings indicate that returning genomic results is similar to return of results in traditional genetic testing but is magnified by the additional complexity and potential uncertainty of the results. Managing patient expectations, initially
Chien, Tsair-Wei; Chou, Ming-Ting; Wang, Wen-Chung; Tsai, Li-Shu; Lin, Weir-Sen
Few studies discuss the indicators used to assess the effect on cost containment in healthcare across hospitals in a single-payer national healthcare system with constrained medical resources. We present the intraclass correlation coefficient (ICC) to assess how well Taiwan constrained hospital-provided medical services in such a system. A custom Excel-VBA routine to record the distances of standard deviations (SDs) from the central line (the mean over the previous 12 months) of a control chart was used to construct and scale annual medical expenditures sequentially from 2000 to 2009 for 421 hospitals in Taiwan to generate the ICC. The ICC was then used to evaluate Taiwan's year-based convergent power to remain unchanged in hospital-provided constrained medical services. A bubble chart of SDs for a specific month was generated to present the effects of using control charts in a national healthcare system. ICCs were generated for Taiwan's year-based convergent power to constrain its medical services from 2000 to 2009. All hospital groups showed a gradually well-controlled supply of services that decreased from 0.772 to 0.415. The bubble chart identified outlier hospitals that required investigation of possible excessive reimbursements in a specific time period. We recommend using the ICC to annually assess a nation's year-based convergent power to constrain medical services across hospitals. Using sequential control charts to regularly monitor hospital reimbursements is required to achieve financial control in a single-payer nationwide healthcare system.
Walker, Nancy; Michaud, Kaleb; Wolfe, Frederick
To describe workplace limitations and the validity and reliability of the Work Limitations Questionnaire (WLQ) in persons with rheumatoid arthritis (RA). A total of 836 employed persons with RA reported clinical and work related measures and completed the WLQ, a 25 item questionnaire that assesses the impact of chronic health conditions on job performance and productivity. Limitations are categorized into 4 domains: physical demands (PDS), mental demands (MDS), time management demands (TMS), and output demands (ODS), which are then used to calculate the WLQ index. Of the 836 completed WLQ, about 10% (85) could not be scored, as more than half the items in each domain were not applicable to the patient's job. Demographic and clinical variables were associated with missing WLQ scores including older age (OR 1.7, 95% CI 1.3-2.1), male sex (OR 1.9, 95% CI 1.2-3.0), and Health Assessment Questionnaire (HAQ) scores (OR 1.4, 95% CI 1.0-2.0). Work limitations were present in all work domains: PDS (27.5%), MDS (15.7%), ODS (19.4%), and TMS (28.6%), resulting in a mean WLQ index of 5.9 (SD 5.6), which corresponds to a 4.9% decrease in productivity and a 5.1% increase in work hours to compensate for productivity loss. The WLQ index was inversely associated with Medical Outcomes Study Short Form 36 (SF-36) Mental Component Score (MCS; r = -0.60) and Physical Component Score (PCS; r = -0.49). Fatigue (0.5), pain (0.46), and HAQ (0.56) were also significantly associated with the WLQ index. Weaker associations were seen with days unable to perform (0.29), days activities cut down (0.38), and annual income (-0.10). The WLQ is a reliable tool for assessing work productivity. However, persons with RA tend to select jobs that they can do with their RA limitations, with the result that the WLQ does not detect functional limitations as well as the HAQ and SF-36. The WLQ provides special information that is not available using conventional measures of assessment, and can provide helpful
Ishida, Shoji; Nagase, Shinichiro; Ino, Masanori
According to the peculiar situation around nuclear power plant in Japan, many results of evacuation simulation in that the public buses and family cars were used together have being obtained by a super computer. These were comprised of the time dependence of the number of residents and vehicles at the exit and starting points, and traffic jam datas at each intersection on the evacuation roads. Also, exposed dose for each group of the residents were calculated in case of Xe and I-131 release. The retrieval system was applied for selecting the indispensable data from many results, and in order to see data on the display screen, the graphic system was provided. (author)
Del Valle, José C; Gallardo-López, Antonio; Buide, Mª Luisa; Whittall, Justen B; Narbona, Eduardo
Anthocyanin pigments have become a model trait for evolutionary ecology as they often provide adaptive benefits for plants. Anthocyanins have been traditionally quantified biochemically or more recently using spectral reflectance. However, both methods require destructive sampling and can be labor intensive and challenging with small samples. Recent advances in digital photography and image processing make it the method of choice for measuring color in the wild. Here, we use digital images as a quick, noninvasive method to estimate relative anthocyanin concentrations in species exhibiting color variation. Using a consumer-level digital camera and a free image processing toolbox, we extracted RGB values from digital images to generate color indices. We tested petals, stems, pedicels, and calyces of six species, which contain different types of anthocyanin pigments and exhibit different pigmentation patterns. Color indices were assessed by their correlation to biochemically determined anthocyanin concentrations. For comparison, we also calculated color indices from spectral reflectance and tested the correlation with anthocyanin concentration. Indices perform differently depending on the nature of the color variation. For both digital images and spectral reflectance, the most accurate estimates of anthocyanin concentration emerge from anthocyanin content-chroma ratio, anthocyanin content-chroma basic, and strength of green indices. Color indices derived from both digital images and spectral reflectance strongly correlate with biochemically determined anthocyanin concentration; however, the estimates from digital images performed better than spectral reflectance in terms of r 2 and normalized root-mean-square error. This was particularly noticeable in a species with striped petals, but in the case of striped calyces, both methods showed a comparable relationship with anthocyanin concentration. Using digital images brings new opportunities to accurately quantify the
Smith, Mark A.; Atcitty, Stanley
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
Watkins, John C.; Steele, Robert Jr.; DeWall, Kevin G.; Weidenhamer, G.H.; Rothberg, O.O.
This paper presents the results of testing sponsored by the NRC to assess valve and motor operator performance under varying pressure and fluid conditions. This effort included an examination of the methods used by the industry to predict the required stem force of a valve, and research to provide guidelines for the extrapolation of in situ test results to design basis conditions.Years ago, when most of these valves were originally installed, the industry used a set of equations to determine analytically that the valves' motor-operators were large enough and the control switches were set high enough to close the valves at their design basis conditions. Our research has identified several inconsistencies with the industry's existing gate valve stem force equation and has challenged the overly simplistic assumptions inherent in its use. This paper discusses the development of the INEL correlation, which serves as the basis for a method to bound the stem force necessary to close flexwedge gate valves whose operational characteristics have been shown to be predictable. As utilities undertake to provide assurance of their valves' operability, this ability to predict analytically the required stem force is especially important for valves that cannot be tested at design basis conditions. For such valves, the results of tests conducted at less severe conditions can be used with the INEL correlation to make the necessary prediction. ((orig.))
Full Text Available Background: This study investigates the role of an incremental change in organizational process in creating radical performance results in a service provider company. The role of Kaizen is established prominently in manufacturing, but is nascent in service applications. This study examines the impact of introducing Kaizen as an ODI tool-how it is applied, how it works, and whether participants believe it helps service groups form more effective working relationships that result in significant performance improvements. Methods: Exploring the evolving role of Kaizen in service contexts, this study explores a variety of facets of human communication in the context of continuous improvement and teamwork inter-organizationally. The paper consists of an archival study and an action research case study. A pre-intervention study consisting of observations, interviews, and submission of questionnaires to employees of a manufacturing and air-sea freight firm was conducted. A Kaizen intervention occurred subsequently, and a post-intervention study was then conducted. Results: Radical improvements in both companies such as 30% financial growth, 81% productivity improvement and more are demonstrated in this paper. Conclusions: Findings offer unique insights into the effects of Kaizen in creating radical performance improvements in a service company and its customer. Both qualitative and quantitative results of business, satisfaction, and productivity suggest time invested in introducing Kaizen into a service organization helps the companies improve relationships and improve the bottom line dramatically.
Oravakangas, Rami; Leppilahti, Juhana; Laine, Vesa; Niinimäki, Tuukka
Hallux valgus is one of the most common foot deformities. Proximal opening wedge osteotomy is used for the treatment of moderate and severe hallux valgus with metatarsus primus varus. However, hypermobility of the first tarsometatarsal joint can compromise the results of the operation, and a paucity of midterm results are available regarding proximal open wedge osteotomy surgery. The aim of the present study was to assess the midterm results of proximal open wedge osteotomy in a consecutive series of patients with severe hallux valgus. Thirty-one consecutive adult patients (35 feet) with severe hallux valgus underwent proximal open wedge osteotomy. Twenty patients (35.5%) and 23 feet (34.3%) were available for the final follow-up examination. The mean follow-up duration was 5.8 (range 4.6 to 7.0) years. The radiologic measurements and American Orthopaedic Foot and Ankle Society hallux-metatarsophalangeal-interphalangeal scores were recorded pre- and postoperatively, and subjective questionnaires were completed and foot scan analyses performed at the end of the follow-up period. The mean hallux valgus angle decreased from 38° to 23°, and the mean intermetatarsal angle correction decreased from 17° to 10°. The mean improvement in the American Orthopaedic Foot and Ankle Society hallux metatarsophalangeal-interphalangeal score increased from 52 to 84. Two feet (5.7%) required repeat surgery because of recurrent hallux valgus. No nonunions were identified. Proximal open wedge osteotomy provided satisfactory midterm results in the treatment of severe hallux valgus, with a low complication rate. The potential instability of the first tarsometatarsal joint does not seem to jeopardize the midterm results of the operation. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.
.... One such system is Provider Perspectives. This study shows that Provider Perspectives significantly decreased Emergency Room utilization and subsequently increased the usage of primary care clinics at Martin Army Community Hospital and Winn...
Mowafi, Hani; Hariri, Mahmoud; Alnahhas, Houssam; Ludwig, Elizabeth; Allodami, Tammam; Mahameed, Bahaa; Koly, Jamal Kaby; Aldbis, Ahmed; Saqqur, Maher; Zhang, Baobao; Al-Kassem, Anas
The Syrian civil war has resulted in large-scale devastation of Syria's health infrastructure along with widespread injuries and death from trauma. The capacity of Syrian trauma hospitals is not well characterized. Data are needed to allocate resources for trauma care to the population remaining in Syria. To identify the number of trauma hospitals operating in Syria and to delineate their capacities. From February 1 to March 31, 2015, a nationwide survey of 94 trauma hospitals was conducted inside Syria, representing a coverage rate of 69% to 93% of reported hospitals in nongovernment controlled areas. Identification and geocoding of trauma and essential surgical services in Syria. Although 86 hospitals (91%) reported capacity to perform emergency surgery, 1 in 6 hospitals (16%) reported having no inpatient ward for patients after surgery. Sixty-three hospitals (70%) could transfuse whole blood but only 7 (7.4%) could separate and bank blood products. Seventy-one hospitals (76%) had any pharmacy services. Only 10 (11%) could provide renal replacement therapy, and only 18 (20%) provided any form of rehabilitative services. Syrian hospitals are isolated, with 24 (26%) relying on smuggling routes to refer patients to other hospitals and 47 hospitals (50%) reporting domestic supply lines that were never open or open less than daily. There were 538 surgeons, 378 physicians, and 1444 nurses identified in this survey, yielding a nurse to physician ratio of 1.8:1. Only 74 hospitals (79%) reported any salary support for staff, and 84 (89%) reported material support. There is an unmet need for biomedical engineering support in Syrian trauma hospitals, with 12 fixed x-ray machines (23%), 11 portable x-ray machines (13%), 13 computed tomographic scanners (22%), 21 adult (21%) and 5 pediatric (19%) ventilators, 14 anesthesia machines (10%), and 116 oxygen cylinders (15%) not functional. No functioning computed tomographic scanners remain in Aleppo, and 95 oxygen cylinders (42
Spaeth, Michael; Bennett, Robert M; Benson, Beverly A; Wang, Y Grace; Lai, Chinglin; Choy, Ernest H
Background Fibromyalgia is characterised by chronic musculoskeletal pain and multiple symptoms including fatigue, multidimensional function impairment, sleep disturbance and tenderness. Along with pain and fatigue, non-restorative sleep is a core symptom of fibromyalgia. Sodium oxybate (SXB) is thought to reduce non-restorative sleep abnormalities. This study evaluated effects of SXB on fibromyalgia-related pain and other symptoms. Methods 573 patients with fibromyalgia according to 1990 American College of Rheumatology criteria were enrolled at 108 centres in eight countries. Subjects were randomly assigned to placebo, SXB 4.5 g/night or SXB 6 g/night. The primary efficacy endpoint was the proportion of subjects with ≥30% reduction in pain visual analogue scale from baseline to treatment end. Other efficacy assessments included function, sleep quality, effect of sleep on function, fatigue, tenderness, health-related quality of life and subject's impression of change in overall wellbeing. Results Significant improvements in pain, sleep and other symptoms associated with fibromyalgia were seen in SXB treated subjects compared with placebo. The proportion of subjects with ≥30% pain reduction was 42.0% for SXB4.5 g/night (p=0.002) and 51.4% for SXB6 g/night (pQuality of sleep (Jenkins sleep scale) improved by 20% for SXB4.5 g/night (p≤0.001) and 25% for SXB6 g/night (p≤0.001) versus 0.5% for placebo. Adverse events with an incidence ≥5% and twice placebo were nausea, dizziness, vomiting, insomnia, anxiety, somnolence, fatigue, muscle spasms and peripheral oedema. Conclusion These results, combined with findings from previous phase 2 and 3 studies, provide supportive evidence that SXB therapy affordsimportant benefits across multiple symptoms in subjects with fibromyalgia. PMID:22294641
Kim, Kyung Cho; Kim, Jin Gyum; Kang, Sung Sik; Jhung, Myung Jo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
The Korea Institute of Nuclear Safety, as a representative organization of Korea, in February 2012 participated in an international Program to Assess the Reliability of Emerging Nondestructive Techniques initiated by the U.S. Nuclear Regulatory Commission. The goal of the Program to Assess the Reliability of Emerging Nondestructive Techniques is to investigate the performance of emerging and prospective novel nondestructive techniques to find flaws in nickel-alloy welds and base materials. In this article, Korean round-robin test results were evaluated with respect to the test blocks and various nondestructive examination techniques. The test blocks were prepared to simulate large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds in nuclear power plants. Also, lessons learned from the Korean round-robin test were summarized and discussed.
Chowdhury, A.A.; Mielnik, T.C. [Electric System Planning, MidAmerican Energy Company, Davenport, Iowa (United States); Lawton, L.E.; Sullivan, M.J.; Katz, A. [Population Research Systems, San Francisco, CA (United States)
This paper presents the overall results of a residential customer survey conducted in service areas of MidAmerican Energy Company, a Midwest utility. A similar survey was conducted concurrently in the industrial, commercial and institutional sectors and the survey results are presented in a companion paper. The results of this study are compared with the results of other studies performed in the high cost areas of the US east and west coasts. This is the first ever study of this nature performed for the residential customers in the US Midwest region. Methodological differences in the study design compared to coastal surveys are discussed. Customer survey costing techniques can be categorized into three main groups: contingent valuation techniques, direct costing techniques and indirect costing techniques. Most customer surveys conducted by different organizations in the last two decades used a combination of all three techniques. The selection of a technique is mainly dependent on the type of customer being surveyed. In this MidAmerican study, contingent valuation techniques and an indirect costing technique have been used, as most consequences of power outages to residential users are related to inconvenience or disruption of housekeeping and leisure activities that are intangible in nature. The major contribution of this paper is that particulars of Midwest residential customers compared to residential customers of coastal utilities are noted and customer responses on power quality issues that are important to customers are summarized. (author)
Naredo, ee.; Møller, I.; Moragues, C.
, tendon lesions, bursitis, and power Doppler signal. Afterwards they compared the ultrasound findings and re-examined the patients together while discussing their results. RESULTS: Overall agreements were 91% for joint effusion/synovitis and tendon lesions, 87% for cortical abnormalities, 84......: The shoulder, wrist/hand, ankle/foot, or knee of 24 patients with rheumatic diseases were evaluated by 23 musculoskeletal ultrasound experts from different European countries randomly assigned to six groups. The participants did not reach consensus on scanning method or diagnostic criteria before...... the investigation. They were unaware of the patients' clinical and imaging data. The experts from each group undertook a blinded ultrasound examination of the four anatomical regions. The ultrasound investigation included the presence/absence of joint effusion/synovitis, bony cortex abnormalities, tenosynovitis...
Rosenberg, Jason; Fabi, Alain; Candido, Kenneth; Knezevic, Nick; Creamer, Michael; Carayannopoulos, Alexios; Ghodsi, Abdi; Nelson, Christopher; Bennett, Matthew
The EMP 3 OWER™ study evaluated spinal cord stimulation (SCS) safety and efficacy and the associated changes in psychosocial and functional outcomes. Upon informed consent and IRB approval, 620 eligible subjects were enrolled prior to SCS trial evaluation and were assessed at baseline, 3, 6 and 12 months post-implant. Patient-reported pain relief (PRP), numerical rating scale (NRS), satisfaction, quality of life (QOL), and pain disability index (PDI) were assessed at all follow-up visits while the pain catastrophizing scale (PCS), short form-36 (SF-36), short form-McGill pain questionnaire version 2 (SF-MPQ-2), and the state-trait anxiety inventory (STAI) were assessed at the 6- and 12-month follow-up visits. Device and/or procedure-related adverse events were also recorded and reported. Subjects reporting a PRP ≥ 50% were considered responders. Repeated measures analysis of variance (RMANOVA) examined the changes across time for all continuous measures. A total of 401 (71%) subjects received a permanent implant. Mean (±SD) patient-reported pain relief was 59.3% (±26.2), 59.2% (±28.9), and 58.2% (±32.0) at 3, 6, and 12 months, respectively. A majority of enrolled subjects were responders at 3 (75.5%), 6 (74.7%), and 12 months (69.7%). RMANOVA revealed a statistically significant change for NRS, PCS, PDI, SF-36, SF-MPQ-2, and STAI scores. At 3 months, the majority of subjects (85.7%) were either very satisfied or satisfied with their device, with similar results at 6 and 12 months. At 3 months, the majority of subjects (73.3%) reported greatly improved or improved QOL with similar results at 6 and 12 months. Spinal cord stimulation provided pain relief and significant improvement of patient psychological and functional outcome measures. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: firstname.lastname@example.org.
Black, James; Gerdtz, Marie; Nicholson, Pat; Crellin, Dianne; Browning, Laura; Simpson, Julie; Bell, Lauren; Santamaria, Nick
applications found. This study provides evidence that applications running on simple phones can be used to count respiratory rates in children. The Once-per-Breath methods are the most reliable, outperforming the 60-second count. For children with raised respiratory rates the 20-breath version of the Once-per-Breath method is faster, so it is a more suitable option where health workers are under time pressure. Copyright © 2015 Elsevier Ltd. All rights reserved.
Teutsch, Friedrich; Gugglberger, Lisa; Dür, Wolfgang
Implementation is critical to the success of health promotion (HP) in schools, but little is known about how schools can best be assisted during this process. This article focuses on Austrian HP providers and aspects their roles incorporate. To investigate the providers' role in the practice of HP implementation and how it differs from its official description. On the basis of these findings, implications are suggested. The data were gathered within the framework of an explorative case study of complex HP interventions. We draw on four interviews with HP organisation staff, five documents from the providers' organisations and seven interviews with school staff from three schools. In practice, providers took up different responsibilities, e.g., acting as emotional support to school staff and supporting the documentation of projects, guided more by the schools' needs than by the programmes they are helping to implement. Providers focused mostly on the implementation of single activities and did little to emphasize the necessity of organisational change. Our findings suggest that providers' background in health should be complemented by a deeper understanding of the importance of organisational change to further support HP implementation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
ASTM is an international society for managing the development of standards on characteristics and performance of materials, products, systems and services and the promotion of related knowledge. This paper provides on overview of ASTM, emphasizing its contribution to nuclear systems reliability. In so doing, the author, from his perspective as chairman of ASTM committee E 10 on ''Nuclear Applications and the Measurement of Radiation Effects and the Committee on Standards'', illustrates ASTM contributions to the understanding and control of radiation embrittlement of light-water reactor pressure vessels. Four major related taks are summarized and pertinent standards identified. These include: (1) surveillance practice (5 standards), (2) neutron dosimetry (8 standards), (3) specification for steels for nuclear service (7 standards) and (4) basic guidelines for thermal annealing to correct radiation embrittlement (1 standard). This illustration, a specific accomplishment using ASTM standards, is cited within the context of the broader nuclear-related activities of ASTM. (author)
Veeck, Ann; O'Reilly, Kelley; MacMillan, Amy; Yu, Hongyan
Midterm student evaluations have been shown to be beneficial for providing formative feedback for course improvement. With the purpose of improving instruction in marketing courses, this research introduces and evaluates a novel form of midterm student evaluation of teaching: the online collaborative evaluation. Working in small teams, students…
Ross, Lone; Petersen, Morten Aagaard; Johnsen, Anna Thit; Lundstrøm, Louise Hyldborg; Groenvold, Mogens
To validate five items (CPWQ-inf) regarding satisfaction with information provided to cancer patients from health care staff, assess the prevalence of dissatisfaction with this information, and identify factors predicting dissatisfaction. The questionnaire was validated by patient-observer agreement and cognitive interviews. The prevalence of dissatisfaction was assessed in a cross-sectional sample of all cancer patients in contact with hospitals during the past year in three Danish counties. The validation showed that the CPWQ performed well. Between 3 and 23% of the 1490 participating patients were dissatisfied with each of the measured aspects of information. The highest level of dissatisfaction was reported regarding the guidance, support and help provided when the diagnosis was given. Younger patients were consistently more dissatisfied than older patients. The brief CPWQ performs well for survey purposes. The survey depicts the heterogeneous patient population encountered by hospital staff and showed that younger patients probably had higher expectations or a higher need for information and that those with more severe diagnoses/prognoses require extra care in providing information. Four brief questions can efficiently assess information needs. With increasing demands for information, a wide range of innovative initiatives is needed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
CERN Press Office. Geneva
At a seminar at CERN on 10 May the NA48 collaboration announced its final result on one of nature's best-kept secrets : direct Charge Parity (CP)-violation. This subtle effect explains nature's preference for matter over antimatter.
Thein, Si Thu; Sudhinaraset, May; Khin, Hnin Su Su; McFarland, Willi; Aung, Tin
Artemisinin-based combination therapy (ACT) is a key strategy for global malaria elimination efforts. However, the development of artemisinin-resistant malaria parasites threatens progress and continued usage of oral artemisinin monotherapies (AMT) predisposes the selection of drug resistant strains. This is particularly a problem along the Myanmar/Thailand border. The artemisinin monotherapy replacement programme (AMTR) was established in 2012 to remove oral AMT from stocks in Myanmar, specifically by replacing oral AMT with quality-assured ACT and conducting behavioural change communication activities to the outlets dispensing anti-malarial medications. This study attempts to quantify the characteristics of outlet providers who continue to stock oral AMT despite these concerted efforts. A cross-sectional survey of all types of private sector outlets that were stocking anti-malarial drugs in 13 townships of Eastern Myanmar was implemented from July to August 2014. A total of 573 outlets were included. Bivariate and multivariable logistic regressions were conducted to assess outlet and provider-level characteristics associated with stocking oral AMT. In total, 2939 outlets in Eastern Myanmar were screened for presence of any anti-malarial drugs in August 2014. The study found that 573 (19.5 %) had some kind of oral anti-malarial drug in stock at the time of survey and among them, 96 (16.8 %) stocked oral AMT. In bivariate analyses, compared to health care facilities, itinerant drug vendors, retailers and health workers were less likely to stock oral AMT (33.3 vs 12.9, 10.0, 8.1 %, OR = 0.30, 0.22, 0.18, respectively). Providers who cut blister pack or sell partial courses (40.6 vs 11.7 %, OR 5.18, CI 3.18-8.44) and those who based their stock decision on consumer demand (32.8 vs 12.1 %, OR 3.54, CI 2.21-5.63) were more likely to stock oAMT. Multivariate logistic regressions produced similar significant associations. Private healthcare facilities and drug
CERN Press Office. Geneva
At a seminar at CERN on 18 June Pascal Debu, spokesman of the Laboratory's NA48 experiment, announced its preliminary result, after analysis of 10% of the expected data, on one of nature's best-kept secrets. Direct CP-violation, as it is called, is a subtle effect that betrays nature's preference for matter over antimatter, the reason why we are here.
Adsul, Prajakta; Wray, Ricardo; Gautam, Kanak; Jupka, Keri; Weaver, Nancy; Wilson, Kristin
Background Integrating health literacy into primary care institutional policy and practice is critical to effective, patient centered health care. While attributes of health literate organizations have been proposed, approaches for strengthening them in healthcare systems with limited resources have not been fully detailed. Methods We conducted key informant interviews with individuals from 11 low resourced health care organizations serving uninsured, underinsured, and government-insured patients across Missouri. The qualitative inquiry explored concepts of impetus to transform, leadership commitment, engaging staff, alignment to organization wide goals, and integration of health literacy with current practices. Findings Several health care organizations reported carrying out health literacy related activities including implementing patient portals, selecting easy to read patient materials, offering community education and outreach programs, and improving discharge and medication distribution processes. The need for change presented itself through data or anecdotal staff experience. For any change to be undertaken, administrators and medical directors had to be supportive; most often a champion facilitated these changes in the organization. Staff and providers were often resistant to change and worried they would be saddled with additional work. Lack of time and funding were the most common barriers reported for integration and sustainability. To overcome these barriers, managers supported changes by working one on one with staff, seeking external funding, utilizing existing resources, planning for stepwise implementation, including members from all staff levels and clear communication. Conclusion Even though barriers exist, resource scarce clinical settings can successfully plan, implement, and sustain organizational changes to support health literacy.
Full Text Available In this study, airborne laser scanning-based and traditional field-based survey methods for tree heights estimation are assessed by using one hundred felled trees as a reference dataset. Comparisons between remote sensing and field-based methods were applied to four circular permanent plots located in the western Italian Alps and established within the Alpine Space project NewFor. Remote sensing (Airborne Laser Scanning, ALS, traditional field-based (indirect measurement, IND, and direct measurement of felled trees (DIR methods were compared by using summary statistics, linear regression models, and variation partitioning. Our results show that tree height estimates by Airborne Laser Scanning (ALS approximated to real heights (DIR of felled trees. Considering the species separately, Larix decidua was the species that showed the smaller mean absolute difference (0.95 m between remote sensing (ALS and direct field (DIR data, followed by Picea abies and Pinus sylvestris (1.13 m and 1.04 m, respectively. Our results cannot be generalized to ALS surveys with low pulses density (<5/m2 and with view angles far from zero (nadir. We observed that the tree heights estimation by laser scanner is closer to actual tree heights (DIR than traditional field-based survey, and this was particularly valid for tall trees with conical shape crowns.
Gallais-During, A., E-mail: email@example.com [CEA, DEN, DEC, F-13108 Saint-Paul-lez-Durance (France); Bonnin, J.; Malgouyres, P.-P. [CEA, DEN, DEC, F-13108 Saint-Paul-lez-Durance (France); Morin, S. [IRSN, F-13108 Saint-Paul-lez-Durance (France); Bernard, S.; Gleizes, B.; Pontillon, Y.; Hanus, E.; Ducros, G. [CEA, DEN, DEC, F-13108 Saint-Paul-lez-Durance (France)
Highlights: • A new facility to perform experimental LWR severe accidents sequences is evaluated. • In the furnace a fuel sample is heated up to 2600 °C under a controlled gas atmosphere. • Innovative thermal gradient tubes are used to study fission product transport. • The new VERDON facility shows an excellent consistency with results from VERCORS. • Fission product re-vapourization results confirm the correct functioning of the gradient tubes. - Abstract: One of the most important areas of research concerning a hypothetical severe accident in a light water reactor (LWR) is determining the source term, i.e. quantifying the nature, release kinetics and global released fraction of the fission products (FPs) and other radioactive materials. In line with the former VERCORS programme to improve source term estimates, the new VERDON laboratory has recently been implemented at the CEA Cadarache Centre in the LECA-STAR facility. The present paper deals with the evaluation of the experimental equipment of this new VERDON laboratory (furnace, release and transport loops) and demonstrates its capability to perform experimental sequences representative of LWR severe accidents and to supply the databases necessary for source term assessments and FP behaviour modelling.
Rizzoli-Córdoba, Antonio; Ortega-Ríosvelasco, Fernando; Villasís-Keever, Miguel Ángel; Pizarro-Castellanos, Mariel; Buenrostro-Márquez, Guillermo; Aceves-Villagrán, Daniel; O'Shea-Cuevas, Gabriel; Muñoz-Hernández, Onofre
The Child Development Evaluation (CDE) is a screening tool designed and validated in Mexico for detecting developmental problems. The result is expressed through a semaphore. In the CDE test, both yellow and red results are considered positive, although a different intervention is proposed for each. The aim of this work was to evaluate the reliability of the CDE test to discriminate between children with yellow/red result based on the developmental domain quotient (DDQ) obtained through the Battelle Development Inventory, 2nd edition (in Spanish) (BDI-2). The information was obtained for the study from the validation. Children with a normal (green) result in the CDE were excluded. Two different cut-off points of the DDQ were used (BDI-2): social: 20.1% vs. 28.9%; and adaptive: 6.9% vs. 20.4%. The semaphore result yellow/red allows identifying different magnitudes of delay in developmental domains or subdomains, supporting the recommendation of different interventions for each one. Copyright © 2014 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.
N A Kovyazina; N A Alhutova; N N Zybina; N M Kalinina
The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case). Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded unce...
van Iterson, Loretta; Augustijn, Paul B.; de Jong, Peter F.; van der Leij, Aryan
The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a reference sample. Then, these RCIs were applied to a…
van Iterson, L.; Augustijn, P.B.; de Jong, P.F.; van der Leij, A.
The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a
Russell, K.D.; Skinner, N.L.
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The primary function of MAR-D is to create a data repository for completed PRAs and Individual Plant Examinations (IPEs) by providing input, conversion, and output capabilities for data used by IRRAS, SARA, SETS, and FRANTIC software. As probabilistic risk assessments and individual plant examinations are submitted to the NRC for review, MAR-D can be used to convert the models and results from the study for use with IRRAS and SARA. Then, these data can be easily accessed by future studies and will be in a form that will enhance the analysis process. This reference manual provides an overview of the functions available within MAR-D and step-by-step operating instructions
Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong
This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.
Basu, Asit P; Basu, Sujit K
This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul
Hoertner, H.; Nieckau, E.; Spindler, H.
This report gives a comprehensive presentation of the detailed reliability investigation carried out for the engineered safety features installed to cope with the design basis accident 'Large LOCA' of a German nuclear power plant with pressurized water reactor. The investigation is based on the engineered safety features of the Biblis Nuclear Power Plant, Unit A. The reliability investigation is carried out by means of a fault tree analysis. The influence of common-mode failures is assessed. (orig.) [de
Saher Ebrahim Taman
Conclusion: Negative D-Dimer test is a reliable diagnostic modality to rule out the need for CT Angiography in patients at high risk population of PE. However, positive test results cannot confirm the diagnosis and further testing is warranted.
Fourlaris, G.; Ellwood, R.; Jones, T.B.
The use of high strength steels (HSS) in automotive components is steadily increasing as automotive designers use modern steel grades to improve structural performance, reduce vehicle weight and enhance crash performance. Weight reduction can be achieved by substituting mild steel with a thinner gauge HSS, however, it must be ensured that no deterioration in performance including fatigue capability occurs. In this study, tests have been carried out to determine the effects that gauge and material strength have on the fatigue performance of a fusion welded automotive suspension arm. Current finite element (FE) modelling and fatigue prediction techniques have been evaluated to determine their reliability when used for thin strip steels. Results have shown the fatigue performance of welded components to be independent of the strength of the parent material for the steel grades studied, with material thickness and joining process the key features determining the fatigue performance. The correlation between the fatigue performance of simple welded samples under uniaxial, constant amplitude loading and complex components under biaxial in service road load data, has been shown to be unreliable. This study also indicates that with the application of modern technologies, such as tailor-welded blanks (TWB), significant weight savings can be achieved. This is demonstrated by a 19% weight reduction with no detrimental effect on the fatigue performance
Junoy Montolio, Francisco G; Wesselink, Christiaan; Gordijn, Marijke; Jansonius, Nomdo M
To determine the influence of several factors on standard automated perimetry test results in glaucoma. Longitudinal Humphrey field analyzer 30-2 Swedish interactive threshold algorithm data from 160 eyes of 160 glaucoma patients were used. The influence of technician experience, time of day, and season on the mean deviation (MD) was determined by performing linear regression analysis of MD against time on a series of visual fields and subsequently performing a multiple linear regression analysis with the MD residuals as dependent variable and the factors mentioned above as independent variables. Analyses were performed with and without adjustment for the test reliability (fixation losses and false-positive and false-negative answers) and with and without stratification according to disease stage (baseline MD). Mean follow-up was 9.4 years, with on average 10.8 tests per patient. Technician experience, time of day, and season were associated with the MD. Approximately 0.2 dB lower MD values were found for inexperienced technicians (P Technician experience, time of day, season, and the percentage of false-positive answers have a significant influence on the MD of standard automated perimetry.
The IEA crisis management has so far not been required to demonstrate its full capacity and power. In spite of a number of crises in the Middle East oil has always been sufficiently available in the Western world. Even with the Gulf at war there have been abundant oil supplies and once again slackening petroleum prices on the world market for several years. This very situation did not fail to affect the different opinions on the reliability of supplies. While the Western German petroleum industry is being suffocated by accumulating idle refinery stocks and therefore protests against excessive provision making with the help of restrictive policies, the mining industry still meets with political support for a policy of stockpiling for the sake of reliable energy supplies. In view of the inflationary use of the reliability and safety concepts in political discussion the publication abstracted is trying to throw light on the meaning per se, the development of the supply situation in the Federal Republic of Germany since 1973 and on the applicability and costs of possible precautionary strategies taking effect in the case of suspended energy imports.
Full Text Available The milk analyse result reliability is important for assurance of foodstuff chain quality. There are more direct and indirect methods for milk composition measurement (fat (F, protein (P, lactose (L and solids non fat (SNF content. The goal was to evaluate some reference and routine milk analytical procedures on result basis. The direct reference analyses were: F, fat content (Röse–Gottlieb method; P, crude protein content (Kjeldahl method; L, lactose (monohydrate, polarimetric method; SNF, solids non fat (gravimetric method. F, P, L and SNF were determined also by various indirect methods: – MIR (infrared (IR technology with optical filters, 7 instruments in 4 labs; – MIR–FT (IR spectroscopy with Fourier’s transformations, 10 in 6; – ultrasonic method (UM, 3 in 1; – analysis by the blue and red box (BRB, 1 v 1. There were used 10 reference milk samples. Coefficient of determination (R2, correlation coefficient (r and standard deviation of the mean of individual differences (MDsd, for n were evaluated. All correlations (r; for all indirect and alternative methods and all milk components were significant (P ≤ 0.001. MIR and MIR–FT (conventional methods explained considerably higher proportion of the variability in reference results than the UM and BRB methods (alternative. All r average values (x minus 1.64 × sd for 95% confidence interval can be used as standards for calibration quality evaluation (MIR, MIR–FT, UM and BRB: – for F 0.997, 0.997, 0.99 and 0.995; – for P 0.986, 0.981, 0.828 and 0.864; – for L 0.968, 0.871, 0.705 and 0.761; – for SNF 0.992, 0.993, 0.911 and 0.872. Similarly MDsd (x plus 1.64 × sd: – for F 0.071, 0.068, 0.132 and 0.101%; – for P 0.051, 0.054, 0.202 and 0.14%; – for L 0.037, 0.074, 0.113 and 0.11%; – for SNF 0.052, 0.068, 0.141 and 0.204.
N A Kovyazina
Full Text Available The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case. Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded uncertainty and the evaluation of the dynamics. Conclusion. Compliance with mandatory measures for laboratory quality management system enables laboratories to obtain reliable results and calculate the parameters that are able to increase the amount of information content of laboratory tests in clinical decision making.
Tamargo, Christina L; Quinn, Gwendolyn P; Sanchez, Julian A; Schabath, Matthew B
Despite growing social acceptance, the LGBTQ population continues to face barriers to healthcare including fear of stigmatization by healthcare providers, and providers' lack of knowledge about LGBTQ-specific health issues. This analysis focuses on the assessment of quantitative and qualitative responses from a subset of providers who identified as specialists that treat one or more of the seven cancers that may be disproportionate in LGBTQ patients. A 32-item web-based survey was emailed to 388 oncology providers at a single institution. The survey assessed: demographics, knowledge, attitudes, and practice behaviors. Oncology providers specializing in seven cancer types had poor knowledge of LGBTQ-specific health needs, with fewer than half of the surveyed providers (49.5%) correctly answering knowledge questions. Most providers had overall positive attitudes toward LGBTQ patients, with 91.7% agreeing they would be comfortable treating this population, and would support education and/or training on LGBTQ-related cancer health issues. Results suggest that despite generally positive attitudes toward the LGBTQ population, oncology providers who treat cancer types most prevalent among the population, lack knowledge of their unique health issues. Knowledge and practice behaviors may improve with enhanced education and training on this population's specific needs.
Jones-Webb, Rhonda; Toomey, Traci L; Lenk, Kathleen M; Nelson, Toben F; Erickson, Darin J
We investigated what local enforcement agencies are doing to target adults who provide alcohol to underage youth; what types of enforcement activities are being conducted to target adult providers; and factors that encourage enforcement activities that target adult providers. We surveyed 1,056 local law enforcement agencies in the US and measured whether or not the agency conducted enforcement activities that target adults who provide alcohol to underage youth. We also measured whether certain agency and jurisdiction characteristics were associated with enforcement activities that target adults who provide alcohol to underage youth. Less than half (42%) of local enforcement agencies conducted enforcement efforts targeting adults who provide alcohol to underage youth. Agencies that conducted the enforcement activities targeting adult providers were significantly more likely to have a full time officer specific to alcohol enforcement, a division specific to alcohol enforcement, a social host law, and to perceive underage drinking was very common. Results suggest that targeting social providers (i.e., adults over 21 years of age) will require greater law enforcement resources, implementation of underage drinking laws (e.g., social host policies), and changing perceptions among law enforcement regarding underage drinking. Future studies are needed to identify the most effective enforcement efforts and to examine how enforcement efforts are prospectively linked to alcohol consumption.
LaCommare, Kristina [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Larsen, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Eto, Joseph [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
Policymakers and regulatory agencies are expressing renewed interest in the reliability and resilience of the U.S. electric power system in large part due to growing recognition of the challenges posed by climate change, extreme weather events, and other emerging threats. Unfortunately, there has been little or no consolidated information in the public domain describing how public utility/service commission (PUC) staff evaluate the economics of proposed investments in the resilience of the power system. Having more consolidated information would give policymakers a better understanding of how different state regulatory entities across the U.S. make economic decisions pertaining to reliability/resiliency. To help address this, Lawrence Berkeley National Laboratory (LBNL) was tasked by the U.S. Department of Energy Office of Energy Policy and Systems Analysis (EPSA) to conduct an initial set of interviews with PUC staff to learn more about how proposed utility investments in reliability/resilience are being evaluated from an economics perspective. LBNL conducted structured interviews in late May-early June 2016 with staff from the following PUCs: Washington D.C. (DCPSC), Florida (FPSC), and California (CPUC).
Lee, Sang Yong
This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.
Michael E. Bowen
Full Text Available Background: Although elevated glucose values are strongly associated with undiagnosed diabetes, they are frequently overlooked. Patient, provider, and system factors associated with failure to follow-up elevated glucose values in electronic medical records (EMRs are not well described. Methods: We conducted a chart review in a comprehensive EMR with a patient portal and results management features. Established primary care patients with no known diagnosis of diabetes and ≥ 1 glucose value >125 mg/dL were included. Follow-up failure was defined as (1 no documented comment on the glucose value or result communication to the patient within 30 days or (2 no hemoglobin A 1c (HbA 1c ordered within 30 days or resulted within 12 months. Associations were examined using Wilcoxon and χ 2 tests. Results: Of 150 charts reviewed, 97 met inclusion criteria. The median glucose was 133 mg/dL, and 20% of patients had multiple values >125 mg/dL. Only 36% of elevated glucose values were followed up. No associations were observed between patient characteristics, diabetes risk factors, or provider characteristics and follow-up failures. Automated flagging of glucose values ≥140 mg/dL by highlighting them red in the EMR was not associated with improved follow-up (46% vs 32%; P = .19. Even when follow-up occurred (n = 35, only 31% completed gold standard diabetes testing (HbA 1c within 12 months. Of the resulted HbA 1c tests (n = 11, 55% were in the prediabetes range (5.7%-6.4%. Conclusions: Two-thirds of elevated glucose values were not followed up, despite EMR features facilitating results management. Greater understanding of the results management process and improved EMR functionalities to support results management are needed.
Solomon David J
Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.
This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de
Al-Khathaami, Ali M; Alshahrani, Saeed M; Kojan, Suleiman M; Al-Jumah, Mohammed A; Alamry, Ahmed A; El-Metwally, Ashraf A
To determine the degree of satisfaction and acceptance of stroke patients, their relatives, and healthcare providers toward using telestroke technology in Saudi Arabia. A cross-sectional study was conducted between October and December 2012 at King Abdulaziz Medical City, Ministry of National Guard Affairs, Riyadh, Saudi Arabia. The Remote Presence Robot (RPR), the RP-7i (FDA- cleared) provided by InTouch Health was used in the study. Patients and their relatives were informed that the physician would appear through a screen on top of a robotic device, as part of their clinical care. Stroke patients admitted through the emergency department, and their relatives, as well as healthcare providers completed a self-administered satisfaction questionnaire following the telestroke consultation sessions. Fifty participants completed the questionnaire. Most subjects agreed that the remote consultant interview was useful and that the audiovisual component of the intervention was of high quality; 98% agreed that they did not feel shy or embarrassed during the remote interview, were able to understand the instruction of the consultant, and recommended its use in stroke management. Furthermore, 92% agreed or strongly agreed that the use of this technology can efficiently replace the physical presence of a neurologist. Results suggest that the use of telestroke medicine is culturally acceptable among stroke patients and their families in Saudi Arabia and favorably received by healthcare providers.
Newman, Constance; Kimeu, Anastasiah; Shamblin, Leigh; Penders, Christopher; McQuide, Pamela A; Bwonya, Judith
IntraHealth International's USAID-funded Capacity Kenya project conducted a performance needs assessment of the Kenya health provider education system in 2010. Various stakeholders shared their understandings of the role played by gender and identified opportunities to improve gender equality in health provider education. Findings suggest that occupational segregation, sexual harassment and discrimination based on pregnancy and family responsibilities present problems, especially for female students and faculty. To grow and sustain its workforce over the long term, Kenyan human resource leaders and managers must act to eliminate gender-based obstacles by implementing existing non-discrimination and equal opportunity policies and laws to increase the entry, retention and productivity of students and faculty. Families and communities must support girls' schooling and defer early marriage. All this will result in a fuller pool of students, faculty and matriculated health workers and, ultimately, a more robust health workforce to meet Kenya's health challenges.
Vallentin, Daniel; Viebahn, Peter
Several energy scenario studies consider concentrated solar power (CSP) plants as an important technology option to reduce the world's CO 2 emissions to a level required for not letting the global average temperature exceed a threshold of 2-2.4 o C. A global ramp up of CSP technologies offers great economic opportunities for technology providers as CSP technologies include highly specialised components. This paper analyses possible value creation effects resulting from a global deployment of CSP until 2050 as projected in scenarios of the International Energy Agency (IEA) and Greenpeace International. The analysis focuses on the economic opportunities of German technology providers since companies such as Schott Solar, Flabeg or Solar Millennium are among the leading suppliers of CSP technologies on the global market.
Full Text Available Angela Golden,1 Yvonne D'Arcy,2 Elizabeth T Masters,3 Andrew Clair3 1NP from Home, LLC, Munds Park, AZ, 2Pain Management and Palliative Care, Suburban Hospital-Johns Hopkins Medicine, Bethesda, MD, 3Pfizer, New York, NY, USA Abstract: Fibromyalgia (FM is a chronic disorder characterized by widespread pain, which can limit patients' physical function and daily activities. FM can be challenging to treat, and the treatment approach could benefit from a greater understanding of patients' perspectives on their condition and their care. Patients with FM participated in an online survey conducted in the USA that sought to identify the symptoms that had the greatest impact on patients' daily lives. The purpose of the survey was to facilitate efforts toward improving care of patients by nurse practitioners, primary care providers, and specialists, in addition to contributing to the development of new outcome measures in both clinical trials and general practice. A total of 1,228 patients with FM completed the survey, responding to specific questions pertaining to symptoms, impact of symptoms, management of FM, and the relationship with health care providers. Chronic pain was identified as the key FM symptom, affecting personal and professional relationships, and restricting physical activity, work, and social commitments. Patients felt that the severity of their condition was underestimated by family, friends, and health care providers. The results of this survey highlight the need for nurse practitioners, primary care providers, and specialists to provide understanding and support to patients as they work together to enable effective diagnosis and management of FM. Keywords: fibromyalgia, pain, survey, impact, support
A confiabilidade da informação fornecida pelo indivíduo a respeito de seu posicionamento habitual de língua The reliability of the information provided by the individuals about their habitual tongue position
Ana Fernanda Rodrigues Cardoso
two phases, with minimum difference of seven and maximum of twenty-one days. Initially we observed the usual position of the tongue. Then the subjects were questioned about the habitual position After the response, the tongue was stimulated with a wooden spatula, in order to enhance perception. Then, the subjects were guided to observe where their tongue was usually positioned in the oral cavity, until the second assessment. This time, the subjects were asked about their habitual position of the tongue. Data were analyzed using Kappa statistic. RESULTS: it was not possible to observe the usual position of the tongue in 100% of the sample. As for the general reliability of the responses, it was found between mild and regular classification. The children showed responses to be little consistent and very diverse. As for the adults, part of them submitted correct answers in the first question and others only submitted reliable answers after intra-oral perception stimulation. CONCLUSIONS: the reliability of the information provided by the individuals in the sample on the usual position of the tongue varies from mild, regular, and therefore low, both in children as in adults. A possible strategy to be used in clinical speech therapy practice is questioning the patients about their tongue position after a period of observation.
Full Text Available Trees provide numerous benefits for urban residents, including reduced energy usage, improved air quality, stormwater management, carbon sequestration, and increased property values. Quantifying these benefits can help justify the costs of planting trees. In this paper, we use i-Tree Streets to quantify the benefits of street trees planted by nonprofits in three U.S. cities (Detroit, Michigan; Indianapolis, Indiana, and Philadelphia, Pennsylvania from 2009 to 2011. We also use both measured and modeled survival and growth rates to “grow” the tree populations 5 and 10 years into the future to project the future benefits of the trees under different survival and growth scenarios. The 4059 re-inventoried trees (2864 of which are living currently provide almost $40,000 (USD in estimated annual benefits ($9–$20/tree depending on the city, the majority (75% of which are increased property values. The trees can be expected to provide increasing annual benefits during the 10 years after planting if the annual survival rate is higher than the 93% annual survival measured during the establishment period. However, our projections show that with continued 93% or lower annual survival, the increase in annual benefits from tree growth will not be able to make up for the loss of benefits as trees die. This means that estimated total annual benefits from a cohort of planted trees will decrease between the 5-year projection and the 10-year projection. The results of this study indicate that without early intervention to ensure survival of planted street trees, tree mortality may be significantly undercutting the ability of tree-planting programs to provide benefits to neighborhood residents.
Cerveri, P; Lopomo, N; Pedotti, A; Ferrigno, G
In the field of 3D reconstruction of human motion from video, model-based techniques have been proposed to increase the estimation accuracy and the degree of automation. The feasibility of this approach is strictly connected with the adopted biomechanical model. Particularly, the representation of the kinematic chain and the assessment of the corresponding parameters play a relevant role for the success of the motion assessment. In this paper, the focus is on the determination of the kinematic parameters of a general hand skeleton model using surface measurements. A novel method that integrates nonrigid sphere fitting and evolutionary optimization is proposed to estimate the centers and the functional axes of rotation of the skeletal joints. The reliability of the technique is tested using real movement data and simulated motions with known ground truth 3D measurement noise and different ranges of motion (RoM). With respect to standard nonrigid sphere fitting techniques, the proposed method performs 10-50% better in the best condition (very low noise and wide RoM) and over 100% better with physiological artifacts and RoM. Repeatability in the range of a couple of millimeters, on the localization of the centers of rotation, and in the range of one degree, on the axis directions is obtained from real data experiments.
Støre-Valen, Jakob; Ryum, Truls; Pedersen, Geir A F; Pripp, Are H; Jose, Paul E; Karterud, Sigmund
The Global Assessment of Functioning (GAF) Scale is used in routine clinical practice and research to estimate symptom and functional severity and longitudinal change. Concerns about poor interrater reliability have been raised, and the present study evaluated the effect of a Web-based GAF training program designed to improve interrater reliability in routine clinical practice. Clinicians rated up to 20 vignettes online, and received deviation scores as immediate feedback (i.e., own scores compared with expert raters) after each rating. Growth curves of absolute SD scores across the vignettes were modeled. A linear mixed effects model, using the clinician's deviation scores from expert raters as the dependent variable, indicated an improvement in reliability during training. Moderation by content of scale (symptoms; functioning), scale range (average; extreme), previous experience with GAF rating, profession, and postgraduate training were assessed. Training reduced deviation scores for inexperienced GAF raters, for individuals in clinical professions other than nursing and medicine, and for individuals with no postgraduate specialization. In addition, training was most beneficial for cases with average severity of symptoms compared with cases with extreme severity. The results support the use of Web-based training with feedback routines as a means to improve the reliability of GAF ratings performed by clinicians in mental health practice. These results especially pertain to clinicians in mental health practice who do not have a masters or doctoral degree. (c) 2015 APA, all rights reserved.
Joshua M Pevnick
Full Text Available Personal fitness trackers (PFT have substantial potential to improve healthcare.To quantify and characterize early adopters who shared their PFT data with providers.We used bivariate statistics and logistic regression to compare patients who shared any PFT data vs. patients who did not.A patient portal was used to invite 79,953 registered portal users to share their data. Of 66,105 users included in our analysis, 499 (0.8% uploaded data during an initial 37-day study period. Bivariate and regression analysis showed that early adopters were more likely than non-adopters to be younger, male, white, health system employees, and to have higher BMIs. Neither comorbidities nor utilization predicted adoption.Our results demonstrate that patients had little intrinsic desire to share PFT data with their providers, and suggest that patients most at risk for poor health outcomes are least likely to share PFT data. Marketing, incentives, and/or cultural change may be needed to induce such data-sharing.
Cassardo, C.; Loglisci, N.
In the recent years, there has been a significant growth in the recognition of the soil moisture importance in large-scale hydrology and climate modelling. Soil moisture is a lower boundary condition, which rules the partitioning of energy in terms of sensible and latent heat flux. Wrong estimations of soil moisture lead to wrong simulation of the surface layer evolution and hence precipitations and cloud cover forecasts could be consequently affected. This is true for large scale medium-range weather forecasts as well as for local-scale short range weather forecasts, particularly in those situations in which local convection is well developed. Unfortunately; despite the importance of this physical parameter there are only few soil moisture data sets sparse in time and in space around in the world. Due to this scarcity of soil moisture observations, we developed an alternative method to provide soil moisture datasets in order to verify numerical weather prediction models. In this paper are presented the preliminary results of an attempt to verify soil moisture fields predicted by a mesoscale model. The data for the comparison were provided by the simulations of the diagnostic land surface scheme LSPM (Land Surface Process Model), widely used at the Piedmont Regional Weather Service for agro-meteorological purposes. To this end, LSPM was initialized and driven by Synop observations, while the surface (vegetation and soil) parameter values were initialized by ECOCLIMAP global dataset at 1km 2 resolution
Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.
Jacobson, M. Z.; Delucchi, M. A.; Cameron, M. A.; Frew, B. A.
The greatest concern facing the large-scale integration of wind, water, and solar (WWS) into a power grid is the high cost of avoiding load loss caused by WWS variability and uncertainty. This talk discusses the recent development of a new grid integration model to address this issue. The model finds low-cost, no-load-loss, non-unique solutions to this problem upon electrification of all U.S. energy sectors (electricity, transportation, heating/cooling, and industry) while accounting for wind and solar time-series data from a 3-D global weather model that simulates extreme events and competition among wind turbines for available kinetic energy. Solutions are obtained by prioritizing storage for heat (in soil and water); cold (in ice and water); and electricity (in phase-change materials, pumped hydro, hydropower, and hydrogen); and using demand response. No natural gas, biofuels, or stationary batteries are needed. The resulting 2050-2055 U.S. electricity social cost for a full system is much less than for fossil fuels. These results hold for many conditions, suggesting that low-cost, stable 100% WWS systems should work many places worldwide. The paper this talk is based on was published in PNAS, 112, 15,060-15,065, 2015, doi:10.1073/pnas.1510028112.
Lee, Kang Hee; Yoon, Kyung Hee; Kim, Hyung Kyu
The reliability assurance with respect to the test procedure and results of the out-pile mechanical performance test for the nuclear fuel assembly is an essential task to assure the test quality and to get a permission for fuel loading into the commercial reactor core. For the case of vibration test, proper management and appropriate calibration of instruments and devices used in the test, various efforts to minimize the possible error during the test and signal acquisition process are needed. Additionally, the deep understanding both of the theoretical assumption and simplification for the signal processing/modal analysis and of the functions of the devices used in the test were highly required. In this study, the overall procedure and result of lateral vibration test were assembly's mechanical characterization were briefly introduced. A series of measures to assure and improve the reliability of the vibration test were discussed
Lee, Kang Hee; Yoon, Kyung Hee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)
The reliability assurance with respect to the test procedure and results of the out-pile mechanical performance test for the nuclear fuel assembly is an essential task to assure the test quality and to get a permission for fuel loading into the commercial reactor core. For the case of vibration test, proper management and appropriate calibration of instruments and devices used in the test, various efforts to minimize the possible error during the test and signal acquisition process are needed. Additionally, the deep understanding both of the theoretical assumption and simplification for the signal processing/modal analysis and of the functions of the devices used in the test were highly required. In this study, the overall procedure and result of lateral vibration test were assembly's mechanical characterization were briefly introduced. A series of measures to assure and improve the reliability of the vibration test were discussed.
Full Text Available A comparative analysis of the methods for determining the required amount of experiments, the accuracy and reliability of the results of physical-mechanical rock properties study has been conducted. The advantages and disadvantages of the existing specialized method for determining the compressive strength of the samples have been discussed. On the basis of the investigation the optimal approach has been proposed to solve a wide range of the problems associated with the rock properties' parameters using
Stern, Lisa F; Simons, Hannah R; Kohn, Julia E; Debevec, Elie J; Morfesis, Johanna M; Patel, Ashlesha A
To describe contraceptive use among U.S. female family planning providers and to compare their contraceptive choices to the general population. We surveyed a convenience sample of female family planning providers ages 25-44 years, including physicians and advanced practice clinicians, via an internet-based survey from April to May 2013. Family planning providers were compared to female respondents ages 25-44 years from the 2011-2013 National Survey of Family Growth. A total of 488 responses were eligible for analysis; 331 respondents (67.8%) were using a contraceptive method. Providers' contraceptive use differed markedly from that of the general population, with providers significantly more likely to use intrauterine contraception, an implant, and the vaginal ring. Providers were significantly less likely to use female sterilization and condoms. There were no significant differences between providers and the general population in use of partner vasectomy or the pill. Long-acting reversible contraception (LARC) use was significantly higher among providers than in the general population (41.7% vs. 12.1%, pfamily planning providers differed significantly from the general population. These findings have implications for clinical practice, patient education, and health policy. Family planning providers report higher use of LARC than the general population. This may reflect differences in preferences and access. Providers might consider sharing these findings with patients, while maintaining patient choice and autonomy. Copyright © 2015 Elsevier Inc. All rights reserved.
Lüdeke, Andreas; Giachino, R
A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum . This contribution will describe the forum and advertise it’s usage in the community.
O'Brien, J.N.; Spettell, C.M.
This report is the first in a series which documents research aimed at improving the usefulness of Probabilistic Risk Assessment (PRA) results in addressing human risk issues. This first report describes the results of an assessment of how well currently available PRA data addresses human risk issues of current concern to NRC. Findings indicate that PRA data could be far more useful in addressing human risk issues with modification of the development process and documentation structure of PRAs. In addition, information from non-PRA sources could be integrated with PRA data to address many other issues. 12 tabs
Montolio, Francisco G. Junoy; Wesselink, Christiaan; Gordijn, Marijke; Jansonius, Nomdo M.
PURPOSE. To determine the influence of several factors on standard automated perimetry test results in glaucoma. METHODS. Longitudinal Humphrey field analyzer 30-2 Swedish interactive threshold algorithm data from 160 eyes of 160 glaucoma patients were used. The influence of technician experience,
F.G.J. Montolio (Francisco G. Junoy); C. Wesselink (Christiaan); M.C.M. Gordijn (Marijke); N.M. Jansonius (Nomdo)
textabstractPURPOSE. To determine the influence of several factors on standard automated perimetry test results in glaucoma. METHODS. Longitudinal Humphrey field analyzer 30-2 Swedish interactive threshold algorithm data from 160 eyes of 160 glaucoma patients were used. The influence of technician
Lathi, Ruth B; Gustin, Stephanie L F; Keller, Jennifer; Maisenbacher, Melissa K; Sigurjonsson, Styrmir; Tao, Rosina; Demko, Zach
To examine the rate of maternal contamination in miscarriage specimens. Retrospective review of 1,222 miscarriage specimens submitted for chromosome testing with detection of maternal cell contamination (MCC). Referral centers requesting genetic testing of miscarriage specimens at a single reference laboratory. Women with pregnancy loss who desire complete chromosome analysis of the pregnancy tissue. Analysis of miscarriage specimens using single-nucleotide polymorphism (SNP) microarray technology with bioinformatics program to detect maternal cell contamination. Chromosome content of miscarriages and incidence of 46,XX results due to MCC. Of the 1,222 samples analyzed, 592 had numeric chromosomal abnormalities, and 630 were normal 46,XX or 46,XY (456 and 187, respectively). In 269 of the 46,XX specimens, MCC with no embryonic component was found. With the exclusion of maternal 46,XX results, the chromosomal abnormality rate increased from 48% to 62%, and the ratio for XX to XY results dropped from 2.6 to 1.0. Over half of the normal 46,XX results in miscarriage specimens were due to MCC. The use of SNPs in MCC testing allows for precise identification of chromosomal abnormalities in miscarriage as well as MCC, improving the accuracy of products of conception testing. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Full Text Available Milk recording (MR is an essential breeder measure. Results are important for inheritance check. The occurrence of errors in the data may compromise the efficiency of breeding of dairy cows. The aim was possibility to reduce the incidence of MR database errors. Analyses of frequency distribution of MR data deviations from different sources and estimations of limits of difference acceptability in milk recording were performed. The results of MR control days of flowmeter in parlor (DMY were paired to the AVG7 results (average for 7 days from the same flowmeter (n = 16,247, original recordings of complete lactations. The individual differences in milk yield indicators were calculated between successive MR control days (DMY – R, monthly interval, the reference value (R = previous DMY for MR data file. A statistically significant correlation coefficient (AVG7 and DMY was 0.935 (P < 0.001 and was higher in comparison to the previous assessment under AMS conditions (automatic milking system; 0.898; P < 0.001. This means that 87.3% of the variability in the milk yield values for MR (DMY can be explained by variations in the AVG7 values and vice versa. Difference tests confirmed significant differences (P < 0.001 0.76 and 0.55 kg between DMY (in MR and AVG7 for original and also refined data file. Mentioned differences, although statistically significant, correspond only to 2.96 and 2.15% relatively. The use of multi–day milk yield average from the electronic flowmeter is an equivalent alternative to the use of record from one MR control day. Results are used in MR practice.
Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne
Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In
Voskresenskij, L.A.; Zajtsev, A.I.; Nelyubov, V.I.; Fedorov, M.A.
Results of calculations and experimental studies on providing reliability of complex for plastic tube welding are presented. Choice of reliability indeces and standards is based. Reliability levels of components are determined. The most waded details are calculated. It is shown that they meet the reqrurements of strength and reliability. Service life tests supported the correct choice of springs. Recommendations on elevating reliability are given. Directions of further developments are shown. 8 refs.; 2 figs.; 1 tab
Sarmiento, Kelly; Donnell, Zoe; Hoffman, Rosanne; Tennant, Bethany
Explore healthcare providers' experiences managing mTBI and better understand their use of mTBI assessment tools and guidelines. Cross-sectional Methods: A random sample of 1,760 healthcare providers responded to the web-based DocStyles survey between June 18 and 30, 2014. The sample included family/general practitioners, internists, pediatricians, and nurse practitioners who reported seeing pediatric patients. We examined their experiences with mTBI to identify opportunities to increase preparedness and improve management of mTBI. Fifty-nine percent of healthcare providers reported that they diagnosed or managed pediatric patients with mTBI within the last 12 months. Of those, 44.4% felt 'very prepared' to make decisions about when pediatric patients can safety return to activities, such as school and sports after a mTBI. When asked how often they use screening or assessment tools to assess pediatric patients with mTBI, almost half reported that they 'seldom' or 'never' use those resources (24.6% and 22.0%, respectively). Most healthcare providers reported seeing pediatric patients with mTBI, yet most feel only somewhat prepared to manage this injury in their practise. Broader use of screening tools and guidelines, that include clinical decision support tools, may be useful for healthcare providers who care for pediatric patients with mTBI.
Martignoni, K.; Elsasser, U.
Carcinomas occurring in the thyroid gland as a result of radiation generally affect the papillary and, to a slightly lesser extent, follicular parts of this organ, while the available body of evidence hardly gives any indications of anaplastic and medullary neoplasms. Radiation has, however, mostly been associated with multicentric tumours. Among the survivors of the nuclear assaults on Hiroshima and Nagasaki, there are no known cases of anaplastic carcinomas of the thyroid. The papillary carcinoma, which is the prevailing type of neoplasm after radiation exposure, has less malignant potential than the follicular one and is encountered in all age groups. Malignant carcinomas of the thyroid are predominantly found in the middle and high age groups. It was calculated that high Gy doses and dose efficiencies are associated in children with a risk coefficient of 2.5 in 10 4 person-years. This rate is only half as high for adults. Studies performed on relevant cohorts point to latency periods of at least five years. Individuals exposed to radiation are believed to be at a forty-year or even life-long risk of developing cancer. The cancer risk can best be described on the basis of a linear dose-effect relationship. The mortality rate calculated for cancer of the thyroid amounts to approx. 10% of the morbidity rate. The carcinogenic potential of iodine-131 in the thyroid is only one-third as great as that associated with external radiation of high dose efficiency. (orig./MG) [de
Full Text Available A Review of: Eldredge, J. D., Hall, L. J., McElfresh, K. R., Warner, T. D., Stromberg, T. L., Trost, J. T., & Jelinek, D. A. (2016. Rural providers’ access to online resources: A randomized controlled trial. Journal of the Medical Library Association, 104(1, 33-41. http://dx.doi.org/10.3163/1536-5050.104.1.005 Objective – To determine whether free access to the point of care (PoC resource Dynamed or the electronic book collection AccessMedicine was more useful to rural health care providers in answering clinical questions in terms of usage and satisfaction. Design – Randomized controlled trial. Setting – Rural New Mexico. Subjects – Twenty-eight health care providers (physicians, nurses, physician assistants, and pharmacists with no reported access to PoC resources, (specifically Dynamed and AccessMedicine or electronic textbook collections prior to enrollment.
Schmied, Virginia; Fowler, Cathrine; Rossiter, Chris; Homer, Caroline; Kruske, Sue
Australia has a system of universal child and family health (CFH) nursing services providing primary health services from birth to school entry. Herein, we report on the findings of the first national survey of CFH nurses, including the ages and circumstances of children and families seen by CFH nurses and the nature and frequency of the services provided by these nurses across Australia. A national survey of CFH nurses was conducted. In all, 1098 CFH nurses responded to the survey. Over 60% were engaged in delivering primary prevention services from a universal platform. Overall, 82.8% reported that their service made first contact with families within 2 weeks of birth, usually in the home (80.7%). The proportion of respondents providing regular support to families decreased as the child aged. Services were primarily health centre based, although 25% reported providing services in other locations (parks, preschools).The timing and location of first contact, the frequency of ongoing services and the composition of families seen by nurses varied across Australian jurisdictions. Nurses identified time constraints as the key barrier to the delivery of comprehensive services. CFH nurses play an important role in supporting families across Australia. The impact of differences in the CFH nursing provision across Australia requires further investigation. What is known about the topic? Countries that offer universal well child health services demonstrate better child health and developmental outcomes than countries that do not. Australian jurisdictions offer free, universal child and family health (CFH) nursing services from birth to school entry. What does this paper add? This paper provides nation-wide data on the nature of work undertaken by CFH nurses offering universal care. Across Australia, there are differences in the timing and location of first contact, the frequency of ongoing services and the range of families seen by nurses. What are the implications for
Full Text Available The role of men who have sex with men (MSM in the African HIV epidemic is gaining recognition yet capacity to address the HIV prevention needs of this group is limited. HIV testing and counselling is not only a critical entry point for biomedical HIV prevention interventions, such as pre-exposure prophylaxis, rectal microbicides and early treatment initiation, but is also an opportunity for focused risk reduction counselling that can support individuals living in difficult circumstances. For prevention efforts to succeed, however, MSM need to access services and they will only do so if these are non-judgmental, informative, focused on their needs, and of clear benefit. This study aimed to understand Kenyan providers' attitudes towards and experiences with counselling MSM in a research clinic targeting this group for HIV prevention. We used in-depth interviews to explore values, attitudes and cognitive and social constructs of 13 counsellors and 3 clinicians providing services to MSM at this clinic. Service providers felt that despite their growing experience, more targeted training would have been helpful to improve their effectiveness in MSM-specific risk reduction counselling. They wanted greater familiarity with MSM in Kenya to better understand the root causes of MSM risk-taking (e.g., poverty, sex work, substance abuse, misconceptions about transmission, stigma, and sexual desire and felt frustrated at the perceived intractability of some of their clients' issues. In addition, they identified training needs on how to question men about specific risk behaviours, improved strategies for negotiating risk reduction with counselling clients, and improved support supervision from senior counsellors. This paper describes the themes arising from these interviews and makes practical recommendations on training and support supervision systems for nascent MSM HIV prevention programmes in Africa.
Taegtmeyer, Miriam; Davies, Alun; Mwangome, Mary; van der Elst, Elisabeth M; Graham, Susan M; Price, Matt A; Sanders, Eduard J
The role of men who have sex with men (MSM) in the African HIV epidemic is gaining recognition yet capacity to address the HIV prevention needs of this group is limited. HIV testing and counselling is not only a critical entry point for biomedical HIV prevention interventions, such as pre-exposure prophylaxis, rectal microbicides and early treatment initiation, but is also an opportunity for focused risk reduction counselling that can support individuals living in difficult circumstances. For prevention efforts to succeed, however, MSM need to access services and they will only do so if these are non-judgmental, informative, focused on their needs, and of clear benefit. This study aimed to understand Kenyan providers' attitudes towards and experiences with counselling MSM in a research clinic targeting this group for HIV prevention. We used in-depth interviews to explore values, attitudes and cognitive and social constructs of 13 counsellors and 3 clinicians providing services to MSM at this clinic. Service providers felt that despite their growing experience, more targeted training would have been helpful to improve their effectiveness in MSM-specific risk reduction counselling. They wanted greater familiarity with MSM in Kenya to better understand the root causes of MSM risk-taking (e.g., poverty, sex work, substance abuse, misconceptions about transmission, stigma, and sexual desire) and felt frustrated at the perceived intractability of some of their clients' issues. In addition, they identified training needs on how to question men about specific risk behaviours, improved strategies for negotiating risk reduction with counselling clients, and improved support supervision from senior counsellors. This paper describes the themes arising from these interviews and makes practical recommendations on training and support supervision systems for nascent MSM HIV prevention programmes in Africa.
Deans, N.D.; Miller, A.J.; Mann, D.P.
The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)
Argenter-Giralt, Miquel; Barba-Albós, Genoveva; Román-Martínez, Anna
The health information system in Catalonia has experienced an important evolution but obtaining integrated data to evaluate the health services is still difficult. At the end of 2008 the basis of the information system of the "Center of Results" and a first set of indicators has been approved by the health system stakeholders. The "Center of Results" is assigned to the Catalan Health Service. It has a Direction Board and a Technical Committee to regulate its operation. The "Center of Results" has the mission to measure, evaluate and disseminate the results obtained in health care by the members of the public health services, to facilitate decision making with shared responsibility at the service of the quality of the health care given to the citizens of Catalonia. The "Center of Results" is based on performance principles that determine their operation: to share and to coordinate the existing information, to stimulate the participation and the co-responsibility of the implied agents, continuous improvement of the health information, promotion of good practices in the use of information and its responsible use, efficient instrumentation of technologies and analytical capacity to transform data into information. A participative process has been made to select and prioritize indicators. This process has reached consensus on a set of indicators. These indicators must contribute to assess the impact of the interventions of the health system on the level of the population's health and how results, with an efficient use of the resources, are obtained. 2010 Elsevier España S.L. All rights reserved.
Vu, Alexander; Wirtz, Andrea; Pham, Kiemanh; Singh, Sonal; Rubenstein, Leonard; Glass, Nancy; Perrin, Nancy
Refugees and internally displaced persons who are affected by armed-conflict are at increased vulnerability to some forms of sexual violence or other types of gender-based violence. A validated, brief and easy-to-administer screening tool will help service providers identify GBV survivors and refer them to appropriate GBV services. To date, no such GBV screening tool exists. We developed the 7-item ASIST-GBV screening tool from qualitative research that included individual interviews and focus groups with GBV refugee and IDP survivors. This study presents the psychometric properties of the ASIST-GBV with female refugees living in Ethiopia and IDPs in Colombia. Several strategies were used to validate ASIST-GBV, including a 3 month implementation to validate the brief screening tool with women/girls seeking health services, aged ≥15 years in Ethiopia (N = 487) and female IDPs aged ≥ 18 years in Colombia (N = 511). High proportions of women screened positive for past-year GBV according to the ASIST-GBV: 50.6 % in Ethiopia and 63.4 % in Colombia. The factor analysis identified a single dimension, meaning that all items loaded on the single factor. Cronbach's α = 0.77. A 2-parameter logistic IRT model was used for estimating the precision and discriminating power of each item. Item difficulty varied across the continuum of GBV experiences in the following order (lowest to highest): threats of violence (0.690), physical violence (1.28), forced sex (2.49), coercive sex for survival (2.25), forced marriage (3.51), and forced pregnancy (6.33). Discrimination results showed that forced pregnancy was the item with the strongest ability to discriminate between different levels of GBV. Physical violence and forced sex also have higher levels of discrimination with threats of violence discriminating among women at the low end of the GBV continuum and coercive sex for survival among women at the mid-range of the continuum. The findings demonstrate that
Lee, Jung Hwan; Lee, Sang-Ho
Epidural steroid injection (ESI) is known to be an effective treatment for lower back or radicular pain due to herniated intervertebral disc (HIVD) and spinal stenosis (SS). Although repeat ESI has generally been indicated to provide more pain relief in partial responders after a single ESI, there has been little evidence supporting the usefulness of repeat injections in cumulative clinical pain reduction. The purpose of this study was to determine whether repeat ESI at a prescribed interval of 2 to 3 weeks after the first injection would provide greater clinical benefit in patients with partial pain reduction than that provided by intermittent injection performed only when pain was aggravated. An Institutional Review Board (IRB)-approved retrospective chart review. Spine hospital. Two hundred and four patients who had underwent transforaminal ESI (TFESI) for treatment of lower back and radicular pain due to HIVD or SS and could be followed-up for one year were enrolled. We divided the patients into 2 groups. Group A (N = 108) comprised partial responders (NRS = 3 after first injection) who underwent repeat injection at a prescribed interval of 2 to 3 weeks after the first injection. Group B (N = 96) comprised partial responders who did not receive a repeat injection at the prescribed interval, but received repeat injections only for aggravation of pain. Various clinical data including total number of injections during one year, duration of NRS group A, or after first injection in group B (time to reinjection), were assessed. These data were compared between groups A and B in terms of total population, HIVD, and SS. In the whole population, the mean time to reinjection was 6.09 ± 3.02 months in group A and 3.69 ± 2.07 months in group B. The NRS groups A and B, respectively. In HIVD patients, the mean time to reinjection was 5.82 ± 3.23 months in group A and 3.84 ± 2.34 months in group B, and NRS groups A and B, respectively. In SS patients, the mean time to
Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo
Ventura-Ríos, Lucio; Hernández-Díaz, Cristina; Ferrusquia-Toríz, Diana; Cruz-Arenas, Esteban; Rodríguez-Henríquez, Pedro; Alvarez Del Castillo, Ana Laura; Campaña-Parra, Alfredo; Canul, Efrén; Guerrero Yeo, Gerardo; Mendoza-Ruiz, Juan Jorge; Pérez Cristóbal, Mario; Sicsik, Sandra; Silva Luna, Karina
This study aims to test the reliability of ultrasound to graduate synovitis in static and video images, evaluating separately grayscale and power Doppler (PD), and combined. Thirteen trained rheumatologist ultrasonographers participated in two separate rounds reading 42 images, 15 static and 27 videos, of the 7-joint count [wrist, 2nd and 3rd metacarpophalangeal (MCP), 2nd and 3rd interphalangeal (IPP), 2nd and 5th metatarsophalangeal (MTP) joints]. The images were from six patients with rheumatoid arthritis, performed by one ultrasonographer. Synovitis definition was according to OMERACT. Scoring system in grayscale, PD separately, and combined (GLOESS-Global OMERACT-EULAR Score System) were reviewed before exercise. Reliability intra- and inter-reading was calculated with Cohen's kappa weighted, according to Landis and Koch. Kappa values for inter-reading were good to excellent. The minor kappa was for GLOESS in static images, and the highest was for the same scoring in videos (k 0.59 and 0.85, respectively). Excellent values were obtained for static PD in 5th MTP joint and for PD video in 2nd MTP joint. Results for GLOESS in general were good to moderate. Poor agreement was observed in 3rd MCP and 3rd IPP in all kinds of images. Intra-reading agreement were greater in grayscale and GLOESS in static images than in videos (k 0.86 vs. 0.77 and k 0.86 vs. 0.71, respectively), but PD was greater in videos than in static images (k 1.0 vs. 0.79). The reliability of the synovitis scoring through static images and videos is in general good to moderate when using grayscale and PD separately or combined.
Agache, I; Doros, I C; Leru, P M; Bucur, I; Poenaru, M; Sarafoleanu, C
Allergic Rhinitis and its Impact on Asthma (ARIA) and the European Union (EU) recommend a shift to guide allergic rhinitis (AR) treatment decisions from symptom severity to disease control, using a simple visual analogue scale (VAS). Using this VAS we assessed, in a real-life study in Romania, the effectiveness of MP-AzeFlu nasal spray. In this multi-centre, prospective, non-interventional study, 253 patients (over 11 years old) with moderate-to-severe AR were prescribed MP-AzeFlu and assessed their symptoms on a VAS (0 (not at all bothersome) to 100 mm (very bothersome)) on Days 0, 1, 3, 7 and 14. The proportion of patients who achieved a defined VAS score cut-off for well-controlled (38 mm) AR were also calculated. Patients perception of disease control was assessed on Day 3. MP-AzeFlu use was associated with a mean (standard deviation) VAS score reduction from 78.4 (15.1) mm at baseline to 14.7 (15.1) mm on the last day. Effectiveness was consistent irrespective of disease severity, phenotype or patient age. 83.4% of patients achieved the smaller than 39 mm well-controlled VAS score cut-off by last day and 95.2% considered their symptoms to be well- or partly controlled at Day 3. MP-AzeFlu provided rapid, effective and sustained AR symptom control in a real-life setting in Romania, irrespective of severity, phenotype or patient age, aligning with ARIA and EU recommendations and supporting the position of MP-AzeFlu as the drug of choice for the treatment of moderate-to-severe AR.
Kurfirst, Vojtech; Mokrácek, Aleš; Canádyová, Júlia; Frána, Radim; Zeman, Petr
Occlusion of the left atrial appendage (LAA) has become an integral and important part of the surgical treatment of atrial fibrillation. Different methods of surgical occlusion of the LAA have been associated with varying levels of short- and long-term success for closure. The purpose of this study was to evaluate long-term results of epicardial placement and endocardial occlusion in patients undergoing cardiac operative procedures. A total of 101 patients (average age 65.7 years) undergoing cardiac operative procedures with the epicardial AtriClip Exclusion System of the LAA were enrolled in the study. The AtriClip was placed via a sternotomy or a thoracotomy or from a thoracoscopic approach. Postoperative variables, such as thromboembolic events, clip stability and endocardial leakage around the device, were examined by transoesophageal echocardiography (TEE) and/or computed tomography. Perioperative clip implantation was achieved in 98% of patients. TEE and/or computed tomography conducted during the follow-up period, comprising 1873 patient-months with a mean duration of 18 ± 11 months, revealed no clip migration, no leakage around the device and no clot formation near the remnant cul-de-sac. During the follow-up period, 4 of the cardiac patients experienced transitory ischaemic attacks, whereas no patient experienced a cerebrovascular attack. The Epicardial AtriClip Exclusion System of the LAA appears to be a feasable and safe operative method with a high success rate. Long-term follow-up confirmed clip stability, complete occlussion of the LAA and absence of any atrial fibrilation-related thromboembolic events. These results need to be confirmed by a larger, multicentre study. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Moreira-Gonçalves, Daniel; Henriques-Coelho, Tiago; Fonseca, Hélder; Ferreira, Rita; Padrão, Ana Isabel; Santa, Cátia; Vieira, Sara; Silva, Ana Filipa; Amado, Francisco; Leite-Moreira, Adelino; Duarte, José Alberto
The present study aimed to test whether a chronic intermittent workload could induce an adaptive cardiac phenotype Chronic intermittent workload induced features of adaptive hypertrophy This was paralleled by protection against acute pressure overload insult The heart may adapt favourably to balanced demands, regardless of the nature of the stimuli. The present study aimed to test whether submitting the healthy heart to intermittent and tolerable amounts of workload, independently of its nature, could result in an adaptive cardiac phenotype. Male Wistar rats were subjected to treadmill running (Ex) (n = 20), intermittent cardiac overload with dobutamine (ITO) (2 mg kg(-1) , s.c.; n = 20) or placebo administration (Cont) (n = 20) for 5 days week(-1) for 8 weeks. Animals were then killed for histological and biochemical analysis or subjected to left ventricular haemodynamic evaluation under baseline conditions, in response to isovolumetric contractions and to sustained LV acute pressure overload (35% increase in peak systolic pressure maintained for 2 h). Baseline cardiac function was enhanced only in Ex, whereas the response to isovolumetric heartbeats was improved in both ITO and Ex. By contrast to the Cont group, in which rats developed diastolic dysfunction with sustained acute pressure overload, ITO and Ex showed increased tolerance to this stress test. Both ITO and Ex developed cardiomyocyte hypertrophy without fibrosis, no overexpression of osteopontin-1 or β-myosin heavy chain, and increased expression of sarcoplasmic reticulum Ca(2+) protein. Regarding hypertrophic pathways, ITO and Ex showed activation of the protein kinase B/mammalian target of rapamycin pathway but not calcineurin. Mitochondrial complex IV and V activities were also increased in ITO and Ex. Chronic submission to controlled intermittent cardiac overload, independently of its nature, results in an adaptive cardiac phenotype. Features of the cardiac overload, such as the duration and
Chung, Jae Eun
Increasing numbers of people have turned to the Internet for health information. Little has been done beyond speculation to empirically investigate patients' discussion of online health information with health care professionals (HCPs) and patients' perception of HCPs' reactions to such discussion. The author analyzed data from the 2007 Health Information National Trends Survey (HINTS) to identify the characteristics of patients (a) who search for health information on the Internet, (b) who discuss the information found on the Internet with HCPs, and (c) who positively assess HCPs' reaction to the online information. Findings show that men were more likely than were women to have a conversation on online information with HCPs. It is unfortunate that patients who had trouble understanding or trusting online health information were no more likely to ask questions to or seek guidance from HCPs. Reactions of HCPs to online information were perceived as particularly negative by certain groups of patients, such as those who experienced poor health and those who had more concerns about the quality of their searched information. Results are discussed for their implications for patient empowerment and patient-HCP relationships.
Shershakov, V.M.; Zukov, G.P.; Kosykh, V.S.
Full text: Radar support to systems of automated radiation monitoring requires dealing with determination of geometric characteristics of air release of radionuclides. For doing this, an air release can be labeled by chaff propagating in the air similarly to particles of radioactive substance. Then, a chaff suspension can be treated as a spatially distributed radar target and thus be detected by a radar. For a number of years the Science and Production Association 'Typhoon' of Roshydromet, Obninsk has been developing a radar tracer system (RTS) for meteorological support of modeling hazardous technological releases. In September -December 2002 experiments were conducted to test the RTS in field. This presentation contains preliminary results of testing this system. A total of 9 experiments pursuing different goals were carried out. Of them 6 experiments were conducted approximately 6 km south-west of Obninsk in the vicinity of the village of Potresovo. The first three experiments were aimed at working out interaction between the MR and LDU and assessing the chaff cloud observation distance. In doing this, radar information was not transmitted from the MR to the CCS. In the last three experiments radar information was transmitted to the CCS by cell communication lines using telephones Siemens S35 with in-built modems. The CCS was deployed in building 4/25 of SPA 'Typhoon'. All information received in the CCS was put an a map. Three experiments were conducted in the area of the Kursk NPP as part of preparations for training exercises near the village of Makarovka about 7 km north-west of the city of Kurchatov. In the first two experiments radar information from the MR was passed by cell communication channels to the CCS deployed in the laboratory of external radiation monitoring of the Kursk nuclear power plant. Experiment 3 was a demonstration and arranged during the emergency response exercises at the Kursk NPP. The MR was based on the site of the external
Cao, Yu; Wirth, Gilson
This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units. The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.
Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly
Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107
Mcinroy, John E.; Saridis, George N.
Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.
Sherif S. AbdelSalam
Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.
Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)
Petersen, Kurt Erling
Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of
The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)
Dougherty, E.M.; Fragola, J.R.
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Fosgerau, Mogens; Karlström, Anders
We derive the value of reliability in the scheduling of an activity of random duration, such as travel under congested conditions. Using a simple formulation of scheduling utility, we show that the maximal expected utility is linear in the mean and standard deviation of trip duration, regardless...... of the form of the standardised distribution of trip durations. This insight provides a unification of the scheduling model and models that include the standard deviation of trip duration directly as an argument in the cost or utility function. The results generalise approximately to the case where the mean...
Bloch, Heinz P
This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.
Full Text Available Helen Nothnagel,1,2,* Christian Puta,1,3,* Thomas Lehmann,4 Philipp Baumbach,5 Martha B Menard,6,7 Brunhild Gabriel,1 Holger H W Gabriel,1 Thomas Weiss,8 Frauke Musial2 1Department of Sports Medicine and Health Promotion, Friedrich Schiller University, Jena, Germany; 2Department of Community Medicine, National Research Center in Complementary and Alternative Medicine, UiT, The Arctic University of Norway, Tromsø, Norway; 3Center for Interdisciplinary Prevention of Diseases Related to Professional Activities, 4Department of Medical Statistics, Computer Sciences and Documentation, Friedrich Schiller University, 5Department of Anesthesiology and Intensive Care Medicine, University Hospital Jena, Germany; 6Crocker Institute, Kiawah Island, SC, 7School of Integrative Medicine and Health Sciences, Saybrook University, Oakland, CA, USA; 8Department of Biological and Clinical Psychology, Friedrich Schiller University, Jena, Germany *These authors contributed equally to this work Background: Quantitative sensory testing (QST is a diagnostic tool for the assessment of the somatosensory system. To establish QST as an outcome measure for clinical trials, the question of how similar the measurements are over time is crucial. Therefore, long-term reliability and limits of agreement of the standardized QST protocol of the German Research Network on Neuropathic Pain were tested. Methods: QST on the lower back and hand dorsum (dominant hand were assessed twice in 22 healthy volunteers (10 males and 12 females; mean age: 46.6±13.0 years, with sessions separated by 10.0±2.9 weeks. All measurements were performed by one investigator. To investigate long-term reliability and agreement of QST, differences between the two measurements, correlation coefficients, intraclass correlation coefficients (ICCs, Bland–Altman plots (limits of agreement, and standard error of measurement were used. Results: Most parameters of the QST were reliable over 10 weeks in
This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be
Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.
Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.
Roca, Jose L.
Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)
Kottner, Jan; Audigé, Laurent; Brorson, Stig
Results of reliability and agreement studies are intended to provide information about the amount of error inherent in any diagnosis, score, or measurement. The level of reliability and agreement among users of scales, instruments, or classifications is widely unknown. Therefore, there is a need ......, standards, or guidelines for reporting reliability and agreement in the health care and medical field are lacking. The objective was to develop guidelines for reporting reliability and agreement studies....
Mohanta, Dusmanta Kumar; Sadhu, Pradip Kumar; Chakrabarti, R.
This paper presents a comparison of results for optimization of captive power plant maintenance scheduling using genetic algorithm (GA) as well as hybrid GA/simulated annealing (SA) techniques. As utilities catered by captive power plants are very sensitive to power failure, therefore both deterministic and stochastic reliability objective functions have been considered to incorporate statutory safety regulations for maintenance of boilers, turbines and generators. The significant contribution of this paper is to incorporate stochastic feature of generating units and that of load using levelized risk method. Another significant contribution of this paper is to evaluate confidence interval for loss of load probability (LOLP) because some variations from optimum schedule are anticipated while executing maintenance schedules due to different real-life unforeseen exigencies. Such exigencies are incorporated in terms of near-optimum schedules obtained from hybrid GA/SA technique during the final stages of convergence. Case studies corroborate that same optimum schedules are obtained using GA and hybrid GA/SA for respective deterministic and stochastic formulations. The comparison of results in terms of interval of confidence for LOLP indicates that levelized risk method adequately incorporates the stochastic nature of power system as compared with levelized reserve method. Also the interval of confidence for LOLP denotes the possible risk in a quantified manner and it is of immense use from perspective of captive power plants intended for quality power
Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)
The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)
Bøyesen, Pernille; McQueen, Fiona M; Gandjbakhch, Frédérique
The aim of this multireader exercise was to assess the reliability and sensitivity to change of the psoriatic arthritis magnetic resonance imaging score (PsAMRIS) in PsA patients followed for 1 year.......The aim of this multireader exercise was to assess the reliability and sensitivity to change of the psoriatic arthritis magnetic resonance imaging score (PsAMRIS) in PsA patients followed for 1 year....
Loeffler, H.; Cester, F.; Sonnenkalb, M.; Klein-Hessling, W.; Voggenberger, T.
Decision support systems such as RODOS aim to support the responsible authorities by providing estimates for the possible radiological consequences in case of an event in a nuclear plant. The prognosis of quantity, composition and time of occurrence of a release from the plant (''source term'') in the so-called pre-release phase is one of the foundations with high relevance for this purpose. Within previous projects source term prognosis tools have been developed and applied exemplarily for a PWR. At the end of 2005 GRS has finalized a PSA level 2 for a plant of the SWR-69 type. On this basis improved versions of the source term prognosis tools QPRO (probabilistic) and ASTRID (deterministic) have been created for a BWR and tested in an emergency exercise in a BWR. The further development of QPRO has been related in particular to the structure of the probabilistic network and the precalculated source terms. The activities for the adaptation of ASTRID focus on the creation of the dataset for the BWR coolant loop and the containment. In the emergency exercise the manageability of QPRO but also of ASTRID has been proven. Further, the first phases of the accident progression have been well identified. However, the exercise scenario developed into a very unlikely sequence with partial core melt, and the reactor building ventilation was shut off just at a critical moment. Therefore the source term prognoses deviate from the exercise scenario. Starting from these experiences with the development and application of QPRO and ASTRID recommendations are given for the further improvement of the reliability of the source term prognosis for RODOS. In general it can be stated that the development status of QPRO and ASTRID is definitely advanced compared to the presently still prevailing source term prognosis methods. Therefore it is recommended to develop plant specific versions of these codes and to apply them.
inverters connected in a chain. ................................................. 5 Figure 3 Typical graph showing frequency versus square root of...developing an experimental reliability estimating methodology that could both illuminate the lifetime reliability of advanced devices, circuits and...or FIT of the device. In other words an accurate estimate of the device lifetime was found and thus the reliability that can be conveniently
Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl
The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...
Soares, Fabio L.; Campelo, Divanilson R.; Yan, Ying
This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular.......This paper provides an overview of in-vehicle communication networks and addresses the challenges of providing reliability in automotive Ethernet in particular....
Ertl-Wagner, Birgit B.; Blume, Jeffrey D.; Herman, Benjamin; Peck, Donald; Udupa, Jayaram K.; Levering, Anthony; Schmalfuss, Ilona M.
Reliable assessment of tumor growth in malignant glioma poses a common problem both clinically and when studying novel therapeutic agents. We aimed to evaluate two software-systems in their ability to estimate volume change of tumor and/or edema on magnetic resonance (MR) images of malignant gliomas. Twenty patients with malignant glioma were included from different sites. Serial post-operative MR images were assessed with two software systems representative of the two fundamental segmentation methods, single-image fuzzy analysis (3DVIEWNIX-TV) and multi-spectral-image analysis (Eigentool), and with a manual method by 16 independent readers (eight MR-certified technologists, four neuroradiology fellows, four neuroradiologists). Enhancing tumor volume and tumor volume plus edema were assessed independently by each reader. Intraclass correlation coefficients (ICCs), variance components, and prediction intervals were estimated. There were no significant differences in the average tumor volume change over time between the software systems (p > 0.05). Both software systems were much more reliable and yielded smaller prediction intervals than manual measurements. No significant differences were observed between the volume changes determined by fellows/neuroradiologists or technologists.Semi-automated software systems are reliable tools to serve as outcome parameters in clinical studies and the basis for therapeutic decision-making for malignant gliomas, whereas manual measurements are less reliable and should not be the basis for clinical or research outcome studies. (orig.)
Reliability assessment of AOSpine thoracolumbar spine injury classification system and Thoracolumbar Injury Classification and Severity Score (TLICS) for thoracolumbar spine injuries: results of a multicentre study.
Kaul, Rahul; Chhabra, Harvinder Singh; Vaccaro, Alexander R; Abel, Rainer; Tuli, Sagun; Shetty, Ajoy Prasad; Das, Kali Dutta; Mohapatra, Bibhudendu; Nanda, Ankur; Sangondimath, Gururaj M; Bansal, Murari Lal; Patel, Nishit
The aim of this multicentre study was to determine whether the recently introduced AOSpine Classification and Injury Severity System has better interrater and intrarater reliability than the already existing Thoracolumbar Injury Classification and Severity Score (TLICS) for thoracolumbar spine injuries. Clinical and radiological data of 50 consecutive patients admitted at a single centre with a diagnosis of an acute traumatic thoracolumbar spine injury were distributed to eleven attending spine surgeons from six different institutions in the form of PowerPoint presentation, who classified them according to both classifications. After time span of 6 weeks, cases were randomly rearranged and sent again to same surgeons for re-classification. Interobserver and intraobserver reliability for each component of TLICS and new AOSpine classification were evaluated using Fleiss Kappa coefficient (k value) and Spearman rank order correlation. Moderate interrater and intrarater reliability was seen for grading fracture type and integrity of posterior ligamentous complex (Fracture type: k = 0.43 ± 0.01 and 0.59 ± 0.16, respectively, PLC: k = 0.47 ± 0.01 and 0.55 ± 0.15, respectively), and fair to moderate reliability (k = 0.29 ± 0.01 interobserver and 0.44+/0.10 intraobserver, respectively) for total score according to TLICS. Moderate interrater (k = 0.59 ± 0.01) and substantial intrarater reliability (k = 0.68 ± 0.13) was seen for grading fracture type regardless of subtype according to AOSpine classification. Near perfect interrater and intrarater agreement was seen concerning neurological status for both the classification systems. Recently proposed AOSpine classification has better reliability for identifying fracture morphology than the existing TLICS. Additional studies are clearly necessary concerning the application of these classification systems across multiple physicians at different level of training and trauma centers to evaluate not
RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)
Full Text Available Introduction: The paper is focused on some selected aspects of the cooperation between logistics service providers and their customers and considers the results of comparative analysis of importance assessment of the variables determining: the scope and nature of that cooperation, quality of providers' sales offer as well as changes in their customer service policy. Methods: To analyze the underlying problem direct research was conducted, i.e. a survey based on a questionnaire among 50 logistics service providers and 50 shippers. The sample was determined on special purpose. In the statistical analysis chi-square independence test, U Mann-Whitney's test as well as Cramer's V and Spearman's rho correlation ratios were used. Results: There were observed significant statistical differences between analyzed groups in the way the cooperation is perceived. The most vital discrepancies are related to customers' satisfaction degree and the assessment of the influence the providers' prices and competencies have on the cooperation. For the customers, declaring higher degree of the satisfaction from the cooperation, service quality was the most important factor. However, for the service providers, price factor was the most important one. Moreover, some differences in the answers related to changes in the service were observed, mainly with reference to: logistics capacity, out-of-loss shipments and communication. Conclusions: The group of customers revealed to be little demanding about logistics service. They tended to order mainly routine services, not demanding special skills from the service providers. This is the most probable reason why customers/providers preferred cooperation with greater number of entities. The customers, unlike service providers, also didn't have the need to develop more advanced forms of cooperation. Moreover, the observed differences related to the importance hierarchy of the cooperation determinants as well as service standards
Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.
The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)
Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas
Macaulay Ann C
Full Text Available Abstract Background Aboriginal peoples globally, and First Nations peoples in Canada particularly, suffer from high rates of type 2 diabetes and related complications compared with the general population. Research into the unique barriers faced by healthcare providers working in on-reserve First Nations communities is essential for developing effective quality improvement strategies. Methods In Phase I of this two-phased study, semi-structured interviews and focus groups were held with 24 healthcare providers in the Sioux Lookout Zone in north-western Ontario. A follow-up survey was conducted in Phase II as part of a larger project, the Canadian First Nations Diabetes Clinical Management and Epidemiologic (CIRCLE study. The survey was completed with 244 healthcare providers in 19 First Nations communities in 7 Canadian provinces, representing three isolation levels (isolated, semi-isolated, non-isolated. Interviews, focus groups and survey questions all related to barriers to providing optimal diabetes care in First Nations communities. Results the key factors emerging from interviews and focus group discussions were at the patient, provider, and systemic level. Survey results indicated that, across three isolation levels, healthcare providers' perceived patient factors as having the largest impact on diabetes care. However, physicians and nurses were more likely to rank patient factors as having a large impact on care than community health representatives (CHRs and physicians were significantly less likely to rank patient-provider communication as having a large impact than CHRs. Conclusions Addressing patient factors was considered the highest impact strategy for improving diabetes care. While this may reflect "patient blaming," it also suggests that self-management strategies may be well-suited for this context. Program planning should focus on training programs for CHRs, who provide a unique link between patients and clinical services
Autenrieth, Daniel A; Brazile, William J; Gilkey, David P; Reynolds, Stephen J; June, Cathy; Sandfort, Del
The Occupational Safety and Health Administration (OSHA) On-Site Consultation Service provides assistance establishing occupational health and safety management systems (OHSMS) to small businesses. The Safety and Health Program Assessment Worksheet (Revised OSHA Form 33) is the instrument used by consultants to assess an organization's OHSMS and provide feedback on how to improve a system. A survey was developed to determine the usefulness of the Revised OSHA Form 33 from the perspective of Colorado OSHA consultation clients. One hundred and seven clients who had received consultation services within a six-year period responded to the survey. The vast majority of respondents indicated that the Revised OSHA Form 33 accurately reflected their OHSMS and that information provided on the Revised OSHA Form 33 was helpful for improving their systems. Specific outcomes reported by the respondents included increased safety awareness, reduced injuries, and improved morale. The results indicate that the OHSMS assistance provided by OSHA consultation is beneficial for clients and that the Revised OSHA Form 33 can be an effective tool for assessing and communicating OHSMS results to business management. Detailed comments and suggestions provided on the Revised OSHA Form 33 are helpful for clients to improve their OHSMS.
Full Text Available HADES experiment at GSI is the only high precision experiment probing nuclear matter in the beam energy range of a few AGeV. Pion, proton and ion beams are used to study rare dielectron and strangeness probes to diagnose properties of strongly interacting matter in this energy regime. Selected results from p + A and A + A collisions are presented and discussed.
Vadaparampil, S. T.; Malo, T.; Cruz, C. D. L.; Christie, J.; Vadaparampil, S. T.
BRCA genetic test results provide important information to manage cancer risk for patients and their families. Little is known on the communication of genetic test results by mutation status with family members and physicians in the oncology care setting. As part of a longitudinal study evaluating the impact of genetic counseling and testing among recently diagnosed breast cancer patients, we collected patients' self-reported patterns of disclosure. Descriptive statistics characterized the sample and determined the prevalence of disclosure of BRCA test results to family members and physicians. Of 100 patients who completed the baseline and the 6-month followup survey, 77 reported pursuing testing. The majority shared test results with female first-degree relatives; fewer did with males. Participants were more likely to share results with oncologists compared to surgeons, primary care physicians, or other specialty physicians. These findings suggest that while breast cancer patients may communicate results to at-risk female family members and their medical oncologist, they may need education and support to facilitate communication to other first-degree relatives and providers
A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)
Mod Ali, N
As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country.
Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang
Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)
Woywodt, Alexander; Vythelingum, Kervina; Rayner, Scott; Anderton, John; Ahmed, Aimun
Renal PatientView (RPV) is a novel, web-based system in the UK that provides patients with access to their laboratory results, in conjunction with patient information. To study how renal patients within our centre access and use RPV. We sent out questionnaires in December 2011 to all 651 RPV users under our care. We collected information on aspects such as the frequency and timing of RPV usage, the parameters viewed by users, and the impact of RPV on their care. A total of 295 (45 %) questionnaires were returned. The predominant users of RPV were transplant patients (42 %) followed by pre-dialysis chronic kidney disease patients (37 %). Forty-two percent of RPV users accessed their results after their clinic appointments, 38 % prior to visiting the clinic. The majority of patients (76 %) had used the system to discuss treatment with their renal physician, while 20 % of patients gave permission to other members of their family to use RPV to monitor results on their behalf. Most users (78 %) reported accessing RPV on average 1-5 times/month. Most patients used RPV to monitor their kidney function, 81 % to check creatinine levels, 57 % to check potassium results. Ninety-two percent of patients found RPV easy to use and 93 % felt that overall the system helps them in taking care of their condition; 53 % of patients reported high satisfaction with RPV. Our results provide interesting insight into use of a system that gives patients web-based access to laboratory results. The fact that 20 % of patients delegate access to relatives also warrants further study. We propose that online access to laboratory results should be offered to all renal patients, although clinicians need to be mindful of the 'digital divide', i.e. part of the population that is not amenable to IT-based strategies for patient empowerment.
Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen
A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the
Dummer, Geoffrey W A; Hiller, N
Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea
Pitigoi, A. E.; Fernandez Ramos, P.
Improving reliability has recently become a very important objective in the field of particle accelerators. The particle accelerators in operation are constantly undergoing modifications, and improvements are implemented using new technologies, more reliable components or redundant schemes (to obtain more reliability, strength, more power, etc.) A reliability model of SNS (Spallation Neutron Source) LINAC has been developed within MAX project and analysis of the accelerator systems reliability has been performed within the MAX project, using the Risk Spectrum reliability analysis software. The analysis results have been evaluated by comparison with the SNS operational data. Results and conclusions are presented in this paper, oriented to identify design weaknesses and provide recommendations for improving reliability of MYRRHA linear accelerator. The SNS reliability model developed for the MAX preliminary design phase indicates possible avenues for further investigation that could be needed to improve the reliability of the high-power accelerators, in view of the future reliability targets of ADS accelerators.
Park, Kyoung Su
This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.
According to the author’s many years’ work experience, this paper first discusses the concepts of electrical automation control equipment reliability testing, and then analyzes the test method of electrical automation control equipment reliability testing, finally, on this basis, this article discusses how to determine the reliability test method of electrical automation control equipment. Results of this study will provide a useful reference for electrical automation control equipment reliab...
Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results
Damásio, Bruno F; Valentini, Felipe; Núñes-Rodriguez, Susana I; Kliem, Soeren; Koller, Sílvia H; Hinz, Andreas; Brähler, Elmar; Finck, Carolyn; Zenger, Markus
This study evaluated cross-cultural measurement invariance for the General Self-efficacy Scale (GSES) in a large Brazilian (N = 2.394) and representative German (N = 2.046) and Colombian (N = 1.500) samples. Initially, multiple-indicators multiple-causes (MIMIC) analyses showed that sex and age were biasing items responses on the total sample (2 and 10 items, respectively). After controlling for these two covariates, a multigroup confirmatory factor analysis (MGCFA) was employed. Configural invariance was attested. However, metric invariance was not supported for five items, in a total of 10, and scalar invariance was not supported for all items. We also evaluated the differences between the latent scores estimated by two models: MIMIC and MGCFA unconstraining the non-equivalent parameters across countries. The average difference was equal to |.07| on the estimation of the latent scores, and 22.8% of the scores were biased in at least .10 standardized points. Bias effects were above the mean for the German group, which the average difference was equal to |.09|, and 33.7% of the scores were biased in at least .10. In synthesis, the GSES did not provide evidence of measurement invariance to be employed in this cross-cultural study. More than that, our results showed that even when controlling for sex and age effects, the absence of control on items parameters in the MGCFA analyses across countries would implicate in bias of the latent scores estimation, with a higher effect for the German population.
Ahmed, Saeed; Schwarz, Monica; Flick, Robert J; Rees, Chris A; Harawa, Mwelura; Simon, Katie; Robison, Jeff A; Kazembe, Peter N; Kim, Maria H
To assess implementation of provider-initiated testing and counselling (PITC) for HIV in Malawi. A review of PITC practices within 118 departments in 12 Ministry of Health (MoH) facilities across Malawi was conducted. Information on PITC practices was collected via a health facility survey. Data describing patient visits and HIV tests were abstracted from routinely collected programme data. Reported PITC practices were highly variable. Most providers practiced symptom-based PITC. Antenatal clinics and maternity wards reported widespread use of routine opt-out PITC. In 2014, there was approximately 1 HIV test for every 15 clinic visits. HIV status was ascertained in 94.3% (5293/5615) of patients at tuberculosis clinics, 92.6% (30,675/33,142) of patients at antenatal clinics and 49.4% (6871/13,914) of patients at sexually transmitted infection clinics. Reported challenges to delivering PITC included test kit shortages (71/71 providers), insufficient physical space (58/71) and inadequate number of HIV counsellors (32/71) while providers from inpatient units cited the inability to test on weekends. Various models of PITC currently exist at MoH facilities in Malawi. Only antenatal and maternity clinics demonstrated high rates of routine opt-out PITC. The low ratio of facility visits to HIV tests suggests missed opportunities for HIV testing. However, the high proportion of patients at TB and antenatal clinics with known HIV status suggests that routine PITC is feasible. These results underscore the need to develop clear, standardised PITC policy and protocols, and to address obstacles of limited health commodities, infrastructure and human resources. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.
Progress in Methodologies for the Assessment of Passive Safety System Reliability in Advanced Reactors. Results from the Coordinated Research Project on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors
objective of the CRP was to determine a common method for reliability assessment of passive safety system performance. Such a method would facilitate the application of risk informed approaches in design optimization and safety qualification of future advanced reactors, thereby contributing to their enhanced safety levels and improved economics. Five Member States participated, representing seven research institutes and organizations in Argentina, France, India, Italy and the Russian Federation. This publication is the outcome of the different tasks performed and extensive discussions held in the technical meetings, and summarizes the information provided by the technical experts within the CRP over the four year period
Morland, E.; Sherry, A.H.
A series of six large-scale experiments have been carried out at AEA Technology using the Spinning Cylinder test facility. Results from two of those experiments (SC-I and SC-II) have been provided to Project FALSIRE and are reviewed in this paper. The Spinning Cylinder tests were carried out using hollow cylinders of 1.4m outer diameter, 0.2m wall thickness and 1.3m length, containing full-length axial defects and fabricated from a modified A508 Class 3 steel. The first Spinning Cylinder test (SC-I) was an investigation of stable ductile growth induced via mechanical (primary) loading and under conditions of contained yielding. Mechanical loading was provided in the hoop direction by rotating the cylinder about its major axis within an enclosed oven. The second test (SC-II) investigated stable ductile growth under severe thermal shock (secondary) loading again under conditions of contained yielding. In this case thermal shock was produced by spraying cold water on the inside surface of the heated cylinder whilst it was rotating. For each experiment, results are presented in terms of a number of variables, eg. crack growth, temperature, stress, strain and applied K and J. In addition, an overview of the analyses of the FALSIRE Phase-1 report is also presented with respect to test SC-I and SC-II. 4 refs., 14 figs., 13 tabs
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is
Jerng, Dong Wook; Ju, Tae Young
There are two methods to determine the reliability criteria for maintenance effectiveness monitoring; using the failure probability, and the importance from PRA. Comparisons of the results from these two methods provides an insight on the relevancy of setting the reliability criteria to improve the maintenance effectiveness. (author)
Perceptions of methicillin-resistant Staphylococcus aureus and hand hygiene provider training and patient education: results of a mixed method study of health care providers in Department of Veterans Affairs spinal cord injury and disorder units.
Hill, Jennifer N; Hogan, Timothy P; Cameron, Kenzie A; Guihan, Marylou; Goldstein, Barry; Evans, Martin E; Evans, Charlesnika T
The goal of this study was to assess current practices for training of spinal cord injury and disorder (SCI/D) health care workers and education of veterans with SCI/D in Department of Veterans Affairs (VA) spinal cord injury (SCI) centers on methicillin-resistant Staphylococcus aureus (MRSA) prevention. Mixed methods. A Web-based survey was distributed to 673 VA SCI/D providers across 24 SCI centers; 21 acute care and 1 long-term care facility participated. There were 295 that responded, 228 had complete data and were included in this analysis. Semistructured interviews were conducted with 30 SCI/D providers across 9 SCI centers. Nurses, physicians, and therapists represent most respondents (92.1%, n = 210); over half (56.6%, n = 129) were nurses. Of providers, 75.9% (n = 173) reported receiving excellent or good training on how to educate patients about MRSA. However, nurses were more likely to report having excellent or good training for how to educate patients about MRSA (P = .005). Despite this, only 63.6% (n = 82) of nurses perceived the education they provide patients on how MRSA is transmitted as excellent or good. Despite health care workers reporting receiving excellent or good training on MRSA-related topics, this did not translate to excellent or good education for patients, suggesting that health care workers need additional training for educating patients. Population-specific MRSA prevention educational materials may also assist providers in educating patients about MRSA prevention for individuals with SCI/D. Published by Mosby, Inc.
Garrick, B.J.; Kaplan, S.
This paper reviews some of the history and status of nuclear reliability and the evolution of this subject from art towards science. It shows that that probability theory is the appropriate and essential mathematical language of this subject. The authors emphasize that it is more useful to view probability not as a $prime$frequency$prime$, i.e., not as the result of a statistical experiment, but rather as a measure of state of confidence or a state of knowledge. They also show that the probabilistic, quantitative approach has a considerable history of application in the electric power industry in the area of power system planning. Finally, the authors show that the decision theory notion of utility provides a point of view from which risks, benefits, safety, and reliability can be viewed in a unified way thus facilitating understanding, comparison, and communication. 29 refs
Delionback, L. M.
The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.
Singh, A.; Spurgin, A.J.; Martin, T.; Welsch, J.; Hallam, J.W.
OPERAS is a personal-computer (PC) based software to collect and process simulator data on control-room operators responses during requalification training scenarios. The data collection scheme is based upon approach developed earlier during the EPRI Operator Reliability Experiments project. The software allows automated data collection from simulator, thus minimizing simulator staff time and resources to collect, maintain and process data which can be useful in monitoring, assessing and enhancing the progress of crew reliability and effectiveness. The system is designed to provide the data and output information in the form of user-friendly charts, tables and figures for use by plant staff. OPERAS prototype software has been implemented at the Diablo Canyon (PWR) and Millstone (BWR) plants and is currently being used to collect operator response data. Data collected from similator include plant-state variables such as reactor pressure and temperature, malfunction, times at which annunciators are activated, operator actions and observations of crew behavior by training staff. The data and systematic analytical results provided by the OPERAS system can contribute to increase objectivity by the utility probabilistic risk analysis (PRA) and training staff in monitoring and assessing reliability of their crews
Callahan, John R.; Montgomery, Todd L.; Whetten, Brian
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
Full Text Available Abstract Background Child and infant malnourishment is a significant and growing problem in the developing world. Malnourished children are at high risk for negative health outcomes over their lifespans. Philani, a paraprofessional home visiting program, was developed to improve childhood nourishment. The objective of this study is to evaluate whether the Philani program can rehabilitate malnourished children in a timely manner. Methods Mentor Mothers were trained to conduct home visits. Mentor Mothers went from house to house in assigned neighborhoods, weighed children age 5 and younger, and recruited mother-child dyads where there was an underweight child. Participating dyads were assigned in a 2:1 random sequence to the Philani intervention condition (n = 536 or a control condition (n = 252. Mentor Mothers visited dyads in the intervention condition for one year, supporting mothers' problem-solving around nutrition. All children were weighed by Mentor Mothers at baseline and three, six, nine and twelve month follow-ups. Results By three months, children in the intervention condition were five times more likely to rehabilitate (reach a healthy weight for their ages than children in the control condition. Throughout the course of the study, 43% (n = 233 of 536 of children in the intervention condition were rehabilitated while 31% (n = 78 of 252 of children in the control condition were rehabilitated. Conclusions Paraprofessional Mentor Mothers are an effective strategy for delivering home visiting programs by providing the knowledge and support necessary to change the behavior of families at risk.
Blick, Kenneth E
To develop a fully automated core laboratory, handling samples on a "first in, first out" real-time basis with Lean/Six Sigma management tools. Our primary goal was to provide services to critical care areas, eliminating turnaround time outlier percentage (TAT-OP) as a factor in patient length of stay (LOS). A secondary goal was to achieve a better laboratory return on investment. In 2011, we reached our primary goal when we calculated the TAT-OP distribution and found we had achieved a Six Sigma level of performance, ensuring that our laboratory service can be essentially eliminated as a factor in emergency department patient LOS. We also measured return on investment, showing a productivity improvement of 35%, keeping pace with our increased testing volume. As a result of our Lean process improvements and Six Sigma initiatives, in part through (1) strategic deployment of point-of-care testing and (2) core laboratory total automation with robotics, middleware, and expert system technology, physicians and nurses at the Oklahoma University Medical Center can more effectively deliver lifesaving health care using evidence-based protocols that depend heavily on "on time, every time" laboratory services.
U.S. Department of Health & Human Services — The Hospice Utilization and Payment Public Use File provides information on services provided to Medicare beneficiaries by hospice providers. The Hospice PUF...
This report investigates, through several examples from the field, the reliability of electronic units in a broader sense. That is, it treats not just random parts failure, but also inadequate reliability design and (externally and internally) induced failures. The report is not meant to be merely an indication of the state of the art for the reliability prediction methods we know, but also as a contribution to the investigation of man-machine interplay in the operation and repair of electronic equipment. The report firmly links electronics reliability to safety and risk analyses approaches with a broader, system oriented view of reliability prediction and with postfailure stress analysis. It is intended to reveal, in a qualitative manner, the existence of symptom and cause patterns. It provides a background for further investigations to identify the detailed mechanisms of the faults and the remedical actions and precautions for achieving cost effective reliability. (author)
Poucet, A.; Guagnini, E.
This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved
Zhou, Yan; Fu, Liya; Zhang, Jun; Hui, Yongchang
To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.
Doctor, S.R.; Deffenbaugh, J.D.; Good, M.S.; Green, E.R.; Heasler, P.G.; Hutton, P.H.; Reid, L.D.; Simonen, F.A.; Spanner, J.C.; Vo, T.V.
This paper reports on progress for three programs: (1) evaluation and improvement in nondestructive examination reliability for inservice inspection of light water reactors (LWR) (NDE Reliability Program), (2) field validation acceptance, and training for advanced NDE technology, and (3) evaluation of computer-based NDE techniques and regional support of inspection activities. The NDE Reliability Program objectives are to quantify the reliability of inservice inspection techniques for LWR primary system components through independent research and establish means for obtaining improvements in the reliability of inservice inspections. The areas of significant progress will be described concerning ASME Code activities, re-analysis of the PISC-II data, the equipment interaction matrix study, new inspection criteria, and PISC-III. The objectives of the second program are to develop field procedures for the AE and SAFT-UT techniques, perform field validation testing of these techniques, provide training in the techniques for NRC headquarters and regional staff, and work with the ASME Code for the use of these advanced technologies. The final program's objective is to evaluate the reliability and accuracy of interpretation of results from computer-based ultrasonic inservice inspection systems, and to develop guidelines for NRC staff to monitor and evaluate the effectiveness of inservice inspections conducted on nuclear power reactors. This program started in the last quarter of FY89, and the extent of the program was to prepare a work plan for presentation to and approval from a technical advisory group of NRC staff
Nanhoe, Anita C; Visser, Maartje; Omlo, Jurriaan J; Watzeels, Anita J C M; van den Broek, Ingrid V; Götz, Hannelore M
Chlamydia prevalence in the Netherlands remains high despite targeted efforts. Effective Partner Notification (PN) and Partner Treatment (PT) can interrupt transmission and prevent re-infections. Patient Initiated Partner Treatment (PIPT) may strengthen chlamydia control. This study explores the current practice of PN and PT, and benefits of, and barriers and facilitators for PIPT among professionals in sexual health care in the Netherlands. A qualitative study was performed among GPs, GP-assistants (GPAs), physicians and nurses working at Sexual Health Clinics (SHC) and key-informants on ethnical diversity using topic lists in focus groups (N = 40) and semi-structured questionnaires in individual interviews (N = 9). Topics included current practices regarding PN and PT, attitude regarding PIPT, and perceived barriers and facilitators for PIPT. Interviews were taped, transcribed verbatim, and coded using ATLAS.ti. A quantitative online questionnaire on the same topics was sent to all physicians and nurses employed at Dutch SHC (complete response rate 26% (84/321)). The qualitative study showed that all professionals support the need for more attention to PN, and that they saw advantages in PIPT. Mentioned barriers included unwilling PN-behaviour, Dutch legislation, several medical considerations and inadequate skills of GPs. Also, concerns about limited knowledge of cultural sensitivity around PN and PT were raised. Mentioned facilitators of PIPT were reliable home based test-kits, phone-contact between professionals and notified partners, more consultation time for GPs or GPAs and additional training. The online questionnaire showed that SHC employees agreed that partners should be treated as soon as possible, but also that they were reluctant towards PIPT without counselling and testing. Professionals saw advantages in PIPT, but they also identified barriers hampering the potential introduction of PIPT. Improving PN and counselling skills with specific
Lim, Tae Jin
This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.
Alipour, R.; Nejadx, Farokhi A.; Izman, S. [Universiti Teknologi Malaysia, Johor Bahru (Malaysia)
The application of dual phase steels (DPS) such as DP600 in the form of thin-walled structure in automotive components is being continuously increased as vehicle designers utilize modern steel grades and low weight structures to improve structural performance, make automotive light and reinforce crash performance. Preventing cost enhancement of broad investigations in this area can be gained by using computers in structural analysis in order to substitute lots of experiments with finite element analysis (FEA). Nevertheless, it necessitates to be certified that selected method including element type and solution methodology is capable of predicting real condition. In this paper, numerical and experimental studies are done to specify the effect of element type selection and solution methodology on the results of finite element analysis in order to investigate the energy absorption behavior of a DP600 thin-walled structure with three different geometries under a low impact loading. The outcomes indicated the combination of implicit method and solid elements is in better agreement with the experiments. In addition, using a combination of shell element types with implicit method reduces the time of simulation remarkably, although the error of results compared to the experiments increased to some extent.
Sitzman, Thomas J; Allori, Alexander C; Matic, Damir B; Beals, Stephen P; Fisher, David M; Samson, Thomas D; Marcus, Jeffrey R; Tse, Raymond W
Objective Oronasal fistula is an important complication of cleft palate repair that is frequently used to evaluate surgical quality, yet reliability of fistula classification has never been examined. The objective of this study was to determine the reliability of oronasal fistula classification both within individual surgeons and between multiple surgeons. Design Using intraoral photographs of children with repaired cleft palate, surgeons rated the location of palatal fistulae using the Pittsburgh Fistula Classification System. Intrarater and interrater reliability scores were calculated for each region of the palate. Participants Eight cleft surgeons rated photographs obtained from 29 children. Results Within individual surgeons reliability for each region of the Pittsburgh classification ranged from moderate to almost perfect (κ = .60-.96). By contrast, reliability between surgeons was lower, ranging from fair to substantial (κ = .23-.70). Between-surgeon reliability was lowest for the junction of the soft and hard palates (κ = .23). Within-surgeon and between-surgeon reliability were almost perfect for the more general classification of fistula in the secondary palate (κ = .95 and κ = .83, respectively). Conclusions This is the first reliability study of fistula classification. We show that the Pittsburgh Fistula Classification System is reliable when used by an individual surgeon, but less reliable when used among multiple surgeons. Comparisons of fistula occurrence among surgeons may be subject to less bias if they use the more general classification of "presence or absence of fistula of the secondary palate" rather than the Pittsburgh Fistula Classification System.
Son, Young Kap
Reliability of an engineering system depends on two reliability metrics: the mechanical reliability, considering component failures, that a functional system topology is maintained and the performance reliability of adequate system performance in each functional configuration. Component degradation explains not only the component aging processes leading to failure in function, but also system performance change over time. Multiple competing failure modes for systems with degrading components in terms of system functionality and system performance are considered in this paper with the assumption that system functionality is not independent of system performance. To reduce errors in system reliability prediction, this paper tries to extend system performance reliability prediction methods in open literature through combining system mechanical reliability from component reliabilities and system performance reliability. The extended reliability prediction method provides a useful way to compare designs as well as to determine effective maintenance policy for efficient reliability growth. Application of the method to an electro-mechanical system, as an illustrative example, is explained in detail, and the prediction results are discussed. Both mechanical reliability and performance reliability are compared to total system reliability in terms of reliability prediction errors
Lavkovsky, S.; Kvasha, N.; Kobzev, V.; Sadovnikov, V.; Goltsev, V.
The practice of radioactive waste treatment in the former USSR was that prior to at-sea dumping of objects with spent nuclear fuel (SNF) a set of design and technological measures was undertaken with a view to form packings with additional barriers to prevent radionuclide release in the environment. Based upon the results of most conservative evaluations of the protective barrier corrosion resistance it was concluded, that till Year 2300 there will be no grounds to worry about a possibility of the loss of tightness of the majority of packings. However, should unfavourable external natural factors combine, the loss of sealing of the packing with the screening assembly of the nuclear icebreaker 'Lenin' can occur at any moment. (author)
Etzel, K.T.; Howard, W.W.; Zgliczynski, J.B.
This document discusses thermal performance envelopes which are used to specify steady-state design requirements for the systems of the Modular High Temperature Gas-Cooled Reactor to maximize plant performance reliability with optimized design. The thermal performance envelopes are constructed around the expected operating point accounting for uncertainties in actual plant as-built parameters and plant operation. The components are then designed to perform successfully at all points within the envelope. As a result, plant reliability is maximized by accounting for component thermal performance variation in the design. The design is optimized by providing a means to determine required margins in a disciplined and visible fashion
At present, in oil refineries in Japan, the term of the continued operation in oil refining facilities is shorter than that in Europe and America because of the regulation on the open inspection period for boilers and hazardous material storage tanks. As a result, the refining cost is comparatively higher than in Europe and America due to the increase in inspection/repair cost and decrease in operational rate. Therefore, it is becoming important to effectively supply petroleum products by keeping stability in oil refining facilities of the whole Japan and prolonging the term of the continued operation of oil refining facilities, etc. In this R and D, the technical development is conducted which is needed for the long-term continued operation of oil refining facilities. The items for the R and D are as follows: assessment technology of reliability of oil refining high temperature system facilities, assessment technology of reliability of piping/storage facilities in oil refinery, assessment technology of reliability of oil refining power system facilities, technology of management support system in oil refining facilities. In this fiscal year, technical survey, data collection, and construction of the basic concept of developmental technology were mostly conducted. Also conducted were trial manufacture of various probes for non-fracture inspection use, oscillators, etc., and basic design of inspection equipment and trial manufacture of a part of them. And the data acquired were analyzed. (NEDO)
Swendeman, Dallas; Farmer, Shu; Mindry, Deborah; Lee, Sung-Jae; Medich, Melissa
In-depth qualitative interviews were conducted with healthcare providers (HCPs) from five HIV medical care coordination teams in a large Los Angeles County HIV clinic, including physicians, nurses, and psychosocial services providers. HCPs reported on the potential utility, acceptability, and barriers for patient self-monitoring and notifications via mobile phones, and web-based dashboards for HCPs. Potential benefits included: 1) enhancing patient engagement, motivation, adherence, and self-management; and 2) improving provider-patient relationships and HCP care coordination. Newly diagnosed and patients with co-morbidities were highest priorities for mobile application support. Facilitators included universal mobile phone ownership and use of smartphones or text messaging. Patient-level barriers included concerns about low motivation and financial instability for consistent use by some patients. Organizational barriers, cited primarily by physicians, included concerns about privacy protections, easy dashboard access, non-integrated electronic records, and competing burdens in limited appointment times. Psychosocial services providers were most supportive of the proposed mobile tools.
Full Text Available Cogeneration and trigeneration plants are widely recognized as promising technologies for increasing energy efficiency in buildings. However, their overall potential is scarcely exploited, due to the difficulties in achieving economic viability and the risk of investment related to uncertainties in future energy loads and prices. Several stochastic optimization models have been proposed in the literature to account for uncertainties, but these instruments share in a common reliance on user-defined probability functions for each stochastic parameter. Being such functions hard to predict, in this paper an analysis of the influence of erroneous estimation of the uncertain energy loads and prices on the optimal plant design and operation is proposed. With reference to a hotel building, a number of realistic scenarios is developed, exploring all the most frequent errors occurring in the estimation of energy loads and prices. Then, profit-oriented optimizations are performed for the examined scenarios, by means of a deterministic mixed integer linear programming algorithm. From a comparison between the achieved results, it emerges that: (i the plant profitability is prevalently influenced by the average “spark-spread” (i.e., ratio between electricity and fuel price and, secondarily, from the shape of the daily price profiles; (ii the “optimal sizes” of the main components are scarcely influenced by the daily load profiles, while they are more strictly related with the average “power to heat” and “power to cooling” ratios of the building.
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs
Didi, Jennifer; Lemée, Ludovic; Gibert, Laure; Pons, Jean-Louis; Pestel-Caron, Martine
Staphylococcus lugdunensis is an emergent virulent coagulase-negative staphylococcus responsible for severe infections similar to those caused by Staphylococcus aureus. To understand its potentially pathogenic capacity and have further detailed knowledge of the molecular traits of this organism, 93 isolates from various geographic origins were analyzed by multi-virulence-locus sequence typing (MVLST), targeting seven known or putative virulence-associated loci (atlLR2, atlLR3, hlb, isdJ, SLUG_09050, SLUG_16930, and vwbl). The polymorphisms of the putative virulence-associated loci were moderate and comparable to those of the housekeeping genes analyzed by multilocus sequence typing (MLST). However, the MVLST scheme generated 43 virulence types (VTs) compared to 20 sequence types (STs) based on MLST, indicating that MVLST was significantly more discriminating (Simpson's index [D], 0.943). No hypervirulent lineage or cluster specific to carriage strains was defined. The results of multilocus sequence analysis of known and putative virulence-associated loci are consistent with a clonal population structure for S. lugdunensis, suggesting a coevolution of these genes with housekeeping genes. Indeed, the nonsynonymous to synonymous evolutionary substitutions (dN/dS) ratio, the Tajima's D test, and Single-likelihood ancestor counting (SLAC) analysis suggest that all virulence-associated loci were under negative selection, even atlLR2 (AtlL protein) and SLUG_16930 (FbpA homologue), for which the dN/dS ratios were higher. In addition, this analysis of virulence-associated loci allowed us to propose a trilocus sequence typing scheme based on the intragenic regions of atlLR3, isdJ, and SLUG_16930, which is more discriminant than MLST for studying short-term epidemiology and further characterizing the lineages of the rare but highly pathogenic S. lugdunensis. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
This book shows how to build in, evaluate, and demonstrate reliability and availability of components, equipment, systems. It presents the state-of-theart of reliability engineering, both in theory and practice, and is based on the author's more than 30 years experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The structure of the book allows rapid access to practical results. This final edition extend and replace all previous editions. New are, in particular, a strategy to mitigate incomplete coverage, a comprehensive introduction to human reliability with design guidelines and new models, and a refinement of reliability allocation, design guidelines for maintainability, and concepts related to regenerative stochastic processes. The set of problems for homework has been extended. Methods & tools are given in a way that they can be tailored to cover different reliability requirement levels and be used for safety analysis. Because of the Appendice...
Every year, RTE produces a reliability report for the past year. This report includes a number of results from previous years so that year-to-year comparisons can be drawn and long-term trends analysed. The 2015 report underlines the major factors that have impacted on the reliability of the electrical power system, without focusing exclusively on Significant System Events (ESS). It describes various factors which contribute to present and future reliability and the numerous actions implemented by RTE to ensure reliability today and in the future, as well as the ways in which the various parties involved in the electrical power system interact across the whole European interconnected network
In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)
Pescatore, C.; Sastre, C.
Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table
Presti, Giovambattista; Cau, Silvia; Oppo, Annalisa; Moderato, Paolo
To increase classroom consumption of home-provided fruits (F) and vegetables (V) in obese, overweight, and normal weight children. Consumption evaluated within and across the baseline phase and the end of the intervention and maintenance phases. Three Italian primary schools. The study involved 672 children (321 male and 329 female) aged 5-11 years. Body mass index measures were available for 461 children. Intervention schools received the Food Dudes (FD) program: 16 days of repeated taste exposure (40 g of F and 40 g of V), video modeling, and rewards-based techniques. The comparison school was only repeatedly exposed to FV. Grams of FV brought from home and eaten. Chi-square, independent t test, repeated-measures ANOVA, and generalized estimating equation model. Intervention schools show a significant increase in home-provided F (P < .001) and V (P < .001) consumption both in overweight and non-overweight children. Approximately half of children in the intervention schools ate at least 1 portion of FV at the end of the intervention and maintenance phases. The increase in home-provided FV intake was similar in overweight and non-overweight children in the FD intervention schools compared with the comparison school. The effect of the FD program was higher at the end of the intervention phase than the end of the maintenance phase. Copyright © 2015 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
McKinnon, Wendy; Naud, Shelly; Ashikaga, Taka; Colletti, Rose; Wood, Marie
: Providing medical management updates and long-term support to families with hereditary cancer syndromes in rural areas is a challenge. To address this, we designed a one-day retreat for BRCA1/2 carriers in our region. The retreat included educational updates about medical management, genetic privacy and discrimination, and addressed psychological and family issues. Evaluations completed at the conclusion of the retreat were overwhelmingly positive with requests for a similar event in the future. The impact of this retreat on a variety of health behaviors was assessed. Eligible participants completed questionnaires before and 6 months after the retreat. Questionnaires focused on lifestyle, cancer screening and prevention practices, psychological history and distress, decision-making regarding genetic testing, and family communication issues. For individuals who completed both the pre and post retreat questionnaires, one-half made lifestyle changes and nearly two-thirds increased cancer screening, initiated chemoprevention, completed or planned to complete preventative surgery in the future. We conclude that this type of forum provides a valuable opportunity for BRCA carriers and their families to receive updated medical information, share personal experiences, provide and receive support, as well as change health behaviors.
Full Text Available BACKGROUND: Turnaround time (TAT is an important indicator of laboratory performance. It is often difficult to achieve fast TAT for blood tests conducted at clinics in developing countries. This is because clinics where the patient is treated are often far away from the laboratory, and transporting blood samples and test results between the two locations creates significant delay. Recent efforts have sought to mitigate this problem by using Short Message Service (SMS to reduce TAT. Studies reporting the impact of this technique have not been published in scientific literature however. In this paper we present a study of LabPush, a system developed to test whether SMS delivery of HIV related laboratory results to clinics could shorten TAT time significantly. METHOD: LapPush was implemented in six clinics of the Kingdom of Swaziland. SMS results were sent out from the laboratory as a supplement to normal transport of paper results. Each clinic was equipped with a mobile phone to receive SMS results. The laboratory that processes the blood tests was equipped with a system for digital input of results, and transmission of results via SMS to the clinics. RESULTS: Laboratory results were received for 1041 different clinical cases. The total number of SMS records received (1032 was higher than that of paper records (965, indicating a higher loss rate for paper records. A statistical comparison of TAT for SMS and paper reports indicates a statistically significant improvement for SMS. Results were more positive for more rural clinics, and an urban clinic with high workload. CONCLUSION: SMS can be used to reduce TAT for blood tests taken at clinics in developing countries. Benefits are likely to be greater at clinics that are further away from laboratories, due to the difficulties this imposes on transport of paper records.
Hatcher, Peter; Shaikh, Shiraz; Fazli, Hassan; Zaidi, Shehla; Riaz, Atif
There is dearth of evidence on provider cost of contracted out services particularly for Maternal and Newborn Health (MNH). The evidence base is weak for policy makers to estimate resources required for scaling up contracting. This paper ascertains provider unit costs and expenditure distribution at contracted out government primary health centers to inform the development of optimal resource envelopes for contracting out MNH services. This is a case study of provider costs of MNH services at two government Rural Health Centers (RHCs) contracted out to a non-governmental organization in Pakistan. It reports on four selected Basic Emergency Obstetrical and Newborn Care (BEmONC) services provided in one RHC and six Comprehensive Emergency Obstetrical and Newborn Care (CEmONC) services in the other. Data were collected using staff interviews and record review to compile resource inputs and service volumes, and analyzed using the CORE Plus tool. Unit costs are based on actual costs of MNH services and are calculated for actual volumes in 2011 and for volumes projected to meet need with optimal resource inputs. The unit costs per service for actual 2011 volumes at the BEmONC RHC were antenatal care (ANC) visit USD$ 18.78, normal delivery US$ 84.61, newborn care US$ 16.86 and a postnatal care (PNC) visit US$ 13.86; and at the CEmONC RHC were ANC visit US$ 45.50, Normal Delivery US$ 148.43, assisted delivery US$ 167.43, C-section US$ 183.34, Newborn Care US$ 41.07, and PNC visit US$ 27.34. The unit costs for the projected volumes needed were lower due to optimal utilization of resources. The percentage distribution of expenditures at both RHCs was largest for salaries of technical staff, followed by salaries of administrative staff, and then operating costs, medicines, medical and diagnostic supplies. The unit costs of MNH services at the two contracted out government rural facilities remain higher than is optimal, primarily due to underutilization. Provider cost analysis
... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.
Lee, Jung Hwan; Lee, Sang-Ho
Epidural steroid injection (ESI) is known to be an effective treatment for neck or radicular pain due to herniated intervertebral disc (HIVD) and spinal stenosis (SS). Although repeat ESI has generally been indicated to provide more pain relief in partial responders after single ESI, there has been little evidence supporting the usefulness of this procedure. The purpose of this study, therefore, was to determine whether repeat ESI at a prescribed interval of 2 to 3 weeks after the first injection would provide greater clinical benefit in patients with partial pain reduction than intermittent ESI performed only when pain was aggravated. One hundred eighty-four patients who underwent transforaminal ESI (TFESI) for treatment of axial neck and radicular arm pain due to HIVD or SS and could be followed up for 1 year were enrolled. We divided the patients into 2 groups. Group A (N = 108) comprised partial responders (numeric rating scale (NRS) ≥ 3 after the first injection) who underwent repeat injection at a prescribed interval of 2 to 3 weeks after the first injection. Group B (N = 76) comprised partial responders who did not receive repeat injection at the prescribed interval, but received intermittent injections only for aggravation of pain. Various clinical data were assessed, including total number of injections during 1 year, NRS duration of Group A, or after first injection in Group B (time to reinjection). Groups A and B were compared in terms of total population, HIVD, and SS. In the whole population, HIVD subgroup, and SS subgroup, patients in Group A required significantly fewer injections to obtain satisfactory pain relief during the 1-year follow-up period. Group A showed a significantly longer time to reinjection and longer NRS Group B did. Repeat TFESI conducted at 2- to 3-week intervals after the first injection in partial responders contributed to greater clinical benefit compared with intermittent TFESI performed only upon pain
Experiences and meanings of integration of TCAM (Traditional, Complementary and Alternative Medical) providers in three Indian states: results from a cross-sectional, qualitative implementation research study.
Nambiar, D; Narayan, V V; Josyula, L K; Porter, J D H; Sathyanarayana, T N; Sheikh, K
Efforts to engage Traditional, Complementary and Alternative Medical (TCAM) practitioners in the public health workforce have growing relevance for India's path to universal health coverage. We used an action-centred framework to understand how policy prescriptions related to integration were being implemented in three distinct Indian states. Health departments and district-level primary care facilities in the states of Kerala, Meghalaya and Delhi. In each state, two or three districts were chosen that represented a variation in accessibility and distribution across TCAM providers (eg, small or large proportions of local health practitioners, Homoeopaths, Ayurvedic and/or Unani practitioners). Per district, two blocks or geographical units were selected. TCAM and allopathic practitioners, administrators and representatives of the community at the district and state levels were chosen based on publicly available records from state and municipal authorities. A total of 196 interviews were carried out: 74 in Kerala, and 61 each in Delhi and Meghalaya. We sought to understand experiences and meanings associated with integration across stakeholders, as well as barriers and facilitators to implementing policies related to integration of Traditional, Complementary and Alternative (TCA) providers at the systems level. We found that individual and interpersonal attributes tended to facilitate integration, while system features and processes tended to hinder it. Collegiality, recognition of stature, as well as exercise of individual personal initiative among TCA practitioners and of personal experience of TCAM among allopaths enabled integration. The system, on the other hand, was characterised by the fragmentation of jurisdiction and facilities, intersystem isolation, lack of trust in and awareness of TCA systems, and inadequate infrastructure and resources for TCA service delivery. State-tailored strategies that routinise interaction, reward individual and system
Rebecca M. Skhosana
Full Text Available The objective of this study was to explore and describe the experiences of health care providers managing sexual assault victims in the emergency unit of a community hospital in the Nkangala district in the Mpumalanga Province. A qualitative, phenomenological design was applied. Purposeful sampling was used to select participants from health care providers who were working in the emergency unit and had managed more than four sexual assault victims. Data were collected by means of individual interviews and analysed according to the Tesch method of data analysis by the researcher and the independent co-coder. Main categories, subcategories and themes were identified. Participants expressed their emotions, challenges and police attitudes and behaviours, as well as inconsistencies in guidelines and needs identification. It was recommended that members of the multidisciplinary team engage in community activities and that the community participate in matters pertaining to sexual assault. Government should develop clear guidelines that are applicable to rural and urban South Africa. Health care sciences should aim to train more forensic nurses. All relevant departments should work together to alleviate the complications caused by sexual assault incidents. Opsomming Die doel van hierdie studie was om die ervaringe van gesondheidsorgverskaffers wat slagoffers van seksuele aanranding in die ongevalle-eenheid van 'n gemeenskapshospitaal in die Nkangala-distrik in die provinsie van Mpumalanga hanteer, te ontgin en te beskryf. ’n Kwalitatiewe fenomenologiese ontwerp is toegepas. Doelbewuste steekproefneming is gebruik om deelnemers te selekteer uit die groep gesondheidsorgverskaffers wat in die ongevalle-eenheid werksaam was en meer as vier slagoffers van seksuele aanranding hanteer het. Data is by wyse van individuele onderhoude ingesamel en volgens die Tesch-metode van data-analise deur die navorser en die onafhanklike medekodeerder geanaliseer
Marchand, Kirsten; Palis, Heather; Oviedo-Joekes, Eugenia
Using data from a nationally representative survey, the Canadian Community Health Survey-Mental Health, this secondary analysis aimed to determine the prevalence of perceived prejudice by health care providers (HCPs) and its relationship with mental disorders. Respondents accessing HCPs in the prior year were asked if they experienced HCP prejudice. A hypothesis driven multivariable logistic regression analysis was conducted to determine the relationship between type of mental disorders and HCP prejudice. Among the 3006 respondents, 10.9 % perceived HCP prejudice, 62.4 % of whom reported a mental disorder. The adjusted odds of prejudice was highest for respondents with anxiety (OR 3.12; 95 % CI 1.60, 6.07), concurrent mood or anxiety and substance disorders (OR 3.08; 95 % CI 1.59, 5.95) and co-occurring mood and anxiety disorders (OR 2.89; 95 % CI 1.68, 4.97) compared to respondents without any mental disorders. These findings are timely for informing discussions regarding policies to address HCP prejudice towards people with mental disorders.
Every year, RTE produces a reliability report for the past year. This document lays out the main factors that affected the electrical power system's operational reliability in 2016 and the initiatives currently under way intended to ensure its reliability in the future. Within a context of the energy transition, changes to the European interconnected network mean that RTE has to adapt on an on-going basis. These changes include the increase in the share of renewables injecting an intermittent power supply into networks, resulting in a need for flexibility, and a diversification in the numbers of stakeholders operating in the energy sector and changes in the ways in which they behave. These changes are dramatically changing the structure of the power system of tomorrow and the way in which it will operate - particularly the way in which voltage and frequency are controlled, as well as the distribution of flows, the power system's stability, the level of reserves needed to ensure supply-demand balance, network studies, assets' operating and control rules, the tools used and the expertise of operators. The results obtained in 2016 are evidence of a globally satisfactory level of reliability for RTE's operations in somewhat demanding circumstances: more complex supply-demand balance management, cross-border schedules at interconnections indicating operation that is closer to its limits and - most noteworthy - having to manage a cold spell just as several nuclear power plants had been shut down. In a drive to keep pace with the changes expected to occur in these circumstances, RTE implemented numerous initiatives to ensure high levels of reliability: - maintaining investment levels of euro 1.5 billion per year; - increasing cross-zonal capacity at borders with our neighbouring countries, thus bolstering the security of our electricity supply; - implementing new mechanisms (demand response, capacity mechanism, interruptibility, etc.); - involvement in tests or projects
Sargusingh, Miriam J.; Nelson, Jason
Reliability has been highlighted by NASA as critical to future human space exploration particularly in the area of environmental controls and life support systems. The Advanced Exploration Systems (AES) projects have been encouraged to pursue higher reliability components and systems as part of technology development plans. However, there is no consensus on what is meant by improving on reliability; nor on how to assess reliability within the AES projects. This became apparent when trying to assess reliability as one of several figures of merit for a regenerable water architecture trade study. In the Spring of 2013, the AES Water Recovery Project (WRP) hosted a series of events at the NASA Johnson Space Center (JSC) with the intended goal of establishing a common language and understanding of our reliability goals and equipping the projects with acceptable means of assessing our respective systems. This campaign included an educational series in which experts from across the agency and academia provided information on terminology, tools and techniques associated with evaluating and designing for system reliability. The campaign culminated in a workshop at JSC with members of the ECLSS and AES communities with the goal of developing a consensus on what reliability means to AES and identifying methods for assessing our low to mid-technology readiness level (TRL) technologies for reliability. This paper details the results of the workshop.
Capuano, Nicola; Logoluso, Nicola; Gallazzi, Enrico; Drago, Lorenzo; Romanò, Carlo Luca
Aim of this study was to verify the hypothesis that a one-stage exchange procedure, performed with an antibiotic-loaded, fast-resorbable hydrogel coating, provides similar infection recurrence rate than a two-stage procedure without the coating, in patients affected by peri-prosthetic joint infection (PJI). In this two-center case-control, study, 22 patients, treated with a one-stage procedure, using implants coated with an antibiotic-loaded hydrogel [defensive antibacterial coating (DAC)], were compared with 22 retrospective matched controls, treated with a two-stage revision procedure, without the coating. At a mean follow-up of 29.3 ± 5.0 months, two patients (9.1%) in the DAC group showed an infection recurrence, compared to three patients (13.6%) in the two-stage group. Clinical scores were similar between groups, while average hospital stay and antibiotic treatment duration were significantly reduced after one-stage, compared to two-stage (18.9 ± 2.9 versus 35.8 ± 3.4 and 23.5 ± 3.3 versus 53.7 ± 5.6 days, respectively). Although in a relatively limited series of patients, our data shows similar infection recurrence rate after one-stage exchange with DAC-coated implants, compared to two-stage revision without coating, with reduced overall hospitalization time and antibiotic treatment duration. These findings warrant further studies in the possible applications of antibacterial coating technologies to treat implant-related infections. III.
Frank C Mng'ong'o
Full Text Available Sustained malaria control is underway using a combination of vector control, prompt diagnosis and treatment of malaria cases. Progress is excellent, but for long-term control, low-cost, sustainable tools that supplement existing control programs are needed. Conventional vector control tools such as indoor residual spraying and house screening are highly effective, but difficult to deliver in rural areas. Therefore, an additional means of reducing mosquito house entry was evaluated: the screening of mosquito house entry points by planting the tall and densely foliated repellent plant Lantana camara L. around houses. A pilot efficacy study was performed in Kagera Region, Tanzania in an area of high seasonal malaria transmission, where consenting families within the study village planted L. camara (Lantana around their homes and were responsible for maintaining the plants. Questionnaire data on house design, socioeconomic status, malaria prevention knowledge, attitude and practices was collected from 231 houses with Lantana planted around them 90 houses without repellent plants. Mosquitoes were collected using CDC Light Traps between September 2008 and July 2009. Data were analysed with generalised negative binomial regression, controlling for the effect of sampling period. Indoor catches of mosquitoes in houses with Lantana were compared using the Incidence Rate Ratio (IRR relative to houses without plants in an adjusted analysis. There were 56% fewer Anopheles gambiae s.s. (IRR 0.44, 95% CI 0.28-0.68, p<0.0001; 83% fewer Anopheles funestus s.s. (IRR 0.17, 95% CI 0.09-0.32, p<0.0001, and 50% fewer mosquitoes of any kind (IRR 0.50, 95% CI 0.38-0.67, p<0.0001 in houses with Lantana relative to controls. House screening using Lantana reduced indoor densities of malaria vectors and nuisance mosquitoes with broad community acceptance. Providing sufficient plants for one home costs US $1.50 including maintenance and labour costs, (30 cents per person. L
Acharya, G.D.; Trivedi, S.A.R.; Pai, K.B.
The high performance of the time of flight diffraction technique (TOFD) with regard to the detection capabilities of weld defects such as crack, slag, lack of fusion has led to a rapidly increasing acceptance of the technique as a pre?service inspection tool. Since the early 1990s TOFD has been applied to several projects, where it replaced the commonly used radiographic testing. The use of TOM lead to major time savings during new build and replacement projects. At the same time the TOFD technique was used as base line inspection, which enables monitoring in the future for critical welds, but also provides documented evidence for life?time. The TOFD technique as the ability to detect and simultaneously size flows of nearly any orientation within the weld and heat affected zone. TOM is recognized as a reliable, proven technique for detection and sizing of defects and proven to be a time saver, resulting in shorter shutdown periods and construction project times. Thus even in cases where inspection price of TOFD per welds is higher, in the end it will result in significantly lower overall costs and improve quality. This paper deals with reliability, economy, acceptance criteria and field experience. It also covers comparative study between radiography technique Vs. TOFD. (Author)
Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh
Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...
Schiøler, Henrik; Ravn, Anders Peter; Izadi-Zamanabadi, Roozbeh
Reliability issues for various technical systems are discussed and focus is directed towards distributed systems, where communication facilities are vital to maintain system functionality. Reliability in communication subsystems is considered as a resource to be shared among a number of logical c...... applications residing on alternative routes. Details are provided for the operation of RRRSVP based on reliability slack calculus. Conclusions summarize the considerations and give directions for future research....... connections and a reliability management framework is suggested. We suggest a network layer level reliability management protocol RRSVP (Reliability Resource Reservation Protocol) as a counterpart of the RSVP for bandwidth and time resource management. Active and passive standby redundancy by background...
Ataca, Dalya; Caikovski, Marian; Piersigilli, Alessandra; Moulin, Alexandre; Benarafa, Charaf; Earp, Sarah E; Guri, Yakir; Kostic, Corinne; Arsenijevic, Yvan; Soininen, Raija; Apte, Suneel S; Brisken, Cathrin
The ADAMTS family comprises 19 secreted metalloproteinases that cleave extracellular matrix components and have diverse functions in numerous disease and physiological contexts. A number of them remain 'orphan' proteases and among them is ADAMTS18, which has been implicated in developmental eye disorders, platelet function and various malignancies. To assess in vivo function of ADAMTS18, we generated a mouse strain with inactivated Adamts18 alleles. In the C57Bl6/Ola background, Adamts18-deficient mice are born in a normal Mendelian ratio, and are viable but show a transient growth delay. Histological examination revealed a 100% penetrant eye defect resulting from leakage of lens material through the lens capsule occurring at embryonic day (E)13.5, when the lens grows rapidly. Adamts18-deficient lungs showed altered bronchiolar branching. Fifty percent of mutant females are infertile because of vaginal obstruction due to either a dorsoventral vaginal septum or imperforate vagina. The incidence of ovarian rete is increased in the mutant mouse strain. Thus, Adamts18 is essential in the development of distinct tissues and the new mouse strain is likely to be useful for investigating ADAMTS18 function in human disease, particularly in the contexts of infertility and carcinogenesis. © 2016. Published by The Company of Biologists Ltd.
Full Text Available The ADAMTS family comprises 19 secreted metalloproteinases that cleave extracellular matrix components and have diverse functions in numerous disease and physiological contexts. A number of them remain ‘orphan’ proteases and among them is ADAMTS18, which has been implicated in developmental eye disorders, platelet function and various malignancies. To assess in vivo function of ADAMTS18, we generated a mouse strain with inactivated Adamts18 alleles. In the C57Bl6/Ola background, Adamts18-deficient mice are born in a normal Mendelian ratio, and are viable but show a transient growth delay. Histological examination revealed a 100% penetrant eye defect resulting from leakage of lens material through the lens capsule occurring at embryonic day (E13.5, when the lens grows rapidly. Adamts18-deficient lungs showed altered bronchiolar branching. Fifty percent of mutant females are infertile because of vaginal obstruction due to either a dorsoventral vaginal septum or imperforate vagina. The incidence of ovarian rete is increased in the mutant mouse strain. Thus, Adamts18 is essential in the development of distinct tissues and the new mouse strain is likely to be useful for investigating ADAMTS18 function in human disease, particularly in the contexts of infertility and carcinogenesis.
Reith, Florence C M; Van den Brande, Ruben; Synnot, Anneliese; Gruen, Russell; Maas, Andrew I R
The Glasgow Coma Scale (GCS) provides a structured method for assessment of the level of consciousness. Its derived sum score is applied in research and adopted in intensive care unit scoring systems. Controversy exists on the reliability of the GCS. The aim of this systematic review was to summarize evidence on the reliability of the GCS. A literature search was undertaken in MEDLINE, EMBASE and CINAHL. Observational studies that assessed the reliability of the GCS, expressed by a statistical measure, were included. Methodological quality was evaluated with the consensus-based standards for the selection of health measurement instruments checklist and its influence on results considered. Reliability estimates were synthesized narratively. We identified 52 relevant studies that showed significant heterogeneity in the type of reliability estimates used, patients studied, setting and characteristics of observers. Methodological quality was good (n = 7), fair (n = 18) or poor (n = 27). In good quality studies, kappa values were ≥0.6 in 85%, and all intraclass correlation coefficients indicated excellent reliability. Poor quality studies showed lower reliability estimates. Reliability for the GCS components was higher than for the sum score. Factors that may influence reliability include education and training, the level of consciousness and type of stimuli used. Only 13% of studies were of good quality and inconsistency in reported reliability estimates was found. Although the reliability was adequate in good quality studies, further improvement is desirable. From a methodological perspective, the quality of reliability studies needs to be improved. From a clinical perspective, a renewed focus on training/education and standardization of assessment is required.
Full Text Available Introduction: One of the major causes of loosening of cementless acetabular cup implants is insufficient initial stability. A technical proposal to decrease the risk of suboptimal first stability is a circumferential finned design of the cup. This design aims to improve periacetabular bone contact and prevent rotational micromotion of the cup when optimal press-fit cannot be obtained. Materials and Methods: We retrospectively reviewed a group of 712 consecutive patients who underwent total hip arthroplasty from June 2006 to June 2014. In all patients, a titanium cup, characterized by three anti-rotational circumferential fins at the superior pole, was implanted. Results: Five hundred and ninety-two patients, for a total of 685 hips, were evaluated at a mean follow-up of 58 months (range 12-96 months. At 1-year follow-up, the average score increased to 82.90 (range 100-70 and at the final follow-up (58 months, range 12-96 months, it was 80.12 (range 100-66. In 22 cases (3%, screws to obtain a secure primary stability of the cup were used. Nineteen complications (2.6% needing revision surgery were observed. Survivorship at 10 years was 98.7% (95% confidence interval [CI], 98.7-99.7% with revision for aseptic cup loosening as an endpoint and 96.7% (95% CI, 98.3-95.1% with revision for all causes of revision as the second endpoint. Discussion: In our group of patients, we did not observe the cases of very early cup loosening. The only two-cup revision, do to loosening of osteolysis, was observed 26 and 32 months before surgery. Conclusion: Our very low rate of additional screws represents an indirect sign of finned cup first stability. Three-finned cup design clinically confirmed to improve initial cup stability.
Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne
When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement
Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.
This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.
A criterion was established concerning the protection that nuclear power plant (NPP) safety systems should afford. An estimate of the necessary or adequate reliability of the total complex of safety systems was derived. The acceptable unreliability of auxiliary safety systems is given, provided the reliability built into the specific NPP safety systems (ECCS, Containment) is to be fully utilized. A criterion for the acceptable unreliability of safety (sub)systems which occur in minimum cut sets having three or more components of the analysed fault tree was proposed. A set of input MTBF or MTTF values which fulfil all the set criteria and attain the appropriate overall reliability was derived. The sensitivity of results to input reliability data values was estimated. Numerical reliability evaluations were evaluated by the programs POTI, KOMBI and particularly URSULA, the last being based on Vesely's kinetic fault tree theory. (author)
Milligan, M.; Artig, R.
Generating capacity that is available during the utility peak period is worth more than off-peak capacity. Wind power from a single location might not be available during enough of the peak period to provide sufficient value. However, if the wind power plant is developed over geographically disperse locations, the timing and availability of wind power from these multiple sources could provide a better match with the utility's peak load than a single site. There are other issues that arise when considering disperse wind plant development. Singular development can result in economies of scale and might reduce the costs of obtaining multiple permits and multiple interconnections. However, disperse development can result in cost efficiencies if interconnection can be accomplished at lower voltages or at locations closer to load centers. Several wind plants are in various stages of planning or development in the US. Although some of these are small-scale demonstration projects, significant wind capacity has been developed in Minnesota, with additional developments planned in Wyoming, Iowa and Texas. As these and other projects are planned and developed, there is a need to perform analysis of the value of geographically disperse sites on the reliability of the overall wind plant.This paper uses a production-cost/reliability model to analyze the reliability of several wind sites in the state of Minnesota. The analysis finds that the use of a model with traditional reliability measures does not produce consistent, robust results. An approach based on fuzzy set theory is applied in this paper, with improved results. Using such a model, the authors find that system reliability can be optimized with a mix of disperse wind sites
Faber, M.H.; Sørensen, John Dalsgaard
The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....
Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo
The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.
Using Economic Experiments to Test the Effect of Reliability Pricing and Self-Sizing on the Private Provision of a Public Good Results: The Case of Constructing Water Conveyance Infrastructure to Mitigate Water Quantity and Quality Concerns in the Sacramento-San Joaquin Delta
Kaplan, J.; Howitt, R. E.; Kroll, S.
Public financing of public projects is becoming more difficult with growing political and financial pressure to reduce the size and scope of government action. Private provision is possible but is often doomed by under-provision. If however, market-like mechanisms could be incorporated into the solicitation of funds to finance the provision of the good, because, for example, the good is supplied stochastically and is divisible, then we would expect fewer incentives to free ride and greater efficiency in providing the public good. In a controlled computer-based economic experiment, we evaluate two market-like conditions (reliability pricing allocation and self-sizing of the good) that are designed to reduce under-provision. The results suggest that financing an infrastructure project when the delivery is allocated based on reliability pricing rather than historical allocation results in significantly greater price formation efficiency and less free riding whether the project is of a fixed size determined by external policy makers or determined endogenously by the sum of private contributions. When reliability pricing and self-sizing (endogenous) mechanism are used in combination free-riding is reduced the greatest among the tested treatments. Furthermore, and as expected, self-sizing when combined with historical allocations results in the worst level of free-riding. This setting for this treatment creates an incentive to undervalue willingness to pay since very low contributions still return positive earnings as long as enough contributions are raised for a single unit. If everyone perceives everyone else is undervaluing their contribution the incentive grows stronger and we see the greatest degree of free riding among the treatments. Lastly, the results from the analysis suggested that the rebate rule may have encouraged those with willingness to pay values less than the cost of the project to feel confident when contributing more than their willingness to pay and
Pahlevani, Peyman; Paramanathan, Achuthan; Hundebøll, Martin
The advantages of network coding have been extensively studied in the field of wireless networks. Integrating network coding with existing IEEE 802.11 MAC layer is a challenging problem. The IEEE 802.11 MAC does not provide any reliability mechanisms for overheard packets. This paper addresses...... this problem and suggests different mechanisms to support reliability as part of the MAC protocol. Analytical expressions to this problem are given to qualify the performance of the modified network coding. These expressions are confirmed by numerical result. While the suggested reliability mechanisms...
Rocco, Claudio M.; Moreno, Jose Ali
This paper deals with the feasibility of using support vector machine (SVM) to build empirical models for use in reliability evaluation. The approach takes advantage of the speed of SVM in the numerous model calculations typically required to perform a Monte Carlo reliability evaluation. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replace system performance evaluation by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated by several examples. Excellent system reliability results are obtained by training a SVM with a small amount of information
... either: Provide little protection for Bulk-Power System reliability or are redundant with other aspects... for retirement either: (1) Provide little protection for Bulk-Power System reliability or (2) are... to assure reliability of the Bulk-Power System and should be withdrawn. We have identified 41...
The Digital Micromirror Device (DMD) developed by Texas Instruments (TI) has made tremendous progress in both performance and reliability since it was first invented in 1987. From the first working concept of a bistable mirror, the DMD is now providing high-brightness, high-contrast, and high-reliability in over 1,500,000 projectors using Digital Light Processing technology. In early 2000, TI introduced the first DMD chip with a smaller mirror (14-micron pitch versus 17-micron pitch). This allowed a greater number of high-resolution DMD chips per wafer, thus providing an increased output capacity as well as the flexibility to use existing package designs. By using existing package designs, subsequent DMDs cost less as well as met our customers' demand for faster time to market. In recent years, the DMD achieved the status of being a commercially successful MEMS device. It reached this status by the efforts of hundreds of individuals working toward a common goal over many years. Neither textbooks nor design guidelines existed at the time. There was little infrastructure in place to support such a large endeavor. The knowledge we gained through our characterization and testing was all we had available to us through the first few years of development. Reliability was only a goal in 1992 when production development activity started; a goal that many throughout the industry and even within Texas Instruments doubted the DMD could achieve. The results presented in this paper demonstrate that we succeeded by exceeding the reliability goals.
Cannon, A.G.; Bendell, A.
Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)
Pitigoi, A.; Fernandez, P.
A reliability model of SNS LINAC (Spallation Neutron Source at Oak Ridge National Laboratory) has been developed using risk spectrum reliability analysis software and the analysis of the accelerator system's reliability has been performed. The analysis results have been evaluated by comparing them with the SNS operational data. This paper presents the main results and conclusions focusing on the definition of design weaknesses and provides recommendations to improve reliability of the MYRRHA ( linear accelerator. The reliability results show that the most affected SNS LINAC parts/systems are: 1) SCL (superconducting linac), front-end systems: IS, LEBT (low-energy beam transport line), MEBT (medium-energy beam transport line), diagnostics and controls; 2) RF systems (especially the SCL RF system); 3) power supplies and PS controllers. These results are in line with the records in the SNS logbook. The reliability issue that needs to be enforced in the linac design is the redundancy of the systems, subsystems and components most affected by failures. For compensation purposes, there is a need for intelligent fail-over redundancy implementation in controllers. Enough diagnostics has to be implemented to allow reliable functioning of the redundant solutions and to ensure the compensation function
Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory
SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.
Takekuro, I. [Tokyo Electric Power Company, Tokyo (Japan)
Japanese electric power companies are now positioning themselves to gain a stronger position in the liberalised electricity market. Nuclear power in particular plays an important role in satisfying a large part of domestic electricity demand and its performance has continued to improve as a result of enhanced safety operation and tough maintenance programmes. Although the criticality accident which occurred in 1999 shocked not only the public but also the nuclear industry itself, the accident provided an opportunity for the industry and the regulators to learn lessons and look again at safety issues. Japanese electric power companies are now eager to be seen as front-runners in the safe, reliable, and efficient generation of nuclear power for the twenty-first century. (author)
Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples
Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.
Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.
Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.
The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis
This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems
This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.
Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de
Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)
This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.
Smallest detectable change and test-retest reliability of a self-reported outcome measure: Results of the Center for Epidemiologic Studies Depression Scale, General Self-Efficacy Scale, and 12-item General Health Questionnaire.
Ohno, Shotaro; Takahashi, Kana; Inoue, Aimi; Takada, Koki; Ishihara, Yoshiaki; Tanigawa, Masaru; Hirao, Kazuki
This study aims to examine the smallest detectable change (SDC) and test-retest reliability of the Center for Epidemiologic Studies Depression Scale (CES-D), General Self-Efficacy Scale (GSES), and 12-item General Health Questionnaire (GHQ-12). We tested 154 young adults at baseline and 2 weeks later. We calculated the intra-class correlation coefficients (ICCs) for test-retest reliability with a two-way random effects model for agreement. We then calculated the standard error of measurement (SEM) for agreement using the ICC formula. The SEM for agreement was used to calculate SDC values at the individual level (SDC ind ) and group level (SDC group ). The study participants included 137 young adults. The ICCs for all self-reported outcome measurement scales exceeded 0.70. The SEM of CES-D was 3.64, leading to an SDC ind of 10.10 points and SDC group of 0.86 points. The SEM of GSES was 1.56, leading to an SDC ind of 4.33 points and SDC group of 0.37 points. The SEM of GHQ-12 with bimodal scoring was 1.47, leading to an SDC ind of 4.06 points and SDC group of 0.35 points. The SEM of GHQ-12 with Likert scoring was 2.44, leading to an SDC ind of 6.76 points and SDC group of 0.58 points. To confirm that the change was not a result of measurement error, a score of self-reported outcome measurement scales would need to change by an amount greater than these SDC values. This has important implications for clinicians and epidemiologists when assessing outcomes. © 2017 John Wiley & Sons, Ltd.
Berg, Melanie; LaBel, Kenneth A.
This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?
exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future
Thoft-Christensen, Palle; Nowak, Andrzej S.
The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....
The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)
A brief description is provided of a three-year effort undertaken by the Lawrence Livermore National Laboratory for the piping reliability project. The ultimate goal of this project is to provide guidance for nuclear piping design so that high-reliability piping systems can be built. Based on the results studied so far, it is concluded that the reliability approach can undoubtedly help in understanding not only how to assess and improve the safety of the piping systems but also how to design more reliable piping systems
Trudnowski, Daniel [Montana Tech of the Univ. of Montana, Butte, MT (United States)
This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”
Sørensen, John Dalsgaard
Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....
Wear, L L; Pinkert, J R
In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.
Kumar, C. Senthil; John Arul, A.; Pal Singh, Om; Suryaprakasa Rao, K.
This paper presents the results of reliability analysis of Shutdown System (SDS) of Indian Prototype Fast Breeder Reactor. Reliability analysis carried out using Fault Tree Analysis predicts a value of 3.5 x 10 -8 /de for failure of shutdown function in case of global faults and 4.4 x 10 -8 /de for local faults. Based on 20 de/y, the frequency of shutdown function failure is 0.7 x 10 -6 /ry, which meets the reliability target, set by the Indian Atomic Energy Regulatory Board. The reliability is limited by Common Cause Failure (CCF) of actuation part of SDS and to a lesser extent CCF of electronic components. The failure frequency of individual systems is -3 /ry, which also meets the safety criteria. Uncertainty analysis indicates a maximum error factor of 5 for the top event unavailability
LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.
Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508
Use of the data of nuclear physical methods of sampling and logging enables to improve reliability of evaluation of radiometric enriching ability of ores, as well as to evaluate quantitatively this reliability. This problem may be solved by using some concepts of geostatistics. The presented results enable to conclude, that the data of nuclear-physical methods of sampling and logging can provide high reliability of evaluation of radiometric enriching ability of non-ferrous ores and their geometrization by technological types
Full Text Available Background: Today it is virtually impossible to operate alone on the international level in the logistics business. This promotes the establishment and development of new integrated business entities - logistic operators. However, such cooperation within a supply chain creates also many problems related to the supply chain reliability as well as the optimization of the supplies planning. The aim of this paper was to develop and formulate the mathematical model and algorithms to find the optimum plan of supplies by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Methods: The mathematical model and algorithms to find the optimum plan of supplies were developed and formulated by using economic criterion and the model for the probability evaluating of non-failure operation of supply chain. Results and conclusions: The problem of ensuring failure-free performance of goods supply channel analyzed in the paper is characteristic of distributed network systems that make active use of business process outsourcing technologies. The complex planning problem occurring in such systems that requires taking into account the consumer's requirements for failure-free performance in terms of supply volumes and correctness can be reduced to a relatively simple linear programming problem through logical analysis of the structures. The sequence of the operations, which should be taken into account during the process of the supply planning with the supplier's functional reliability, was presented.
Jones, Harry W.
A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and
Malone, E.A.; Ayres, D.J.; Lear, R.C. van
Responding to increasingly stringent regulatory requirements, Babcock and Wilcox has teamed up with three specialist companies to provide services for nuclear utilities aiming to improve the performance of their valves and actuators. The services, which are outlined here, include inspection, repair, overhaul and valve and actuator reliability programmes. (author)
Jones, Harry W.
Recycling life support systems can achieve ultra reliability by using spares to replace failed components. The added mass for spares is approximately equal to the original system mass, provided the original system reliability is not very low. Acceptable reliability can be achieved for the space shuttle and space station by preventive maintenance and by replacing failed units, However, this maintenance and repair depends on a logistics supply chain that provides the needed spares. The Mars mission must take all the needed spares at launch. The Mars mission also must achieve ultra reliability, a very low failure rate per hour, since it requires years rather than weeks and cannot be cut short if a failure occurs. Also, the Mars mission has a much higher mass launch cost per kilogram than shuttle or station. Achieving ultra reliable space life support with acceptable mass will require a well-planned and extensive development effort. Analysis must define the reliability requirement and allocate it to subsystems and components. Technologies, components, and materials must be designed and selected for high reliability. Extensive testing is needed to ascertain very low failure rates. Systems design should segregate the failure causes in the smallest, most easily replaceable parts. The systems must be designed, produced, integrated, and tested without impairing system reliability. Maintenance and failed unit replacement should not introduce any additional probability of failure. The overall system must be tested sufficiently to identify any design errors. A program to develop ultra reliable space life support systems with acceptable mass must start soon if it is to produce timely results for the moon and Mars.
This paper describes the experimental tests performed in order to prove the reliability parameters for certain equipment manufactured in INR Pitesti, for NPP Cernavoda. The tests were provided by Technical Specifications and test procedures. A comparison, referring to the reliability parameters, between Canadian equipment and INR manufactured equipment ones is also given. The results of tests and conclusions are shown. (author)
Adams, S A; De Bont, A A
This article analyzes the efforts of three organizations to provide a standard that guides Internet users to reliable health care sites. Comparison of health Internet sites, interviews and document studies. In comparing these approaches, three different constructions of reliability are identified. The resulting possibilities and restrictions of these constructions for users that are searching for health information on the Internet are revealed.
The recently published Washington International Energy Group's 1993 Electric Utility Outlook states that nearly one-third (31 percent) of U.S. utility executives expect reliability to decrease in the near future. Electric power system stability is crucial to reliability. Stability analysis determines whether a system will stay intact under normal operating conditions, during minor disturbances such as load fluctuations, and during major disturbances when one or more parts of the system fails. All system elements contribute to reliability or the lack of it. However, this report centers on the transmission segment of the electric system. The North American Electric Reliability Council (NERC) says the transmission systems as planned will be adequate over the next 10 years. However, delays in building new lines and increasing demands for transmission services are serious concerns. Reliability concerns exist in the Mid-Continent Area Power Pool and the Mid-America Interconnected Network regions where transmission facilities have not been allowed to be constructed as planned. Portions of the transmission systems in other regions are loaded at or near their limits. NERC further states that utilities must be allowed to complete planned generation and transmission as scheduled. A reliable supply of electricity also depends on adhering to established operating criteria. Factors that could complicate operations include: More interchange schedules resulting from increased transmission services. Increased line loadings in portions of the transmission systems. Proliferation of non-utility generators
Siahpush, A.S.; Hills, S.W.; Pham, H.; Majumdar, D.
A study was performed to determine the common methods and tools that are available to calculated or predict a system's reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs
Parkinson, D.B.; Oestergaard, C.
1 - Description of problem or function: Calculation of the reliability index given the failure boundary. A linearization point (design point) is found on the failure boundary for a stationary reliability index (min) and a stationary failure probability density function along the failure boundary, provided that the basic variables are normally distributed. 2 - Method of solution: Iteration along the failure boundary which must be specified - together with its partial derivatives with respect to the basic variables - by the user in a subroutine FSUR. 3 - Restrictions on the complexity of the problem: No distribution information included (first-order-second-moment-method). 20 basic variables (could be extended)
A. V. Rаdkеvich
Full Text Available Purpose. Development of scientific-methodical bases to the design of rational management of material streams in the field of building providing taking into account intersystem connections with the enterprises of building industry. Methodology. The analysis of last few years of functioning of building industry in Ukraine allows distinguishing a number of problems that negatively influence the steady development of building, as the component of the state economics system. Therefore the research of existent organization methods of the system of building objects providing with material resources is extremely necessary. In connection with this the article justifies the use of method of hierarchies analysis (Saati method for finding the optimal task solution of fixing the enterprises of building industry after building objects. Findings. Results give an opportunity to guidance of building organization to estimate and choose advantageous suppliers - enterprises of building industry, to conduct their rating, estimation taking into account basic descriptions, such as: quality, price, reliability of deliveries, specialization, financial status etc. Originality. On the basis of Saati method the methodologies of organization are improved, planning and managements of the reliable system of providing of building necessary material resources that meet the technological requirements of implementation of building and installation works. Practical value. Contribution to the decisions of many intricate organizational problems that are accompanied by the problems of development of building, provided due to organization of the reliable system of purchase of material resources.
Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng
Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.
Kim, Man Cheol
This paper provides an overview of the ongoing research activities on a reliability analysis of digital instrumentation and control (I and C) systems of nuclear power plants (NPPs) performed by the Korea Atomic Energy Research Institute (KAERI). The research activities include the development of a new safety-critical software reliability analysis method by integrating the advantages of existing software reliability analysis methods, a fault coverage estimation method based on fault injection experiments, and a new human reliability analysis method for computer-based main control rooms (MCRs) based on human performance data from the APR-1400 full-scope simulator. The research results are expected to be used to address various issues such as the licensing issues related to digital I and C probabilistic safety assessment (PSA) for advanced digital-based NPPs. (author)
Huang, Ting-Cheng; Zhang, Yong-Jun
Incentive-based demand response (IBDR) can guide customers to adjust their behaviour of electricity and curtail load actively. Meanwhile, distributed generation (DG) and energy storage system (ESS) can provide time for the implementation of IBDR. The paper focus on the reliability evaluation of microgrid considering IBDR. Firstly, the mechanism of IBDR and its impact on power supply reliability are analysed. Secondly, the IBDR dispatch model considering customer’s comprehensive assessment and the customer response model are developed. Thirdly, the reliability evaluation method considering IBDR based on Monte Carlo simulation is proposed. Finally, the validity of the above models and method is studied through numerical tests on modified RBTS Bus6 test system. Simulation results demonstrated that IBDR can improve the reliability of microgrid.
Bott, T.F.; Haas, P.M.; Manning, J.J.
The Centralized Reliability Data Organization (CREDO) has been established at Oak Ridge National Laboratory (ORNL) by the Department of Energy to provide a national center for collection, evaluation and dissemination of reliability data for advanced reactors. While the system is being developed and continuous data collection at the two U.S. reactor sites (EBR-II and FFTF) is being established, data on advanced reactor components which have been in use at U.S. test loops and experimental reactors have been collected and analyzed. Engineering, operating and event data on sodium valves, pumps, flow meters, rupture discs, heat exchangers and cold traps have been collected from more than a dozen sites. The results of analyses of the data performed to date are presented
Verma, Ajit Kumar; Karanki, Durga Rao
Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...
Sembiring, N.; Ginting, E.; Darnello, T.
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
Full Text Available We have found the errors in the throughput formulae presented in our paper "Connectivity-based reliable multicast MAC protocol for IEEE 802.11 wireless LANs". We provide the corrected formulae and numerical results.
Oestereich, André L.; Galvão, Ernesto F.
An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.
Hall, R.E.; Boccio, J.L.
Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant
Hammons, Thomas J.; Voropai, Nikolai I.
This paper addresses the problems of power supply reliability in a market environment. The specific features of economic interrelations between the power supply organization and consumers in terms of reliability assurance are examined and the principles of providing power supply reliability are formulated. The economic mechanisms of coordinating the interests of power supply organization and consumers to provide power supply reliability are discussed. Reliability of restructuring China's powe...
... consumers value overall network reliability and quality in selecting mobile wireless service providers, they...-125] Improving the Resiliency of Mobile Wireless Communications Networks; Reliability and Continuity... (Reliability NOI) in 2011 to ``initiate a comprehensive examination of issues regarding the reliability...
Fullwood, R.; Lofaro, R.; Samanta, P.
The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics
Fullwood, R.; Lofaro, R.; Samanta, P.
The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. this paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics
Costa Monteiro, E.; Leon, L. F.
The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.
JIANG; John; N
Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.
Nordlund, Joakim [Cellkraft AB, Stockholm (Sweden)
For fuel cells to be a viable alternative for backup power in applications, where reliability is a critical factor, the reliability of fuel cells has to be high and documented. Based on intrinsic properties of fuel cells, it is safe to argue that it is possible to make them highly reliable, but to unleash the full reliability potential of fuel cells, some great engineering work has to be performed. Cellkraft has since many years been addressing this issue and this project is an important piece of this puzzle. The project included both a large number of laboratory testing of fuel cells and long experiments in field environment to verify the results from the laboratory work. The development work performed within this project is a solid base for the continuous work to fulfil Cellkraft's own, tough, technical reliability targets. The project targets below were achieved within this project: 1. The fuel cell start with 100 % reliability. 2. The fuel cell provides nominal power within 30 seconds in 100 % of the cases. 3. The fuel cell keeps providing nominal power as long as there is a demand in 100 % of the cases. 4. No cell in the fuel cell deviates from the mean cell potential with more than 0,1 V at full power.
Dogliani, M.; Østergaard, C.; Parmentier, G.
This paper deals with the development of different methods that allow the reliability-based design of ship structures to be transferred from the area of research to the systematic application in current design. It summarises the achievements of a three-year collaborative research project dealing...... with developments of models of load effects and of structural collapse adopted in reliability formulations which aim at calibrating partial safety factors for ship structural design. New probabilistic models of still-water load effects are developed both for tankers and for containerships. New results are presented...... structure of several tankers and containerships. The results of the reliability analysis were the basis for the definition of a target safety level which was used to asses the partial safety factors suitable for in a new design rules format to be adopted in modern ship structural design. Finally...
Cost reductions for offshore wind turbines are a substantial requirement in order to make offshore wind energy more competitive compared to other energy supply methods. During the 20 – 25 years of wind turbines useful life, Operation & Maintenance costs are typically estimated to be a quarter...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...... to one third of the total cost of energy. Reduction of Operation & Maintenance costs will result in significant cost savings and result in cheaper electricity production. Operation & Maintenance processes mainly involve actions related to replacements or repair. Identifying the right times when...
Full Text Available This paper describes the phenomenon of reliability of power plants. It gives an explanation of the terms connected with this topic as their proper understanding is important for understanding the relations and equations which model the possible real situations. The reliability phenomenon is analysed using both the exponential distribution and the Weibull distribution. The results of our analysis are specific equations giving information about the characteristics of the power plants, the mean time of operations and the probability of failure-free operation. Equations solved for the Weibull distribution respect the failures as well as the actual operating hours. Thanks to our results, we are able to create a model of dynamic reliability for prediction of future states. It can be useful for improving the current situation of the unit as well as for creating the optimal plan of maintenance and thus have an impact on the overall economics of the operation of these power plants.
Lofgren, E.V.; DeMoss, G.M.; Fragola, J.R.; Appignani, P.L.; Delarche, G.; Boccio, J.
The purpose of this report is to provide technical guidelines for NRC staff use in the development of positions for evaluating emergency diesel generator (EDG) reliability programs. Such reviews will likely result following resolution of USI A-44 and GSI B-56. The diesel generator reliability program is a management system for achieving and maintaining a selected (or target) level of reliability. This can be achieved by: (1) understanding the factors that control the EDG reliability and (2) then applying reliability and maintenance techniques in the proper proportion to achieve selected performance goals. The concepts and guidelines discussed in this report are concepts and approaches that have been successful in applications where high levels of reliability must be maintained. Both an EDG reliability program process and a set of review items for NRC use are provided. The review items represent a checklist for reviewing EDG reliability programs. They do not, in themselves, constitute a reliability program. Rather, the review items are those distinctive features of a reliability program that must be present for the program to be effective
The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.
Halpin, Glennelle; Halpin, Gerald
Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…
... operators of the Bulk-Power System, and other interested parties for improvement of the Electric Reliability... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability..., Reliability Standards that provide for an adequate level of reliability of the Bulk-Power System, and (2) Has...
The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. The article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.
Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.
Silva Monroy, Cesar Augusto; Loose, Verne William
This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.
Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.
This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific
Gintautas, Tomas; Sørensen, John Dalsgaard
Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...
Kim, Man Cheol; Smidts, Carol
The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended
Buden, D.; Hunt, R.N.M.
Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig
Sander, P.; Badoux, R.
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
Full Text Available The paper is devoted to creation of an effective system of mapping at all levels of tourist-excursion functioning that will boost the promotion of tourist product in a domestic and foreign tourist market. The State Scientific - Production Enterprise «Kartographia» actively participates in cartographic tourism provision by producing travel pieces, survey, large-scale, route maps, atlases, travel guides, city plans. They produce maps covering different content of the territory of Ukraine, its individual regions, cities interested in tourist excursions. The list and scope of cartographic products has been prepared for publication and released for the last five years. The development of new types of tourism encourages publishers to create various cartographic products for the needs of tourists guaranteeing high accuracy, reliability of information, ease of use. A variety of scientific and practical problems in tourism and excursion activities that are solved using maps and plans makes it difficult to determine the criteria for assessing their reliability. The author proposes to introduce the concept of «relevance» - as maps suitability to solving specific problems. The basis of the peer review is suitability of maps for the objective results release criteria: appropriateness of the target maps tasks (area, theme, destination; accuracy of given parameters (projection, scale, height interval; year according to the shooting of location or mapping; selection methods, methods of results measurement processing algorithm; availability of assistive devices (instrumentation, computer technology, simulation devices. These criteria provide the reliability and accuracy of the result as acceptable to consumers as possible. The author proposes a set of measures aimed at improving the content, quality and reliability of cartographic production.
Bistouni, Fathollah; Jahanshahi, Mohsen
Supercomputers and multi-processor systems are comprised of thousands of processors that need to communicate in an efficient way. One reasonable solution would be the utilization of multistage interconnection networks (MINs), where the challenge is to analyze the reliability of such networks. One of the methods to increase the reliability and fault-tolerance of the MINs is use of various switching stages. Therefore, recently, the reliability of one of the most common MINs namely shuffle-exchange network (SEN) has been evaluated through the investigation on the impact of increasing the number of switching stage. Also, it is concluded that the reliability of SEN with one additional stage (SEN+) is better than SEN or SEN with two additional stages (SEN+2), even so, the reliability of SEN is better compared to SEN with two additional stages (SEN+2). Here we re-evaluate the reliability of these networks where the results of the terminal, broadcast, and network reliability analysis demonstrate that SEN+ and SEN+2 continuously outperform SEN and are very alike in terms of reliability. - Highlights: • The impact of increasing the number of stages on reliability of MINs is investigated. • The RBD method as an accurate method is used for the reliability analysis of MINs. • Complex series–parallel RBDs are used to determine the reliability of the MINs. • All measures of the reliability (i.e. terminal, broadcast, and network reliability) are analyzed. • All reliability equations will be calculated for different size N×N
Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.
Pham, H.; Pham, M.
This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.
Pham, H.; Pham, M.
This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.
Nannapaneni, Saideep; Mahadevan, Sankaran
This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.
The need for an emergency diesel generator (EDG) reliability program has been established by 10 CFR Part 50, Section 50.63, Loss of All Alternating Current Power, which requires that utilities assess their station blackout duration and recovery capability. EDGs are the principal emergency ac power sources for coping with a station blackout. Regulatory Guide 1.155, Station Blackout, identifies a need for (1) an EDG reliability equal to or greater than 0.95, and (2) an EDG reliability program to monitor and maintain the required levels. The resolution of Generic Safety Issue (GSI) B-56 embodies the identification of a suitable EDG reliability program structure, revision of pertinent regulatory guides and Tech Specs, and development of an Inspection Module. Resolution of B-56 is coupled to the resolution of Unresolved Safety Issue (USI) A-44, Station Blackout, which resulted in the station blackout rule, 10 CFR 50.63 and Regulatory Guide 1.155, Station Blackout. This paper discusses the principal elements of an EDG reliability program developed for resolving GSI B-56 and related matters
The state of the art in safety and reliability assessment of the software of industrial computer systems is reviewed and likely progress over the next few years is identified and compared with the perceived needs of the user. Some of the current projects contributing to the development of new techniques for assessing software reliability are described. One is the software test and evaluation method which looked at the faults within and between two manufacturers specifications, faults in the codes and inconsistencies between the codes and specifications. The results are given. (author)
Skaler, F.; Djetelic, N.
Operation that is safe, reliable, effective and acceptable to public is the common message in a mission statement of commercial nuclear power plants (NPPs). To fulfill these goals, nuclear industry, among other areas, has to focus on: 1 Human Performance (HU) and 2 Equipment Reliability (EQ). The performance objective of HU is as follows: The behaviors of all personnel result in safe and reliable station operation. While unwanted human behaviors in operations mostly result directly in the event, the behavior flaws either in the area of maintenance or engineering usually cause decreased equipment reliability. Unsatisfied Human performance leads even the best designed power plants into significant operating events, which can be found as well-known examples in nuclear industry. Equipment reliability is today recognized as the key to success. While the human performance at most NPPs has been improving since the start of WANO / INPO / IAEA evaluations, the open energy market has forced the nuclear plants to reduce production costs and operate more reliably and effectively. The balance between these two (opposite) goals has made equipment reliability even more important for safe, reliable and efficient production. Insisting on on-line operation by ignoring some principles of safety could nowadays in a well-developed safety culture and human performance environment exceed the cost of electricity losses. In last decade the leading USA nuclear companies put a lot of effort to improve equipment reliability primarily based on INPO Equipment Reliability Program AP-913 at their NPP stations. The Equipment Reliability Program is the key program not only for safe and reliable operation, but also for the Life Cycle Management and Aging Management on the way to the nuclear power plant life extension. The purpose of Equipment Reliability process is to identify, organize, integrate and coordinate equipment reliability activities (preventive and predictive maintenance, maintenance
Brumleve, T D [Plowshare Systems Research Division, Sandia Laboratories, Livermore, CA (United States)
Based on the premise that there will always be a finite chance of a Plowshare project failure, the implications of such a failure are examined. It is suggested that the optimum reliability level will not necessarily be the highest attainable, but rather that which results in minimum average project cost. The type of performance guarantee that the U. S. should provide for nuclear explosive services, the determination of nuclear yield, courses of action to take in the event of failure, and methods to offset remedial costs are discussed. (author)
Based on the premise that there will always be a finite chance of a Plowshare project failure, the implications of such a failure are examined. It is suggested that the optimum reliability level will not necessarily be the highest attainable, but rather that which results in minimum average project cost. The type of performance guarantee that the U. S. should provide for nuclear explosive services, the determination of nuclear yield, courses of action to take in the event of failure, and methods to offset remedial costs are discussed. (author)
Engelen, S.; Gill, E.K.A.; Verhoeven, C.J.M.
Satellite swarms, consisting of a large number of identical, miniaturized and simple satellites, are claimed to provide an implementation for specific space missions which require high reliability. However, a consistent model of how reliability and availability on mission level is linked to cost-
Delgadillo, Lucy M.; Bushman, Brittani S.
Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…
A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le
Reliability of Large and Complex Systems, previously titled Reliability of Large Systems, is an innovative guide to the current state and reliability of large and complex systems. In addition to revised and updated content on the complexity and safety of large and complex mechanisms, this new edition looks at the reliability of nanosystems, a key research topic in nanotechnology science. The author discusses the importance of safety investigation of critical infrastructures that have aged or have been exposed to varying operational conditions. This reference provides an asympt
Full Text Available Performance (availability and yield and reliability of wind turbines can make the difference between success and failure of wind farm projects and these factors are vital to decrease the cost of energy. During the last years, several initiatives started to gather data on the performance and reliability of wind turbines on- and offshore and published findings in different journals and conferences. Even though the scopes of the different initiatives are similar, every initiative follows a different approach and results are therefore difficult to compare. The present paper faces this issue, collects results of different initiatives and harmonizes the results. A short description and assessment of every considered data source is provided. To enable this comparison, the existing reliability characteristics are mapped to a system structure according to the Reference Designation System for Power Plants (RDS-PP®. The review shows a wide variation in the performance and reliability metrics of the individual initiatives. Especially the comparison on onshore wind turbines reveals significant differences between the results. Only a few publications are available on offshore wind turbines and the results show an increasing performance and reliability of offshore wind turbines since the first offshore wind farms were erected and monitored.
Kim, Man Cheol
For the purpose of making system reliability analysis easier and more intuitive, RBDGG (Reliability Block diagram with General Gates) methodology was introduced as an extension of the conventional reliability block diagram. The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system, and therefore the modeling of a system for system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar with that of the development of the RGGG (Reliability Graph with General Gates) methodology, which is an extension of a conventional reliability graph. The newly proposed methodology is now implemented into a software tool, RBDGG Solver. RBDGG Solver was developed as a WIN32 console application. RBDGG Solver receives information on the failure modes and failure probabilities of each component in the system, along with the connection structure and connection logics among the components in the system. Based on the received information, RBDGG Solver automatically generates a system reliability analysis model for the system, and then provides the analysis results. In this paper, application of RBDGG Solver to the reliability analysis of an example system, and verification of the calculation results are provided for the purpose of demonstrating how RBDGG Solver is used for system reliability analysis
Apr 4, 2010 ... from 14 to 30 days in studies done in Europe and. North America ... to confirmatory laboratory diagnostic test was 56.2 days (n=83, range 1 to 985 days, standard .... (9,10). In a population based study in German, Volker ... risk factors for provider delays (9). ... mammographic reliability for cancer diagnosis at.
Lee, Sang Yong; Jung, Jae Hyun; Kim, Seong Hun
The automatic control systems used in nuclear power plant (NPP) consists of numerous control modules that can be considered to be a network of components various complex ways. The control modules require relatively high reliability than industrial electronic products. Reliability prediction provides the rational basis of system designs and also provides the safety significance of system operations. The aim of this paper is to minimize the deficiencies of the traditional reliability prediction method calculation using the available field return data. This way is possible to do more realistic reliability assessment. SAMCHANG Enterprise Company (SEC) has established database containing high quality data at the module and component level from module maintenance in NPP. On the basis of these, this paper compares results that add failure record (field data) to Telcordia-SR-332 reliability prediction model with MIL-HDBK-217F prediction results
Cloud, R.L.; Anderson, P.H.; Leung, J.S.M.
The Seismic Stops methodology has been developed to provide a reliable alternative for providing seismic support to nuclear power plant piping. The concept is based on using rigid passive supports with large clearances. These gaps permit unrestrained thermal expansion while limiting excessive seismic displacements. This type of restraint has performed successfully in fossil fueled power plants. A simplified production analysis tool has been developed which evaluates the nonlinear piping response including the effect of the gapped supports. The methodology utilizes the response spectrum approach and has been incorporated into a piping analysis computer program RLCA-GAP. Full scale shake table tests of piping specimens were performed to provide test correlation with the developed methodology. Analyses using RLCA-GAP were in good agreement with test results. A sample piping system was evaluated using the Seismic Stops methodology to replace the existing snubbers with passive gapped supports. To provide further correlation data, the sample system was also evaluated using nonlinear time history analysis. The correlation comparisons showed RLCA-GAP to be a viable methodology and a reliable alternative for snubber optimization and elimination. (orig.)
Wang, Huai; Ma, Ke; Blaabjerg, Frede
Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....
Solid State Lighting Reliability: Components to Systems begins with an explanation of the major benefits of solid state lighting (SSL) when compared to conventional lighting systems including but not limited to long useful lifetimes of 50,000 (or more) hours and high efficacy. When designing effective devices that take advantage of SSL capabilities the reliability of internal components (optics, drive electronics, controls, thermal design) take on critical importance. As such a detailed discussion of reliability from performance at the device level to sub components is included as well as the integrated systems of SSL modules, lamps and luminaires including various failure modes, reliability testing and reliability performance. This book also: Covers the essential reliability theories and practices for current and future development of Solid State Lighting components and systems Provides a systematic overview for not only the state-of-the-art, but also future roadmap and perspectives of Solid State Lighting r...
Baranowsky, P. W. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)
The reliability of emergency AC power Systems has been under study at the U.S. Nuclear Regulatory Commission and by its contractors for several years. This paper provides the results of work recently performed to evaluate past U.S. nuclear power plant emergency AC power System reliability performance using system level data. Operating experience involving multiple diesel generator failures, unavailabilities, and simultaneous occurrences of failures and out of service diesel generators were used to evaluate reliability performance at individual nuclear power plants covering a 9 year period from 1976 through 1984. The number and nature of failures and distributions of reliability evaluation results are provided. The results show that plant specific performance varied considerably during the period with a large number achieving high reliability performance and a smaller number accounting for lower levels of reliability performance. (author)
One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de
Ditlevsen, Ove Dalager; Madsen, H. O.
The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....
Michael A. Guthrie
Full Text Available limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment. For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.
In this paper a methodology of reliability analysis of mechanical systems with latent failures is described. Reliability analysis of such systems must include appropriate usage of check intervals for latent failure detection. The methodology suggests, that based on system logic the analyst decides at the beginning if a system can fail actively or latently and propagates this approach through all system levels. All inspections are assumed to be perfect (all failures are detected and repaired and no new failures are introduced as a result of the maintenance). Additional assumptions are that mission time is much smaller, than check intervals and all components have constant failure rates. Analytical expressions for reliability calculates are provided, based on fault tree and Markov modeling techniques (for two and three redundant systems with inspection intervals). The proposed methodology yields more accurate results than are obtained by not using check intervals or using half check interval times. The conventional analysis assuming that at the beginning of each mission system is as new, give an optimistic prediction of system reliability. Some examples of reliability calculations of mechanical systems with latent failures and establishing optimum check intervals are provided
Kleinhammer, R. K.; Kahn, J. C.
Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.
DeMott, D. L.; Kleinhammer, R. K.
Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.
Akhavein, A.; Fotuhi Firuzabad, M.
Reliability of energy supply is one of the most important issues of service quality. On one hand, customers usually have different expectations for service reliability and price. On the other hand, providing different level of reliability at load points is a challenge for system operators. In order to take reasonable decisions and obviate reliability implementation difficulties, market players need to know impacts of their assets on system and load-point reliabilities. One tool to specify reliability impacts of assets is the criticality or reliability importance measure by which system components can be ranked based on their effect on reliability. Conventional methods for determination of reliability importance are essentially on the basis of risk sensitivity analysis and hence, impose prohibitive calculation burden in large power systems. An approach is proposed in this paper to determine reliability importance of energy producers from perspective of consumers or distribution companies in a composite generation and transmission system. In the presented method, while avoiding immense computational burden, the energy producers are ranked based on their rating, unavailability and impact on power flows in the lines connecting to the considered load points. Study results on the IEEE reliability test system show successful application of the proposed method. - Research highlights: → Required reliability level at load points is a concern in modern power systems. → It is important to assess reliability importance of energy producers or generators. → Generators can be ranked based on their impacts on power flow to a selected area. → Ranking of generators is an efficient tool to assess their reliability importance.
Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A
The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.
Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.
The primary purposes of the information in these reports are the following: to provide operating statistics of safety-related systems within a unit which may be used to compare and evaluate reliability performance and to provide failure mode and failure rate statistics on components which may be used in failure mode effects analysis, fault hazard analysis, probabilistic reliability analysis, and so forth
Volkanovski, A.; Cepin, M.; Mavko, B.
The power system reliability analysis method is developed from the aspect of reliable delivery of electrical energy to customers. The method is developed based on the fault tree analysis, which is widely applied in the Probabilistic Safety Assessment (PSA). The method is adapted for the power system reliability analysis. The method is developed in a way that only the basic reliability parameters of the analysed power system are necessary as an input for the calculation of reliability indices of the system. The modeling and analysis was performed on an example power system consisting of eight substations. The results include the level of reliability of current power system configuration, the combinations of component failures resulting in a failed power delivery to loads, and the importance factors for components and subsystems. (author)
Harkness, H.H.; Belytschko, T.; Liu, W.K.
Fatigue reliability is addressed by the first-order reliability method combined with a finite element method. Two-dimensional finite element models of components with cracks in mode I are considered with crack growth treated by the Paris law. Probability density functions of the variables affecting fatigue are proposed to reflect a setting where nondestructive evaluation is used, and the Rosenblatt transformation is employed to treat non-Gaussian random variables. Comparisons of the first-order reliability results and Monte Carlo simulations suggest that the accuracy of the first-order reliability method is quite good in this setting. Results show that the upper portion of the initial crack length probability density function is crucial to reliability, which suggests that if nondestructive evaluation is used, the probability of detection curve plays a key role in reliability. (orig.)
Green, A.E.; Bourne, A.J.
Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions
Hwang, Seok Won; Lee, Chang Ju; Sung, Key Yong
Probabilistic safety assessment (PSA) is a systematic technique which estimates the degree of risk impacts to the public due to an accident scenario. Estimating the occurrence frequencies and consequences of potential scenarios requires a thorough analysis of the accident details and all fundamental parameters. The robustness of PSA to check weaknesses in a design and operation will allow a better informed and balanced decision to be reached. The fundamental parameters for PSA, such as the component failure rates, should be estimated under the condition of steady collection of the evidence throughout the operational period. However, since any single plant data does not sufficiently enough to provide an adequate PSA result, in actual, the whole operating data was commonly used to estimate the reliability parameters for the same type of components. The reliability data of any component type consists of two categories; the generic that is based on the operating experiences of whole plants, and the plant-specific that is based on the operation of a specific plant of interest. The generic data is highly essential for new or recently-built nuclear power plants (NPPs). Generally, the reliability data base may be categorized into the component reliability, initiating event frequencies, human performance, and so on. Among these data, the component reliability seems a key element because it has the most abundant population. Therefore, the component reliability data is essential for taking a part in the quantification of accident sequences because it becomes an input of various basic events which consists of the fault tree
Chavaillaz, Alain; Sauer, Juergen
This experiment examined how operators coped with a change in system reliability between training and testing. Forty participants were trained for 3 h on a complex process control simulation modelling six levels of automation (LOA). In training, participants either experienced a high- (100%) or low-reliability system (50%). The impact of training experience on operator behaviour was examined during a 2.5 h testing session, in which participants either experienced a high- (100%) or low-reliability system (60%). The results showed that most operators did not often switch between LOA. Most chose an LOA that relieved them of most tasks but maintained their decision authority. Training experience did not have a strong impact on the outcome measures (e.g. performance, complacency). Low system reliability led to decreased performance and self-confidence. Furthermore, complacency was observed under high system reliability. Overall, the findings suggest benefits of adaptable automation because it accommodates different operator preferences for LOA. Practitioner Summary: The present research shows that operators can adapt to changes in system reliability between training and testing sessions. Furthermore, it provides evidence that each operator has his/her preferred automation level. Since this preference varies strongly between operators, adaptable automation seems to be suitable to accommodate these large differences.
Brandhorst, Henry W., Jr.; Rodiek, Julie A.
Providing reliable power over the anticipated mission life is critical to all satellites; therefore solar arrays are one of the most vital links to satellite mission success. Furthermore, solar arrays are exposed to the harshest environment of virtually any satellite component. In the past 10 years 117 satellite solar array anomalies have been recorded with 12 resulting in total satellite failure. Through an in-depth analysis of satellite anomalies listed in the Airclaim's Ascend SpaceTrak database, it is clear that solar array reliability is a serious, industry-wide issue. Solar array reliability directly affects the cost of future satellites through increased insurance premiums and a lack of confidence by investors. Recommendations for improving reliability through careful ground testing, standardization of testing procedures such as the emerging AIAA standards, and data sharing across the industry will be discussed. The benefits of creating a certified module and array testing facility that would certify in-space reliability will also be briefly examined. Solar array reliability is an issue that must be addressed to both reduce costs and ensure continued viability of the commercial and government assets on orbit.
This book explains reliability techniques with examples from electronics design for the benefit of engineers. It presents the application of de-rating, FMEA, overstress analyses and reliability improvement tests for designing reliable electronic equipment. Adequate information is provided for designing computerized reliability database system to support the application of the techniques by designers. Pedantic terms and the associated mathematics of reliability engineering discipline are excluded for the benefit of comprehensiveness and practical applications. This book offers excellent support
For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de
Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro
Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....
Harms, E. Jr.
This paper reports on the reliability of the Fermilab Antiproton source since it began operation in 1985. Reliability of the complex as a whole as well as subsystem performance is summarized. Also discussed is the trending done to determine causes of significant machine downtime and actions taken to reduce the incidence of failure. Finally, results of a study to detect previously unidentified reliability limitations are presented
Normand, J.; Charon, M.
Concern for obtaining high-quality products which will function properly when required to do so is nothing new - it is one manifestation of a conscientious attitude to work. However, the complexity and cost of equipment and the consequences of even temporary immobilization are such that it has become necessary to make special arrangements for obtaining high-quality products and examining what one has obtained. Each unit within an enterprise must examine its own work or arrange for it to be examined; a unit whose specific task is quality assurance is responsible for overall checking, but does not relieve other units of their responsibility. Quality assurance is a form of mutual assistance within an enterprise, designed to remove the causes of faults as far as possible. It begins very early in a project and continues through the ordering stage, construction, start-up trials and operation. Quality and hence reliability are the direct result of what is done at all stages of a project. They depend on constant attention to detail, for even a minor piece of poor workmanship can, in the case of an essential item of equipment, give rise to serious operational difficulties
Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju
This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.
Jung, Hoan Sung; Seong, Poong Hyun
It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig
Vaniachine, A; The ATLAS collaboration; Karpenko, D
During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability...
Vaniachine, A; The ATLAS collaboration; Karpenko, D
During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability...
Sang, Z.F.; Zhu, Y.Z.; Widera, G.E.O.
The main purpose of this paper is to provide an applicable method to establish reliability factors for expanded tube-to-tubesheet joints. The paper also reports on the results of a preliminary study to validate experimentally the reliability efficiencies listed in Table A-2 of Appendix A of Section VIII, Division 1, of the ASME Boiler and Pressure Vessel Code. A comparison between the actual reliability factors f r , determined from testing the damage strength of the joint and calculated according to Appendix A-4 of the ASME Code, and those of Table A-2 is carried out. The results are discussed in light of the restrictions inherent in Table A-2. It is confirmed that some existing values of f r are conservative while others are less so. (orig.)
Mehta, Atul C.; Bodie, Charles C.
Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.
Antonovsky, A.; Pollock, C.; Straker, L.
The aim of this research was to understand the relationship between maintenance staff perceptions of organisational effectiveness and operational reliability in petroleum operations. Engineering measures exist that assess the effectiveness of maintenance and reliability of equipment. These measures are typically retrospective and may not provide insight into what impedes system reliability. Perceptions of organisational effectiveness by the workforce may provide a predictive measure that could improve our understanding of the human factors that influence system reliability. Maintenance personnel (n=133) from nine petroleum production facilities completed a survey as part of a study of human factors and maintenance reliability. 69 respondents (51.9%) provided comments to an open-ended question in the survey, and these data were analysed using Interpretive Phenomenological Analysis to extract themes. Four super-ordinate themes were identified from the analysis: 1) Communication and access to information, 2) Efficiency of current work systems, 3) Need for better workgroup support, and 4) Management impacts on the workplace. We found a significant relationship between the frequency of the four super-ordinate themes and the facility reliability level as measured by ‘Mean Time Between Failures’: χ"2(6,N=158)=16.2, p=.013. These results demonstrated that operational effectiveness might be differentiated on the basis of survey-derived perceptions of maintenance personnel. - Highlights: • Thematic analysis of survey comments provided insights into workplace reliability • Worker’s comments on reliability related to technical data on time between failures • Management decision-making was the main theme in the lower reliability workplaces • Improving efficiency was the main theme in the higher reliability workplaces • Communication and better workgroup support were themes at all reliability levels
Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo
Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical
Liu, Zhunga; Pan, Quan; Dezert, Jean; Han, Jun-Wei; He, You
Classifier fusion is an efficient strategy to improve the classification performance for the complex pattern recognition problem. In practice, the multiple classifiers to combine can have different reliabilities and the proper reliability evaluation plays an important role in the fusion process for getting the best classification performance. We propose a new method for classifier fusion with contextual reliability evaluation (CF-CRE) based on inner reliability and relative reliability concepts. The inner reliability, represented by a matrix, characterizes the probability of the object belonging to one class when it is classified to another class. The elements of this matrix are estimated from the -nearest neighbors of the object. A cautious discounting rule is developed under belief functions framework to revise the classification result according to the inner reliability. The relative reliability is evaluated based on a new incompatibility measure which allows to reduce the level of conflict between the classifiers by applying the classical evidence discounting rule to each classifier before their combination. The inner reliability and relative reliability capture different aspects of the classification reliability. The discounted classification results are combined with Dempster-Shafer's rule for the final class decision making support. The performance of CF-CRE have been evaluated and compared with those of main classical fusion methods using real data sets. The experimental results show that CF-CRE can produce substantially higher accuracy than other fusion methods in general. Moreover, CF-CRE is robust to the changes of the number of nearest neighbors chosen for estimating the reliability matrix, which is appealing for the applications.
Ayers, Mark L
"Increasing system complexity require new, more sophisticated tools for system modeling and metric calculation. Bringing the field up to date, this book provides telecommunications engineers with practical tools for analyzing, calculating, and reporting availability, reliability, and maintainability metrics. It gives the background in system reliability theory and covers in-depth applications in fiber optic networks, microwave networks, satellite networks, power systems, and facilities management. Computer programming tools for simulating the approaches presented, using the Matlab software suite, are also provided"
Ellingwood, B.; Bhattacharya, B.; Zheng, R.
Steel containments and liners in nuclear power plants may be exposed to aggressive environments that may cause their strength and stiffness to decrease during the plant service life. Among the factors recognized as having the potential to cause structural deterioration are uniform, pitting or crevice corrosion; fatigue, including crack initiation and propagation to fracture; elevated temperature; and irradiation. The evaluation of steel containments and liners for continued service must provide assurance that they are able to withstand future extreme loads during the service period with a level of reliability that is sufficient for public safety. Rational methodologies to provide such assurances can be developed using modern structural reliability analysis principles that take uncertainties in loading, strength, and degradation resulting from environmental factors into account. The research described in this report is in support of the Steel Containments and Liners Program being conducted for the US Nuclear Regulatory Commission by the Oak Ridge National Laboratory. The research demonstrates the feasibility of using reliability analysis as a tool for performing condition assessments and service life predictions of steel containments and liners. Mathematical models that describe time-dependent changes in steel due to aggressive environmental factors are identified, and statistical data supporting the use of these models in time-dependent reliability analysis are summarized. The analysis of steel containment fragility is described, and simple illustrations of the impact on reliability of structural degradation are provided. The role of nondestructive evaluation in time-dependent reliability analysis, both in terms of defect detection and sizing, is examined. A Markov model provides a tool for accounting for time-dependent changes in damage condition of a structural component or system. 151 refs
Zheng Liangang; Lu Yongbo
This paper performs the reliability analysis of reactor pressure vessel (RPV) with ANSYS. The analysis method include direct Monte Carlo Simulation method, Latin Hypercube Sampling, central composite design and Box-Behnken Matrix design. The RPV integrity reliability under given input condition is proposed. The result shows that the effects on the RPV base material reliability are internal press, allowable basic stress and elasticity modulus of base material in descending order, and the effects on the bolt reliability are allowable basic stress of bolt material, preload of bolt and internal press in descending order. (authors)
Kapp, K; Daum, R [Karlsruhe Univ. (TH) (Germany, F.R.). Lehrstuhl fuer Angewandte Informatik, Transport- und Verkehrssysteme
Automated technical systems have to meet very high requirements concerning safety, security and reliability. Today, modern computers, especially microcomputers, are used as integral parts of those systems. In consequence computer programs must work in a safe and reliable mannter. Methods are discussed which allow to construct safe and reliable software for automatic systems such as reactor protection systems and to prove that the safety requirements are met. As a result it is shown that only the method of total software diversification can satisfy all safety requirements at tolerable cost. In order to achieve a high degree of reliability, structured and modular programming in context with high level programming languages are recommended.
Kalyani, B. J. D.; Rao, Kolasani Ramchand H.
Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.
Freight transportation provides a significant contribution to our nations economy. A reliable and accessible freight network enables business in the Twin Cities to be more competitive in the Upper Midwest region. Accurate and reliable freight data...
Shang Yanlong; Cai Qi; Zhao Xinwen; Chen Ling
By taking into account the effect of degradation due to internal vibration and external shocks. and based on service environment and degradation mechanism of nuclear power plant coolant pump, a multi-state reliability model of coolant pump was proposed for the system that involves competitive failure process between shocks and degradation. Using this model, degradation state probability and system reliability were obtained under the consideration of internal vibration and external shocks for the degraded coolant pump. It provided an effective method to reliability analysis for coolant pump in nuclear power plant based on operating environment. The results can provide a decision making basis for design changing and maintenance optimization. (authors)
A methodology is presented in this paper to evaluate the time-dependent system reliability of a pipeline segment that contains multiple active corrosion defects and is subjected to stochastic internal pressure loading. The pipeline segment is modeled as a series system with three distinctive failure modes due to corrosion, namely small leak, large leak and rupture. The internal pressure is characterized as a simple discrete stochastic process that consists of a sequence of independent and identically distributed random variables each acting over a period of one year. The magnitude of a given sequence follows the annual maximum pressure distribution. The methodology is illustrated through a hypothetical example. Furthermore, the impact of the spatial variability of the pressure loading and pipe resistances associated with different defects on the system reliability is investigated. The analysis results suggest that the spatial variability of pipe properties has a negligible impact on the system reliability. On the other hand, the spatial variability of the internal pressure, initial defect sizes and defect growth rates can have a significant impact on the system reliability.
Zhang Li; Hu Hong; Li Pengcheng; Jiang Jianjun; Yi Cannan; Chen Qingqing
In order to build a quantitative model to analyze operators' monitoring behavior reliability of digital main control room of nuclear power plants, based on the analysis of the design characteristics of digital main control room of a nuclear power plant and operator's monitoring behavior, and combining with operators' monitoring behavior process, monitoring behavior reliability was divided into three parts including information transfer reliability among screens, inside-screen information sampling reliability and information detection reliability. Quantitative calculation model of information transfer reliability among screens was established based on Senders's monitoring theory; the inside screen information sampling reliability model was established based on the allocation theory of attention resources; and considering the performance shaping factor causality, a fuzzy Bayesian method was presented to quantify information detection reliability and an example of application was given. The results show that the established model of monitoring behavior reliability gives an objective description for monitoring process, which can quantify the monitoring reliability and overcome the shortcomings of traditional methods. Therefore, it provides theoretical support for operator's monitoring behavior reliability analysis in digital main control room of nuclear power plants and improves the precision of human reliability analysis. (authors)
Nuclear power plants and, in particular, reactor pressure boundary components have unique reliability requirements, in that usually no significant redundancy is possible, and a single failure can give rise to possible widespread core damage and fission product release. Reliability may be required for availability or safety reasons, but in the case of the pressure boundary and certain other systems safety may dominate. Possible Safety and Reliability (S and R) criteria are proposed which would produce acceptable reactor design. Without some S and R requirement the designer has no way of knowing how far he must go in analysing his system or component, or whether his proposed solution is likely to gain acceptance. The paper shows how reliability targets for given components and systems can be individually considered against the derived S and R criteria at the design and construction stage. Since in the case of nuclear pressure boundary components there is often very little direct experience on which to base reliability studies, relevant non-nuclear experience is examined. (author)
Jensen, Carsten Lynge; Hansen, Lars Gårn; Fjordbak, Troels
Experimental evidence of the effect of providing households with cheap energy saving technology is sparse. We present results from a field experiment in which autopoweroff plugs were provided free of charge to randomly selected households. We use propensity score matching to find treatment effects...
U.S. Department of Health & Human Services — The MAX Provider Characteristics (PC) File Implementation Report describes the design, implementation, and results of the MAXPC prototype, which was based on three...
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing
Kammerer, Catherine C.
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.
This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended
Ribeiro, A.A.T.; Muniz, A.A.
Results of the analysis of factors influencing the reliability of international nuclear power plants of the PWR type are presented. The reliability factor is estimated and the probability of its having lower values than a certain specified value is discussed. (Author) [pt
Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.
Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.
226-30, October 1974. 66 I, 26. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser Engineering and...Vol. R-23, No. 4, 226-30, October 1974. 28. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser...opnatien ot deg C, mounted on a 4-inach square 0.250~ inch thick al~loy alum~nusi panel.. This mounting technique should be L~ ken into cunoidur~tiou
Barranco, Manuel; Proenza, Julián; Almeida, Luís
Fieldbuses targeted to highly dependable distributed embedded systems are shifting from bus to star topologies. Surprisingly, despite the efforts into this direction, engineers lack of analyses that quantitatively characterize the system reliability achievable by buses and stars. Thus, to guide engineers in developing adequate bus and star fieldbuses, this work models, quantifies and compares the system reliability provided by simplex buses and stars for the case of the Controller Area Network (CAN). It clarifies how relevant dependability-related aspects affect reliability, refuting some intuitive ideas, and revealing some previously unknown bus and star benefits. - Highlights: • SANs models that quantify the reliability of simplex buses/stars in fieldbuses. • Models cover system relevant dependability-related features abstracted in the literature. • Results refute intuitive ideas about buses and stars and show some unexpected effects. • Models and results can guide the design of reliable simplex bus/stars fieldbuses
Jung, Hoan Sung; Seong, Poong Hyun
For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. The experiences show that the operational reliability is higher than the test reliability User's interest is on the operational reliability rather than on the test reliability, however. With the assumption that the difference in reliability results from the change of environment, testing environment factors comprising the aging factor and the coverage factor are defined in this study to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results are close to the actual data
Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya
Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)
This book presents the state-of-the-art in quality and reliability engineering from a product life cycle standpoint. Topics in reliability include reliability models, life data analysis and modeling, design for reliability and accelerated life testing, while topics in quality include design for quality, acceptance sampling and supplier selection, statistical process control, production tests such as screening and burn-in, warranty and maintenance. The book provides comprehensive insights into two closely related subjects, and includes a wealth of examples and problems to enhance reader comprehension and link theory and practice. All numerical examples can be easily solved using Microsoft Excel. The book is intended for senior undergraduate and post-graduate students in related engineering and management programs such as mechanical engineering, manufacturing engineering, industrial engineering and engineering management programs, as well as for researchers and engineers in the quality and reliability fields. D...
Ehsani, A.; Ranjbar, A. M.; Fotuhi Firuzabad, M.; Ehsani, M.
Recently, in many countries, electric utility industry is undergoing considerable changes in regard to its structure and regulation. It can be clearly seen that the thrust towards privatization and deregulation or re regulation of the electric utility industry will introduce numerous reliability problems that will require new criteria and analytical tools that recognize the residual uncertainties in the new environment. In this paper, different risks and uncertainties in competitive electricity markets are briefly introduced; the approach of customers, operators, planners, generation bodies and network providers to the reliability of deregulated system is studied; the impact of dispersed generation on system reliability is evaluated; and finally, the reliability cost/reliability worth issues in the new competitive environment are considered
In Japan lacking in energy resources, it is our basic energy policy to accelerate the development program of nuclear power, thereby reducing our dependence. As referred to in the foregoing, every effort has been exerted on our part to improve the PWR system reliability by dint of the so-called 'HOMEMADE' TQC activities, which is our brain-child as a result of applying to the energy industry the quality control philosophy developed in the field of manufacturing industry
Johnson, G.; Lawrence, D.; Yu, H.
The objective of this project is to develop a method to predict the potential reliability of software to be used in a digital system instrumentation and control system. The reliability prediction is to make use of existing measures of software reliability such as those described in IEEE Std 982 and 982.2. This prediction must be of sufficient accuracy to provide a value for uncertainty that could be used in a nuclear power plant probabilistic risk assessment (PRA). For the purposes of the project, reliability was defined to be the probability that the digital system will successfully perform its intended safety function (for the distribution of conditions under which it is expected to respond) upon demand with no unintended functions that might affect system safety. The ultimate objective is to use the identified measures to develop a method for predicting the potential quantitative reliability of a digital system. The reliability prediction models proposed in this report are conceptual in nature. That is, possible prediction techniques are proposed and trial models are built, but in order to become a useful tool for predicting reliability, the models must be tested, modified according to the results, and validated. Using methods outlined by this project, models could be constructed to develop reliability estimates for elements of software systems. This would require careful review and refinement of the models, development of model parameters from actual experience data or expert elicitation, and careful validation. By combining these reliability estimates (generated from the validated models for the constituent parts) in structural software models, the reliability of the software system could then be predicted. Modeling digital system reliability will also require that methods be developed for combining reliability estimates for hardware and software. System structural models must also be developed in order to predict system reliability based upon the reliability
Quiton, Raimi L; Keaser, Michael L; Zhuo, Jiachen; Gullapalli, Rao P; Greenspan, Joel D
As the practice of conducting longitudinal fMRI studies to assess mechanisms of pain-reducing interventions becomes more common, there is a great need to assess the test-retest reliability of the pain-related BOLD fMRI signal across repeated sessions. This study quantitatively evaluated the reliability of heat pain-related BOLD fMRI brain responses in healthy volunteers across 3 sessions conducted on separate days using two measures: (1) intraclass correlation coefficients (ICC) calculated based on signal amplitude and (2) spatial overlap. The ICC analysis of pain-related BOLD fMRI responses showed fair-to-moderate intersession reliability in brain areas regarded as part of the cortical pain network. Areas with the highest intersession reliability based on the ICC analysis included the anterior midcingulate cortex, anterior insula, and second somatosensory cortex. Areas with the lowest intersession reliability based on the ICC analysis also showed low spatial reliability; these regions included pregenual anterior cingulate cortex, primary somatosensory cortex, and posterior insula. Thus, this study found regional differences in pain-related BOLD fMRI response reliability, which may provide useful information to guide longitudinal pain studies. A simple motor task (finger-thumb opposition) was performed by the same subjects in the same sessions as the painful heat stimuli were delivered. Intersession reliability of fMRI activation in cortical motor areas was comparable to previously published findings for both spatial overlap and ICC measures, providing support for the validity of the analytical approach used to assess intersession reliability of pain-related fMRI activation. A secondary finding of this study is that the use of standard ICC alone as a measure of reliability may not be sufficient, as the underlying variance structure of an fMRI dataset can result in inappropriately high ICC values; a method to eliminate these false positive results was used in this
The overall objective of this research project is to develop a technical basis for flexible piping designs which will improve piping reliability and minimize the use of pipe supports, snubbers, and pipe whip restraints. The current study was conducted to establish the necessary groundwork based on the piping reliability analysis. A confirmatory piping reliability assessment indicated that removing rigid supports and snubbers tends to either improve or affect very little the piping reliability. The authors then investigated a couple of changes to be implemented in Regulatory Guide (RG) 1.61 and RG 1.122 aimed at more flexible piping design. They concluded that these changes substantially reduce calculated piping responses and allow piping redesigns with significant reduction in number of supports and snubbers without violating ASME code requirements. Furthermore, the more flexible piping redesigns are capable of exhibiting reliability levels equal to or higher than the original stiffer design. An investigation of the malfunction of pipe whip restraints confirmed that the malfunction introduced higher thermal stresses and tended to reduce the overall piping reliability. Finally, support and component reliabilities were evaluated based on available fragility data. Results indicated that the support reliability usually exhibits a moderate decrease as the piping flexibility increases. Most on-line pumps and valves showed an insignificant reduction in reliability for a more flexible piping design
This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method
user and a supplier arfue to determine if a failure is, or is not to be ascribed to the equipment, some disputable cases are difficult to nettle ... combat action, or tampering by Government personnel, provided there is clear and c~nvincing evidence of such cause. In addition, the contrac- tor...satellite there in are described The OR of resulting module pest fail signals an bood preocoistr4 A K Geiqer MU S Navy. Electronic Systems indicates
Longhurst, F.; Wessels, H.
Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.
Schijndel, van A.
Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has
For the next generation of high performance, high average luminosity colliders, the ''factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis
Kasperski, M.; Geurts, C.P.W.
The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the
In the paper it is shown how upper and lower bounds for the reliability of plastic slabs can be determined. For the fundamental case it is shown that optimal bounds of a deterministic and a stochastic analysis are obtained on the basis of the same failure mechanisms and the same stress fields....
According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A
This report includes three papers as follows: : 1. Guo F., Rakha H., and Park S. (2010), "A Multi-state Travel Time Reliability Model," : Transportation Research Record: Journal of the Transportation Research Board, n 2188, : pp. 46-54. : 2. Park S.,...
Stanley, Leanne M.; Edwards, Michael C.
The purpose of this article is to highlight the distinction between the reliability of test scores and the fit of psychometric measurement models, reminding readers why it is important to consider both when evaluating whether test scores are valid for a proposed interpretation and/or use. It is often the case that an investigator judges both the…
Holt, James P.
The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.
Nowadays, power electronic equipment requirements are important, concerning performances, quality and reliability. On the other hand, costs have to be reduced in order to satisfy the market rules. To provide cheap, reliability and performances, many standard components with mass production are developed. But the construction of specific products must be considered following these two different points: in one band you can produce specific components, with delay, over-cost problems and eventuality quality and reliability problems, in the other and you can use standard components in a adapted topologies. The CEA of Pierrelatte has adopted this last technique of power electronic conception for the development of these high voltage pulsed power converters. The technique consists in using standard components and to associate them in series and in parallel. The matrix constitutes high voltage macro-switch where electrical parameters are distributed between the synchronized components. This study deals with the reliability of these structures. It brings up the high reliability aspect of MOSFETs matrix associations. Thanks to several homemade test facilities, we obtained lots of data concerning the components we use. The understanding of defects propagation mechanisms in matrix structures has allowed us to put forwards the necessity of robust drive system, adapted clamping voltage protection, and careful geometrical construction. All these reliability considerations in matrix associations have notably allowed the construction of a new matrix structure regrouping all solutions insuring reliability. Reliable and robust, this product has already reaches the industrial stage. (author) [fr
Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009.  P. O'Connor ...
The development of Korea's new long-term care service infrastructure and its results: focusing on the market-friendly policy used for expansion of the numbers of service providers and personal care workers.
One of the main reasons for reforming long-term care systems is a deficient existing service infrastructure for the elderly. This article provides an overview of why and how the Korean government expanded long-term care infrastructure through the introduction of a new compulsory insurance system, with a particular focus on the market-friendly policies used to expand the infrastructure. Then, the positive results of the expansion of the long-term care infrastructure and the challenges that have emerged are examined. Finally, it is argued that the Korean government should actively implement a range of practical policies and interventions within the new system.
Bulut, Okan; Davison, Mark L; Rodriguez, Michael C
Subscores are of increasing interest in educational and psychological testing due to their diagnostic function for evaluating examinees' strengths and weaknesses within particular domains of knowledge. Previous studies about the utility of subscores have mostly focused on the overall reliability of individual subscores and ignored the fact that subscores should be distinct and have added value over the total score. This study introduces a profile reliability approach that partitions the overall subscore reliability into within-person and between-person subscore reliability. The estimation of between-person reliability and within-person reliability coefficients is demonstrated using subscores from number-correct scoring, unidimensional and multidimensional item response theory scoring, and augmented scoring approaches via a simulation study and a real data study. The effects of various testing conditions, such as subtest length, correlations among subscores, and the number of subtests, are examined. Results indicate that there is a substantial trade-off between within-person and between-person reliability of subscores. Profile reliability coefficients can be useful in determining the extent to which subscores provide distinct and reliable information under various testing conditions.
Wu, Xiaoyue; Hillston, Jane
Mission reliability of a system depends on specific criteria for mission success. To evaluate the mission reliability of some mission systems that do not need to work normally for the whole mission time, two types of mission reliability for such systems are studied. The first type corresponds to the mission requirement that the system must remain operational continuously for a minimum time within the given mission time interval, while the second corresponds to the mission requirement that the total operational time of the system within the mission time window must be greater than a given value. Based on Markov renewal properties, matrix integral equations are derived for semi-Markov systems. Numerical algorithms and a simulation procedure are provided for both types of mission reliability. Two examples are used for illustration purposes. One is a one-unit repairable Markov system, and the other is a cold standby semi-Markov system consisting of two components. By the proposed approaches, the mission reliability of systems with time redundancy can be more precisely estimated to avoid possible unnecessary redundancy of system resources. - Highlights: • Two types of mission reliability under generalized requirements are defined. • Equations for both types of reliability are derived for semi-Markov systems. • Numerical methods are given for solving both types of reliability. • Simulation procedure is given for estimating both types of reliability. • Verification of the numerical methods is given by the results of simulation
Riemann, Bryan L; Lininger, Monica R
To describe the concepts of measurement reliability and minimal important change. All measurements have some magnitude of error. Because clinical practice involves measurement, clinicians need to understand measurement reliability. The reliability of an instrument is integral in determining if a change in patient status is meaningful. Measurement reliability is the extent to which a test result is consistent and free of error. Three perspectives of reliability-relative reliability, systematic bias, and absolute reliability-are often reported. However, absolute reliability statistics, such as the minimal detectable difference, are most relevant to clinicians because they provide an expected error estimate. The minimal important difference is the smallest change in a treatment outcome that the patient would identify as important. Clinicians should use absolute reliability characteristics, preferably the minimal detectable difference, to determine the extent of error around a patient's measurement. The minimal detectable difference, coupled with an appropriately estimated minimal important difference, can assist the practitioner in identifying clinically meaningful changes in patients.
This document provides a comprehensive review to evaluate the reliability of indicator species toxicity test results in predicting aquatic ecosystem impacts, also called the ecological relevance of laboratory single species toxicity tests.
This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs
Madsen, Henrik; Burtschy, Bernard; Albeanu, G.
This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....
Baker, Nick; Liserre, Marco; Dupont, Laurent
Power electronic systems play an increasingly important role in providing high-efficiency power conversion for adjustable-speed drives, power-quality correction, renewable-energy systems, energy-storage systems, and electric vehicles. However, they are often presented with demanding operating...... environments that challenge the reliability aspects of power electronic techniques. For example, increasingly thermally stressful environments are seen in applications such as electric vehicles, where ambient temperatures under the hood exceed 150 °C, while some wind turbine applications can place large...
Vaniachine, A; Golubkov, D; Karpenko, D
During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability Engineering approach supported continuous improvements in data reprocessing throughput during LHC data taking. The throughput doubled in 2011 vs. 2010 reprocessing, then quadrupled in 2012 vs. 2011 reprocessing. We present the Reliability Engineering analysis of ATLAS data reprocessing campaigns providing the foundation needed to scale up the Big Data processing technologies beyond the petascale.
Field, Richard V., Jr. (.,; .); Grigoriu, Mircea
A method is developed for reliability analysis of dynamic systems under limited information. The available information includes one or more samples of the system output; any known information on features of the output can be used if available. The method is based on the theory of non-Gaussian translation processes and is shown to be particularly suitable for problems of practical interest. For illustration, we apply the proposed method to a series of simple example problems and compare with results given by traditional statistical estimators in order to establish the accuracy of the method. It is demonstrated that the method delivers accurate results for the case of linear and nonlinear dynamic systems, and can be applied to analyze experimental data and/or mathematical model outputs. Two complex applications of direct interest to Sandia are also considered. First, we apply the proposed method to assess design reliability of a MEMS inertial switch. Second, we consider re-entry body (RB) component vibration response during normal re-entry, where the objective is to estimate the time-dependent probability of component failure. This last application is directly relevant to re-entry random vibration analysis at Sandia, and may provide insights on test-based and/or model-based qualification of weapon components for random vibration environments.
Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng
Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.
Full Text Available Physically unclonable functions (PUFs have been touted for their inherent resistance to invasive attacks and low cost in providing a hardware root of trust for various security applications. SRAM PUFs in particular are popular in industry for key/ID generation. Due to intrinsic process variations, SRAM cells, ideally, tend to have the same start-up behavior. SRAM PUFs exploit this start-up behavior. Unfortunately, not all SRAM cells exhibit reliable start-up behavior due to noise susceptibility. Hence, design enhancements are needed for improving reliability. Some of the proposed enhancements in literature include fuzzy extraction, error-correcting codes and voting mechanisms. All enhancements involve a trade-off between area/power/performance overhead and PUF reliability. This paper presents a design enhancement technique for reliability that improves upon previous solutions. We present simulation results to quantify improvement in SRAM PUF reliability and efficiency. The proposed technique is shown to generate a 128-bit key in ≤0.2 μ s at an area estimate of 4538 μ m 2 with error rate as low as 10 − 6 for intrinsic error probability of 15%.
Solanki, R.B.; Krishnamurthy, P.R.; Singh, Suneet; Varde, P.V.; Verma, A.K.
The increasing use of passive systems in the innovative nuclear reactors puts demand on the estimation of the reliability assessment of these passive systems. The passive systems operate on the driving forces such as natural circulation, gravity, internal stored energy etc. which are moderately weaker than that of active components. Hence, phenomenological failures (virtual components) are equally important as that of equipment failures (real components) in the evaluation of passive systems reliability. The contribution of the mechanical components to the passive system reliability can be evaluated in a classical way using the available component reliability database and well known methods. On the other hand, different methods are required to evaluate the reliability of processes like thermohydraulics due to lack of adequate failure data. The research is ongoing worldwide on the reliability assessment of the passive systems and their integration into PSA, however consensus is not reached. Two of the most widely used methods are Reliability Evaluation of Passive Systems (REPAS) and Assessment of Passive System Reliability (APSRA). Both these methods characterize the uncertainties involved in the design and process parameters governing the function of the passive system. However, these methods differ in the quantification of passive system reliability. Inter comparison among different available methods provides useful insights into the strength and weakness of different methods. This paper highlights the results of the thermal hydraulic analysis of a typical passive isolation condenser system carried out using RELAP mode 3.2 computer code applying REPAS and APSRA methodologies. The failure surface is established for the passive system under consideration and system reliability has also been evaluated using these methods. Challenges involved in passive system reliabilities are identified, which require further attention in order to overcome the shortcomings of these
Lakner, A.A.; Anderson, R.T.
This book is written for the reliability instructor, program manager, system engineer, design engineer, reliability engineer, nuclear regulator, probability risk assessment (PRA) analyst, general manager and others who are involved in system hardware acquisition, design and operation and are concerned with plant safety and operational cost-effectiveness. It provides criteria, guidelines and comprehensive engineering data affecting reliability; it covers the key aspects of system reliability as it relates to conceptual planning, cost tradeoff decisions, specification, contractor selection, design, test and plant acceptance and operation. It treats reliability as an integrated methodology, explicitly describing life cycle management techniques as well as the basic elements of a total hardware development program, including: reliability parameters and design improvement attributes, reliability testing, reliability engineering and control. It describes how these elements can be defined during procurement, and implemented during design and development to yield reliable equipment. (author)
Full Text Available The military networks, contrary to commercial ones, require standards which provide the highest level of security and reliability. The process to assuring redundancy of the main connections through applying various protocols and transmission media causes problem with time needed to re-establish virtual tunnels between different locations in case of damaged link. This article compares reliability of different IP (Internet Protocol tunnels, which were implemented on military network devices.
Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou
A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.
Hardy, L; Duru, Ph; Koch, J M; Revol, J L; Van Vaerenbergh, P; Volpe, A M; Clugnet, K; Dely, A; Goodhew, D
About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.
Landers, John; Rogers, Erin; Gerke, Gretchen
A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.
Gutscher, W.D.; Johnson, K.J.
Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace
Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon
of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...
Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D.
About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop
This report contains the papers delivered at the course on safety and reliability assessment held at the CSIR Conference Centre, Scientia, Pretoria. The following topics were discussed: safety standards; licensing; biological effects of radiation; what is a PWR; safety principles in the design of a nuclear reactor; radio-release analysis; quality assurance; the staffing, organisation and training for a nuclear power plant project; event trees, fault trees and probability; Automatic Protective Systems; sources of failure-rate data; interpretation of failure data; synthesis and reliability; quantification of human error in man-machine systems; dispersion of noxious substances through the atmosphere; criticality aspects of enrichment and recovery plants; and risk and hazard analysis. Extensive examples are given as well as case studies
...] Interpretation of Transmission Planning Reliability Standard March 18, 2010. AGENCY: Federal Energy Regulatory... transmission planning Reliability Standard TPL-002-0 provides that planning authorities and transmission... Reliability Standard TPL-002-0. In this order, the Commission proposes to reject NERC's proposed...
Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.
In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it
Ethiraj, Sendil; Yi, Sangyoon
of the ensuing work examined only corollary implications of this observation. We treat the observation as a research question and ask: when and why are reliable organizations favored by evolutionary forces? Using a simple theoretical model, we direct attention at a minimal set of variables that are implicated...... shocks, reliable organizations can in fact outperform their less reliable counterparts if they can take advantage of the knowledge resident in their historical choices. While these results are counter-intuitive, the caveat is that our results are only an existence proof for our theory rather than...
Hoppa, Mary Ann; Wilson, Larry W.
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.
Murthy, D.N.P.; Rausand, M.; Virtanen, S.
Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.
Camp, R.A.; Bergin, W.
Since the expected life of the Tokamak Fusion Test Reactor (TFTR) has been extended into the early 1990's, the issues of equipment wear-out, when to refurbish/replace, and the costs associated with these decisions, must be faced. The management of the maintenance of the TFTR Central Instrumentation, Control and Data Acquisition System (CICADA) power supplies within the CAMAC network is a case study of a set of systems to monitor repairable systems reliability, costs, and results of action. The CAMAC network is composed of approximately 500 racks, each with its own power supply. By using a simple reliability estimator on a coarse time interval, in conjunction with determining the root cause of individual failures, a cost effective repair and maintenance program has been realized. This paper describes the estimator, some of the specific causes for recurring failures and their correction, and the subsequent effects on the reliability estimator. By extension of this program the authors can assess the continued viability of CAMAC power supplies into the future, predicting wear-out and developing cost effective refurbishment/replacement policies. 4 refs., 3 figs., 1 tab
This book shows how to build in and assess reliability, availability, maintainability, and safety (RAMS) of components, equipment, and systems. It presents the state of the art of reliability (RAMS) engineering, in theory & practice, and is based on over 30 years author's experience in this field, half in industry and half as Professor of Reliability Engineering at the ETH, Zurich. The book structure allows rapid access to practical results. Methods & tools are given in a way that they can be tailored to cover different RAMS requirement levels. Thanks to Appendices A6 - A8 the book is mathematically self-contained, and can be used as a textbook or as a desktop reference with a large number of tables (60), figures (210), and examples / exercises^ 10,000 per year since 2013) were the motivation for this final edition, the 13th since 1985, including German editions. Extended and carefully reviewed to improve accuracy, it represents the continuous improvement effort to satisfy reader's needs and confidenc...
The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches
The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)
To ensure long-term safe and reliable plant operation, equipment operability and availability must also be ensured by setting a group of processes to be established within the nuclear power plant. Equipment reliability process represents the integration and coordination of important equipment reliability activities into one process, which enables equipment performance and condition monitoring, preventive maintenance activities development, implementation and optimization, continuous improvement of the processes and long term planning. The initiative for introducing systematic approach for equipment reliability assuring came from US nuclear industry guided by INPO (Institute of Nuclear Power Operations) and by participation of several US nuclear utilities. As a result of the initiative, first edition of INPO document AP-913, 'Equipment Reliability Process Description' was issued and it became a basic document for implementation of equipment reliability process for the whole nuclear industry. The scope of equipment reliability process in Krsko NPP consists of following programs: equipment criticality classification, preventive maintenance program, corrective action program, system health reports and long-term investment plan. By implementation, supervision and continuous improvement of those programs, guided by more than thirty years of operating experience, Krsko NPP will continue to be on a track of safe and reliable operation until the end of prolonged life time. (author).
Park, Moon Soo; Moon, Joo Hyun; Kang, Chang Sun [Seoul National Univ., Seoul (Korea, Republic of)
Korea Institute of Nuclear Safety (KINS) carried out a questionnaire survey on public's understanding nuclear safety and regulation in order to grasp public acceptance for nuclear energy. The survey was planned to help to analyze public opinion on nuclear energy and provide basic data for advertising strategy and policy development. In this study, based on results of the survey, the reliability of the survey was evaluated according to each nuclear site.
Park, Moon Soo; Moon, Joo Hyun; Kang, Chang Sun
Korea Institute of Nuclear Safety (KINS) carried out a questionnaire survey on public's understanding nuclear safety and regulation in order to grasp public acceptance for nuclear energy. The survey was planned to help to analyze public opinion on nuclear energy and provide basic data for advertising strategy and policy development. In this study, based on results of the survey, the reliability of the survey was evaluated according to each nuclear site
Wang, Huai; Zhou, Dao; Blaabjerg, Frede
Reliability is a crucial performance indicator of power electronic systems in terms of availability, mission accomplishment and life cycle cost. A paradigm shift in the research on reliability of power electronics is going on from simple handbook based calculations (e.g. models in MIL-HDBK-217F h...... and reliability prediction models are provided. A case study on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical component IGBT modules....
Davis, J Lynn; Mills, Karmann; Lamvik, Michael; Yaga, Robert; Shepherd, Sarah D; Bittle, James; Baldasaro, Nick; Solano, Eric; Bobashev, Georgiy; Johnson, Cortina; Evans, Amy
Results from accelerated life tests (ALT) on mass-produced commercially available 6” downlights are reported along with results from commercial LEDs. The luminaires capture many of the design features found in modern luminaires. In general, a systems perspective is required to understand the reliability of these devices since LED failure is rare. In contrast, components such as drivers, lenses, and reflector are more likely to impact luminaire reliability than LEDs.
Oliveira, L.F.S. de; Soto, J.B.; Maciel, C.C.; Gibelli, S.M.O.; Fleming, P.V.; Arrieta, L.A.
An extensive reliability analysis of some safety systems of Angra I, are presented. The fault tree technique, which has been successfully used in most reliability studies of nuclear safety systems performed to date is employed. Results of a quantitative determination of the unvailability of the accumulator and the containment spray injection systems are presented. These results are also compared to those reported in WASH-1400. (E.G.) [pt
Drost, Ellen A.
In this paper, the author aims to provide novice researchers with an understanding of the general problem of validity in social science research and to acquaint them with approaches to developing strong support for the validity of their research. She provides insight into these two important concepts, namely (1) validity; and (2) reliability, and…
Brissaud, Florent, E-mail: firstname.lastname@example.org [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Charpentier, Dominique [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France)
The reliability analysis of new technology-based transmitters has to deal with specific issues: various interactions between both material elements and functions, undefined behaviours under faulty conditions, several transmitted data, and little reliability feedback. To handle these particularities, a '3-step' model is proposed, based on goal tree-success tree (GTST) approaches to represent both the functional and material aspects, and includes the faults and failures as a third part for supporting reliability analyses. The behavioural aspects are provided by relationship matrices, also denoted master logic diagrams (MLD), with stochastic values which represent direct relationships between system elements. Relationship analyses are then proposed to assess the effect of any fault or failure on any material element or function. Taking these relationships into account, the probabilities of malfunction and failure modes are evaluated according to time. Furthermore, uncertainty analyses tend to show that even if the input data and system behaviour are not well known, these previous results can be obtained in a relatively precise way. An illustration is provided by a case study on an infrared gas transmitter. These properties make the proposed model and corresponding reliability analyses especially suitable for intelligent transmitters (or 'smart sensors').
Eto, Joe; Eto, Joe; Lesieutre, Bernard; Lewis, Nancy Jo; Parashar, Manu
The increased need to manage California?s electricity grid in real time is a result of the ongoing transition from a system operated by vertically-integrated utilities serving native loads to one operated by an independent system operator supporting competitive energy markets. During this transition period, the traditional approach to reliability management -- construction of new transmission lines -- has not been pursued due to unresolved issues related to the financing and recovery of transmission project costs. In the absence of investments in new transmission infrastructure, the best strategy for managing reliability is to equip system operators with better real-time information about actual operating margins so that they can better understand and manage the risk of operating closer to the edge. A companion strategy is to address known deficiencies in offline modeling tools that are needed to ground the use of improved real-time tools. This project: (1) developed and conducted first-ever demonstrations of two prototype real-time software tools for voltage security assessment and phasor monitoring; and (2) prepared a scoping study on improving load and generator response models. Additional funding through two separate subsequent work authorizations has already been provided to build upon the work initiated in this project.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
In the present contemporary climate of global competition in every branch of engineering and manufacture it has been shown from extensive customer surveys that above every other attribute, reliability stands as the most desired feature in a finished product. To survive this relentless fight for survival any organisation, which neglect the plea of attaining to excellence in reliability, will do so at a serious cost Reliability in Automotive and Mechanical Engineering draws together a wide spectrum of diverse and relevant applications and analyses on reliability engineering. This is distilled into this attractive and well documented volume and practising engineers are challenged with the formidable task of simultaneously improving reliability and reducing the costs and down-time due to maintenance. The volume brings together eleven chapters to highlight the importance of the interrelated reliability and maintenance disciplines. They represent the development trends and progress resulting in making this book ess...
Choi, S. Y.; Han, S. H.; Kim, S. H.
The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs
Full Text Available As the penetration of wind power into the power system increases, the ability to assess the reliability impact of such interaction becomes more important. The composite reliability evaluations involving wind energy provide ample opportunities for assessing the benefits of different wind farm connection points. A connection to the weak area of the transmission network will require network reinforcement for absorbing the additional wind energy. Traditionally, the reinforcements are performed by constructing new transmission corridors. However, a new state-of-art technology such as the dynamic thermal rating (DTR system, provides new reinforcement strategy and this requires new reliability assessment method. This paper demonstrates a methodology for assessing the cost and the reliability of network reinforcement strategies by considering the DTR systems when large scale wind farms are connected to the existing power network. Sequential Monte Carlo simulations were performed and all DTRs and wind speed were simulated using the auto-regressive moving average (ARMA model. Various reinforcement strategies were assessed from their cost and reliability aspects. Practical industrial standards are used as guidelines when assessing costs. Due to this, the proposed methodology in this paper is able to determine the optimal reinforcement strategies when both the cost and reliability requirements are considered.
Jesteadt, Walt; Joshi, Suyash Narendra
In this study, 16 normally-hearing listeners judged the loudness of 1000-Hz sinusoids using magnitude estimation (ME), magnitude production (MP), and categorical loudness scaling (CLS). Listeners in each of four groups completed the loudness scaling tasks in a different sequence on the first visit...... (ME, MP, CLS; MP, ME, CLS; CLS, ME, MP; CLS, MP, ME), and the order was reversed on the second visit. This design made it possible to compare the reliability of estimates of the slope of the loudness function across procedures in the same listeners. The ME data were well fitted by an inflected...... results were the most reproducible, they do not provide direct information about the slope of the loudness function because the numbers assigned to CLS categories are arbitrary. This problem can be corrected by using data from the other procedures to assign numbers that are proportional to loudness...
Vinod, Gopika; Saraf, R.K.; Babar, A.K.; Sanyasi Rao, V.V.S.; Tharani, Rajiv
Component failure, repair and maintenance data is a very important element of any Probabilistic Safety Assessment study. The credibility of the results of such study is enhanced if the data used is generated from operating experience of similar power plants. Towards this objective, a computerised database is designed, with fields such as, date and time of failure, component name, failure mode, failure cause, ways of failure detection, reactor operating power status, repair times, down time, etc. This leads to evaluation of plant specific failure rate, and on demand failure probability/unavailability for all components. Systematic data updation can provide a real time component reliability parameter statistics and trend analysis and this helps in planning maintenance strategies. A software package has been developed RELDATA, which incorporates the database management and data analysis methods. This report describes the software features and underlying methodology in detail. (author)
Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.
Pelto, P.J.; Ames, K.R.; Gallucci, R.H.
This report summarizes the results of the Reliability Analysis of Containment Isolation System Project. Work was performed in five basic areas: design review, operating experience review, related research review, generic analysis and plant specific analysis. Licensee Event Reports (LERs) and Integrated Leak Rate Test (ILRT) reports provided the major sources of containment performance information used in this study. Data extracted from LERs were assembled into a computer data base. Qualitative and quantitative information developed for containment performance under normal operating conditions and design basis accidents indicate that there is room for improvement. A rough estimate of overall containment unavailability for relatively small leaks which violate plant technical specifications is 0.3. An estimate of containment unavailability due to large leakage events is in the range of 0.001 to 0.01. These estimates are dependent on several assumptions (particularly on event duration times) which are documented in the report
This thesis covers two research topics concerning optical solutions for networks e.g. avionic systems. One is to identify the applications for silicon photonic devices for cost-effective solutions in short-range optical networks. The other one is to realise advanced functionalities in order...... to increase the availability of highly reliable optical networks. A cost-effective transmitter based on a directly modulated laser (DML) using a silicon micro-ring resonator (MRR) to enhance its modulation speed is proposed, analysed and experimentally demonstrated. A modulation speed enhancement from 10 Gbit...... interconnects and network-on-chips. A novel concept of all-optical protection switching scheme is proposed, where fault detection and protection trigger are all implemented in the optical domain. This scheme can provide ultra-fast establishment of the protection path resulting in a minimum loss of data...
Kwik, R.J.; Polizzi, L.M.; Sticco, S.; Gerrard, P.B.; Yeater, M.L.; Hockenbury, R.W.; Phillips, M.A.
An integrated reliability analysis program combining graphic representation of fault trees, automated data base loadings and reference, and automated construction of reliability code input files was developed. The functional specifications for CADRIGS, the computer aided design reliability interactive graphics system, are presented. Previously developed fault tree segments used in auxiliary feedwater system safety analysis were constructed on CADRIGS and, when combined, yielded results identical to those resulting from manual input to the same reliability codes
If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive. There has been a significant improvement in nuclear power plant performance, due largely to a decline in the forced outage rate and a dramatic drop in the average number of forced outages per fuel cycle. If fewer forced outages are a sign of improved safety, nuclear power plants have become safer and more productive over time. To encourage further increases in performance, regulatory incentive schemes should reward reactor operators for improved reliability and safety, as well as for improved performance
Stieglitz, Lennart Henning
Neuronavigation plays a central role in modern neurosurgery. It allows visualizing instruments and three-dimensional image data intraoperatively and supports spatial orientation. Thus it allows to reduce surgical risks and speed up complex surgical procedures. The growing availability and importance of neuronavigation makes clear how relevant it is to know about its reliability and accuracy. Different factors may influence the accuracy during the surgery unnoticed, misleading the surgeon. Besides the best possible optimization of the systems themselves, a good knowledge about its weaknesses is mandatory for every neurosurgeon.
Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel
This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples
Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: email@example.com
This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.
Leicht, R.; Wingender, H.J.
Activities in the fields of reliability and risk analyses have led to the development of particular software tools which now are combined in the PC-based integrated CARARA system. The options available in this system cover a wide range of reliability-oriented tasks, like organizing raw failure data in the component/event data bank FDB, performing statistical analysis of those data with the program FDA, managing the resulting parameters in the reliability data bank RDB, and performing fault tree analysis with the fault tree code FTL or evaluating the risk of toxic or radioactive material release with the STAR code. (orig.)
In honor of the work of Professor Shunji Osaki, Stochastic Reliability and Maintenance Modeling provides a comprehensive study of the legacy of and ongoing research in stochastic reliability and maintenance modeling. Including associated application areas such as dependable computing, performance evaluation, software engineering, communication engineering, distinguished researchers review and build on the contributions over the last four decades by Professor Shunji Osaki. Fundamental yet significant research results are presented and discussed clearly alongside new ideas and topics on stochastic reliability and maintenance modeling to inspire future research. Across 15 chapters readers gain the knowledge and understanding to apply reliability and maintenance theory to computer and communication systems. Stochastic Reliability and Maintenance Modeling is ideal for graduate students and researchers in reliability engineering, and workers, managers and engineers engaged in computer, maintenance and management wo...
Minner, Daphne Diane
The intention of this research project was to bridge the gap between social science research and application to the environmental domain through the development of a theoretically derived instrument designed to give educators a template by which to evaluate environmental education curricula. The theoretical base for instrument development was provided by several developmental theories such as Piaget's theory of cognitive development, Developmental Systems Theory, Life-span Perspective, as well as curriculum research within the area of environmental education. This theoretical base fueled the generation of a list of components which were then translated into a questionnaire with specific questions relevant to the environmental education domain. The specific research question for this project is: Can a valid assessment instrument based largely on human development and education theory be developed that reliably discriminates high, moderate, and low quality in environmental education curricula? The types of analyses conducted to answer this question were interrater reliability (percent agreement, Cohen's Kappa coefficient, Pearson's Product-Moment correlation coefficient), test-retest reliability (percent agreement, correlation), and criterion-related validity (correlation). Face validity and content validity were also assessed through thorough reviews. Overall results indicate that 29% of the questions on the questionnaire demonstrated a high level of interrater reliability and 43% of the questions demonstrated a moderate level of interrater reliability. Seventy-one percent of the questions demonstrated a high test-retest reliability and 5% a moderate level. Fifty-five percent of the questions on the questionnaire were reliable (high or moderate) both across time and raters. Only eight questions (8%) did not show either interrater or test-retest reliability. The global overall rating of high, medium, or low quality was reliable across both coders and time, indicating
Friedman, Lee; Stern, Hal; Brown, Gregory G; Mathalon, Daniel H; Turner, Jessica; Glover, Gary H; Gollub, Randy L; Lauriello, John; Lim, Kelvin O; Cannon, Tyrone; Greve, Douglas N; Bockholt, Henry Jeremy; Belger, Aysenil; Mueller, Bryon; Doty, Michael J; He, Jianchun; Wells, William; Smyth, Padhraic; Pieper, Steve; Kim, Seyoung; Kubicki, Marek; Vangel, Mark; Potkin, Steven G
In the present report, estimates of test-retest and between-site reliability of fMRI assessments were produced in the context of a multicenter fMRI reliability study (FBIRN Phase 1, www.nbirn.net). Five subjects were scanned on 10 MRI scanners on two occasions. The fMRI task was a simple block design sensorimotor task. The impulse response functions to the stimulation block were derived using an FIR-deconvolution analysis with FMRISTAT. Six functionally-derived ROIs covering the visual, auditory and motor cortices, created from a prior analysis, were used. Two dependent variables were compared: percent signal change and contrast-to-noise-ratio. Reliability was assessed with intraclass correlation coefficients derived from a variance components analysis. Test-retest reliability was high, but initially, between-site reliability was low, indicating a strong contribution from site and site-by-subject variance. However, a number of factors that can markedly improve between-site reliability were uncovered, including increasing the size of the ROIs, adjusting for smoothness differences, and inclusion of additional runs. By employing multiple steps, between-site reliability for 3T scanners was increased by 123%. Dropping one site at a time and assessing reliability can be a useful method of assessing the sensitivity of the results to particular sites. These findings should provide guidance toothers on the best practices for future multicenter studies.
Mizelle, J C; Oparah, Alexis; Wheaton, Lewis A
The integration of vision and somatosensation is required to allow for accurate motor behavior. While both sensory systems contribute to an understanding of the state of the body through continuous updating and estimation, how the brain processes unreliable sensory information remains to be fully understood in the context of complex action. Using functional brain imaging, we sought to understand the role of the cerebellum in weighting visual and somatosensory feedback by selectively reducing the reliability of each sense individually during a tool use task. We broadly hypothesized upregulated activation of the sensorimotor and cerebellar areas during movement with reduced visual reliability, and upregulated activation of occipital brain areas during movement with reduced somatosensory reliability. As specifically compared to reduced somatosensory reliability, we expected greater activations of ipsilateral sensorimotor cerebellum for intact visual and somatosensory reliability. Further, we expected that ipsilateral posterior cognitive cerebellum would be affected with reduced visual reliability. We observed that reduced visual reliability results in a trend towards the relative consolidation of sensorimotor activation and an expansion of cerebellar activation. In contrast, reduced somatosensory reliability was characterized by the absence of cerebellar activations and a trend towards the increase of right frontal, left parietofrontal activation, and temporo-occipital areas. Our findings highlight the role of the cerebellum for specific aspects of skillful motor performance. This has relevance to understanding basic aspects of brain functions underlying sensorimotor integration, and provides a greater understanding of cerebellar function in tool use motor control.
Allan, R.N.; Whitehead, A.M.
The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)
Wright, R I
Microprocessor-based technology has had an impact in nearly every area of industrial electronics and many applications have important safety implications. Microprocessors are being used for the monitoring and control of hazardous processes in the chemical, oil and power generation industries, for the control and instrumentation of aircraft and other transport systems and for the control of industrial machinery. Even in the field of nuclear reactor protection, where designers are particularly conservative, microprocessors are used to implement certain safety functions and may play increasingly important roles in protection systems in the future. Where microprocessors are simply replacing conventional hard-wired control and instrumentation systems no new hazards are created by their use. In the field of robotics, however, the microprocessor has opened up a totally new technology and with it has created possible new and as yet unknown hazards. The paper discusses some of the design and manufacturing techniques which may be used to enhance the reliability of microprocessor based systems and examines the available reliability data on lsi/vlsi microcircuits. 12 references.
Chavaillaz, Alain; Wastell, David; Sauer, Jürgen
The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.
Full Text Available Background: Logistic Service Providers main concern was to ensure reliability for a low price (Christopher, 2005. Dutch Logistic Service Providers still have these two aspects at the top of their list, but also have to take in a new aspect: sustainability. 88% Of the investigated Logistic Service Providers have included sustainability in the company's goals. These Logistic Service Providers have developed different strategies to achieve a higher level of sustainability. This paper presents the results of a study into what Logistic Service Providers say what they are doing, or intend to do, to improve sustainability for their transport services. In this way insight is given in the attitude of Dutch Logistic Service Providers towards sustainability and how they intend to translate this into business practise: internal solutions or new methods incorporating external partners. Methods: Various methods of the investigations were used, among which the analysis of the statements about the sustainabilityon the websites of various companies as well as the questionnaire per Internet. The research covered 50 largest logistics companies operating in the Netherlands and 60 companies that competed for the award "Lean and Green" advertised in the Netherlands. In addition, the Internet survey was answered by 41 companies that belong to the network of our university. Results: The investigation has shown that sustainability is handled by the logistics company as an integral part of the corporate strategy. In contrast, shippers depend in the choice of logistics services primarily on such classical aspects as the reliability or the price and the sustainability play a minor role. Conclusions: Trying to find methods to improve the sustainability, Dutch logistics service providers, in the first place, look for solutions that increase the efficiency and therefore the cost reduction potential. Solutions, which require the involvement of clients, were less often
Vacha-Haase, Tammi; Kogan, Lori R.; Tani, Crystal R.; Woodall, Renee A.
Used reliability generalization to explore the variance of scores on 10 Minnesota Multiphasic Personality Inventory (MMPI) clinical scales drawing on 1,972 articles in the literature on the MMPI. Results highlight the premise that scores, not tests, are reliable or unreliable, and they show that study characteristics do influence scores on the…
Feizabadi, Mohammad; Jahromi, Abdolhamid Eshraghniaye
In discussions related to reliability optimization using redundancy allocation, one of the structures that has attracted the attention of many researchers, is series-parallel structure. In models previously presented for reliability optimization of series-parallel systems, there is a restricting assumption based on which all components of a subsystem must be homogeneous. This constraint limits system designers in selecting components and prevents achieving higher levels of reliability. In this paper, a new model is proposed for reliability optimization of series-parallel systems, which makes possible the use of non-homogeneous components in each subsystem. As a result of this flexibility, the process of supplying system components will be easier. To solve the proposed model, since the redundancy allocation problem (RAP) belongs to the NP-hard class of optimization problems, a genetic algorithm (GA) is developed. The computational results of the designed GA are indicative of high performance of the proposed model in increasing system reliability and decreasing costs. - Highlights: • In this paper, a new model is proposed for reliability optimization of series-parallel systems. • In the previous models, there is a restricting assumption based on which all components of a subsystem must be homogeneous. • The presented model provides a possibility for the subsystems’ components to be non- homogeneous in the required conditions. • The computational results demonstrate the high performance of the proposed model in improving reliability and reducing costs.
Ruiu, D.; Ye, C.; Billinton, R.; Lakhanpal, D.
A study was conducted on the selection of a generating system reliability criterion that ensures a reasonable continuity of supply while minimizing the total costs to utility customers. The study was conducted using the Institute for Electronic and Electrical Engineers (IEEE) reliability test system as the study system. The study inputs and results for conditions and load forecast data, new supply resources data, demand-side management resource data, resource planning criterion, criterion value selection, supply side development, integrated resource development, and best criterion values, are tabulated and discussed. Preliminary conclusions are drawn as follows. In the case of integrated resource planning, the selection of the best value for a given type of reliability criterion can be done using methods similar to those used for supply side planning. The reliability criteria values previously used for supply side planning may not be economically justified when integrated resource planning is used. Utilities may have to revise and adopt new, and perhaps lower supply reliability criteria for integrated resource planning. More complex reliability criteria, such as energy related indices, which take into account the magnitude, frequency and duration of the expected interruptions are better adapted than the simpler capacity-based reliability criteria such as loss of load expectation. 7 refs., 5 figs., 10 tabs
This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.