WorldWideScience

Sample records for failure assessment curve

  1. The elastic-plastic failure assessment diagram of surface cracked structure

    International Nuclear Information System (INIS)

    Ning, J.; Gao, Q.

    1987-01-01

    The simplified NLSM is able to calculate the EPFM parameters and failure assessment curve for the surface cracked structure correctly and conveniently. The elastic-plastic failure assessment curve of surface crack is relevant to crack geometry, loading form and material deformation behaviour. It is necessary to construct the EPFM failure assessment curve of the surface crack for the failure assessment of surface cracked structure. (orig./HP)

  2. Failure assessment diagrams for circular hollow section X- and K-joints

    International Nuclear Information System (INIS)

    Qian, Xudong

    2013-01-01

    This paper reports the failure assessment curves for semi-elliptical surface cracks located at hot-spot positions in the circular hollow section X- and K-joints. The failure assessment curves derive from the square root of the ratio between the linear–elastic and the elastic–plastic energy release rates, computed from the domain-integral approach. This study examines both the material and geometric dependence of the failure assessment curves. The area reduction factor, used in defining the strength of the cracked joints, imposes a significant effect on the computed failure assessment curve. The failure assessment curves indicate negligible variations with respect to the crack-front locations and the material yield strength. The crack depth ratio exerts a stronger effect on the computed failure assessment curve than does the crack aspect ratio. This study proposes a parametric expression for the failure assessment curves based on the geometric parameters for surface cracks in circular hollow section X- and K-joints. -- Highlights: ► This study proposes geometric dependent expressions of FADs for tubular joints. ► We examine the geometric and material dependence of the FADs for X- and K-joints. ► The proposed FAD is independent of yield strength and is a lower-bound for typical hardening

  3. Analysis of leak and break behavior in a failure assessment diagram for carbon steel pipes

    International Nuclear Information System (INIS)

    Kanno, Satoshi; Hasegawa, Kunio; Shimizu, Tasuku; Saitoh, Takashi; Gotoh, Nobuho

    1992-01-01

    The leak and break behavior of a cracked coolant pipe subjected to an internal pressure and a bending moment was analyzed with a failure assessment diagram using the R6 approach. This paper examines the conditions of the detectable coolant leakage without breakage. A leakage assessment curve, a locus of assessment point for detectable coolant leakage, was defined in the failure assessment diagram. The region between the leak assessment and failure assessment curves satisfies the condition of detectable leakage without breakage. In this region, a crack can be safely inspected by a coolant leak detector. (orig.)

  4. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  5. The Component And System Reliability Analysis Of Multipurpose Reactor G.A. Subway's Based On The Failure Rate Curve

    International Nuclear Information System (INIS)

    Sriyono; Ismu Wahyono, Puradwi; Mulyanto, Dwijo; Kusmono, Siamet

    2001-01-01

    The main component of Multipurpose G.A.Siwabessy had been analyzed by its failure rate curve. The main component ha'..e been analyzed namely, the pump of ''Fuel Storage Pool Purification System'' (AK-AP), ''Primary Cooling System'' (JE01-AP), ''Primary Pool Purification System'' (KBE01-AP), ''Warm Layer System'' (KBE02-AP), ''Cooling Tower'' (PA/D-AH), ''Secondary Cooling System'', and Diesel (BRV). The Failure Rate Curve is made by component database that was taken from 'log book' operation of RSG GAS. The total operation of that curve is 2500 hours. From that curve it concluded that the failure rate of components form of bathtub curve. The maintenance processing causes the curve anomaly

  6. Variations of fracture toughness and stress-strain curve of cold worked stainless steel and their influence on failure strength of cracked pipe

    International Nuclear Information System (INIS)

    Kamaya, Masayuki

    2016-01-01

    In order to assess failure probability of cracked components, it is important to know the variations of the material properties and their influence on the failure load assessment. In this study, variations of the fracture toughness and stress-strain curve were investigated for cold worked stainless steel. The variations of the 0.2% proof and ultimate strengths obtained using 8 specimens of 20% cold worked stainless steel (CW20) were 77 MPa and 81 MPa, respectively. The respective variations were decreased to 13 and 21 MPa for 40% cold worked material (CW40). Namely, the variation in the tensile strength was decreased by hardening. The COVs (coefficients of variation) of fracture toughness were 7.3% and 16.7% for CW20 and CW40, respectively. Namely, the variation in the fracture toughness was increased by hardening. Then, in order to investigate the influence of the variations in the material properties on failure load of a cracked pipe, flaw assessments were performed for a cracked pipe subjected to a global bending load. Using the obtained material properties led to variation in the failure load. The variation in the failure load of the cracked pipe caused by the variation in the stress-strain curve was less than 1.5% for the COV. The variation in the failure load caused by fracture toughness variation was relatively large for CW40, although it was less than 2.0% for the maximum case. It was concluded that the hardening induced by cold working does not cause significant variation in the failure load of cracked stainless steel pipe. (author)

  7. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  8. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  9. A simplified early-warning system for imminent landslide prediction based on failure index fragility curves developed through numerical analysis

    Directory of Open Access Journals (Sweden)

    Ugur Ozturk

    2016-07-01

    Full Text Available Early-warning systems (EWSs are crucial to reduce the risk of landslide, especially where the structural measures are not fully capable of preventing the devastating impact of such an event. Furthermore, designing and successfully implementing a complete landslide EWS is a highly complex task. The main technical challenges are linked to the definition of heterogeneous material properties (geotechnical and geomechanical parameters as well as a variety of the triggering factors. In addition, real-time data processing creates a significant complexity, since data collection and numerical models for risk assessment are time consuming tasks. Therefore, uncertainties in the physical properties of a landslide together with the data management represent the two crucial deficiencies in an efficient landslide EWS. Within this study the application is explored of the concept of fragility curves to landslides; fragility curves are widely used to simulate systems response to natural hazards, i.e. floods or earthquakes. The application of fragility curves to landslide risk assessment is believed to simplify emergency risk assessment; even though it cannot substitute detailed analysis during peace-time. A simplified risk assessment technique can remove some of the unclear features and decrease data processing time. The method is based on synthetic samples which are used to define the approximate failure thresholds for landslides, taking into account the materials and the piezometric levels. The results are presented in charts. The method presented in this paper, which is called failure index fragility curve (FIFC, allows assessment of the actual real-time risk in a case study that is based on the most appropriate FIFC. The application of an FIFC to a real case is presented as an example. This method to assess the landslide risk is another step towards a more integrated dynamic approach to a potential landslide prevention system. Even if it does not define

  10. Failure prediction of low-carbon steel pressure vessel and cylindrical models

    International Nuclear Information System (INIS)

    Zhang, K.D.; Wang, W.

    1987-01-01

    The failure loads predicted by failure assessment methods (namely the net-section stress criterion; the EPRI engineering approach for elastic-plastic analysis; the CEGB failure assessment route; the modified R6 curve by Milne for strain hardening; and the failure assessment curve based on J estimation by Ainsworth) have been compared with burst test results on externally, axially sharp notched pressure vessel and open-ended cylinder models made from typical low-carbon steel St45 seamless tube which has a transverse true stress-strain curve of straight-line and parabola type and a high value of ultimate strength to yield. It was concluded from the comparison that whilst the net-section stress criterion and the CEGB route did not give conservative predictions, Milne's modified curve did give a conservative and good prediction; Ainsworth's curve gave a fairly conservative prediction; and EPRI solutions also could conditionally give a good prediction but the conditions are still somewhat uncertain. It is suggested that Milne's modified R6 curve is used in failure assessment of low-carbon steel pressure vessels. (author)

  11. Experience Curves: A Tool for Energy Policy Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Neij, Lena; Helby, Peter [Lund Univ. (Sweden). Environmental and Energy Systems Studies; Dannemand Andersen, Per; Morthorst, Poul Erik [Riso National Laboratory, Roskilde (Denmark); Durstewitz, Michael; Hoppe-Kilpper, Martin [Inst. fuer Solare Energieversorgungstechnik e.V., Kassel (DE); and others

    2003-07-01

    The objective of the project, Experience curves: a tool for energy policy assessment (EXTOOL), was to analyse the experience curve as a tool for the assessment of energy policy measures. This is of special interest, since the use of experience curves for the assessment of energy policy measures requires the development of the established experience curve methodology. This development raises several questions which have been addressed and analysed in this project. The analysis is based on case studies of wind power, an area with considerable experience in technology development, deployment and policy measures. Therefore, a case study based on wind power provides a good opportunity to study the usefulness of experience curves as a tool for the assessment of energy policy measures. However, the results are discussed in terms of using experience curves for the assessment of any energy technology. The project shows that experience curves can be used to assess the effect of combined policy measures in terms of cost reductions. Moreover, the result of the project show that experience curves could be used to analyse international 'learning systems', i.e. cost reductions brought about by the development of wind power and policy measures used in other countries. Nevertheless, the use of experience curves for the assessment of policy programmes has several limitations. First, the analysis and assessment of policy programmes cannot be achieved unless relevant experience curves based on good data can be developed. The authors are of the opinion that only studies that provide evidence of the validity, reliability and relevance of experience curves should be taken into account in policy making. Second, experience curves provide an aggregated picture of the situation and more detailed analysis of various sources of cost reduction, and cost reductions resulting from individual policy measures, requires additional data and analysis tools. Third, we do not recommend the use of

  12. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  13. Statistical assessment of the learning curves of health technologies.

    Science.gov (United States)

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second

  14. Sequential Organ Failure Assessment Score for Evaluating Organ Failure and Outcome of Severe Maternal Morbidity in Obstetric Intensive Care

    Directory of Open Access Journals (Sweden)

    Antonio Oliveira-Neto

    2012-01-01

    Full Text Available Objective. To evaluate the performance of Sequential Organ Failure Assessment (SOFA score in cases of severe maternal morbidity (SMM. Design. Retrospective study of diagnostic validation. Setting. An obstetric intensive care unit (ICU in Brazil. Population. 673 women with SMM. Main Outcome Measures. mortality and SOFA score. Methods. Organ failure was evaluated according to maximum score for each one of its six components. The total maximum SOFA score was calculated using the poorest result of each component, reflecting the maximum degree of alteration in systemic organ function. Results. highest total maximum SOFA score was associated with mortality, 12.06 ± 5.47 for women who died and 1.87 ± 2.56 for survivors. There was also a significant correlation between the number of failing organs and maternal mortality, ranging from 0.2% (no failure to 85.7% (≥3 organs. Analysis of the area under the receiver operating characteristic (ROC curve (AUC confirmed the excellent performance of total maximum SOFA score for cases of SMM (AUC = 0.958. Conclusions. Total maximum SOFA score proved to be an effective tool for evaluating severity and estimating prognosis in cases of SMM. Maximum SOFA score may be used to conceptually define and stratify the degree of severity in cases of SMM.

  15. Evaluation of flaws in ferritic piping: ASME Code Appendix J, Deformation Plasticity Failure Assessment Diagram (DPFAD)

    International Nuclear Information System (INIS)

    Bloom, J.M.

    1991-08-01

    This report summarizes the methods and bases used by an ASME Code procedure for the evaluation of flaws in ferritic piping. The procedure is currently under consideration by the ASME Boiler and Pressure Vessel Code Committee of Section 11. The procedure was initially proposed in 1985 for the evaluation of the acceptability of flaws detected in piping during in-service inspection for certain materials, identified in Article IWB-3640 of the ASME Boiler and Pressure Vessel Code Section 11 ''Rules for In-service Inspection of Nuclear Power Plant Components.'' for which the fracture toughness is not sufficiently high to justify acceptance based solely on the plastic limit load evaluation methodology of Appendix C and IWB-3641. The procedure, referred to as Appendix J, originally included two approaches: a J-integral based tearing instability (J-T) analysis and the deformation plasticity failure assessment diagram (DPFAD) methodology. In Appendix J, a general DPFAD approach was simplified for application to part-through wall flows in ferritic piping through the use of a single DPFAD curve for circumferential flaws. Axial flaws are handled using two DPFAD curves where the ratio of flaw depth to wall thickness is used to determine the appropriate DPFAD curve. Flaws are evaluated in Appendix J by comparing the actual pipe applied stress with the allowable stress with the appropriate safety factors for the flaw size at the end of the evaluation period. Assessment points for circumferential and axial flaws are plotted on the appropriate failure assessment diagram. In addition, this report summarizes the experimental test predictions of the results of the Battelle Columbus Laboratory experiments, the Eiber experiments, and the JAERI tests using the Appendix J DPFAD methodology. Lastly, this report also provides guidelines for handling residual stresses in the evaluation procedure. 22 refs., 13 figs., 5 tabs

  16. Comparing risk of failure models in water supply networks using ROC curves

    International Nuclear Information System (INIS)

    Debon, A.; Carrion, A.; Cabrera, E.; Solano, H.

    2010-01-01

    The problem of predicting the failure of water mains has been considered from different perspectives and using several methodologies in engineering literature. Nowadays, it is important to be able to accurately calculate the failure probabilities of pipes over time, since water company profits and service quality for citizens depend on pipe survival; forecasting pipe failures could have important economic and social implications. Quantitative tools (such as managerial or statistical indicators and reliable databases) are required in order to assess the current and future state of networks. Companies managing these networks are trying to establish models for evaluating the risk of failure in order to develop a proactive approach to the renewal process, instead of using traditional reactive pipe substitution schemes. The main objective of this paper is to compare models for evaluating the risk of failure in water supply networks. Using real data from a water supply company, this study has identified which network characteristics affect the risk of failure and which models better fit data to predict service breakdown. The comparison using the receiver operating characteristics (ROC) graph leads us to the conclusion that the best model is a generalized linear model. Also, we propose a procedure that can be applied to a pipe failure database, allowing the most appropriate decision rule to be chosen.

  17. Comparing risk of failure models in water supply networks using ROC curves

    Energy Technology Data Exchange (ETDEWEB)

    Debon, A., E-mail: andeau@eio.upv.e [Centro de Gestion de la Calidad y del Cambio, Dpt. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica de Valencia, E-46022 Valencia (Spain); Carrion, A. [Centro de Gestion de la Calidad y del Cambio, Dpt. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica de Valencia, E-46022 Valencia (Spain); Cabrera, E. [Dpto. De Ingenieria Hidraulica Y Medio Ambiente, Instituto Tecnologico del Agua, Universidad Politecnica de Valencia, E-46022 Valencia (Spain); Solano, H. [Universidad Diego Portales, Santiago (Chile)

    2010-01-15

    The problem of predicting the failure of water mains has been considered from different perspectives and using several methodologies in engineering literature. Nowadays, it is important to be able to accurately calculate the failure probabilities of pipes over time, since water company profits and service quality for citizens depend on pipe survival; forecasting pipe failures could have important economic and social implications. Quantitative tools (such as managerial or statistical indicators and reliable databases) are required in order to assess the current and future state of networks. Companies managing these networks are trying to establish models for evaluating the risk of failure in order to develop a proactive approach to the renewal process, instead of using traditional reactive pipe substitution schemes. The main objective of this paper is to compare models for evaluating the risk of failure in water supply networks. Using real data from a water supply company, this study has identified which network characteristics affect the risk of failure and which models better fit data to predict service breakdown. The comparison using the receiver operating characteristics (ROC) graph leads us to the conclusion that the best model is a generalized linear model. Also, we propose a procedure that can be applied to a pipe failure database, allowing the most appropriate decision rule to be chosen.

  18. Towards a whole-network risk assessment for railway bridge failures caused by scour during flood events

    Directory of Open Access Journals (Sweden)

    Lamb Rob

    2016-01-01

    Full Text Available Localised erosion (scour during flood flow conditions can lead to costly damage or catastrophic failure of bridges, and in some cases loss of life or significant disruption to transport networks. Here, we take a broad scale view to assess risk associated with bridge scour during flood events over an entire infrastructure network, illustrating the analysis with data from the British railways. There have been 54 recorded events since 1846 in which scour led to the failure of railway bridges in Britain. These events tended to occur during periods of extremely high river flow, although there is uncertainty about the precise conditions under which failures occur, which motivates a probabilistic analysis of the failure events. We show how data from the historical bridge failures, combined with hydrological analysis, have been used to construct fragility curves that quantify the conditional probability of bridge failure as a function of river flow, accompanied by estimates of the associated uncertainty. The new fragility analysis is tested using flood events simulated from a national, spatial joint probability model for extremes in river flows. The combined models appear robust in comparison with historical observations of the expected number of bridge failures in a flood event, and provide an empirical basis for further broad-scale network risk analysis.

  19. Improving FMEA risk assessment through reprioritization of failures

    Science.gov (United States)

    Ungureanu, A. L.; Stan, G.

    2016-08-01

    Most of the current methods used to assess the failure and to identify the industrial equipment defects are based on the determination of Risk Priority Number (RPN). Although conventional RPN calculation is easy to understand and use, the methodology presents some limitations, such as the large number of duplicates and the difficulty of assessing the RPN indices. In order to eliminate the afore-mentioned shortcomings, this paper puts forward an easy and efficient computing method, called Failure Developing Mode and Criticality Analysis (FDMCA), which takes into account the failures and the defect evolution in time, from failure appearance to a breakdown.

  20. FIB/FESEM experimental and analytical assessment of R-curve behavior of WC–Co cemented carbides

    Energy Technology Data Exchange (ETDEWEB)

    Tarragó, J.M., E-mail: jose.maria.tarrago@upc.edu [CIEFMA, Departament de Ciència dels Materials i Enginyeria Metallúrgica, ETSEIB, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); CRnE, Centre de Recerca en Nanoenginyeria, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); Jiménez-Piqué, E. [CIEFMA, Departament de Ciència dels Materials i Enginyeria Metallúrgica, ETSEIB, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); CRnE, Centre de Recerca en Nanoenginyeria, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); Schneider, L. [Sandvik Hyperion, Coventry CV4 0XG (United Kingdom); Casellas, D. [Fundació CTM Centre Tecnològic, 08243 Manresa (Spain); Torres, Y. [Departamento de Ingeniería y Ciencia de los Materiales y del Transporte, ETSI, Universidad de Sevilla, 41092 Sevilla (Spain); Llanes, L. [CIEFMA, Departament de Ciència dels Materials i Enginyeria Metallúrgica, ETSEIB, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); CRnE, Centre de Recerca en Nanoenginyeria, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain)

    2015-10-01

    Exceptional fracture toughness levels exhibited by WC–Co cemented carbides (hardmetals) are due mainly to toughening derived from plastic stretching of crack-bridging ductile enclaves. This takes place due to the development of a multiligament zone at the wake of cracks growing in a stable manner. As a result, hardmetals exhibit crack growth resistance (R-curve) behavior. In this work, the toughening mechanics and mechanisms of these materials are investigated by combining experimental and analytical approaches. Focused Ion Beam technique (FIB) and Field-Emission Scanning Electron Microscopy (FESEM) are implemented to obtain serial sectioning and imaging of crack–microstructure interaction in cracks arrested after stable extension under monotonic loading. The micrographs obtained provide experimental proof of the developing multiligament zone, including failure micromechanisms within individual bridging ligaments. Analytical assessment of the multiligament zone is then conducted on the basis of experimental information attained from FIB/FESEM images, and a model for the description of R-curve behavior of hardmetals is proposed. It was found that, due to the large stresses supported by the highly constrained and strongly bonded bridging ligaments, WC–Co cemented carbides exhibit quite steep but short R-curve behavior. Relevant strength and reliability attributes exhibited by hardmetals may then be rationalized on the basis of such toughening scenario.

  1. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  2. Common-Cause Failure Analysis in Event Assessment

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  3. Frailty Assessment in Heart Failure: an Overview of the Multi-domain Approach.

    Science.gov (United States)

    McDonagh, Julee; Ferguson, Caleb; Newton, Phillip J

    2018-02-01

    The study aims (1) to provide a contemporary description of frailty assessment in heart failure and (2) to provide an overview of multi-domain frailty assessment in heart failure. Frailty assessment is an important predictive measure for mortality and hospitalisation in individuals with heart failure. To date, there are no frailty assessment instruments validated for use in heart failure. This has resulted in significant heterogeneity between studies regarding the assessment of frailty. The most common frailty assessment instrument used in heart failure is the Frailty Phenotype which focuses on five physical domains of frailty; the appropriateness a purely physical measure of frailty in individuals with heart failure who frequently experience decreased exercise tolerance and shortness of breath is yet to be determined. A limited number of studies have approached frailty assessment using a multi-domain view which may be more clinically relevant in heart failure. There remains a lack of consensus regarding frailty assessment and an absence of a validated instrument in heart failure. Despite this, frailty continues to be assessed frequently, primarily for research purposes, using predominantly physical frailty measures. A more multidimensional view of frailty assessment using a multi-domain approach will likely be more sensitive to identifying at risk patients.

  4. Application of Master Curve Methodology for Structural Integrity Assessments of Nuclear Components

    Energy Technology Data Exchange (ETDEWEB)

    Sattari-Far, Iradj [Det Norske Veritas, Stockholm (Sweden); Wallin, Kim [VTT, Esbo (Finland)

    2005-10-15

    The objective was to perform an in-depth investigation of the Master Curve methodology and also based on this method develop a procedure for fracture assessments of nuclear components. The project has sufficiently illustrated the capabilities of the Master Curve methodology for fracture assessments of nuclear components. Within the scope of this work, the theoretical background of the methodology and its validation on small and large specimens has been studied and presented to a sufficiently large extent, as well as the correlations between the charpy-V data and the Master Curve T{sub 0} reference temperature in the evaluation of fracture toughness. The work gives a comprehensive report of the background theory and the different applications of the Master Curve methodology. The main results of the work have shown that the cleavage fracture toughness is characterized by a large amount of statistical scatter in the transition region, it is specimen size dependent and it should be treated statistically rather than deterministically. The Master Curve methodology is able to make use of statistical data in a consistent way. Furthermore, the Master Curve methodology provides a more precise prediction of the fracture toughness of embrittled materials in comparison with the ASME K{sub IC} reference curve, which often gives over-conservative results. The suggested procedure in this study, concerning the application of the Master Curve method in fracture assessments of ferritic steels in the transition region and the low shelf regions, is valid for the temperatures range T{sub 0}-50{<=}T{<=}T{sub 0}+50 deg C. If only approximate information is required, the Master Curve may well be extrapolated outside this temperature range. The suggested procedure has also been illustrated for some examples.

  5. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    Science.gov (United States)

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  6. A Failure Criterion for Concrete

    DEFF Research Database (Denmark)

    Ottosen, N. S.

    1977-01-01

    A four-parameter failure criterion containing all the three stress invariants explicitly is proposed for short-time loading of concrete. It corresponds to a smooth convex failure surface with curved meridians, which open in the negative direction of the hydrostatic axis, and the trace in the devi......A four-parameter failure criterion containing all the three stress invariants explicitly is proposed for short-time loading of concrete. It corresponds to a smooth convex failure surface with curved meridians, which open in the negative direction of the hydrostatic axis, and the trace...

  7. Failure assessment of pressure vessels under yielding conditions

    International Nuclear Information System (INIS)

    Harrison, R.P.; Darlaston, B.J.L.; Townley, C.H.A.

    1977-01-01

    The paper summarizes the work carried out to establish the behavior of structures containing defects and outlines a failure assessment route which can be used to assess the integrity of a structure containing a defect. The basis for this failure assessment route is the two-criteria approach of Dowling and Townley, which can be applied to structures containing defects irrespective of whether they are in the linear elastic fracture mechanics regime, the fully plastic regime, or in an intermediate regime. The extension of this concept to include crack growth by stable tearing is dealt with in the paper

  8. Design fatigue curve for Hastelloy-X

    International Nuclear Information System (INIS)

    Nishiguchi, Isoharu; Muto, Yasushi; Tsuji, Hirokazu

    1983-12-01

    In the design of components intended for elevated temperature service as the experimental Very High-Temperature gas-cooled Reactor (VHTR), it is essential to prevent fatigue failure and creep-fatigue failure. The evaluation method which uses design fatigue curves is adopted in the design rules. This report discussed several aspects of these design fatigue curves for Hastelloy-X (-XR) which is considered for use as a heat-resistant alloy in the VHTR. Examination of fatigue data gathered by a literature search including unpublished data showed that Brinkman's equation is suitable for the design curve of Hastelloy-X (-XR), where total strain range Δ epsilon sub(t) is used as independent variable and fatigue life Nsub(f) is transformed into log(log Nsub(f)). (author)

  9. The prehospital intravenous access assessment: a prospective study on intravenous access failure and access delay in prehospital emergency medicine.

    Science.gov (United States)

    Prottengeier, Johannes; Albermann, Matthias; Heinrich, Sebastian; Birkholz, Torsten; Gall, Christine; Schmidt, Joachim

    2016-12-01

    Intravenous access in prehospital emergency care allows for early administration of medication and extended measures such as anaesthesia. Cannulation may, however, be difficult, and failure and resulting delay in treatment and transport may have negative effects on the patient. Therefore, our study aims to perform a concise assessment of the difficulties of prehospital venous cannulation. We analysed 23 candidate predictor variables on peripheral venous cannulations in terms of cannulation failure and exceedance of a 2 min time threshold. Multivariate logistic regression models were fitted for variables of predictive value (P0.6) of their respective receiver operating characteristic curve. A total of 762 intravenous cannulations were enroled. In all, 22% of punctures failed on the first attempt and 13% of punctures exceeded 2 min. Model selection yielded a three-factor model (vein visibility without tourniquet, vein palpability with tourniquet and insufficient ambient lighting) of fair accuracy for the prediction of puncture failure (AUC=0.76) and a structurally congruent model of four factors (failure model factors plus vein visibility with tourniquet) for the exceedance of the 2 min threshold (AUC=0.80). Our study offers a simple assessment to identify cases of difficult intravenous access in prehospital emergency care. Of the numerous factors subjectively perceived as possibly exerting influences on cannulation, only the universal - not exclusive to emergency care - factors of lighting, vein visibility and palpability proved to be valid predictors of cannulation failure and exceedance of a 2 min threshold.

  10. Dynamic loads during failure risk assessment of bridge crane structures

    Science.gov (United States)

    Gorynin, A. D.; Antsev, V. Yu; Shaforost, A. N.

    2018-03-01

    The paper presents the method of failure risk assessment associated with a bridge crane metal structure at the design stage. It also justifies the necessity of taking into account dynamic loads with regard to the operational cycle of a bridge crane during failure risk assessment of its metal structure.

  11. Long-term effects as the cause of failure in electronic components

    International Nuclear Information System (INIS)

    Renz, H.; Kreichgauer, H.

    1989-01-01

    After a brief presentation of the utilisation properties of electronic components, their failure rates are discussed with particular reference to the socalled bath-tub curve. The main emphasis is on the construction and manufacture of integrated circuits and the possible types and causes of failure arising from the individual manufacturing stages (layout faults, internal corrosion, masking and etching errors, leakage currents, inadequate heat removal, etc.). A technical insurance assessment is then provided of the long-term failures associated with technological matters. (orig.) [de

  12. Renal function assessment in heart failure.

    Science.gov (United States)

    Pérez Calvo, J I; Josa Laorden, C; Giménez López, I

    Renal function is one of the most consistent prognostic determinants in heart failure. The prognostic information it provides is independent of the ejection fraction and functional status. This article reviews the various renal function assessment measures, with special emphasis on the fact that the patient's clinical situation and response to the heart failure treatment should be considered for the correct interpretation of the results. Finally, we review the literature on the performance of tubular damage biomarkers. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  13. Assessment of Nonorganic Failure To Thrive.

    Science.gov (United States)

    Wooster, Donna M.

    1999-01-01

    This article describes basic assessment considerations for infants and toddlers exhibiting nonorganic failure to thrive. The evaluation process must examine feeding, maternal-child interactions, child temperament, and environmental risks and behaviors. Early identification and intervention are necessary to minimize the long-term developmental…

  14. Clinical use of nuclear cardiology in the assessment of heart failure

    International Nuclear Information System (INIS)

    Han Lei; Shi Hongcheng

    2011-01-01

    Nuclear cardiology is the most commonly performed non-invasive cardiac imaging test in patients with heart failure, and it plays an important role in their assessment and management. Quantitative gated single positron emission computed tomography is used to assess quantitatively cardiac volume, left ventricular ejection fraction, stroke volume, and cardiac diastolic function. Resting and stress myocardial perfusion imaging can not only identify nonischemic heart failure and ischemic heart failure, but also demonstrate myocardial viability. Diastolic heart failure also termed as heart failure with a preserved left ventricular ejection fraction is readily identified by nuclear cardiology techniques and can accurately be estimated by peak filling rate and time to peak filling rate. With newer techniques such as three-dimensional, quantitative gated single positron emission computed tomography can assess movement of the left ventricle, and wall thickening evaluation aids its assessment. Myocardial perfusion imaging is also commonly used to identify candidates for implantable cardiac defibrillator and cardiac resynchronization therapies. Neurotransmitter imaging using 123 I-metaiodobenzylguanidine offers prognostic information in patients with heart failure. Metabolism and function in the heart are closely related, and energy substrate metabolism is a potential target of medical therapies to improve cardiac function in patients with heart failure. Cardiac metabolic imaging using 123 I-15-(p-iodophenyl) 3-R, S-methylpentadecacoic acid is a commonly used tracer in clinical studies to diagnose metabolic heart failure. Nuclear cardiology tests, including neurotransmitter imaging and metabolic imaging, are now easily preformed with new tracers to improve heart failure diagnosis. Nuclear cardiology techniques contribute significantly to identifying patients with heart failure and to guiding their management decisions. (authors)

  15. Reliability-based failure cause assessment of collapsed bridge during construction

    International Nuclear Information System (INIS)

    Choi, Hyun-Ho; Lee, Sang-Yoon; Choi, Il-Yoon; Cho, Hyo-Nam; Mahadevan, Sankaran

    2006-01-01

    Until now, in many forensic reports, the failure cause assessments are usually carried out by a deterministic approach so far. However, it may be possible for the forensic investigation to lead to unreasonable results far from the real collapse scenario, because the deterministic approach does not systematically take into account any information on the uncertainties involved in the failures of structures. Reliability-based failure cause assessment (reliability-based forensic engineering) methodology is developed which can incorporate the uncertainties involved in structural failures and structures, and to apply them to the collapsed bridge in order to identify the most critical failure scenario and find the cause that triggered the bridge collapse. Moreover, to save the time and cost of evaluation, an algorithm of automated event tree analysis (ETA) is proposed and possible to automatically calculate the failure probabilities of the failure events and the occurrence probabilities of failure scenarios. Also, for reliability analysis, uncertainties are estimated more reasonably by using the Bayesian approach based on the experimental laboratory testing data in the forensic report. For the applicability, the proposed approach is applied to the Hang-ju Grand Bridge, which collapsed during construction, and compared with deterministic approach

  16. Parametric and quantitative analysis of MR renographic curves for assessing the functional behaviour of the kidney

    Energy Technology Data Exchange (ETDEWEB)

    Michoux, N.; Montet, X.; Pechere, A.; Ivancevic, M.K.; Martin, P.-Y.; Keller, A.; Didier, D.; Terrier, F.; Vallee, J.-P

    2005-04-01

    The aim of this study was to refine the description of the renal function based on MR images and through transit-time curve analysis on a normal population and on a population with renal failure, using the quantitative model of the up-slope. Thirty patients referred for a kidney MR exam were divided in a first population with well-functioning kidneys and in a second population with renal failure from ischaemic kidney disease. The perfusion sequence consisted of an intravenous injection of Gd-DTPA and of a fast GRE sequence T1-TFE with 90 deg. magnetisation preparation (Intera 1.5 T MR System, Philips Medical System). To convert the signal intensity into 1/T1, which is proportional to the contrast media concentration, a flow-corrected calibration procedure was used. Following segmentation of regions of interest in the cortex and medulla of the kidney and in the abdominal aorta, outflow curves were obtained and filtered to remove the high frequency fluctuations. The model of the up-slope method was then applied. Significant reduction of the cortical perfusion (Q{sub c}=0.057{+-}0.030 ml/(s 100 g) to Q{sub c}=0.030{+-}0.017 ml/(s 100 g), P<0.013), of the medullary perfusion (Q{sub m}=0.023{+-}0.018 ml/(s 100 g) to Q{sub m}=0.011{+-}0.006 ml/(s 100 g), P<0.046) and of the accumulation of contrast media in the medulla (Q{sub a}=0.005{+-}0.003 ml/(s 100 g) to Q{sub a}=0.0009{+-}0.0008 ml/(s 100 g), P<0.001) were found in presence of renal failure. High correlations were found between the creatinine level and the accumulation Q{sub a} in the medulla (r{sup 2}=0.72, P<0.05), and between the perfusion ratio Q{sub c}/Q{sub m} and the accumulation Q{sub a} in the medulla (r{sup 2}=0.81, P<0.05). No significant difference was found in times to peak between both populations despite a trend showing T{sub a} the time to the end of the increasing contrast accumulation period in the medulla, arriving later for renal failure. Advances in MR signal calibration with the building of

  17. [Reference curves for assessing the physical growth of male Wistar rats].

    Science.gov (United States)

    Cossio-Bolaños, Marco; Gómez Campos, Rossana; Vargas Vitoria, Rodrigo; Hochmuller Fogaça, Rosalvo Tadeu; de Arruda, Miguel

    2013-11-01

    Wistar rats are one of the most popular strains routinely used for research in the laboratory to serve as an important research tool, so it requires strict control of variables such as age, sex and body weight, and Thus to extrapolate the results to the human model. To develop reference curves for assessing the physical growth of male Wistar rats according to chronological age and somatic maturation from a non-invasive. The subjects studied were 731 male Wistar rats transversely. We assessed age, body weight and body surface. LMS method was used to construct percentile curves based on weight and somatic maturation. The proposed physical growth curves are used to track the physical growth and nutritional status diagnosis of male Wistar rats. Budgets by cutting points are: P3, P10, P25, P50, P75, P90 and P97. The results suggest that scientists from different areas can use such references, in order to extrapolate somatic growth phases of the laboratory rat and the human model is a non-invasive alternative to assess growth and nutritional status. Copyright AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  18. Semiconductor failure threshold estimation problem in electromagnetic assessment

    International Nuclear Information System (INIS)

    Enlow, E.W.; Wunsch, D.C.

    1984-01-01

    Present semiconductor failure models to predict the one-microsecond square-wave power failure level for use with system electromagnetic (EM) assessments and hardening design are incomplete. This is because for a majority of device types there is insufficient data readily available in a composite data source to quantify the model parameters and the inaccuracy of the models cause complications in definition of adequate hardness margins and quantification of EM performance. This paper presents new semiconductor failure models which use a generic approach that are an integration and simplification of many present models. This generic approach uses two categorical models: one for diodes and transistors, and one for integrated circuits. The models were constructed from a large database of semiconductor failure data. The approach used for constructing diode and transistor failure level models is based on device rated power and are simple to use and universally applicable. The model predicts the value of the 1 μ second failure power to be used in the power failure models P = Kt /SUP -1/2/ or P = K 1 t -1 + K 2 t /SUP -1/2/ + K 3

  19. Hysteroscopic sterilization using a virtual reality simulator: assessment of learning curve.

    Science.gov (United States)

    Janse, Juliënne A; Goedegebuure, Ruben S A; Veersema, Sebastiaan; Broekmans, Frank J M; Schreuder, Henk W R

    2013-01-01

    To assess the learning curve using a virtual reality simulator for hysteroscopic sterilization with the Essure method. Prospective multicenter study (Canadian Task Force classification II-2). University and teaching hospital in the Netherlands. Thirty novices (medical students) and five experts (gynecologists who had performed >150 Essure sterilization procedures). All participants performed nine repetitions of bilateral Essure placement on the simulator. Novices returned after 2 weeks and performed a second series of five repetitions to assess retention of skills. Structured observations on performance using the Global Rating Scale and parameters derived from the simulator provided measurements for analysis. The learning curve is represented by improvement per procedure. Two-way repeated-measures analysis of variance was used to analyze learning curves. Effect size (ES) was calculated to express the practical significance of the results (ES ≥ 0.50 indicates a large learning effect). For all parameters, significant improvements were found in novice performance within nine repetitions. Large learning effects were established for six of eight parameters (p learning curve established in this study endorses future implementation of the simulator in curricula on hysteroscopic skill acquisition for clinicians who are interested in learning this sterilization technique. Copyright © 2013 AAGL. Published by Elsevier Inc. All rights reserved.

  20. Experimental Assessment of Tensile Failure Characteristic for Advanced Composite Laminates

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Keon [Agency for Defense Development, Daejeon (Korea, Republic of); Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-10-15

    In recent years, major airplane manufacturers have been using the laminate failure theory to estimate the strain of composite structures for airplanes. The laminate failure theory uses the failure strain of the laminate to analyze composite structures. This paper describes a procedure for the experimental assessment of laminate tensile failure characteristics. Regression analysis was used as the experimental assessment method. The regression analysis was performed with the response variable being the laminate failure strain and with the regressor variables being two-ply orientation (0° and ±45°) variables. The composite material in this study is a carbon/epoxy unidirectional (UD) tape that was cured as a pre-preg at 177°C(350°F). A total of 149 tension tests were conducted on specimens from 14 distinct laminates that were laid up at standard angle layers (0°, 45°, -45°, and 90°). The ASTM-D-3039 standard was used as the test method.

  1. Experimental Assessment of Tensile Failure Characteristic for Advanced Composite Laminates

    International Nuclear Information System (INIS)

    Lee, Myoung Keon; Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon

    2017-01-01

    In recent years, major airplane manufacturers have been using the laminate failure theory to estimate the strain of composite structures for airplanes. The laminate failure theory uses the failure strain of the laminate to analyze composite structures. This paper describes a procedure for the experimental assessment of laminate tensile failure characteristics. Regression analysis was used as the experimental assessment method. The regression analysis was performed with the response variable being the laminate failure strain and with the regressor variables being two-ply orientation (0° and ±45°) variables. The composite material in this study is a carbon/epoxy unidirectional (UD) tape that was cured as a pre-preg at 177°C(350°F). A total of 149 tension tests were conducted on specimens from 14 distinct laminates that were laid up at standard angle layers (0°, 45°, -45°, and 90°). The ASTM-D-3039 standard was used as the test method.

  2. Broadening failure rate distributions in PRA uncertainty analyses

    International Nuclear Information System (INIS)

    Martz, H.F.

    1984-01-01

    Several recent nuclear power plant probabilistic risk assessments (PRAs) have utilized broadened Reactor Safety Study (RSS) component failure rate population variability curves to compensate for such things as expert overvaluation bias in the estimates upon which the curves are based. A simple two-components of variation empirical Bayes model is proposed for use in estimating the between-expert variability curve in the presence of such biases. Under certain conditions this curve is a population variability curve. Comparisons are made with the existing method. The popular procedure appears to be generally much more conservative than the empirical Bayes method in removing such biases. In one case the broadened curve based on the popular method is more than two orders of magnitude broader than the empirical Bayes curve. In another case it is found that the maximum justifiable degree of broadening of the RSS curve is to increase α from 5% to 12%, which is significantly less than 20% value recommended in the popular approach. 15 references, 1 figure, 5 tables

  3. Multidisciplinary approach for in-deep assessment of joint prosthesis failure.

    Science.gov (United States)

    Tessarolo, F; Caola, I; Piccoli, F; Dorigotti, P; Demattè, E; Molinari, M; Malavolta, M; Barbareschi, M; Caciagli, P; Nollo, G

    2009-01-01

    In spite of advancement in biomaterials and biomechanics, in development of new osteo-integrative materials and coatings, and in macro- micro- component design, a non negligible fraction of the implanted prosthesis fails before the expected lifetime. A prospective observational clinical study has been conducted to define and apply a set of experimental techniques to in-deep assess the failure of joint prosthesis. Microbiological, histological and micro-structural techniques were implemented to specifically address phenomena occurring at the tissue-implant interface. Results obtained from 27 cases of prosthetic joint failure are discussed in terms of sensitivity and specificity. A procedural flow-chart is finally proposed for the assessment of joint prosthesis failure.

  4. A unified approach to failure assessment of engineering structures

    International Nuclear Information System (INIS)

    Harrison, R.P.

    1977-01-01

    A codified procedure for the failure assessment of engineering structures is presented which has as its basis the two criteria approach of Dowling and Townley (Int. J. Press. Vessels and Piping; 3:77 (1975)) and the Bilby, Cottrell and Swinden (Proc. R. Soc.; A272:304 (1963)) and Dugdale (J. Mech. Phys. Sol.; 8:100 (1960)) model of yielding ahead of a crack tip. The procedure consists of independently assessing the risk of failure (a) under linear elastic conditions only and (b) under plastic collapse conditions only. These two limiting criteria are then plotted as a co-ordinate point on a Failure Assessment Diagram. From this a measure of the degree of safety of the structure can be obtained. As examples, several of the HSST vessel tests are used to indicate the simplicity and versatility of the procedure. It is shown how maximum allowable pressures or defect sizes can be obtained and how safety factors can be readily incorporated on any of the parameters used in the assessment. It is also demonstrated how helpful the procedure is in designing not only working structures, but also structures that are to be used for testing. (author)

  5. Enhancement of global flood damage assessments using building material based vulnerability curves

    Science.gov (United States)

    Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves

  6. Failure rate data for fusion safety and risk assessment

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1993-01-01

    The Fusion Safety Program (FSP) at the Idaho National Engineering Laboratory (INEL) conducts safety research in materials, chemical reactions, safety analysis, risk assessment, and in component research and development to support existing magnetic fusion experiments and also to promote safety in the design of future experiments. One of the areas of safety research is applying probabilistic risk assessment (PRA) methods to fusion experiments. To apply PRA, we need a fusion-relevant radiological dose code and a component failure rate data base. This paper describes the FSP effort to develop a failure rate data base for fusion-specific components

  7. Assessing Risks of Mine Tailing Dam Failures

    Science.gov (United States)

    Concha Larrauri, P.; Lall, U.

    2017-12-01

    The consequences of tailings dam failures can be catastrophic for communities and ecosystems in the vicinity of the dams. The failure of the Fundão tailings dam at the Samarco mine in 2015 killed 19 people with severe consequences for the environment. The financial and legal consequences of a tailings dam failure can also be significant for the mining companies. For the Fundão tailings dam, the company had to pay 6 billion dollars in fines and twenty-one executives were charged with qualified murder. There are tenths of thousands of active, inactive, and abandoned tailings dams in the world and there is a need to better understand the hazards posed by these structures to downstream populations and ecosystems. A challenge to assess the risks of tailings dams in a large scale is that many of them are not registered in publicly available databases and there is little information about their current physical state. Additionally, hazard classifications of tailings dams - common in many countries- tend to be subjective, include vague parameter definitions, and are not always updated over time. Here we present a simple methodology to assess and rank the exposure to tailings dams using ArcGIS that removes subjective interpretations. The method uses basic information such as current dam height, storage volume, topography, population, land use, and hydrological data. A hazard rating risk was developed to compare the potential extent of the damage across dams. This assessment provides a general overview of what in the vicinity of the tailings dams could be affected in case of a failure and a way to rank tailings dams that is directly linked to the exposure at any given time. One hundred tailings dams in Minas Gerais, Brazil were used for the test case. This ranking approach could inform the risk management strategy of the tailings dams within a company, and when disclosed, it could enable shareholders and the communities to make decisions on the risks they are taking.

  8. A Big Data Analysis Approach for Rail Failure Risk Assessment.

    Science.gov (United States)

    Jamshidi, Ali; Faghih-Roohi, Shahrzad; Hajizadeh, Siamak; Núñez, Alfredo; Babuska, Robert; Dollevoet, Rolf; Li, Zili; De Schutter, Bart

    2017-08-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  9. Customer system efficiency improvement assessment: Supply curves for transmission and distribution conservation options

    Energy Technology Data Exchange (ETDEWEB)

    Tepel, R.C.; Callaway, J.W.; De Steese, J.G.

    1987-11-01

    This report documents the results of Task 6 in the Customer System Efficiency Improvement (CSEI) Assessment Project. A principal objective of this project is to assess the potential for energy conservation in the transmission and distribution (TandD) systems of electric utilities in the BPA service area. The scope of this assessment covers BPA customers in the Pacific Northwest region and all non-federal TandD systems, including those that currently place no load on the BPA system. Supply curves were developed to describe the conservation resource potentially available from TandD-system efficiency improvements. These supply curves relate the levelized cost of upgrading existing equipment to the estimated amount of energy saved. Stated in this form, the resource represented by TandD loss reductions can be compared with other conservation options and regional electrical generation resources to determine the most cost-effective method of supplying power to the Pacific Northwest. The development of the supply curves required data acquisition and methodology development that are also described in this report. 11 refs., 11 figs., 16 tabs.

  10. A big data analysis approach for rail failure risk assessment

    NARCIS (Netherlands)

    Jamshidi, A.; Faghih Roohi, S.; Hajizadeh, S.; Nunez Vicencio, Alfredo; Babuska, R.; Dollevoet, R.P.B.J.; Li, Z.; De Schutter, B.H.K.

    2017-01-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by

  11. The shape of a strain-based failure assessment diagram

    International Nuclear Information System (INIS)

    Budden, P.J.; Ainsworth, R.A.

    2012-01-01

    There have been a number of recent developments of strain-based fracture assessment approaches, including proposals by Budden [Engng Frac Mech 2006;73:537–52] for a strain-based failure assessment diagram (FAD) related to the conventional stress-based FAD. However, recent comparisons with finite element (FE) data have shown that this proposed strain-based FAD can be non-conservative in some cases, particularly for deeper cracks and materials with little strain-hardening capacity. Therefore, this paper re-examines the shape of the strain-based FAD, guided by these FE analyses and some theoretical analysis. On this basis, modified proposals for the shape of the strain-based FAD are given, including simplified and more detailed options in line with the options available for stress-based FADs in existing fitness-for-service procedures. The proposals are then illustrated by a worked example and by comparison with FE data, which demonstrate that the new proposals are generally conservative. - Highlights: ► The strain-based failure assessment diagram approach to fracture is developed. ► The new approach modifies earlier proposals by Budden. ► A new generic Option 1 strain-based failure assessment diagram is proposed. ► Validation based on finite element J data for plates and cylinders is presented. ► The new approach is generally conservative compared with the finite element data.

  12. Preventing blood transfusion failures: FMEA, an effective assessment method.

    Science.gov (United States)

    Najafpour, Zhila; Hasoumi, Mojtaba; Behzadi, Faranak; Mohamadi, Efat; Jafary, Mohamadreza; Saeedi, Morteza

    2017-06-30

    Failure Mode and Effect Analysis (FMEA) is a method used to assess the risk of failures and harms to patients during the medical process and to identify the associated clinical issues. The aim of this study was to conduct an assessment of blood transfusion process in a teaching general hospital, using FMEA as the method. A structured FMEA was recruited in our study performed in 2014, and corrective actions were implemented and re-evaluated after 6 months. Sixteen 2-h sessions were held to perform FMEA in the blood transfusion process, including five steps: establishing the context, selecting team members, analysis of the processes, hazard analysis, and developing a risk reduction protocol for blood transfusion. Failure modes with the highest risk priority numbers (RPNs) were identified. The overall RPN scores ranged from 5 to 100 among which, four failure modes were associated with RPNs over 75. The data analysis indicated that failures with the highest RPNs were: labelling (RPN: 100), transfusion of blood or the component (RPN: 100), patient identification (RPN: 80) and sampling (RPN: 75). The results demonstrated that mis-transfusion of blood or blood component is the most important error, which can lead to serious morbidity or mortality. Provision of training to the personnel on blood transfusion, knowledge raising on hazards and appropriate preventative measures, as well as developing standard safety guidelines are essential, and must be implemented during all steps of blood and blood component transfusion.

  13. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  14. Validation of self assessment patient knowledge questionnaire for heart failure patients.

    Science.gov (United States)

    Lainscak, Mitja; Keber, Irena

    2005-12-01

    Several studies showed insufficient knowledge and poor compliance to non-pharmacological management in heart failure patients. Only a limited number of validated tools are available to assess their knowledge. The aim of the study was to test our 10-item Patient knowledge questionnaire. The Patient knowledge questionnaire was administered to 42 heart failure patients from Heart failure clinic and to 40 heart failure patients receiving usual care. Construct validity (Pearson correlation coefficient), internal consistency (Cronbach alpha), reproducibility (Wilcoxon signed rank test), and reliability (chi-square test and Student's t-test for independent samples) were assessed. Overall score of the Patient knowledge questionnaire had the strongest correlation to the question about regular weighing (r=0.69) and the weakest to the question about presence of heart disease (r=0.33). There was a strong correlation between question about fluid retention and questions assessing regular weighing, (r=0.86), weight of one litre of water (r=0.86), and salt restriction (r=0.57). The Cronbach alpha was 0.74 and could be improved by exclusion of questions about clear explanation (Chronbach alpha 0.75), importance of fruit, soup, and vegetables (Chronbach alpha 0.75), and self adjustment of diuretic (Chronbach alpha 0.81). During reproducibility testing 91% to 98% of questions were answered equally. Patients from Heart failure clinic scored significantly better than patients receiving usual care (7.9 (1.3) vs. 5.7 (2.2), p<0.001). Patient knowledge questionnaire is a valid and reliable tool to measure knowledge of heart failure patients.

  15. Validation of BS7910:2005 failure assessment diagrams for cracked square hollow section T-, Y- and K-joints

    International Nuclear Information System (INIS)

    Lie, S.T.; Yang, Z.M.; Gho, W.M.

    2009-01-01

    This paper describes the usage of finite element (FE) analyses results to validate the standard BS7910 assessment procedure for the safe design of cracked square hollow section (SHS) T-, Y- and K-joints. In the study, the actual 3D surface cracks obtained from previous fatigue tests have been included in the FE models. An automatic mesh generation program is then developed and used to produce the failure assessment diagram (FAD) through the J-integral method. The ultimate strength of uncracked SHS joints with reduced load bearing areas have been referenced to derive the plastic collapse loads of cracked SHS joints for the development of FAD. These loads have been validated against the previous experimental results. In comparison with the existing standard BS7910 Level 2A/3A FAD curve and the proposed assessment procedure for circular hollow section joints, it is found that a plastic collapse load with a penalty factor of 1.05 will be sufficient for the safe assessment of cracked SHS T, Y, and K-joints under brace end axial loading

  16. Consequences assessment for fuel channel failure with consequential moderator drain

    International Nuclear Information System (INIS)

    Wahba, N.N.; Bayoumi, M.H.

    2002-01-01

    This paper documents the consequences of spontaneous pressure tube/consequential calandria tube rupture followed by the ejection of end fittings (as a result of guillotine failure of pressure tube) leading to the drain of the moderator. The event is postulated to occur in conjunction with an independent failure of Emergency Coolant Injection System (ECIS). The results of the detailed consequence assessments are used to propose a course of action to mitigate the consequences of such an event. A methodology based on a lumped-parameter model was developed to assess the consequences of the postulated event. (author)

  17. Assessment of electronic component failure rates on the basis of experimental data

    International Nuclear Information System (INIS)

    Nitsch, R.

    1991-01-01

    Assessment and prediction of failure rates of electronic systems are made using experimental data derived from laboratory-scale tests or from the practice, as for instance from component failure rate statistics or component repair statistics. Some problems and uncertainties encountered in an evaluation of such field data are discussed in the paper. In order to establish a sound basis for comparative assessment of data from various sources, the items of comparison and the procedure in case of doubt have to be defined. The paper explains two standard methods proposed for practical failure rate definition. (orig.) [de

  18. Assessing the impact of windfarms - the learning curve in Cornwall

    International Nuclear Information System (INIS)

    Hull, A.

    1998-01-01

    This paper uses windfarm application decisions in Cornwall between 1989 and 1995 to illustrate the learning curve of planners in assessing appropriate windfarm locations, and in particular how the process of knowledge construction is constantly reviewed and modified in the light of experience and circumstance. One of the accepted purposes of Environmental Impact Assessment is to predict the possible effects, both beneficial and adverse, of the development on the environment. In practice what is beneficial and what is adverse can be a matter of dispute. The paper draws out the role of the planning system in assessing what is problematic or benign, and the practical strategies and procedures used to assess and control the environmental impacts of wind energy schemes. (Author)

  19. Dynamic airway pressure-time curve profile (Stress Index): a systematic review.

    Science.gov (United States)

    Terragni, Pierpaolo; Bussone, Guido; Mascia, Luciana

    2016-01-01

    The assessment of respiratory mechanics at the bedside is necessary in order to identify the most protective ventilatory strategy. Indeed in the last 20 years, adverse effects of positive ventilation to the lung structures have led to a reappraisal of the objectives of mechanical ventilation. The ventilator setting requires repeated readjustment over the period of mechanical ventilation dependency and careful respiratory monitoring to minimize the risks, preventing further injury and permitting the lung and airways healing. Among the different methods that have been proposed and validated, the analysis of dynamic P-t curve (named Stress Index, SI) represents an adequate tool available at the bedside, repeatable and, therefore, able to identify the amount of overdistension occurring in the daily clinical practice, when modifying positive end-expiratory pressure. In this review we will analyze the evidence that supports respiratory mechanics assessment at the bedside and the application of the dynamic P/t curve profile (SI) to optimize protective ventilation in patients with acute respiratory failure.

  20. Failure detection system risk reduction assessment

    Science.gov (United States)

    Aguilar, Robert B. (Inventor); Huang, Zhaofeng (Inventor)

    2012-01-01

    A process includes determining a probability of a failure mode of a system being analyzed reaching a failure limit as a function of time to failure limit, determining a probability of a mitigation of the failure mode as a function of a time to failure limit, and quantifying a risk reduction based on the probability of the failure mode reaching the failure limit and the probability of the mitigation.

  1. Analysis of dependent failures in risk assessment and reliability evaluation

    International Nuclear Information System (INIS)

    Fleming, K.N.; Mosleh, A.; Kelley, A.P. Jr.; Gas-Cooled Reactors Associates, La Jolla, CA)

    1983-01-01

    The ability to estimate the risk of potential reactor accidents is largely determined by the ability to analyze statistically dependent multiple failures. The importance of dependent failures has been indicated in recent probabilistic risk assessment (PRA) studies as well as in reports of reactor operating experiences. This article highlights the importance of several different types of dependent failures from the perspective of the risk and reliability analyst and provides references to the methods and data available for their analysis. In addition to describing the current state of the art, some recent advances, pitfalls, misconceptions, and limitations of some approaches to dependent failure analysis are addressed. A summary is included of the discourse on this subject, which is presented in the Institute of Electrical and Electronics Engineers/American Nuclear Society PRA Procedures Guide

  2. Assessment of left ventricular function in patients with atrial fibrillation by left ventricular filling and function curves determined by ECG gated blood pool scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Inagaki, Suetsugu

    1986-06-01

    Accurate cardiac function in patients with atrial fibrillation (Af) is difficult to assess, since a wide fluctuation of cardiac cycle makes the ventricular hemodynamics variable. Although ECG gated blood pool scintigraphy (EGBPS) is useful to evaluate left ventricular (LV) function, a conventional EGBPS might have a problem in applying to Af. Therefore, a new processing algorithm was devised to make multiple gated images discriminated by preceding R-R intervals (PRR), and LV filling and function curves were obtained in 62 patients with Af to evaluate LV function. LV filling curve, obtained by plotting end-diastolic volume (EDV) againt PRR, demonstrated that the blood filling was impaired in mitral stenosis and constrictive pericarditis, but recovered after mitral commissurotomy. LV function curve, by plotting stroke volume (SV) againt EDV, was quantitatively analysed by the indices such as Slope and Position. Both indices reduced significantly in heart failure. When compared among underlying diseases individually, the indices decreased in the following order; lone Af, hyperthyroidism, senile Af, hypertension, mitral valve disease, ischemic heart disease, dilated cardiomyopathy and aortic regurgitation. After the treatment with digitalis and/or diuretics, left and upward shift of function curve was observed. The rise in heart rate by atropine infusion made Slope and Position unchanged, and which implied that function curve was little influenced by heart rate per se. The rise in systolic blood pressure by angiotensin-II infusion caused shifts in function curve to rightward and downward. Downward shift, mostly seen in patients with gentler slope in control state, may imply afterload mismatch due to a decrease in preload reserve. (J.P.N.).

  3. VALIDATING A COMPUTER-BASED TECHNIQUE FOR ASSESSING STABILITY TO FAILURE STRESS

    Directory of Open Access Journals (Sweden)

    I. F. Arshava

    2013-03-01

    Full Text Available An upsurge of interest in the implicit personality assessment, currently observed both in personality psycho-diagnostics and in experimental studies of social attitudes and prejudices, signals the shifting of researchers’ attention from de?ning between-person personality taxonomy to specifying comprehensive within-person processes, the dynamics of which can be captured at the level of an individual case. This research examines the possibility of the implicit assessment of the individual’s stability vs. susceptibility to failure stress by comparing the degrees of ef?cacy in the voluntary self-regulation of a computer-simulated information-processing activity under different conditions (patent of Ukraine № 91842, issued in 2010. By exposing two groups of participants (university undergraduates to processing the information, the scope of which exceeds the human short-term memory capacity at one of the stages of the modeled activity an unexpected and unavoidable failure is elicited. The participants who retain stability of their self-regulation behavior after having been exposed to failure, i.e. who keep processing information as effectively as they did prior to failure, are claimed to retain homeostasis and thus possess emotional stability. Those, who loose homeostasis after failure and display lower standards of self-regulation behavior, are considered to be susceptible to stress. The validity of the suggested type of the implicit diagnostics was empirically tested by clustering (K-means algorithm two samples of the participants on the  properties of their self-regulation behavior and testing between-cluster differences by a set of the explicitly assessed variables: Action control ef?cacy (Kuhl, 2001, preferred strategies of Coping with Stressful Situations (Endler, Parker, 1990,  Purpose-in-Life orientation (a Russian version of the test by Crumbaugh and Maholick, modi?ed by D.Leontiev, 1992, Psychological Well-being (Ryff, 1989

  4. Assessment selection in human-automation interaction studies: The Failure-GAM2E and review of assessment methods for highly automated driving.

    Science.gov (United States)

    Grane, Camilla

    2018-01-01

    Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Heart failure: a weak link in CHA2 DS2 -VASc.

    Science.gov (United States)

    Friberg, Leif; Lund, Lars H

    2018-02-15

    In atrial fibrillation, stroke risk is assessed by the CHA 2 DS 2 -VASc score. Heart failure is included in CHA 2 DS 2 -VASc, but the rationale is uncertain. Our objective was to test if heart failure is a risk factor for stroke, independent of other risk factors in CHA 2 DS 2 -VASc. We studied 300 839 patients with atrial fibrillation in the Swedish Patient Register 2005-11. Three definitions of heart failure were used in order to assess the robustness of the results. In the main analysis, heart failure was defined by a hospital discharge diagnosis of heart failure as first or second diagnosis and a filled prescription of a diuretic within 3 months before index + 30 days. The second definition counted first or second discharge diagnoses failure diagnosis in open or hospital care before index + 30 days. Associations with outcomes were assessed with multivariable Cox analyses. Patients with heart failure were older (80.5 vs. 74.0 years, P failure and 3.1% without. Adjustment for the cofactors in CHA 2 DS 2 -VASc eradicated the difference in stroke risk between patients with and without heart failure (hazard ratio 1.01 with 95% confidence interval 0.96-1.05). The area under the receiver operating characteristic curve for CHA 2 DS 2 -VASc was not improved by points for heart failure. A clinical diagnosis of heart failure was not an independent risk factor for stroke in patients with atrial fibrillation, which may have implications for anticoagulation management. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  6. Risk assessment of tailings facility dam failure

    OpenAIRE

    Hadzi-Nikolova, Marija; Mirakovski, Dejan; Stefanova, Violeta

    2011-01-01

    This paper presents the consequences of tailings facility dam failure and therefore the needs for its risk assessment. Tailings are fine-grained wastes of the mining industry, output as slurries, due to mixing with water during mineral processing. Tailings dams vary a lot as it is affected by: tailings characteristics and mill output, site characteristics as: topography, hydrology, geology, groundwater, seismicity and available material and disposal methods. The talings which accumulat...

  7. [Predictive factors for failure of non-invasive positive pressure ventilation in immunosuppressed patients with acute respiratory failure].

    Science.gov (United States)

    Jia, Xiangli; Yan, Ci; Xu, Sicheng; Gu, Xingli; Wan, Qiufeng; Hu, Xinying; Li, Jingwen; Liu, Guangming; Caikai, Shareli; Guo, Zhijin

    2018-02-01

    To evaluate the predictive factors for failure of non-invasive positive pressure ventilation (NIPPV) in immunosuppressed patients with acute respiratory failure (ARF). The clinical data of 118 immuno-deficient patients treated with NIPPV in the respiratory and intensive care unit (RICU) of the First Affiliated Hospital of Xinjiang Medical University from January 2012 to August 2017 were retrospectively analyzed. The patients were divided into a non-endotracheal intubation (ETI) group (n = 62) and ETI group (n = 56) according to whether ETI was performed during the hospitalization period or not. Each observed indicator was analyzed by univariate analysis, and factors leading to failure of NIPPV were further analyzed by Logistic regression. Receiver operating characteristic (ROC) curve was plotted to evaluate the predictive value of risk factors for failure of NIPPV in immunosuppressed patients with ARF. The non-intubation rate for NIPPV in immunosuppressed patients was 50.8% (60/118). Compared with the non-ETI group, the body temperature, pH value in the ETI group were significantly increased, the partial pressure of arterial carbon dioxide (PaCO 2 ) was significantly decreased, the ratio of oxygenation index (PaO 2 /FiO 2 ) failure of NIPPV. ROC curve analysis showed that the APACHE II score ≥ 20 and PaO 2 /FiO 2 failure of NIPPV, the area under ROC curve (AUC) of the APACHE II score ≥ 20 was 0.787, the sensitivity was 83.93%, the specificity was 69.35%, the positive predict value (PPV) was 71.21%, the negative predict value (NPV) was 82.69%, the positive likelihood ratio (PLR) was 2.74, the negative likelihood ratio (NLR) was 0.23, and Youden index was 0.53; the AUC of PaO 2 /FiO 2 failure of NIPPV in immunocompromised patients.

  8. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    International Nuclear Information System (INIS)

    Harry, T; Manger, R; Cervino, L; Pawlicki, T

    2016-01-01

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this work was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.

  9. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Harry, T [Oregon State University, Corvallis, OR (United States); University of California, San Diego, La Jolla, CA (United States); Manger, R; Cervino, L; Pawlicki, T [University of California, San Diego, La Jolla, CA (United States)

    2016-06-15

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this work was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.

  10. Probabilistic assessment of roadway departure risk in a curve

    Science.gov (United States)

    Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.

    2011-10-01

    Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.

  11. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Yee, Eric [KEPCO International Nuclear Graduate School, Dept. of Nuclear Power Plant Engineering, Ulsan (Korea, Republic of)

    2017-03-15

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

  12. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    International Nuclear Information System (INIS)

    Yee, Eric

    2017-01-01

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered

  13. Residual stress effects in LMFBR fracture assessment procedures

    International Nuclear Information System (INIS)

    Hooton, D.G.

    1984-01-01

    Two post-yield fracture mechanics methods, which have been developed into fully detailed failure assessment procedures for ferritic structures, have been reviewed from the point of view of the manner in which as-welded residual stress effects are incorporated, and comparisons then made with finite element and theoretical models of centre-cracked plates containing residual/thermal stresses in the form of crack-driving force curves. Applying the procedures to austenitic structures, comparisons are made in terms of failure assessment curves and it is recommended that the preferred method for the prediction of critical crack sizes in LMFBR austenitic structures containing as-welded residual stresses is the CEGB-R6 procedure based on a flow stress defined at 3% strain in the parent plate. When the prediction of failure loads in such structures is required, it is suggested that the CEGB-R6 procedure be used with residual/thermal stresses factored to give a maximum total stress of flow stress magnitude

  14. Predictive Performance of the Simplified Acute Physiology Score (SAPS) II and the Initial Sequential Organ Failure Assessment (SOFA) Score in Acutely Ill Intensive Care Patients

    DEFF Research Database (Denmark)

    Granholm, Anders; Møller, Morten Hylander; Kragh, Mette

    2016-01-01

    PURPOSE: Severity scores including the Simplified Acute Physiology Score (SAPS) II and the Sequential Organ Failure Assessment (SOFA) score are used in intensive care units (ICUs) to assess disease severity, predict mortality and in research. We aimed to assess the predictive performance of SAPS II...... compared the discrimination of SAPS II and initial SOFA scores, compared the discrimination of SAPS II in our cohort with the original cohort, assessed the calibration of SAPS II customised to our cohort, and compared the discrimination for 90-day mortality vs. in-hospital mortality for both scores....... Discrimination was evaluated using areas under the receiver operating characteristics curves (AUROC). Calibration was evaluated using Hosmer-Lemeshow's goodness-of-fit Ĉ-statistic. RESULTS: AUROC for in-hospital mortality was 0.80 (95% confidence interval (CI) 0.77-0.83) for SAPS II and 0.73 (95% CI 0...

  15. Failure Modes Taxonomy for Reliability Assessment of Digital Instrumentation and Control Systems for Probabilistic Risk Analysis - Failure modes taxonomy for reliability assessment of digital I and C systems for PRA

    International Nuclear Information System (INIS)

    Amri, A.; Blundell, N.; ); Authen, S.; Betancourt, L.; Coyne, K.; Halverson, D.; Li, M.; Taylor, G.; Bjoerkman, K.; Brinkman, H.; Postma, W.; Bruneliere, H.; Chirila, M.; Gheorge, R.; Chu, L.; Yue, M.; Delache, J.; Georgescu, G.; Deleuze, G.; Quatrain, R.; Thuy, N.; Holmberg, J.-E.; Kim, M.C.; Kondo, K.; Mancini, F.; Piljugin, E.; Stiller, J.; Sedlak, J.; Smidts, C.; Sopira, V.

    2015-01-01

    Digital protection and control systems appear as upgrades in older nuclear power plants (NPP), and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital systems, there is a need to quantitatively assess the reliability of the digital systems in a justifiable manner. Due to the many unique attributes of digital systems (e.g., functions are implemented by software, units of the system interact in a communication network, faults can be identified and handled online), a number of modelling and data collection challenges exist, and international consensus on the reliability modelling has not yet been reached. The objective of the task group called DIGREL has been to develop a taxonomy of failure modes of digital components for the purposes of probabilistic risk analysis (PRA). An activity focused on the development of a common taxonomy of failure modes is seen as an important step towards standardised digital instrumentation and control (I and C) reliability assessment techniques for PRA. Needs from PRA has guided the work, meaning, e.g., that the I and C system and its failures are studied from the point of view of their functional significance point of view. The taxonomy will be the basis of future modelling and quantification efforts. It will also help to define a structure for data collection and to review PRA studies. The proposed failure modes taxonomy has been developed by first collecting examples of taxonomies provided by the task group organisations. This material showed some variety in the handling of I and C hardware failure modes, depending on the context where the failure modes have been defined. Regarding the software part of I and C, failure modes defined in NPP PRAs have been simple - typically a software CCF failing identical processing units. The DIGREL task group has defined a new failure modes taxonomy based on a hierarchical definition of five levels of abstraction: 1. system level (complete

  16. Development of an Electronic Medical Record Based Alert for Risk of HIV Treatment Failure in a Low-Resource Setting

    Science.gov (United States)

    Puttkammer, Nancy; Zeliadt, Steven; Balan, Jean Gabriel; Baseman, Janet; Destiné, Rodney; Domerçant, Jean Wysler; France, Garilus; Hyppolite, Nathaelf; Pelletier, Valérie; Raphael, Nernst Atwood; Sherr, Kenneth; Yuhas, Krista; Barnhart, Scott

    2014-01-01

    Background The adoption of electronic medical record systems in resource-limited settings can help clinicians monitor patients' adherence to HIV antiretroviral therapy (ART) and identify patients at risk of future ART failure, allowing resources to be targeted to those most at risk. Methods Among adult patients enrolled on ART from 2005–2013 at two large, public-sector hospitals in Haiti, ART failure was assessed after 6–12 months on treatment, based on the World Health Organization's immunologic and clinical criteria. We identified models for predicting ART failure based on ART adherence measures and other patient characteristics. We assessed performance of candidate models using area under the receiver operating curve, and validated results using a randomly-split data sample. The selected prediction model was used to generate a risk score, and its ability to differentiate ART failure risk over a 42-month follow-up period was tested using stratified Kaplan Meier survival curves. Results Among 923 patients with CD4 results available during the period 6–12 months after ART initiation, 196 (21.2%) met ART failure criteria. The pharmacy-based proportion of days covered (PDC) measure performed best among five possible ART adherence measures at predicting ART failure. Average PDC during the first 6 months on ART was 79.0% among cases of ART failure and 88.6% among cases of non-failure (pART initiation were added to PDC, the risk score differentiated between those who did and did not meet failure criteria over 42 months following ART initiation. Conclusions Pharmacy data are most useful for new ART adherence alerts within iSanté. Such alerts offer potential to help clinicians identify patients at high risk of ART failure so that they can be targeted with adherence support interventions, before ART failure occurs. PMID:25390044

  17. How to interpret safety critical failures in risk and reliability assessments

    International Nuclear Information System (INIS)

    Selvik, Jon Tømmerås; Signoret, Jean-Pierre

    2017-01-01

    Management of safety systems often receives high attention due to the potential for industrial accidents. In risk and reliability literature concerning such systems, and particularly concerning safety-instrumented systems, one frequently comes across the term ‘safety critical failure’. It is a term associated with the term ‘critical failure’, and it is often deduced that a safety critical failure refers to a failure occurring in a safety critical system. Although this is correct in some situations, it is not matching with for example the mathematical definition given in ISO/TR 12489:2013 on reliability modeling, where a clear distinction is made between ‘safe failures’ and ‘dangerous failures’. In this article, we show that different interpretations of the term ‘safety critical failure’ exist, and there is room for misinterpretations and misunderstandings regarding risk and reliability assessments where failure information linked to safety systems are used, and which could influence decision-making. The article gives some examples from the oil and gas industry, showing different possible interpretations of the term. In particular we discuss the link between criticality and failure. The article points in general to the importance of adequate risk communication when using the term, and gives some clarification on interpretation in risk and reliability assessments.

  18. An investigation on vulnerability assessment of steel structures with thin steel shear wall through development of fragility curves

    OpenAIRE

    Mohsen Gerami; Saeed Ghaffari; Amir Mahdi Heidari Tafreshi

    2017-01-01

    Fragility curves play an important role in damage assessment of buildings. Probability of damage induction to the structure against seismic events can be investigated upon generation of afore mentioned curves. In current research 360 time history analyses have been carried out on structures of 3, 10 and 20 story height and subsequently fragility curves have been adopted. The curves are developed based on two indices of inter story drifts and equivalent strip axial strains of the shear wall. T...

  19. Impact of Different Surgeons on Dental Implant Failure.

    Science.gov (United States)

    Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    To assess the influence of several factors on the prevalence of dental implant failure, with special consideration of the placement of implants by different dental surgeons. This retrospective study is based on 2,670 patients who received 10,096 implants at one specialist clinic. Only the data of patients and implants treated by surgeons who had inserted a minimum of 200 implants at the clinic were included. Kaplan-Meier curves were stratified with respect to the individual surgeon. A generalized estimating equation (GEE) method was used to account for the fact that repeated observations (several implants) were placed in a single patient. The factors bone quantity, bone quality, implant location, implant surface, and implant system were analyzed with descriptive statistics separately for each individual surgeon. A total of 10 surgeons were eligible. The differences between the survival curves of each individual were statistically significant. The multivariate GEE model showed the following variables to be statistically significant: surgeon, bruxism, intake of antidepressants, location, implant length, and implant system. The surgeon with the highest absolute number of failures was also the one who inserted the most implants in sites of poor bone and used turned implants in most cases, whereas the surgeon with the lowest absolute number of failures used mainly modern implants. Separate survival analyses of turned and modern implants stratified for the individual surgeon showed statistically significant differences in cumulative survival. Different levels of failure incidence could be observed between the surgeons, occasionally reaching significant levels. Although a direct causal relationship could not be ascertained, the results of the present study suggest that the surgeons' technique, skills, and/or judgment may negatively influence implant survival rates.

  20. Cost development of future technologies for power generation-A study based on experience curves and complementary bottom-up assessments

    International Nuclear Information System (INIS)

    Neij, Lena

    2008-01-01

    Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development

  1. Elastic-plastic fracture assessment using a J-R curve by direct method

    International Nuclear Information System (INIS)

    Asta, E.P.

    1996-01-01

    In the elastic-plastic evaluation methods, based on J integral and tearing modulus procedures, an essential input is the material fracture resistance (J-R) curve. In order to simplify J-R determination direct, a method from load-load point displacement records of the single specimen tests may be employed. This procedure has advantages such as avoiding accuracy problems of the crack growth measuring devices and reducing testing time. This paper presents a structural integrity assessment approach, for ductile fracture, using the J-R obtained by a direct method from small single specimen fracture tests. The J-R direct method was carried out by means of a developed computational program based on theoretical elastic-plastic expressions. A comparative evaluation between the direct method J resistance curves and those obtained by the standard testing methodology on typical pressure vessel steels has been made. The J-R curves estimated from the direct method give an acceptable agreement with the approach proposed in this study which is reliable to use for engineering determinations. (orig.)

  2. Correlation of radiological assessment of congestive heart failure with left ventricular end-diastolic pressure

    International Nuclear Information System (INIS)

    Herman, P.G.; Kahn, A.; Kallman, C.E.; Rojas, K.A.; Bodenheimer, M.M.

    1988-01-01

    Left ventricular end-diastolic pressure (LVEDP) has been considered a reliable indicator of left ventricular function. The purpose of this study was to correlate the radiologic assessment of congestive heart failure with LVEDP. The population of the study consisted of 85 consecutive cases in four ranges of LVEDP ( 24). The PA chest radiographs obtained 1 day prior to cardiac catherization were assessed for radiological evidence of congestive heart failure and were graded from normal to abnormal (0-3). The results will be summarized in the authors' presentation. The discordance of radiological assessment of congestive heart failure in patients with elevated LVEDP will be discussed in light of recent advances in pathophysiologic understanding of left ventricular function and the impact of new classes of drugs in the management of these patients

  3. Assessment of congestive heart failure in chest radiographs

    International Nuclear Information System (INIS)

    Henriksson, L.; Sundin, A.; Smedby, Oe.; Albrektsson, P.

    1990-01-01

    The effect of observer variations and film-screen quality on the diagnosis of congestive heart failure based on chest radiographs was studied in 27 patients. For each patient, two films were exposed, one with the Kodak Lanex Medium system and one with the Agfa MR 400 system. The films were presented to three observers who assessed the presence of congestive heart failure on a three-graded scale. The results showed no significant difference between the two systems but large systematic differences between the observers. There were also differences between the two ratings by the same observer that could not be explained by the film-screen factor. It is concluded that the choice between these two systems is of little importance in view of the interobserver and intraobserver variability that can exist within the same department. (orig.)

  4. An experimental assessment of proposed universal yield curves for secondary electron emission

    International Nuclear Information System (INIS)

    Salehi, M.; Flinn, E.A.

    1980-01-01

    A variety of 'Universal Yield Curves' for the secondary emission process have been proposed. A series of precise measurements of the secondary emission properties of a range of related amorphous semiconducting materials, made under UHV on freshly vacuum-cleaved surfaces, and covering a wide range of primary energies, have recently made possible an accurate assessment of the validity of the various UYC's suggested. It is found that no truly universal curve exists; the atomic number of the target material plays an important part in determining the secondary emission properties. Agarwal's (Proc. Phys. Soc.; 71: 851 (1958)) semi-empirical expression, which takes account of the atomic number and weight, is found to give good agreement for all the materials studied. Further theoretical investigation is required. (author)

  5. Application of failure assessment diagram methods to cracked straight pipes and elbows

    International Nuclear Information System (INIS)

    Ainsworth, R.A.; Gintalas, M.; Sahu, M.K.; Chattopadhyay, J.; Dutta, B.K.

    2016-01-01

    This paper reports fracture assessments of large-scale straight pipes and elbows of various pipe diameters and crack sizes. The assessments estimate the load for ductile fracture initiation using the failure assessment diagram method. Recent solutions in the literature for stress intensity factor and limit load provide the analysis inputs. An assessment of constraint effects is also performed using recent solutions for elastic T-stress. It is found that predictions of initiation load are close to the experimental values for straight pipes under pure bending. For elbows, there is generally increased conservatism in the sense that the experimental loads are greater than those predicted. The effects of constraint are found not to be a major contributor to the initiation fracture assessments but may have some influence on the ductile crack extension. - Highlights: • This paper presents assessments of the loads for ductile fracture initiation in 21 large-scale piping tests. • Modern stress intensity factor and limit load solutions were used for standard failure assessment diagram methods. • This leads to generally accurate assessments of the loads for ductile crack initiation. • The effects of constraint are found not to be a major contributor to the initiation fracture assessments.

  6. Ambiguity assessment of small-angle scattering curves from monodisperse systems.

    Science.gov (United States)

    Petoukhov, Maxim V; Svergun, Dmitri I

    2015-05-01

    A novel approach is presented for an a priori assessment of the ambiguity associated with spherically averaged single-particle scattering. The approach is of broad interest to the structural biology community, allowing the rapid and model-independent assessment of the inherent non-uniqueness of three-dimensional shape reconstruction from scattering experiments on solutions of biological macromolecules. One-dimensional scattering curves recorded from monodisperse systems are nowadays routinely utilized to generate low-resolution particle shapes, but the potential ambiguity of such reconstructions remains a major issue. At present, the (non)uniqueness can only be assessed by a posteriori comparison and averaging of repetitive Monte Carlo-based shape-determination runs. The new a priori ambiguity measure is based on the number of distinct shape categories compatible with a given data set. For this purpose, a comprehensive library of over 14,000 shape topologies has been generated containing up to seven beads closely packed on a hexagonal grid. The computed scattering curves rescaled to keep only the shape topology rather than the overall size information provide a `scattering map' of this set of shapes. For a given scattering data set, one rapidly obtains the number of neighbours in the map and the associated shape topologies such that in addition to providing a quantitative ambiguity measure the algorithm may also serve as an alternative shape-analysis tool. The approach has been validated in model calculations on geometrical bodies and its usefulness is further demonstrated on a number of experimental X-ray scattering data sets from proteins in solution. A quantitative ambiguity score (a-score) is introduced to provide immediate and convenient guidance to the user on the uniqueness of the ab initio shape reconstruction from the given data set.

  7. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    Lo, T.; Woo, H.H.; Holman, G.S.; Chou, C.K.

    1984-02-01

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  8. Sex- and Site-Specific Normative Data Curves for HR-pQCT.

    Science.gov (United States)

    Burt, Lauren A; Liang, Zhiying; Sajobi, Tolulope T; Hanley, David A; Boyd, Steven K

    2016-11-01

    The purpose of this study was to develop age-, site-, and sex-specific centile curves for common high-resolution peripheral quantitative computed tomography (HR-pQCT) and finite-element (FE) parameters for males and females older than 16 years. Participants (n = 866) from the Calgary cohort of the Canadian Multicentre Osteoporosis Study (CaMos) between the ages of 16 and 98 years were included in this study. Participants' nondominant radius and left tibia were scanned using HR-pQCT. Standard and automated segmentation methods were performed and FE analysis estimated apparent bone strength. Centile curves were generated for males and females at the tibia and radius using the generalized additive models for location, scale, and shape (GAMLSS) package in R. After GAMLSS analysis, age-, sex-, and site-specific centiles (10th, 25th, 50th, 75th, 90th) for total bone mineral density and trabecular number as well as failure load have been calculated. Clinicians and researchers can use these reference curves as a tool to assess bone health and changes in bone quality. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  9. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  10. New method of safety assessment for pressure vessel of nuclear power plant--brief introduction of master curve approach

    International Nuclear Information System (INIS)

    Yang Wendou

    2011-01-01

    The new Master Curve Method is called as a revolutionary advance to the assessment of- reactor pressure vessel integrity in USA. This paper explains the origin, basis and standard of the Master Curve from the reactor pressure-temperature limit curve which assures the safety of nuclear power plant. According to the characteristics of brittle fracture which is greatly susceptible to the microstructure, the theory and the test method of the Master Curve as well as its statistical law which can be modeled using Weibull distribution are described in this paper. The meaning, advantage, application and importance of the Master Curve as well as the relation between the Master Curve and nuclear power safety are understood from the fitting formula for the fracture toughness database by Weibull distribution model. (author)

  11. Failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  12. Development of an electronic medical record based alert for risk of HIV treatment failure in a low-resource setting.

    Directory of Open Access Journals (Sweden)

    Nancy Puttkammer

    Full Text Available The adoption of electronic medical record systems in resource-limited settings can help clinicians monitor patients' adherence to HIV antiretroviral therapy (ART and identify patients at risk of future ART failure, allowing resources to be targeted to those most at risk.Among adult patients enrolled on ART from 2005-2013 at two large, public-sector hospitals in Haiti, ART failure was assessed after 6-12 months on treatment, based on the World Health Organization's immunologic and clinical criteria. We identified models for predicting ART failure based on ART adherence measures and other patient characteristics. We assessed performance of candidate models using area under the receiver operating curve, and validated results using a randomly-split data sample. The selected prediction model was used to generate a risk score, and its ability to differentiate ART failure risk over a 42-month follow-up period was tested using stratified Kaplan Meier survival curves.Among 923 patients with CD4 results available during the period 6-12 months after ART initiation, 196 (21.2% met ART failure criteria. The pharmacy-based proportion of days covered (PDC measure performed best among five possible ART adherence measures at predicting ART failure. Average PDC during the first 6 months on ART was 79.0% among cases of ART failure and 88.6% among cases of non-failure (p<0.01. When additional information including sex, baseline CD4, and duration of enrollment in HIV care prior to ART initiation were added to PDC, the risk score differentiated between those who did and did not meet failure criteria over 42 months following ART initiation.Pharmacy data are most useful for new ART adherence alerts within iSanté. Such alerts offer potential to help clinicians identify patients at high risk of ART failure so that they can be targeted with adherence support interventions, before ART failure occurs.

  13. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-10-01

    When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model

  14. Assessing neural activity related to decision-making through flexible odds ratio curves and their derivatives.

    Science.gov (United States)

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Pardo-Vazquez, Jose L; Leboran, Victor; Molenberghs, Geert; Faes, Christel; Acuña, Carlos

    2011-06-30

    It is well established that neural activity is stochastically modulated over time. Therefore, direct comparisons across experimental conditions and determination of change points or maximum firing rates are not straightforward. This study sought to compare temporal firing probability curves that may vary across groups defined by different experimental conditions. Odds-ratio (OR) curves were used as a measure of comparison, and the main goal was to provide a global test to detect significant differences of such curves through the study of their derivatives. An algorithm is proposed that enables ORs based on generalized additive models, including factor-by-curve-type interactions to be flexibly estimated. Bootstrap methods were used to draw inferences from the derivatives curves, and binning techniques were applied to speed up computation in the estimation and testing processes. A simulation study was conducted to assess the validity of these bootstrap-based tests. This methodology was applied to study premotor ventral cortex neural activity associated with decision-making. The proposed statistical procedures proved very useful in revealing the neural activity correlates of decision-making in a visual discrimination task. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Advanced composites structural concepts and materials technologies for primary aircraft structures: Structural response and failure analysis

    Science.gov (United States)

    Dorris, William J.; Hairr, John W.; Huang, Jui-Tien; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    Non-linear analysis methods were adapted and incorporated in a finite element based DIAL code. These methods are necessary to evaluate the global response of a stiffened structure under combined in-plane and out-of-plane loading. These methods include the Arc Length method and target point analysis procedure. A new interface material model was implemented that can model elastic-plastic behavior of the bond adhesive. Direct application of this method is in skin/stiffener interface failure assessment. Addition of the AML (angle minus longitudinal or load) failure procedure and Hasin's failure criteria provides added capability in the failure predictions. Interactive Stiffened Panel Analysis modules were developed as interactive pre-and post-processors. Each module provides the means of performing self-initiated finite elements based analysis of primary structures such as a flat or curved stiffened panel; a corrugated flat sandwich panel; and a curved geodesic fuselage panel. This module brings finite element analysis into the design of composite structures without the requirement for the user to know much about the techniques and procedures needed to actually perform a finite element analysis from scratch. An interactive finite element code was developed to predict bolted joint strength considering material and geometrical non-linearity. The developed method conducts an ultimate strength failure analysis using a set of material degradation models.

  16. A curved beam test specimen for determining the interlaminar tensile strength of a laminated composite

    Science.gov (United States)

    Hiel, Clement C.; Sumich, Mark; Chappell, David P.

    1991-01-01

    A curved beam type of test specimen is evaluated for use in determining the through-the-thickness strength of laminated composites. Two variations of a curved beam specimen configuration (semicircular and elliptical) were tested to failure using static and fatigue loads. The static failure load for the semicircular specimens was found to be highly sensitive to flaw content, with the specimens falling into two distinct groups. This result supports the use of proof testing for structural validation. Static design allowables are derived based on the Weibull distribution. Fatigue data indicates no measured increase in specimen compliance prior to final fracture. All static and fatigue failures at room temperature dry conditions occurred catastrophically. The elliptical specimens demonstrated unusually high failure strengths indicating the presence of phenomena requiring further study. Results are also included for specimens exposed to a wet environment showing a matrix strength degradation due to moisture content. Further testing is underway to evaluate a fatigue methodology for matrix dominated failures based on residual static strength (wearout).

  17. Assessment of the French and US embrittlement trend curves applied to RPV materials irradiated in the BR2 materials test reactor

    International Nuclear Information System (INIS)

    Chaouadi, R.; Gerard, R.; Boagaerts, A.S.

    2011-01-01

    The irradiation embrittlement of reactor pressure vessels (RPVs) in monitored through the surveillance programs associated with predictive formulas, the so-called embrittlement trend curves. These formulas are generally empirically derived and contain the major embrittlement-inducing elements such as copper, nickel and phosphorus. There are a number of such trend curves used in various regulatory guides used in the US, France, Germany, Russia and Japan. These trend curves are often supported by surveillance data and regularly assessed in view of updated surveillance databases. With the recent worldwide move towards life extension of existing reactors above their initially-scheduled lifetime of 40 years, adequate and accurate modeling of irradiation embrittlement becomes a concern for long term operation. The aim of this work is to assess the performance of the embrittlement trend curves used in a regulatory perspective. The work presented here is limited to US and French trend curves because the reactor pressure vessels of the Belgian nuclear power plants are either Westinghouse or Framatome design. The chemical composition of the Belgian RPVs being very close to the one of the French 900 MW units, the French trend curve is used except for the Doel 1-2 units for which these curves are not applicable due to the higher copper content of the welds. In this case, the U.S. trend curves are used. The aim of this work is to evaluate the performance of the embrittlement trend curves used in a regulatory perspective to represent the experimental data obtained in the BR2 reactor. In particular, the French (FIM, FIS) and the US (Reg. Guide 1.99 Rev. 2, ASTM E900-02, EWO and EONY) formulas are of prime interest. The results obtained clearly show that the French trend curves tend to over-estimate the actual irradiation hardening while the US curves under-estimate it. Within the long term operation perspective, both over- and under-estimating are undesirable and therefore the

  18. Physics of Failure as a Basis for Solder Elements Reliability Assessment in Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    description of the reliability. A physics of failure approach is applied. A SnAg solder component used in power electronics is used as an example. Crack propagation in the SnAg solder is modeled and a model to assess the accumulated plastic strain is proposed based on a physics of failure approach. Based...... on the proposed model it is described how to find the accumulated linear damage and reliability levels for a given temperature loading profile. Using structural reliability methods the reliability levels of the electrical components are assessed by introducing scale factors for stresses....

  19. Global experience curves for wind farms

    International Nuclear Information System (INIS)

    Junginger, M.; Faaij, A.; Turkenburg, W.C.

    2005-01-01

    In order to forecast the technological development and cost of wind turbines and the production costs of wind electricity, frequent use is made of the so-called experience curve concept. Experience curves of wind turbines are generally based on data describing the development of national markets, which cause a number of problems when applied for global assessments. To analyze global wind energy price development more adequately, we compose a global experience curve. First, underlying factors for past and potential future price reductions of wind turbines are analyzed. Also possible implications and pitfalls when applying the experience curve methodology are assessed. Second, we present and discuss a new approach of establishing a global experience curve and thus a global progress ratio for the investment cost of wind farms. Results show that global progress ratios for wind farms may lie between 77% and 85% (with an average of 81%), which is significantly more optimistic than progress ratios applied in most current scenario studies and integrated assessment models. While the findings are based on a limited amount of data, they may indicate faster price reduction opportunities than so far assumed. With this global experience curve we aim to improve the reliability of describing the speed with which global costs of wind power may decline

  20. A human reliability analysis (HRA) method for identifying and assessing the error of commission (EOC) from a diagnosis failure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Yun; Kang, Dae Il

    2005-01-01

    The study deals with a method for systematically identifying and assessing the EOC events that might be caused from a diagnosis failure or misdiagnosis of the expected events in accident scenarios of nuclear power plants. The method for EOC identification and assessment consists of three steps: analysis of the potential for a diagnosis failure (or misdiagnosis), identification of the EOC events from the diagnosis failure, quantitative assessment of the identified EOC events. As a tool for analysing a diagnosis failure, the MisDiagnosis Tree Analysis (MDTA) technique is proposed with the taxonomy of misdiagnosis causes. Also, the guidance on the identification of EOC events and the classification system and data are given for quantitiative assessment. As an applicaton of the proposed method, the EOCs identification and assessment for Younggwang 3 and 4 plants and their impact on the plant risk were performed. As the result, six events or event sequences were considered for diagnosis failures and about 20 new Human Failure Events (HFEs) involving EOCs were identified. According to the assessment of the risk impact of the identified HFEs, they increase the CDF by 11.4 % of the current CDF value, which corresponds to 10.2 % of the new CDF. The small loss of coolant accident (SLOCA) turned out to be a major contributor to the increase of CDF resulting in 9.2 % increaseof the current CDF.

  1. Green tax reform, marginal revenue of wage income taxes, and the wage curve. A brief note

    International Nuclear Information System (INIS)

    Ziesemer, T.

    2002-01-01

    It has been shown elsewhere (Schneider, 1997) that the success of a green tax reform depends crucially on a small slope of the wage curve of an efficiency wage model in which production occurs using a second factor E, energy or emissions. Also elsewhere (Scholz, 1998) it was revealed that there is a second necessary condition that the marginal revenue of the wage income tax is negative. In this note we show that (1) these two conditions are not independent, but rather depend both on the slope of the wage curve; and (2) if Schneider's condition of a sufficiently flat wage curve is fulfilled, marginal revenue of wage income taxes must be negative. By implication, both the green tax reform and the sign of the marginal revenue of wage income taxes depend on the slope of the wage curve which allows to distinguish three cases of a tax reform: (a) a double dividend for a very small slope of the wage curve (Schneider's case); (b) failure of unemployment reduction (Scholz' case) for a very steep wage curve; (c) failure of emission reduction for an intermediate case of a wage curve slope

  2. Seismic Margin Assessment for Research Reactor using Fragility based Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Shinyoung; Oh, Jinho; Lee, Jong-Min; Ryu, Jeong-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The research reactor has been often subjected to external hazards during the design lifetime. Especially, a seismic event can be one of significant threats to the failure of structure system of the research reactor. This failure is possibly extended to the direct core damage of the reactor. For this purpose, the fault tree for structural system failure leading to the core damage under an earthquake accident is developed. The failure probabilities of basic events are evaluated as fragility curves of log-normal distributions. Finally, the plant-level seismic margin is investigated by the fault tree analysis combining with fragility data and the critical path is identified. The plant-level probabilistic seismic margin assessment using the fragility based fault tree analysis was performed for quantifying the safety of research reactor to a seismic hazard. For this, the fault tree for structural system failure leading to the core damage of the reactor under a seismic accident was developed. The failure probabilities of basic events were evaluated as fragility curves of log-normal distributions.

  3. Patterns of failure for glioblastoma multiforme following concurrent radiation and temozolomide

    International Nuclear Information System (INIS)

    Dobelbower, Michael C.; Burnett, Omer L. III; Haytt, Mark D.; Fiveash, John B.; Nordal, Robert A.; Nabors, Louis B.; Markert, James M.

    2011-01-01

    To analyse patterns of failure in patients with glioblastoma multiforme treated with concurrent radiation and temozolomide. A retrospective review of patients treated with concurrent radiation and temozolomide was performed. Twenty patients treated at the University of Alabama at Birmingham, with biopsy-proven disease, documented disease progression after treatment, and adequate radiation dosimetry and imaging records were included in the study. Patients generally received 46 Gy to the primary tumour and surrounding oedema plus 1 cm, and 60 Gy to the enhancing tumour plus 1 cm. MRIs documenting failure after therapy were fused to the original treatment plans. Contours of post-treatment tumour volumes were generated from MRIs showing tumour failure and were overlaid onto the original isodose curves. The recurrent tumours were classified as in-field, marginal or regional. Recurrences were also evaluated for distant failure. Of the 20 documented failures, all patients had some component of failure at the primary site. Eighteen patients (90%) failed in-field, 2 patients (10%) had marginal failures, and no regional failures occurred. Four patients (20%) had a component of distant failure in which an independent satellite lesion was located completely outside of the 95% isodose curve. Radiation concurrent with temozolomide appears to be associated with a moderate risk of distant brain failure in addition to the high rate of local failure. The risk of distant failure was consistent with that observed with radiation alone, suggesting that temozolomide does not act to reduce distant brain failure but to improve local control.

  4. Exponential Decay Nonlinear Regression Analysis of Patient Survival Curves: Preliminary Assessment in Non-Small Cell Lung Cancer

    Science.gov (United States)

    Stewart, David J.; Behrens, Carmen; Roth, Jack; Wistuba, Ignacio I.

    2010-01-01

    Background For processes that follow first order kinetics, exponential decay nonlinear regression analysis (EDNRA) may delineate curve characteristics and suggest processes affecting curve shape. We conducted a preliminary feasibility assessment of EDNRA of patient survival curves. Methods EDNRA was performed on Kaplan-Meier overall survival (OS) and time-to-relapse (TTR) curves for 323 patients with resected NSCLC and on OS and progression-free survival (PFS) curves from selected publications. Results and Conclusions In our resected patients, TTR curves were triphasic with a “cured” fraction of 60.7% (half-life [t1/2] >100,000 months), a rapidly-relapsing group (7.4%, t1/2=5.9 months) and a slowly-relapsing group (31.9%, t1/2=23.6 months). OS was uniphasic (t1/2=74.3 months), suggesting an impact of co-morbidities; hence, tumor molecular characteristics would more likely predict TTR than OS. Of 172 published curves analyzed, 72 (42%) were uniphasic, 92 (53%) were biphasic, 8 (5%) were triphasic. With first-line chemotherapy in advanced NSCLC, 87.5% of curves from 2-3 drug regimens were uniphasic vs only 20% of those with best supportive care or 1 drug (p<0.001). 54% of curves from 2-3 drug regimens had convex rapid-decay phases vs 0% with fewer agents (p<0.001). Curve convexities suggest that discontinuing chemotherapy after 3-6 cycles “synchronizes” patient progression and death. With postoperative adjuvant chemotherapy, the PFS rapid-decay phase accounted for a smaller proportion of the population than in controls (p=0.02) with no significant difference in rapid-decay t1/2, suggesting adjuvant chemotherapy may move a subpopulation of patients with sensitive tumors from the relapsing group to the cured group, with minimal impact on time to relapse for a larger group of patients with resistant tumors. In untreated patients, the proportion of patients in the rapid-decay phase increased (p=0.04) while rapid-decay t1/2 decreased (p=0.0004) with increasing

  5. Hypertonic Saline in Conjunction with High-Dose Furosemide Improves Dose-Response Curves in Worsening Refractory Congestive Heart Failure.

    Science.gov (United States)

    Paterna, Salvatore; Di Gaudio, Francesca; La Rocca, Vincenzo; Balistreri, Fabio; Greco, Massimiliano; Torres, Daniele; Lupo, Umberto; Rizzo, Giuseppina; di Pasquale, Pietro; Indelicato, Sergio; Cuttitta, Francesco; Butler, Javed; Parrinello, Gaspare

    2015-10-01

    Diuretic responsiveness in patients with chronic heart failure (CHF) is better assessed by urine production per unit diuretic dose than by the absolute urine output or diuretic dose. Diuretic resistance arises over time when the plateau rate of sodium and water excretion is reached prior to optimal fluid elimination and may be overcome when hypertonic saline solution (HSS) is added to high doses of furosemide. Forty-two consecutively hospitalized patients with refractory CHF were randomized in a 1:1:1 ratio to furosemide doses (125 mg, 250 mg, 500 mg) so that all patients received intravenous furosemide diluted in 150 ml of normal saline (0.9%) in the first step (0-24 h) and the same furosemide dose diluted in 150 ml of HSS (1.4%) in the next step (24-48 h) as to obtain 3 groups as follows: Fourteen patients receiving 125 mg (group 1), fourteen patients receiving 250 mg (group 2), and fourteen patients receiving 500 mg (group 3) of furosemide. Urine samples of all patients were collected at 30, 60, and 90 min, and 3, 4, 5, 6, 8, and 24 h after infusion. Diuresis, sodium excretion, osmolality, and furosemide concentration were evaluated for each urine sample. After randomization, 40 patients completed the study. Two patients, one in group 2 and one in group 3 dropped out. Patients in group 1 (125 mg furosemide) had a mean age of 77 ± 17 years, 43% were male, 6 (43%) had heart failure with a preserved ejection fraction (HFpEF), and 64% were in New York Heart Association (NYHA) class IV; the mean age of patients in group 2 (250 mg furosemide) was 80 ± 8.1 years, 15% were male, 5 (38%) had HFpEF, and 84% were in NYHA class IV; and the mean age of patients in group 3 (500 mg furosemide) was 73 ± 12 years, 54% were male, 6 (46%) had HFpEF, and 69% were in NYHA class IV. HSS added to furosemide increased total urine output, sodium excretion, urinary osmolality, and furosemide urine delivery in all patients and at all time points. The percentage increase was 18,14, and

  6. Reliability-based fatigue life estimation of shear riveted connections considering dependency of rivet hole failures

    Directory of Open Access Journals (Sweden)

    Leonetti* Davide

    2018-01-01

    Full Text Available Standards and guidelines for the fatigue design of riveted connections make use of a stress range-endurance (S-N curve based on the net section stress range regardless of the number and the position of the rivets. Almost all tests on which S-N curves are based, are performed with a minimum number of rivets. However, the number of rivets in a row is expected to increase the fail-safe behaviour of the connection, whereas the number of rows is supposed to decrease the theoretical stress concentration at the critical locations, and hence these aspects are not considered in the S-N curves. This paper presents a numerical model predicting the fatigue life of riveted connections by performing a system reliability analysis on a double cover plated riveted butt joint. The connection is considered in three geometries, with different number of rivets in a row and different number of rows. The stress state in the connection is evaluated using a finite element model in which the friction coefficient and the clamping force in the rivets are considered in a deterministic manner. The probability of failure is evaluated for the main plate, and fatigue failure is assumed to be originating at the sides of the rivet holes, the critical locations, or hot-spots. The notch stress approach is applied to assess the fatigue life, considered to be a stochastic quantity. Unlike other system reliability models available in the literature, the evaluation of the probability of failure takes into account the stochastic dependence between the failures at each critical location modelled as a parallel system, which means considering the change of the state of stress in the connection when a ligament between two rivets fails. A sensitivity study is performed to evaluate the effect of the pretension in the rivet and the friction coefficient on the fatigue life.

  7. Reliability Analysis of Fatigue Failure of Cast Components for Wind Turbines

    Directory of Open Access Journals (Sweden)

    Hesam Mirzaei Rafsanjani

    2015-04-01

    Full Text Available Fatigue failure is one of the main failure modes for wind turbine drivetrain components made of cast iron. The wind turbine drivetrain consists of a variety of heavily loaded components, like the main shaft, the main bearings, the gearbox and the generator. The failure of each component will lead to substantial economic losses such as cost of lost energy production and cost of repairs. During the design lifetime, the drivetrain components are exposed to variable loads from winds and waves and other sources of loads that are uncertain and have to be modeled as stochastic variables. The types of loads are different for offshore and onshore wind turbines. Moreover, uncertainties about the fatigue strength play an important role in modeling and assessment of the reliability of the components. In this paper, a generic stochastic model for fatigue failure of cast iron components based on fatigue test data and a limit state equation for fatigue failure based on the SN-curve approach and Miner’s rule is presented. The statistical analysis of the fatigue data is performed using the Maximum Likelihood Method which also gives an estimate of the statistical uncertainties. Finally, illustrative examples are presented with reliability analyses depending on various stochastic models and partial safety factors.

  8. Nonorganic Failure to Thrive: Developmental Outcomes and Psychosocial Assessment and Intervention Issues.

    Science.gov (United States)

    Heffer, Robert W.; Kelley, Mary L.

    1994-01-01

    This review describes Nonorganic Failure to Thrive, presents developmental outcomes, and discusses psychosocial assessment and intervention issues relevant to this developmental disability of early childhood, focusing on child-specific variables, situational and family variables, parent-child interaction variables, and biopsychosocial formulation…

  9. The results of questionnaire on quantitative assessment of 123I-metaiodobenzylguanidine myocardial scintigraphy in heart failure

    International Nuclear Information System (INIS)

    Nishimura, Tsunehiko; Sugishita, Yasurou; Sasaki, Yasuhito.

    1997-01-01

    This study was done by working group under the cooperation between Japanese Society of Nuclear Medicine and Japanese Circulation Society. We evaluated the usefulness of quantitative assessment of 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy in heart failure by the results of questionnaire. Forty-nine (72.1%) of 68 selected institutions participated in this study. The incidence of MIBG myocardial scintigraphy used in heart failure was 41.1%. The imaging protocol was mostly done by both planar and SPECT at 15 min and 3.6 hr after intravenous injection of 111 MBq of MIBG. The quantitative assessment was mostly done by heart/mediastinum (H/M) ratio and washout rate analysis based on planar imaging. The mean normal value of H/M ratio were 2.34±0.36, and 2.49±0.40, at early and delayed images, respectively. The normal value of washout rate was 27.74±5.34%. On the other hand, those of H/M ratio in heart failure were 1.87±0.27, and 1.75±0.24, at early and delayed images, respectively. That of washout rate was 42.30±6.75%. These parameters were very useful for the evaluation of heart failure. In conclusion, MIBG myocardial scintigraphy was widely used for not only early detection and severity assessment, but also indication for therapy and prognosis evaluation in heart failure patients. (author)

  10. An investigation on vulnerability assessment of steel structures with thin steel shear wall through development of fragility curves

    Directory of Open Access Journals (Sweden)

    Mohsen Gerami

    2017-02-01

    Full Text Available Fragility curves play an important role in damage assessment of buildings. Probability of damage induction to the structure against seismic events can be investigated upon generation of afore mentioned curves. In current research 360 time history analyses have been carried out on structures of 3, 10 and 20 story height and subsequently fragility curves have been adopted. The curves are developed based on two indices of inter story drifts and equivalent strip axial strains of the shear wall. Time history analysis is carried out in Perform 3d considering 10 far field seismograms and 10 near fields. Analysis of low height structures revealed that they are more vulnerable in accelerations lower than 0.8 g in near field earthquakes because of higher mode effects. Upon the generated fragility curves it was observed that middle and high structures have more acceptable performance and lower damage levels compared to low height structures in both near and far field seismic hazards.

  11. Modeling of Triangular Lattice Space Structures with Curved Battens

    Science.gov (United States)

    Chen, Tzikang; Wang, John T.

    2005-01-01

    Techniques for simulating an assembly process of lattice structures with curved battens were developed. The shape of the curved battens, the tension in the diagonals, and the compression in the battens were predicted for the assembled model. To be able to perform the assembly simulation, a cable-pulley element was implemented, and geometrically nonlinear finite element analyses were performed. Three types of finite element models were created from assembled lattice structures for studying the effects of design and modeling variations on the load carrying capability. Discrepancies in the predictions from these models were discussed. The effects of diagonal constraint failure were also studied.

  12. Nanowire failure: long = brittle and short = ductile.

    Science.gov (United States)

    Wu, Zhaoxuan; Zhang, Yong-Wei; Jhon, Mark H; Gao, Huajian; Srolovitz, David J

    2012-02-08

    Experimental studies of the tensile behavior of metallic nanowires show a wide range of failure modes, ranging from ductile necking to brittle/localized shear failure-often in the same diameter wires. We performed large-scale molecular dynamics simulations of copper nanowires with a range of nanowire lengths and provide unequivocal evidence for a transition in nanowire failure mode with change in nanowire length. Short nanowires fail via a ductile mode with serrated stress-strain curves, while long wires exhibit extreme shear localization and abrupt failure. We developed a simple model for predicting the critical nanowire length for this failure mode transition and showed that it is in excellent agreement with both the simulation results and the extant experimental data. The present results provide a new paradigm for the design of nanoscale mechanical systems that demarcates graceful and catastrophic failure. © 2012 American Chemical Society

  13. Precision-Recall-Gain Curves:PR Analysis Done Right

    OpenAIRE

    Flach, Peter; Kull, Meelis

    2015-01-01

    Precision-Recall analysis abounds in applications of binary classification where true negatives do not add value and hence should not affect assessment of the classifier's performance. Perhaps inspired by the many advantages of receiver operating characteristic (ROC) curves and the area under such curves for accuracy-based performance assessment, many researchers have taken to report Precision-Recall (PR) curves and associated areas as performance metric. We demonstrate in this paper that thi...

  14. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  15. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  16. [Implantable sensors for outpatient assessment of ventricular filling pressure in advanced heart failure : Which telemonitoring design is optimal?

    Science.gov (United States)

    Herrmann, E; Fichtlscherer, S; Hohnloser, S H; Zeiher, A M; Aßmus, B

    2016-12-01

    Patients with advanced heart failure suffer from frequent hospitalizations. Non-invasive hemodynamic telemonitoring for assessment of ventricular filling pressure has been shown to reduce hospitalizations. We report on the right ventricular (RVP), the pulmonary artery (PAP) and the left atrial pressure (LAP) sensor for non-invasive assessment of the ventricular filling pressure. A literature search concerning the available implantable pressure sensors for noninvasive haemodynamic telemonitoring in patients with advanced heart failure was performed. Until now, only implantation of the PAP-sensor was able to reduce hospitalizations for cardiac decompensation and to improve quality of life. The right ventricular pressure sensor missed the primary endpoint of a significant reduction of hospitalizations, clinical data using the left atrial pressure sensor are still pending. The implantation of a pressure sensor for assessment of pulmonary artery filling pressure is suitable for reducing hospitalizations for heart failure and for improving quality of life in patients with advanced heart failure.

  17. [Assessment of medical management of heart failure at National Hospital Blaise COMPAORE].

    Science.gov (United States)

    Kambiré, Y; Konaté, L; Diallo, I; Millogo, G R C; Kologo, K J; Tougouma, J B; Samadoulougou, A K; Zabsonré, P

    2018-05-09

    The aim of this study was to assess the quality of medical management of heart failure at the National Hospital Blaise Compaoré according to the international guidelines. A retrospective study was performed including consecutive patients admitted for heart failure documented sonographically from October 2012 to March 2015 in the Medicine and Medical Specialties Department of National Hospital Blaise Compaore with a minimum follow-up of six weeks. Data analysis was made by the SPSS 20.0 software. Eighty-four patients, mean age of 57.61±18.24 years, were included. It was an acute heart failure in 84.5% of patients with systolic left ventricular function impaired (77.4%). The rate of prescription of different drugs in heart failure any type was 88.1% for loop diuretics; 77.1% for angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and 65.5% for betablockers. In patients with systolic dysfunction, 84.62% of patients were received the combination of angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and 75.38% for betablockers. Exercise rehabilitation was undergoing in 10.7% of patients. The death rate was 16.7% and hospital readmission rate of 16.7%. The prescription rate of major heart failure drugs is satisfactory. Cardiac rehabilitation should be developed. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  18. Probabilistic evaluation of design S-N curve and reliability assessment of ASME code-based evaluation

    International Nuclear Information System (INIS)

    Zhao Yongxiang

    1999-01-01

    A probabilistic evaluating approach of design S-N curve and a reliability assessment approach of the ASME code-based evaluation are presented on the basis of Langer S-N model-based P-S-N curves. The P-S-N curves are estimated by a so-called general maximum likelihood method. This method can be applied to deal with the virtual stress amplitude-crack initial life data which have a characteristics of double random variables. Investigation of a set of the virtual stress amplitude-crack initial life (S-N) data of 1Cr18Ni9Ti austenitic stainless steel-welded joint reveals that the P-S-N curves can give a good prediction of scatter regularity of the S-N data. Probabilistic evaluation of the design S-N curve with 0.9999 survival probability has considered various uncertainties, besides of the scatter of the S-N data, to an appropriate extent. The ASME code-based evaluation with 20 reduction factor on the mean life is much more conservative than that with 2 reduction factor on the stress amplitude. Evaluation of the latter in 666.61 MPa virtual stress amplitude is equivalent to 0.999522 survival probability and in 2092.18 MPa virtual stress amplitude equivalent to 0.9999999995 survival probability. This means that the evaluation in the low loading level may be non-conservative and in contrast, too conservative in the high loading level. Cause is that the reduction factors are constants and the factors can not take into account the general observation that scatter of the N data increases with the loading level decreasing. This has indicated that it is necessary to apply the probabilistic approach to the evaluation of design S-N curve

  19. Sequential decision reliability concept and failure rate assessment

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1990-11-01

    Conventionally, a reliability concept is considered together with both each basic unit and their integration in a complicated large scale system such as a nuclear power plant (NPP). Basically, as the plant's operational status is determined by the information obtained from various sensors, the plant's reliability and the risk assessment is closely related to the reliability of the sensory information and hence the sensor components. However, considering the relevant information-processing systems, e.g. fault detection processors, there exists a further question about the reliability of such systems, specifically the reliability of the systems' decision-based outcomes by means of which the further actions are performed. To this end, a general sequential decision reliability concept and the failure rate assessment methodology is introduced. The implications of the methodology are investigated and the importance of the decision reliability concept in system operation is demonstrated by means of sensory signals in real-time from the Borssele NPP in the Netherlands. (author). 21 refs.; 8 figs

  20. Assessment of heart rate, acidosis, consciousness, oxygenation, and respiratory rate to predict noninvasive ventilation failure in hypoxemic patients.

    Science.gov (United States)

    Duan, Jun; Han, Xiaoli; Bai, Linfu; Zhou, Lintong; Huang, Shicong

    2017-02-01

    To develop and validate a scale using variables easily obtained at the bedside for prediction of failure of noninvasive ventilation (NIV) in hypoxemic patients. The test cohort comprised 449 patients with hypoxemia who were receiving NIV. This cohort was used to develop a scale that considers heart rate, acidosis, consciousness, oxygenation, and respiratory rate (referred to as the HACOR scale) to predict NIV failure, defined as need for intubation after NIV intervention. The highest possible score was 25 points. To validate the scale, a separate group of 358 hypoxemic patients were enrolled in the validation cohort. The failure rate of NIV was 47.8 and 39.4% in the test and validation cohorts, respectively. In the test cohort, patients with NIV failure had higher HACOR scores at initiation and after 1, 12, 24, and 48 h of NIV than those with successful NIV. At 1 h of NIV the area under the receiver operating characteristic curve was 0.88, showing good predictive power for NIV failure. Using 5 points as the cutoff value, the sensitivity, specificity, positive predictive value, negative predictive value, and diagnostic accuracy for NIV failure were 72.6, 90.2, 87.2, 78.1, and 81.8%, respectively. These results were confirmed in the validation cohort. Moreover, the diagnostic accuracy for NIV failure exceeded 80% in subgroups classified by diagnosis, age, or disease severity and also at 1, 12, 24, and 48 h of NIV. Among patients with NIV failure with a HACOR score of >5 at 1 h of NIV, hospital mortality was lower in those who received intubation at ≤12 h of NIV than in those intubated later [58/88 (66%) vs. 138/175 (79%); p = 0.03). The HACOR scale variables are easily obtained at the bedside. The scale appears to be an effective way of predicting NIV failure in hypoxemic patients. Early intubation in high-risk patients may reduce hospital mortality.

  1. The results of questionnaire on quantitative assessment of {sup 123}I-metaiodobenzylguanidine myocardial scintigraphy in heart failure

    Energy Technology Data Exchange (ETDEWEB)

    Nishimura, Tsunehiko [Osaka Univ., Suita (Japan). Medical school; Sugishita, Yasurou; Sasaki, Yasuhito

    1997-12-01

    This study was done by working group under the cooperation between Japanese Society of Nuclear Medicine and Japanese Circulation Society. We evaluated the usefulness of quantitative assessment of {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy in heart failure by the results of questionnaire. Forty-nine (72.1%) of 68 selected institutions participated in this study. The incidence of MIBG myocardial scintigraphy used in heart failure was 41.1%. The imaging protocol was mostly done by both planar and SPECT at 15 min and 3.6 hr after intravenous injection of 111 MBq of MIBG. The quantitative assessment was mostly done by heart/mediastinum (H/M) ratio and washout rate analysis based on planar imaging. The mean normal value of H/M ratio were 2.34{+-}0.36, and 2.49{+-}0.40, at early and delayed images, respectively. The normal value of washout rate was 27.74{+-}5.34%. On the other hand, those of H/M ratio in heart failure were 1.87{+-}0.27, and 1.75{+-}0.24, at early and delayed images, respectively. That of washout rate was 42.30{+-}6.75%. These parameters were very useful for the evaluation of heart failure. In conclusion, MIBG myocardial scintigraphy was widely used for not only early detection and severity assessment, but also indication for therapy and prognosis evaluation in heart failure patients. (author)

  2. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    International Nuclear Information System (INIS)

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins

  3. Disease assessment and prognosis of liver failure

    Directory of Open Access Journals (Sweden)

    ZHANG Jing

    2016-09-01

    Full Text Available Liver failure has a high fatality rate and greatly threatens human health. Liver transplantation can effectively reduce the fatality rate. However, the problems such as donor shortage and allograft rejection limit the wide application of liver transplantation. An accurate early assessment helps to evaluate patients′ condition and optimize therapeutic strategies. At present, commonly used systems for prognostic evaluation include the King′s College Hospital, MELD, integrated MELD, Child-Pugh score, CLIF-SOFA, CLIF-C ACLFS, and D-MELD, and each system has its own advantages and disadvantages. Among these systems, the MELD scoring system is the most commonly used one, and the D-MELD scoring system is the most innovative one, which can be used for patients on the waiting list for liver transplantation. This article elaborates on the characteristics and predictive value of each scoring system in clinical practice.

  4. Generic component failure data base

    International Nuclear Information System (INIS)

    Eide, S.A.; Calley, M.B.

    1992-01-01

    This report discusses comprehensive component generic failure data base which has been developed for light water reactor probabilistic risk assessments. The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) was used to generate component failure rates. Using this approach, most of the failure rates are based on actual plant data rather then existing estimates

  5. Dynamic computed tomography (CT) in the rat kidney and application to acute renal failure models

    International Nuclear Information System (INIS)

    Ishikawa, Isao; Saito, Tadashi; Ishii, Hirofumi; Bansho, Junichi; Koyama, Yukinori; Tobita, Akira

    1995-01-01

    Renal dynamic CT scanning is suitable for determining the excretion of contrast medium in the cortex and medulla of the kidney, which is valuable for understanding the pathogenesis of disease processes in various conditions. This form of scanning would be convenient for use, if a method of application to the rat kidney were available. Therefore, we developed a method of applying renal dynamic CT to rats and evaluated the cortical and medullary curves, e.g., the corticomedullary junction time which is correlated to creatinine clearance, in various rat models of acute renal failure. The rat was placed in a 10deg oblique position and a bilateral hilar slice was obtained before and 5, 10, 15, 20, 25, 30, 40, 50, 60, 80, 100, 120, 140, 160 and 180 sec after administering 0.5 ml of contrast medium using Somatom DR. The width of the slice was 4 mm and the scan time was 3 sec. The corticomedullary junction time in normal rats was 23.0±10.5 sec, the peak value of the cortical curve was 286.3±76.7 Hounsfield Unit (HU) and the peak value of the medullary curve was 390.1±66.2 HU. Corticomedullary junction time after exposure of the kidney was prolonged compared to that of the unexposed kidney. In rats with acute renal failure, the excretion pattern of contrast medium was similar in both the glycerol- and HgCl2-induced acute renal failure models. The peak values of the cortical curve were maintained three hours after a clamp was placed at the hilar region of the kidney for one hour, and the peak values of the medullary curve were maintained during the administration of 10μg/kg/min of angiotensin II. Dynamic CT curves in the acute renal failure models examined were slightly different from those in human acute renal failure. These results suggest that rats do not provide an ideal model for human acute renal failure. However, the application of dynamic CT to the rat kidney models was valuable for estimating the pathogenesis of various human kidney diseases. (author)

  6. Chronic renal failure and sexual functioning: clinical status versus objectively assessed sexual response

    NARCIS (Netherlands)

    Toorians, A. W.; Janssen, E.; Laan, E.; Gooren, L. J.; Giltay, E. J.; Oe, P. L.; Donker, A. J.; Everaerd, W.

    1997-01-01

    BACKGROUND: Sexual dysfunctions are common among patients with chronic renal failure. The prevalence was assessed in a population of 281 patients (20-60 years), and it was attempted to determine whether their mode of treatment (haemodialysis, peritoneal dialysis, or kidney transplantation), or

  7. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  8. Defect assessments of pipelines based on the FAD approach incorporating constraint effects

    Energy Technology Data Exchange (ETDEWEB)

    Ruggieri, Claudio; Cravero, Sebastian [Sao Paulo Univ., SP (Brazil)

    2005-07-01

    This work presents a framework for including constraint effects in the failure assessment diagram (FAD) approach. The procedure builds upon the constraint-based Q methodology to correct measured toughness values using low constraint fracture specimens which modifies the shape of the FAD curve. The approach is applied to predict the failure (burst pressure) of high pressure pipelines with planar defects having different geometries (i.e., crack depth and crack length). The FAD curves are corrected for effects of constraint based on the L{sub r}-Q trajectories for pin-loaded SE(T) specimens. The article shows that inclusion of constraint effects in the FAD approach provides better agreement between experimentally measured burst pressure and predicted values for high pressure pipelines with planar defects. (author)

  9. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.

    1999-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)

  10. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1998-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)

  11. Stereoscopic visualization in curved spacetime: seeing deep inside a black hole

    International Nuclear Information System (INIS)

    Hamilton, Andrew J S; Polhemus, Gavin

    2010-01-01

    Stereoscopic visualization adds an additional dimension to the viewer's experience, giving them a sense of distance. In a general relativistic visualization, distance can be measured in a variety of ways. We argue that the affine distance, which matches the usual notion of distance in flat spacetime, is a natural distance to use in curved spacetime. As an example, we apply affine distance to the visualization of the interior of a black hole. Affine distance is not the distance perceived with normal binocular vision in curved spacetime. However, the failure of binocular vision is simply a limitation of animals that have evolved in flat spacetime, not a fundamental obstacle to depth perception in curved spacetime. Trinocular vision would provide superior depth perception.

  12. Compressive Failure Mechanisms in Layered Materials

    DEFF Research Database (Denmark)

    Sørensen, Kim Dalsten

    Two important failure modes in fiber reinforced composite materials in cluding layers and laminates occur under loading conditions dominated by compression in the layer direction. These two distinctly different failure modes are 1. buckling driven delamination 2. failure by strain localization...... or on cylindrical substrates modeling the delamination as an interface fracture mechanical problem. Here attention is directed towards double-curved substrates, which introduces a new non-dimensional combination of geometric parameters. It is shown for a wide range of parameters that by choosing the two....... This has some impact on the convergence rate for decreasing mesh size in the load vs. end shortening response for a rectangular block of material. Especially in the immediate post critical range the convergence rate may be slow. The capabilities of the model to deal with more complicated structural...

  13. Parathyroid hormone secretion in chronic renal failure

    DEFF Research Database (Denmark)

    Madsen, J C; Rasmussen, A Q; Ladefoged, S D

    1996-01-01

    The aim of study was to introduce and evaluate a method for quantifying the parathyroid hormone (PTH) secretion during hemodialysis in secondary hyperparathyroidism due to end-stage renal failure. We developed a method suitable for inducing sequential hypocalcemia and hypercalcemia during....../ionized calcium curves were constructed, and a mean calcium set-point of 1.16 mmol/liter was estimated compared to the normal mean of about 1.13 mmol/liter. In conclusion, we demonstrate that it is important to use a standardized method to evaluate parathyroid hormone dynamics in chronic renal failure. By the use...

  14. Selected component failure rate values from fusion safety assessment tasks

    Energy Technology Data Exchange (ETDEWEB)

    Cadwallader, L.C.

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  15. Selected Component Failure Rate Values from Fusion Safety Assessment Tasks

    Energy Technology Data Exchange (ETDEWEB)

    Cadwallader, Lee Charles

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  16. Selected component failure rate values from fusion safety assessment tasks

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1998-01-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers

  17. Association between Functional Variables and Heart Failure after Myocardial Infarction in Rats

    Energy Technology Data Exchange (ETDEWEB)

    Polegato, Bertha F.; Minicucci, Marcos F.; Azevedo, Paula S.; Gonçalves, Andréa F.; Lima, Aline F.; Martinez, Paula F.; Okoshi, Marina P.; Okoshi, Katashi; Paiva, Sergio A. R.; Zornoff, Leonardo A. M., E-mail: lzornoff@fmb.unesp.br [Faculdade de Medicina de Botucatu - Universidade Estadual Paulista ' Júlio de mesquita Filho' - UNESP Botucatu, SP (Brazil)

    2016-02-15

    Heart failure prediction after acute myocardial infarction may have important clinical implications. To analyze the functional echocardiographic variables associated with heart failure in an infarction model in rats. The animals were divided into two groups: control and infarction. Subsequently, the infarcted animals were divided into groups: with and without heart failure. The predictive values were assessed by logistic regression. The cutoff values predictive of heart failure were determined using ROC curves. Six months after surgery, 88 infarcted animals and 43 control animals were included in the study. Myocardial infarction increased left cavity diameters and the mass and wall thickness of the left ventricle. Additionally, myocardial infarction resulted in systolic and diastolic dysfunction, characterized by lower area variation fraction values, posterior wall shortening velocity, E-wave deceleration time, associated with higher values of E / A ratio and isovolumic relaxation time adjusted by heart rate. Among the infarcted animals, 54 (61%) developed heart failure. Rats with heart failure have higher left cavity mass index and diameter, associated with worsening of functional variables. The area variation fraction, the E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate were functional variables predictors of heart failure. The cutoff values of functional variables associated with heart failure were: area variation fraction < 31.18%; E / A > 3.077; E-wave deceleration time < 42.11 and isovolumic relaxation time adjusted by heart rate < 69.08. In rats followed for 6 months after myocardial infarction, the area variation fraction, E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate are predictors of heart failure onset.

  18. Association between Functional Variables and Heart Failure after Myocardial Infarction in Rats

    International Nuclear Information System (INIS)

    Polegato, Bertha F.; Minicucci, Marcos F.; Azevedo, Paula S.; Gonçalves, Andréa F.; Lima, Aline F.; Martinez, Paula F.; Okoshi, Marina P.; Okoshi, Katashi; Paiva, Sergio A. R.; Zornoff, Leonardo A. M.

    2016-01-01

    Heart failure prediction after acute myocardial infarction may have important clinical implications. To analyze the functional echocardiographic variables associated with heart failure in an infarction model in rats. The animals were divided into two groups: control and infarction. Subsequently, the infarcted animals were divided into groups: with and without heart failure. The predictive values were assessed by logistic regression. The cutoff values predictive of heart failure were determined using ROC curves. Six months after surgery, 88 infarcted animals and 43 control animals were included in the study. Myocardial infarction increased left cavity diameters and the mass and wall thickness of the left ventricle. Additionally, myocardial infarction resulted in systolic and diastolic dysfunction, characterized by lower area variation fraction values, posterior wall shortening velocity, E-wave deceleration time, associated with higher values of E / A ratio and isovolumic relaxation time adjusted by heart rate. Among the infarcted animals, 54 (61%) developed heart failure. Rats with heart failure have higher left cavity mass index and diameter, associated with worsening of functional variables. The area variation fraction, the E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate were functional variables predictors of heart failure. The cutoff values of functional variables associated with heart failure were: area variation fraction < 31.18%; E / A > 3.077; E-wave deceleration time < 42.11 and isovolumic relaxation time adjusted by heart rate < 69.08. In rats followed for 6 months after myocardial infarction, the area variation fraction, E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate are predictors of heart failure onset

  19. A note on families of fragility curves

    International Nuclear Information System (INIS)

    Kaplan, S.; Bier, V.M.; Bley, D.C.

    1989-01-01

    In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve

  20. A New Development in the Method of Measurement of Reciprocity-Law Failure and Its Application to Screen/Green-Sensitive X-Ray Film Systems

    Science.gov (United States)

    Fujita, Hiroshi; Uchida, Suguru

    1981-01-01

    Since it has been confirmed by experiment that the intensity of X-rays varies approximately as the focus-film distance (FFD) to the minus 2.12th power, the X-ray intensity can be changed by varying the FFD. It is shown in this paper that two types of reciprocity failure curve, density vs. exposure time for constant exposure and relative exposure vs. exposure time for constant density, can easily be obtained from several time-scale characteristic curves taken experimentally for several FFD’s in the rare-earth screen-film systems used. Only low-intensity reciprocity failure is present for exposure times of more than about 0.1 sec for one film, but both low-intensity and high-intensity reciprocity failures occur in the other one. The effects of reciprocity failure on the H & D curves can be seen in the shape of the curves and the relative speed.

  1. The effect of defects on structural failure: A two-criteria approach

    International Nuclear Information System (INIS)

    Dowling, A.R.; Townley, C.H.A.

    1976-01-01

    The two-criteria approach to the study of defects in structures assumes that failure occurs when the applied load reaches the lower of either a load to cause brittle failure in accordance with the theories of linear elastic fracture mechanics or a collapse load dependent on the ultimate stress of the material and the structural geometry. This simple approach is described and compared with previously published experimental results for various geometries and materials. The simplicity of this method of defect analysis lies in the fact that each criterion is sufficiently well understood to permit scaling and geometry changes to be accommodated readily. It becomes apparent that a sizeable transition region exists between the two criteria but this can be described in an expression relating the criteria. This expression adequately predicts the behaviour of cracked structures of both simple and complex geometry. A design curve for defect assessment is proposed for which it is unnecessary to consider the transition region. (author)

  2. An assessment of mode-coupling and falling-friction mechanisms in railway curve squeal through a simplified approach

    Science.gov (United States)

    Ding, Bo; Squicciarini, Giacomo; Thompson, David; Corradi, Roberto

    2018-06-01

    Curve squeal is one of the most annoying types of noise caused by the railway system. It usually occurs when a train or tram is running around tight curves. Although this phenomenon has been studied for many years, the generation mechanism is still the subject of controversy and not fully understood. A negative slope in the friction curve under full sliding has been considered to be the main cause of curve squeal for a long time but more recently mode coupling has been demonstrated to be another possible explanation. Mode coupling relies on the inclusion of both the lateral and vertical dynamics at the contact and an exchange of energy occurs between the normal and the axial directions. The purpose of this paper is to assess the role of the mode-coupling and falling-friction mechanisms in curve squeal through the use of a simple approach based on practical parameter values representative of an actual situation. A tramway wheel is adopted to study the effect of the adhesion coefficient, the lateral contact position, the contact angle and the damping ratio. Cases corresponding to both inner and outer wheels in the curve are considered and it is shown that there are situations in which both wheels can squeal due to mode coupling. Additionally, a negative slope is introduced in the friction curve while keeping active the vertical dynamics in order to analyse both mechanisms together. It is shown that, in the presence of mode coupling, the squealing frequency can differ from the natural frequency of either of the coupled wheel modes. Moreover, a phase difference between wheel vibration in the vertical and lateral directions is observed as a characteristic of mode coupling. For both these features a qualitative comparison is shown with field measurements which show the same behaviour.

  3. Sensitive Troponin I Assay in Patients with Chest Pain - Association with Significant Coronary Lesions with or Without Renal Failure.

    Science.gov (United States)

    Soeiro, Alexandre de Matos; Gualandro, Danielle Menosi; Bossa, Aline Siqueira; Zullino, Cindel Nogueira; Biselli, Bruno; Soeiro, Maria Carolina Feres de Almeida; Leal, Tatiana de Carvalho Andreucci Torres; Serrano, Carlos Vicente; Oliveira Junior, Mucio Tavares de

    2018-01-01

    Despite having higher sensitivity as compared to conventional troponins, sensitive troponins have lower specificity, mainly in patients with renal failure. Study aimed at assessing the sensitive troponin I levels in patients with chest pain, and relating them to the existence of significant coronary lesions. Retrospective, single-center, observational. This study included 991 patients divided into two groups: with (N = 681) and without (N = 310) significant coronary lesion. For posterior analysis, the patients were divided into two other groups: with (N = 184) and without (N = 807) chronic renal failure. The commercial ADVIA Centaur® TnI-Ultra assay (Siemens Healthcare Diagnostics) was used. The ROC curve analysis was performed to identify the sensitivity and specificity of the best cutoff point of troponin as a discriminator of the probability of significant coronary lesion. The associations were considered significant when p renal failure, the areas under the ROC curve were 0.703 (95% CI: 0.66 - 0.74) and 0.608 (95% CI: 0.52 - 0.70), respectively. The best cutoff points to discriminate the presence of significant coronary lesion were: in the general population, 0.605 ng/dL (sensitivity, 63.4%; specificity, 67%); in patients without renal failure, 0.605 ng/dL (sensitivity, 62.7%; specificity, 71%); and in patients with chronic renal failure, 0.515 ng/dL (sensitivity, 80.6%; specificity, 42%). In patients with chest pain, sensitive troponin I showed a good correlation with significant coronary lesions when its level was greater than 0.605 ng/dL. In patients with chronic renal failure, a significant decrease in specificity was observed in the correlation of troponin levels and severe coronary lesions.

  4. [Surgical learning curve for creation of vascular accesses for haemodialysis: value of medico-radio-surgical collaboration].

    Science.gov (United States)

    Van Glabeke, Emmanuel; Belenfant, Xavier; Barrou, Benoît; Adhemar, Jean-Pierre; Laedrich, Joëlle; Mavel, Marie-Christine; Challier, Emmanuel

    2005-04-01

    Creation of a vascular access (VA) for haemodialysis is a surgical procedure which comprises a failure rate related to the quality of the vessels and the operator's experience. The authors report the first 2 years of a young urologist's experience with this procedure in a local hospital in collaboration with the nephrology team. Patients undergoing creation of VA were divided into 2 chronological groups. The patient's age and gender, the cause of renal failure, the presence of diabetes, clinical examination of the upper limb, preoperative assessment of upper limb vessels, the type of anaesthesia, the operating time and the start of dialysis after the operation, as well as the functional results of the VA at 6 months were studied. Results concerning the patients of the first period were discussed by the operator and the nephrology team. During the first 9 months, 28 patients were operated, corresponding to 36 operations including 32 direct fistulas. Over the following 15 months, 61 patients were operated, with the creation of 63 VAs, including 55 direct fistulas. The failure rate (thrombosis or non-functioning VA) decreased from 32.1% to 11.1% (p=0.07), while the 2 groups were globally comparable. Evaluation of a new surgical procedure shows a number of failures, as for all learning curves. However, it helps to improve the results. Collaboration with nephrologists must comprise a discussion allowing the acceptance of certain failures, as they reflect compliance with a strategy of preservation of the vascular capital and a rational attempt to avoid a non-essential proximal access or bypass graft. The support of a motivated radiology team (preoperative assessment and management of complications) and the assistance of a more experienced operator are essential.

  5. Predicting Time Series Outputs and Time-to-Failure for an Aircraft Controller Using Bayesian Modeling

    Science.gov (United States)

    He, Yuning

    2015-01-01

    Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.

  6. Assessment of Ex-Vitro Anaerobic Digestion Kinetics of Crop Residues Through First Order Exponential Models: Effect of LAG Phase Period and Curve Factor

    Directory of Open Access Journals (Sweden)

    Abdul Razaque Sahito

    2013-04-01

    Full Text Available Kinetic studies of AD (Anaerobic Digestion process are useful to predict the performance of digesters and design appropriate digesters and also helpful in understanding inhibitory mechanisms of biodegradation. The aim of this study was to assess the anaerobic kinetics of crop residues digestion with buffalo dung. Seven crop residues namely, bagasse, banana plant waste, canola straw, cotton stalks, rice straw, sugarcane trash and wheat straw were selected from the field and were analyzed on MC (Moisture Contents, TS (Total Solids and VS (Volatile Solids with standard methods. In present study, three first order exponential models namely exponential model, exponential lag phase model and exponential curve factor model were used to assess the kinetics of the AD process of crop residues and the effect of lag phase and curve factor was analyzed based on statistical hypothesis testing and on information theory. Assessment of kinetics of the AD of crop residues and buffalo dung follows the first order kinetics. Out of the three models, the simple exponential model was the poorest model, while the first order exponential curve factor model is the best fit model. In addition to statistical hypothesis testing, the exponential curve factor model has least value of AIC (Akaike's Information Criterion and can generate methane production data more accurately. Furthermore, there is an inverse linear relationship between the lag phase period and the curve factor.

  7. Assessment of ex-vitro anaerobic digestion kinetics of crop residues through first order exponential models: effect of lag phase period and curve factor

    International Nuclear Information System (INIS)

    Sahito, A.R.; Brohi, K.M.

    2013-01-01

    Kinetic studies of AD (Anaerobic Digestion) process are useful to predict the performance of digesters and design appropriate digesters and also helpful in understanding inhibitory mechanisms of biodegradation. The aim of this study was to assess the anaerobic kinetics of crop residues digestion with buffalo dung. Seven crop residues namely, bagasse, banana plant waste, canola straw, cotton stalks, rice straw, sugarcane trash and wheat straw were selected from the field and were analyzed on MC (Moisture Contents), TS (Total Solids) and VS (Volatile Solids) with standard methods. In present study, three first order exponential models namely exponential model, exponential lag phase model and exponential curve factor model were used to assess the kinetics of the AD process of crop residues and the effect of lag phase and curve factor was analyzed based on statistical hypothesis testing and on information theory. Assessment of kinetics of the AD of crop residues and buffalo dung follows the first order kinetics. Out of the three models, the simple exponential model was the poorest model, while the first order exponential curve factor model is the best fit model. In addition to statistical hypothesis testing, the exponential curve factor model has least value of AIC (Akaike's Information Criterion) and can generate methane production data more accurately. Furthermore, there is an inverse linear relationship between the lag phase period and the curve factor. (author)

  8. An Abrupt Transition to an Intergranular Failure Mode in the Near-Threshold Fatigue Crack Growth Regime in Ni-Based Superalloys

    Science.gov (United States)

    Telesman, J.; Smith, T. M.; Gabb, T. P.; Ring, A. J.

    2018-06-01

    Cyclic near-threshold fatigue crack growth (FCG) behavior of two disk superalloys was evaluated and was shown to exhibit an unexpected sudden failure mode transition from a mostly transgranular failure mode at higher stress intensity factor ranges to an almost completely intergranular failure mode in the threshold regime. The change in failure modes was associated with a crossover of FCG resistance curves in which the conditions that produced higher FCG rates in the Paris regime resulted in lower FCG rates and increased ΔK th values in the threshold region. High-resolution scanning and transmission electron microscopy were used to carefully characterize the crack tips at these near-threshold conditions. Formation of stable Al-oxide followed by Cr-oxide and Ti-oxides was found to occur at the crack tip prior to formation of unstable oxides. To contrast with the threshold failure mode regime, a quantitative assessment of the role that the intergranular failure mode has on cyclic FCG behavior in the Paris regime was also performed. It was demonstrated that even a very limited intergranular failure content dominates the FCG response under mixed mode failure conditions.

  9. Parathyroid hormone secretion in chronic renal failure

    DEFF Research Database (Denmark)

    Madsen, J C; Rasmussen, A Q; Ladefoged, S D

    1996-01-01

    The aim of study was to introduce and evaluate a method for quantifying the parathyroid hormone (PTH) secretion during hemodialysis in secondary hyperparathyroidism due to end-stage renal failure. We developed a method suitable for inducing sequential hypocalcemia and hypercalcemia during....../ionized calcium curves were constructed, and a mean calcium set-point of 1.16 mmol/liter was estimated compared to the normal mean of about 1.13 mmol/liter. In conclusion, we demonstrate that it is important to use a standardized method to evaluate parathyroid hormone dynamics in chronic renal failure. By the use...... of a standardized method we show that the calcium set-point is normal or slightly elevated, indicating normal parathyroid reactivity to calcium in chronic renal failure....

  10. Parathyroid hormone secretion in chronic renal failure

    DEFF Research Database (Denmark)

    Madsen, J C; Rasmussen, A Q; Ladefoged, S D

    1996-01-01

    /ionized calcium curves were constructed, and a mean calcium set-point of 1.16 mmol/liter was estimated compared to the normal mean of about 1.13 mmol/liter. In conclusion, we demonstrate that it is important to use a standardized method to evaluate parathyroid hormone dynamics in chronic renal failure. By the use...... of a standardized method we show that the calcium set-point is normal or slightly elevated, indicating normal parathyroid reactivity to calcium in chronic renal failure.......The aim of study was to introduce and evaluate a method for quantifying the parathyroid hormone (PTH) secretion during hemodialysis in secondary hyperparathyroidism due to end-stage renal failure. We developed a method suitable for inducing sequential hypocalcemia and hypercalcemia during...

  11. Assessment of characteristic failure envelopes for intact rock using results from triaxial tests

    OpenAIRE

    Muralha, J.; Lamas, L.

    2014-01-01

    The paper presents contributions to the statistical study of the parameters of the Mohr-Coulomb and Hoek-Brown strength criteria, in order to assess the characteristic failure envelopes for intact rock, based on the results of several sets of triaxial tests performed by LNEC. 10p DBB/NMMR

  12. Anatomical curve identification

    Science.gov (United States)

    Bowman, Adrian W.; Katina, Stanislav; Smith, Joanna; Brown, Denise

    2015-01-01

    Methods for capturing images in three dimensions are now widely available, with stereo-photogrammetry and laser scanning being two common approaches. In anatomical studies, a number of landmarks are usually identified manually from each of these images and these form the basis of subsequent statistical analysis. However, landmarks express only a very small proportion of the information available from the images. Anatomically defined curves have the advantage of providing a much richer expression of shape. This is explored in the context of identifying the boundary of breasts from an image of the female torso and the boundary of the lips from a facial image. The curves of interest are characterised by ridges or valleys. Key issues in estimation are the ability to navigate across the anatomical surface in three-dimensions, the ability to recognise the relevant boundary and the need to assess the evidence for the presence of the surface feature of interest. The first issue is addressed by the use of principal curves, as an extension of principal components, the second by suitable assessment of curvature and the third by change-point detection. P-spline smoothing is used as an integral part of the methods but adaptations are made to the specific anatomical features of interest. After estimation of the boundary curves, the intermediate surfaces of the anatomical feature of interest can be characterised by surface interpolation. This allows shape variation to be explored using standard methods such as principal components. These tools are applied to a collection of images of women where one breast has been reconstructed after mastectomy and where interest lies in shape differences between the reconstructed and unreconstructed breasts. They are also applied to a collection of lip images where possible differences in shape between males and females are of interest. PMID:26041943

  13. Assessing the impact of heart failure specialist services on patient populations

    Directory of Open Access Journals (Sweden)

    Lyratzopoulos Georgios

    2004-05-01

    Full Text Available Abstract Background The assessment of the impact of healthcare interventions may help commissioners of healthcare services to make optimal decisions. This can be particularly the case if the impact assessment relates to specific patient populations and uses timely local data. We examined the potential impact on readmissions and mortality of specialist heart failure services capable of delivering treatments such as b-blockers and Nurse-Led Educational Intervention (N-LEI. Methods Statistical modelling of prevented or postponed events among previously hospitalised patients, using estimates of: treatment uptake and contraindications (based on local audit data; treatment effectiveness and intolerance (based on literature; and annual number of hospitalization per patient and annual risk of death (based on routine data. Results Optimal treatment uptake among eligible but untreated patients would over one year prevent or postpone 11% of all expected readmissions and 18% of all expected deaths for spironolactone, 13% of all expected readmisisons and 22% of all expected deaths for b-blockers (carvedilol and 20% of all expected readmissions and an uncertain number of deaths for N-LEI. Optimal combined treatment uptake for all three interventions during one year among all eligible but untreated patients would prevent or postpone 37% of all expected readmissions and a minimum of 36% of all expected deaths. Conclusion In a population of previously hospitalised patients with low previous uptake of b-blockers and no uptake of N-LEI, optimal combined uptake of interventions through specialist heart failure services can potentially help prevent or postpone approximately four times as many readmissions and a minimum of twice as many deaths compared with simply optimising uptake of spironolactone (not necessarily requiring specialist services. Examination of the impact of different heart failure interventions can inform rational planning of relevant healthcare

  14. Assessing the impact of heart failure specialist services on patient populations.

    Science.gov (United States)

    Lyratzopoulos, Georgios; Cook, Gary A; McElduff, Patrick; Havely, Daniel; Edwards, Richard; Heller, Richard F

    2004-05-24

    The assessment of the impact of healthcare interventions may help commissioners of healthcare services to make optimal decisions. This can be particularly the case if the impact assessment relates to specific patient populations and uses timely local data. We examined the potential impact on readmissions and mortality of specialist heart failure services capable of delivering treatments such as b-blockers and Nurse-Led Educational Intervention (N-LEI). Statistical modelling of prevented or postponed events among previously hospitalised patients, using estimates of: treatment uptake and contraindications (based on local audit data); treatment effectiveness and intolerance (based on literature); and annual number of hospitalization per patient and annual risk of death (based on routine data). Optimal treatment uptake among eligible but untreated patients would over one year prevent or postpone 11% of all expected readmissions and 18% of all expected deaths for spironolactone, 13% of all expected readmisisons and 22% of all expected deaths for b-blockers (carvedilol) and 20% of all expected readmissions and an uncertain number of deaths for N-LEI. Optimal combined treatment uptake for all three interventions during one year among all eligible but untreated patients would prevent or postpone 37% of all expected readmissions and a minimum of 36% of all expected deaths. In a population of previously hospitalised patients with low previous uptake of b-blockers and no uptake of N-LEI, optimal combined uptake of interventions through specialist heart failure services can potentially help prevent or postpone approximately four times as many readmissions and a minimum of twice as many deaths compared with simply optimising uptake of spironolactone (not necessarily requiring specialist services). Examination of the impact of different heart failure interventions can inform rational planning of relevant healthcare services.

  15. Determination of the time to failure curve as a function of stress for a highly irradiated AISI 304 stainless steel after constant load tests in simulated PWR water environment

    International Nuclear Information System (INIS)

    Pokor, C.; Massoud, J.P.; Wintergerst, M.; Toivonen, A.; Ehrnsten, U.; Karlsen, W.

    2011-01-01

    The structures of Reactor Pressure Vessel Internals are subjected to an intense neutron flux. Under these operating conditions, the microstructure and the mechanical properties of the austenitic stainless steel components change. In addition, these components are subjected to stress of either manufacturing origin or generated under operation. Cases of baffle bolts cracking have occurred in CP0 Nuclear Power Plant units. The mechanism of degradation of these bolts is Irradiation-Assisted Stress Corrosion Cracking. In order to obtain a better understanding of this mechanism and its principal parameters of influence, a set of stress corrosion tests (mainly constant load tests) were launched within the framework of the EDF project 'PWR Internals' using materials from a Chooz A baffle corner (SA 304). These tests aim to quantify the influence on IASCC of the applied stress, temperature and environment (primary water, higher lithium concentration, inert environment) for an irradiation dose close to 30 dpa. A curve showing time to failure as a function of the stress was determined. The shape of this curve is consistent with the few data that are available in the literature. A stress threshold of about 50 % of the yield strength value at the test temperature has been determined, below which cracking in that environment seems impossible. After irradiation this material is sensitive to intergranular fracture in a primary environment, but also in an inert environment (argon) at 340 C. The tests also showed a negative effect of increased lithium concentration on the time to failure and on the stress threshold. (authors)

  16. Cone-beam computed tomography analysis of curved root canals after mechanical preparation with three nickel-titanium rotary instruments

    Science.gov (United States)

    Elsherief, Samia M.; Zayet, Mohamed K.; Hamouda, Ibrahim M.

    2013-01-01

    Cone beam computed tomography is a 3-dimensional high resolution imaging method. The purpose of this study was to compare the effects of 3 different NiTi rotary instruments used to prepare curved root canals on the final shape of the curved canals and total amount of root canal transportation by using cone-beam computed tomography. A total of 81 mesial root canals from 42 extracted human mandibular molars, with a curvature ranging from 15 to 45 degrees, were selected. Canals were randomly divided into 3 groups of 27 each. After preparation with Protaper, Revo-S and Hero Shaper, the amount of transportation and centering ability that occurred were assessed by using cone beam computed tomography. Utilizing pre- and post-instrumentation radiographs, straightening of the canal curvatures was determined with a computer image analysis program. Canals were metrically assessed for changes (surface area, changes in curvature and transportation) during canal preparation by using software SimPlant; instrument failures were also recorded. Mean total widths and outer and inner width measurements were determined on each central canal path and differences were statistically analyzed. The results showed that all instruments maintained the original canal curvature well with no significant differences between the different files (P = 0.226). During preparation there was failure of only one file (the protaper group). In conclusion, under the conditions of this study, all instruments maintained the original canal curvature well and were safe to use. Areas of uninstrumented root canal wall were left in all regions using the various systems. PMID:23885273

  17. Assessment of importance of elements for systems that condition depends on the sequence of elements failures

    International Nuclear Information System (INIS)

    Povyakalo, A.A.

    1996-01-01

    This paper proposes new general formulas for calculation of indices of elements importance for systems whose condition depends on sequence of elements failures. These systems have been called as systems with memory of failures (M-systems). Techniques existing for assessment of importance of elements are based on the Bool's models of system reliability, for which it is significant to suggest, that in every period of time system state depends only on a combination of states of elements at that very moment of time. These systems have been called as combinational systems (C-systems). Reliability of M-systems at any moment of operating time is a functional having distributions of elements time before failure as its arguments. Bool's models and methods of assessment of element importance, based on these models, are not appropriate for these systems. Pereguda and Povyakalo proposed the new techniques for assessment of elements importance for PO-SS systems that includes Protection Object (PO) and Safety System (PO). PO-SS system is an example of M-system. That technique is used at this paper as a basis for more general consideration. It has been shown that technique proposed for assessment of elements importance for M-systems has well-known Birnbaum's method as its particular case. Also the system with double protection is considered as an example

  18. Characterization of KS-material by means of J-R-curves especially using the partial unloading technique

    International Nuclear Information System (INIS)

    Voss, B.; Blauel, J.G.; Schmitt, W.

    1983-01-01

    Essential components of nuclear reactor systems are fabricated from materials of high thoughness to exclude brittle failure. With increasing load, a crack tip will blunt, a plastic zone will be formed, voids may nucleate and coalesce thus initiating stable crack extension when the crack driving parameter, e.g. J, exceeds the initiation value Jsub(i). Further stable crack growth will occur with further increasing J prior to complete failure of the structure. The specific material resistance against crack extension is characterized by J resistance curves Jsub(R)=J(Δa). ASTM provides a standard to determine the initiation toughness Jsub(Ic) from a Jsub(R)-curve [1] and a tentative standard for determining the Jsub(R)-curve by a single specimen test [2]. To generate a Jsub(R)-curve values for the crack driving parameter J and the corresponding stable crack growth Δa have to be measured. Besides the multiple specimen technique [1], the potential drop and especially the partial unloading compliance method [2] are used to measure stable crack growth. Some special problems and some results for pressure vessel steels are discussed in this paper. (orig./RW)

  19. Lagrangian Curves on Spectral Curves of Monopoles

    International Nuclear Information System (INIS)

    Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.

    2010-01-01

    We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.

  20. Testing the validity of stock-recruitment curve fits

    International Nuclear Information System (INIS)

    Christensen, S.W.; Goodyear, C.P.

    1988-01-01

    The utilities relied heavily on the Ricker stock-recruitment model as the basis for quantifying biological compensation in the Hudson River power case. They presented many fits of the Ricker model to data derived from striped bass catch and effort records compiled by the National Marine Fisheries Service. Based on this curve-fitting exercise, a value of 4 was chosen for the parameter alpha in the Ricker model, and this value was used to derive the utilities' estimates of the long-term impact of power plants on striped bass populations. A technique was developed and applied to address a single fundamental question: if the Ricker model were applicable to the Hudson River striped bass population, could the estimates of alpha from the curve-fitting exercise be considered reliable. The technique involved constructing a simulation model that incorporated the essential biological features of the population and simulated the characteristics of the available actual catch-per-unit-effort data through time. The ability or failure to retrieve the known parameter values underlying the simulation model via the curve-fitting exercise was a direct test of the reliability of the results of fitting stock-recruitment curves to the real data. The results demonstrated that estimates of alpha from the curve-fitting exercise were not reliable. The simulation-modeling technique provides an effective way to identify whether or not particular data are appropriate for use in fitting such models. 39 refs., 2 figs., 3 tabs

  1. Serum Albumin Is Independently Associated with Persistent Organ Failure in Acute Pancreatitis

    Directory of Open Access Journals (Sweden)

    Wandong Hong

    2017-01-01

    Full Text Available Background and Aims. To investigate the association between serum albumin levels within 24 hrs of patient admission and the development of persistent organ failure in acute pancreatitis. Methods. A total of 700 patients with acute pancreatitis were enrolled. Multivariate logistic regression and subgroup analysis determined whether decreased albumin was independently associated with persistent organ failure and mortality. The diagnostic performance of serum albumin was evaluated by the area under Receiver Operating Characteristic (ROC curves. Results. As levels of serum albumin decrease, the risk of persistent organ failure significantly increases (Ptrend<0.001. The incidence of organ failure was 3.5%, 10.6%, and 41.6% in patients with normal albumin and mild and severe hypoalbuminaemia, respectively. Decreased albumin levels were also proportionally associated with prolonged hospital stay (Ptrend<0.001 and the risk of death (Ptrend<0.001. Multivariate analysis suggested that biliary etiology, chronic concomitant diseases, hematocrit, blood urea nitrogen, and the serum albumin level were independently associated with persistent organ failure. Blood urea nitrogen and the serum albumin level were also independently associated with mortality. The area under ROC curves of albumin for predicting organ failure and mortality were 0.78 and 0.87, respectively. Conclusion. A low serum albumin is independently associated with an increased risk of developing of persistent organ failure and death in acute pancreatitis. It may also be useful for the prediction of the severity of acute pancreatitis.

  2. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  3. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  4. Assessing patient preferences in heart failure using conjoint methodology

    Directory of Open Access Journals (Sweden)

    Pisa G

    2015-08-01

    Full Text Available Giovanni Pisa,1 Florian Eichmann,1 Stephan Hupfer21Kantar Health GmbH, Munich, Germany; 2Novartis Pharma GmbH, Nuernberg, GermanyAim: The course of heart failure (HF is characterized by frequent hospitalizations, a high mortality rate, as well as a severely impaired health-related quality of life (HRQoL. To optimize disease management, understanding of patient preferences is crucial. We aimed to assess patient preferences using conjoint methodology and HRQoL in patients with HF.Methods: Two modules were applied: an initial qualitative module, consisting of in-depth interviews with 12 HF patients, and the main quantitative module in 300 HF patients from across Germany. Patients were stratified according to the time of their last HF hospitalization. Each patient was presented with ten different scenarios during the conjoint exercise. Additionally, patients completed the generic HRQoL instrument, EuroQol health questionnaire (EQ-5D™.Results: The attribute with the highest relative importance was dyspnea (44%, followed by physical capacity (18%. Of similar importance were exhaustion during mental activities (13%, fear due to HF (13%, and autonomy (12%. The most affected HRQoL dimensions according to the EQ-5D questionnaire were anxiety/depression (23% with severe problems, pain/discomfort (19%, and usual activities (15%. Overall average EQ-5D score was 0.39 with stable, chronic patients (never hospitalized having a significantly better health state vs the rest of the cohort.Conclusion: This paper analyzed patient preference in HF using a conjoint methodology. The preference weights resulting from the conjoint analysis could be used in future to design HRQoL questionnaires which could better assess patient preferences in HF care.Keywords: heart failure, quality of life, conjoint analysis, utility, patient preference

  5. Variation in Cognitive Failures: An Individual Differences Investigation of Everyday Attention and Memory Failures

    Science.gov (United States)

    Unsworth, Nash; Brewer, Gene A.; Spillers, Gregory J.

    2012-01-01

    The present study examined individual differences in everyday cognitive failures assessed by diaries. A large sample of participants completed various cognitive ability measures in the laboratory. Furthermore, a subset of these participants also recorded everyday cognitive failures (attention, retrospective memory, and prospective memory failures)…

  6. Investigation of learning and experience curves

    Energy Technology Data Exchange (ETDEWEB)

    Krawiec, F.; Thornton, J.; Edesess, M.

    1980-04-01

    The applicability of learning and experience curves for predicting future costs of solar technologies is assessed, and the major test case is the production economics of heliostats. Alternative methods for estimating cost reductions in systems manufacture are discussed, and procedures for using learning and experience curves to predict costs are outlined. Because adequate production data often do not exist, production histories of analogous products/processes are analyzed and learning and aggregated cost curves for these surrogates estimated. If the surrogate learning curves apply, they can be used to estimate solar technology costs. The steps involved in generating these cost estimates are given. Second-generation glass-steel and inflated-bubble heliostat design concepts, developed by MDAC and GE, respectively, are described; a costing scenario for 25,000 units/yr is detailed; surrogates for cost analysis are chosen; learning and aggregate cost curves are estimated; and aggregate cost curves for the GE and MDAC designs are estimated. However, an approach that combines a neoclassical production function with a learning-by-doing hypothesis is needed to yield a cost relation compatible with the historical learning curve and the traditional cost function of economic theory.

  7. Clinical usefulness of 123I-MIBG myocardial scintigraphy as a marker of the severity and prognosis of congestive heart failure

    International Nuclear Information System (INIS)

    Shiga, Koji

    1999-01-01

    To evaluate the clinical usefulness of 123 I-MIBG myocardial scintigraphy in patients with congestive heart failure. Myocardial dynamic imaging was performed immediately after 123 I-MIBG administration at 1 frame/sec for 500 sec in 52 patients with or without congestive heart failure. The %uptake/ROI, dynamic heart to mediastinum uptake ratio (H/M) and dynamic washout rate (WR) were calculated from their time activity curves to assess the relationship between the NYHA functional class and these values. In 52 other patients with heart failure, the initial and delayed MIBG anterior planar images were obtained, and H/M in delayed images and WR between initial and delayed images were measured. The patients were followed up for 31.8±16.8 months, and their survival rates were compared among three groups, H/M 123 I-MIBG myocardial scintigraphy is very useful to diagnose the severity and prognosis in patients with congestive heart failure. (K.H.)

  8. Assessing rockfall susceptibility in steep and overhanging slopes using three-dimensional analysis of failure mechanisms

    Science.gov (United States)

    Matasci, Battista; Stock, Greg M.; Jaboyedoff, Michael; Carrea, Dario; Collins, Brian D.; Guérin, Antoine; Matasci, G.; Ravanel, L.

    2018-01-01

    Rockfalls strongly influence the evolution of steep rocky landscapes and represent a significant hazard in mountainous areas. Defining the most probable future rockfall source areas is of primary importance for both geomorphological investigations and hazard assessment. Thus, a need exists to understand which areas of a steep cliff are more likely to be affected by a rockfall. An important analytical gap exists between regional rockfall susceptibility studies and block-specific geomechanical calculations. Here we present methods for quantifying rockfall susceptibility at the cliff scale, which is suitable for sub-regional hazard assessment (hundreds to thousands of square meters). Our methods use three-dimensional point clouds acquired by terrestrial laser scanning to quantify the fracture patterns and compute failure mechanisms for planar, wedge, and toppling failures on vertical and overhanging rock walls. As a part of this work, we developed a rockfall susceptibility index for each type of failure mechanism according to the interaction between the discontinuities and the local cliff orientation. The susceptibility for slope parallel exfoliation-type failures, which are generally hard to identify, is partly captured by planar and toppling susceptibility indexes. We tested the methods for detecting the most susceptible rockfall source areas on two famously steep landscapes, Yosemite Valley (California, USA) and the Drus in the Mont-Blanc massif (France). Our rockfall susceptibility models show good correspondence with active rockfall sources. The methods offer new tools for investigating rockfall hazard and improving our understanding of rockfall processes.

  9. Regional Scale Sea Cliff Hazard Assessment at Sintra and Cascais Counties, Western Coast of Portugal

    Directory of Open Access Journals (Sweden)

    Fernando Marques

    2018-02-01

    Full Text Available Mass movements of different types and sizes are the main processes of sea cliff evolution, being a considerable natural hazard, the assessment of which is a relevant issue in terms of human loss prevention and land use regulations. To predict the occurrence of future failures affecting the cliff top in slow retreating cliffs, a study was made using the logistic regression statistical method, a set of predisposing factors mainly related with geology (lithology, structure, faults, geomorphology (maximum, mean and standard variation of slope angle, height, aspect, curvatures, toe protection and near offshore mean annual wave power, which were correlated with an aerial photo interpretation based inventory of cliff failures occurred in a 63 years period (1947–2010. The susceptibility model was validated against the inventory data using standard Receiver Operator Curves, which provided area under the curve (AUC values higher than 0.8. In spite of the room for improvement of cliff failure inventories and predisposing factors to be used in these types of studies, namely those related to the rock mass strength and wave power nearshore, the results obtained indicate that the proposed approach is an effective contribution for objective and quantitative hazard assessment.

  10. Fuel failure assessments based on radiochemistry. Experience feedback and challenges

    International Nuclear Information System (INIS)

    Petit, C.; Ziabletsev, D.; Zeh, P.

    2015-01-01

    Significant improvements have been observed in LWR nuclear fuel reliability over the past years. As a result, the number of fuel failures in PWRs and BWRs has recently dramatically decreased. Nevertheless, a few remaining challenges still exist. One of them is that the industry has recently started seeing a relatively new type of fuel failure, so-called 'weak leak failures', which could be characterized by a very small release of gaseous fission products and essentially almost zero release of iodines or any other soluble fission products in the reactor coolant. Correspondingly, the behavior of these weak leakers does not follow typical behavior of a conventional leaker characterized by a proportionality of the amount of released Xe 133 related to the failed rod power. Instead, for a weak leaker, the activity of Xe 133 is directly correlated to the size of the cladding defects of the leaker. The presence of undetected weak leaker in the core may lead to carryover of a leaker into the subsequent cycle. Even if the presence of weak leaker in the core is suspected, it typically requires more effort to identify the leaker which could result in extended duration of the outage and ultimately to economic losses to the utility operating the reactor. To effectively deal with this issue the industry has been facing, several changes have been recently realized, which are different from the methodology of dealing with conventional leaker. These changes include new assessment methods, the need for improved sipping techniques to better identify low release leakers, and correspondingly better equipment to be able to locate small clad defects associated with weak leaker, such as sensitive localization device of failed rods, sensitive eddy current coil for the failed rod, ultra high definition cameras for the failed rod examination and experienced fuel reliability engineers performing cause of failure and rood cause research and analyses. Ultimately, the destructive

  11. Evaluation of the learning curve for external cephalic version using cumulative sum analysis.

    Science.gov (United States)

    Kim, So Yun; Han, Jung Yeol; Chang, Eun Hye; Kwak, Dong Wook; Ahn, Hyun Kyung; Ryu, Hyun Mi; Kim, Moon Young

    2017-07-01

    We evaluated the learning curve for external cephalic version (ECV) using learning curve-cumulative sum (LC-CUSUM) analysis. This was a retrospective study involving 290 consecutive cases between October 2013 and March 2017. We evaluated the learning curve for ECV on nulli and over para 1 group using LC-CUSUM analysis on the assumption that 50% and 70% of ECV procedures succeeded by description a trend-line of quadratic function with reliable R 2 values. The overall success rate for ECV was 64.8% (188/290), while the success rate for nullipara and over para 1 groups was 56.2% (100/178) and 78.6% (88/112), respectively. 'H' value, that the actual failure rate does not differ from the acceptable failure rate, was -3.27 and -1.635 when considering ECV success rates of 50% and 70%, respectively. Consequently, in order to obtain a consistent 50% success rate, we would require 57 nullipara cases, and in order to obtain a consistent 70% success rate, we would require 130 nullipara cases. In contrast, 8 to 10 over para 1 cases would be required for an expected success rate of 50% and 70% on over para 1 group. Even a relatively inexperienced physician can experience success with multipara and after accumulating experience, they will manage nullipara cases. Further research is required for LC-CUSUM involving several practitioners instead of a single practitioner. This will lead to the gradual implementation of standard learning curve guidelines for ECV.

  12. Reframing Success and Failure of Information Systems

    DEFF Research Database (Denmark)

    Cecez-Kecmanovic, Dubravka; Kautz, Karlheinz; Abrahall, Rebecca

    2014-01-01

    -networks of developers, managers, technologies, project documents, methodologies, and other actors. Drawing from a controversial case of a highly innovative information system in an insurance company-considered a success and failure at the same time- the paper reveals the inherent indeterminacy of IS success and failure......he paper questions common assumptions in the dominant representational framings of information systems success and failure and proposes a performative perspective that conceives IS success and failure as relational effects performed by sociomaterial practices of IS project actor...... that performed both different IS realities and competing IS assessments. The analysis shows that the IS project and the implemented system as objects of assessment are not given and fixed, but are performed by the agencies of assessment together with the assessment outcomes of success and failure. The paper...

  13. Development of failure criterion for Kevlar-epoxy fabric laminates

    Science.gov (United States)

    Tennyson, R. C.; Elliott, W. G.

    1984-01-01

    The development of the tensor polynomial failure criterion for composite laminate analysis is discussed. In particular, emphasis is given to the fabrication and testing of Kevlar-49 fabric (Style 285)/Narmco 5208 Epoxy. The quadratic-failure criterion with F(12)=0 provides accurate estimates of failure stresses for the Kevlar/Epoxy investigated. The cubic failure criterion was re-cast into an operationally easier form, providing the engineer with design curves that can be applied to laminates fabricated from unidirectional prepregs. In the form presented no interaction strength tests are required, although recourse to the quadratic model and the principal strength parameters is necessary. However, insufficient test data exists at present to generalize this approach for all undirectional prepregs and its use must be restricted to the generic materials investigated to-date.

  14. Evaluation of crack growth behavior and probabilistic S–N characteristics of carburized Cr–Mn–Si steel with multiple failure modes

    International Nuclear Information System (INIS)

    Li, Wei; Sun, Zhenduo; Zhang, Zhenyu; Deng, Hailong; Sakai, Tatsuo

    2014-01-01

    Highlights: • The stepwise S–N characteristics only for interior induced failure was observed. • The interior crack growth behavior with threshold conditions in different stages was clarified. • The distribution characteristics of test data in transition failure region was evaluated. • A model for evaluating the probabilistic S–N curve with multiple failure modes was developed. - Abstract: The unexpected failures of case-hardened steels in long life regime have been a critical issue in modern engineering design. In this study, the failure behavior of a carburized Cr–Mn–Si steel under very high cycle fatigue (VHCF) was investigated, and a model for evaluating the probabilistic S–N curve associated with multiple failure modes was developed. Results show that the carburized Cr–Mn–Si steel exhibits three failure modes including the surface flaw-induced failure, the interior inclusion-induced failure without the fine granular area (FGA) and the interior inclusion-induced failure with the FGA. As the predominant failure mode in the VHCF regime, the interior failure process can be divided into four stages: (i) the small crack growth around the inclusion, (ii) the stable macroscopic crack growth outside the FGA, (iii) the unstable crack growth outside the fish-eye and (iv) the momentary fracture outside the final crack growth zone. The threshold values are successively evaluated to be 2.33 MPa m 1/2 , 4.13 MPa m 1/2 , 18.51 MPa m 1/2 and 29.26 MPa m 1/2 . The distribution characteristics of the test data in transition failure region can be well characterized by the mixed two-parameter Weibull distribution function. The developed probabilistic S–N curve model is in good agreement with the test data with multiple failure modes. Although the result is somewhat conservative in the VHCF regime, it is acceptable for safety considerations

  15. Vitamin D and Heart Failure.

    Science.gov (United States)

    Marshall Brinkley, D; Ali, Omair M; Zalawadiya, Sandip K; Wang, Thomas J

    2017-10-01

    Vitamin D is principally known for its role in calcium homeostasis, but preclinical studies implicate multiple pathways through which vitamin D may affect cardiovascular function and influence risk for heart failure. Many adults with cardiovascular disease have low vitamin D status, making it a potential therapeutic target. We review the rationale and potential role of vitamin D supplementation in the prevention and treatment of chronic heart failure. Substantial observational evidence has associated low vitamin D status with the risk of heart failure, ventricular remodeling, and clinical outcomes in heart failure, including mortality. However, trials assessing the influence of vitamin D supplementation on surrogate markers and clinical outcomes in heart failure have generally been small and inconclusive. There are insufficient data to recommend routine assessment or supplementation of vitamin D for the prevention or treatment of chronic heart failure. Prospective trials powered for clinical outcomes are warranted.

  16. Curve of Spee and Its Relationship with Dentoskeletal Morphology

    Directory of Open Access Journals (Sweden)

    Prerna Raje Batham

    2013-01-01

    Conclusion: The curve of Spee is related to various dentoskeletal variables. Thus, the determination of this relationship is useful to assess the feasibility of leveling the curve of Spee by orthodontic treatment.

  17. Progressive senile scoliosis: Seven cases of increasing spinal curves in elderly patients

    Energy Technology Data Exchange (ETDEWEB)

    Gillespy, T. III; Gillespy, T. Jr.; Revak, C.S.

    1985-04-01

    An increasing scoliosis was documented in seven elderly women. The average curve at the most recent examination was 43/sup 0/ (range 26/sup 0/-78/sup 0/). Previous films, from 5 to 26 years before, demonstrated an average increase of 2.3/sup 0//year (range 1/sup 0/-4.8/sup 0//year). There were three lumbar and four thoracolumbar curves. Three curves were to the right and four were to the left. Only one patient had osteoporotic vertebral body crush fractures. The common underlying mechanism in the progression of senile scoliosis appears to be asymmetric loading of the spine which can be caused by a previously established scoliosis, spondylolysis/spondylolisthesis, lumbosacral anomalies, or leg length discrepancy. Subsequently, factors that can cause a curve to increase include degenerative disc disease with lateral disc space narrowing, soft tissue failure, and osteoporosis. Since even minor scoliosis can potentially progress in the older adult, increased monitoring of scoliosis in patients over age 50 years may be warranted.

  18. Progressive senile scoliosis: Seven cases of increasing spinal curves in elderly patients

    International Nuclear Information System (INIS)

    Gillespy, T. III; Revak, C.S.

    1985-01-01

    An increasing scoliosis was documented in seven elderly women. The average curve at the most recent examination was 43 0 (range 26 0 -78 0 ). Previous films, from 5 to 26 years before, demonstrated an average increase of 2.3 0 /year (range 1 0 -4.8 0 /year). There were three lumbar and four thoracolumbar curves. Three curves were to the right and four were to the left. Only one patient had osteoporotic vertebral body crush fractures. The common underlying mechanism in the progression of senile scoliosis appears to be asymmetric loading of the spine which can be caused by a previously established scoliosis, spondylolysis/spondylolisthesis, lumbosacral anomalies, or leg length discrepancy. Subsequently, factors that can cause a curve to increase include degenerative disc disease with lateral disc space narrowing, soft tissue failure, and osteoporosis. Since even minor scoliosis can potentially progress in the older adult, increased monitoring of scoliosis in patients over age 50 years may be warranted. (orig.)

  19. Implementation of the Master Curve method in ProSACC

    Energy Technology Data Exchange (ETDEWEB)

    Feilitzen, Carl von; Sattari-Far, Iradj [Inspecta Technology AB, Stockholm (Sweden)

    2012-03-15

    Cleavage fracture toughness data display normally large amount of statistical scatter in the transition region. The cleavage toughness data in this region is specimen size-dependent, and should be treated statistically rather than deterministically. Master Curve methodology is a procedure for mechanical testing and statistical analysis of fracture toughness of ferritic steels in the transition region. The methodology accounts for temperature and size dependence of fracture toughness. Using the Master Curve methodology for evaluation of the fracture toughness in the transition region releases the overconservatism that has been observed in using the ASME-KIC curve. One main advantage of using the Master Curve methodology is possibility to use small Charpy-size specimens to determine fracture toughness. Detailed description of the Master Curve methodology is given by Sattari-Far and Wallin [2005). ProSACC is a suitable program in using for structural integrity assessments of components containing crack like defects and for defect tolerance analysis. The program gives possibilities to conduct assessments based on deterministic or probabilistic grounds. The method utilized in ProSACC is based on the R6-method developed at Nuclear Electric plc, Milne et al [1988]. The basic assumption in this method is that fracture in a cracked body can be described by two parameters Kr and Lr. The parameter Kr is the ratio between the stress intensity factor and the fracture toughness of the material. The parameter Lr is the ratio between applied load and the plastic limit load of the structure. The ProSACC assessment results are therefore highly dependent on the applied fracture toughness value in the assessment. In this work, the main options of the Master Curve methodology are implemented in the ProSACC program. Different options in evaluating Master Curve fracture toughness from standard fracture toughness testing data or impact testing data are considered. In addition, the

  20. Implementation of the Master Curve method in ProSACC

    International Nuclear Information System (INIS)

    Feilitzen, Carl von; Sattari-Far, Iradj

    2012-03-01

    Cleavage fracture toughness data display normally large amount of statistical scatter in the transition region. The cleavage toughness data in this region is specimen size-dependent, and should be treated statistically rather than deterministically. Master Curve methodology is a procedure for mechanical testing and statistical analysis of fracture toughness of ferritic steels in the transition region. The methodology accounts for temperature and size dependence of fracture toughness. Using the Master Curve methodology for evaluation of the fracture toughness in the transition region releases the overconservatism that has been observed in using the ASME-KIC curve. One main advantage of using the Master Curve methodology is possibility to use small Charpy-size specimens to determine fracture toughness. Detailed description of the Master Curve methodology is given by Sattari-Far and Wallin [2005). ProSACC is a suitable program in using for structural integrity assessments of components containing crack like defects and for defect tolerance analysis. The program gives possibilities to conduct assessments based on deterministic or probabilistic grounds. The method utilized in ProSACC is based on the R6-method developed at Nuclear Electric plc, Milne et al [1988]. The basic assumption in this method is that fracture in a cracked body can be described by two parameters Kr and Lr. The parameter Kr is the ratio between the stress intensity factor and the fracture toughness of the material. The parameter Lr is the ratio between applied load and the plastic limit load of the structure. The ProSACC assessment results are therefore highly dependent on the applied fracture toughness value in the assessment. In this work, the main options of the Master Curve methodology are implemented in the ProSACC program. Different options in evaluating Master Curve fracture toughness from standard fracture toughness testing data or impact testing data are considered. In addition, the

  1. Application of a few orthogonal polynomials to the assessment of the fracture failure probability of a spherical tank

    International Nuclear Information System (INIS)

    Cao Tianjie; Zhou Zegong

    1993-01-01

    This paper presents some methods to assess the fracture failure probability of a spherical tank. These methods convert the assessment of the fracture failure probability into the calculation of the moment of cracks and a one-dimensional integral. In the paper, we first derive series' formulae to calculation the moments of cracks on the occasion of the crack fatigue growth and the moments of crack opening displacements according to JWES-2805 code. We then use the first n moments of crack opening displacements and a few orthogonal polynomials to compose the probability density function of the crack opening displacement. Lastly, the fracture failure probability is obtained according to the interference theory. An example proves that these methods are simpler, quicker, and more accurate. At the same time, these methods avoid the disadvantage of Edgeworth's series method. (author)

  2. Trial application of the candidate root cause categorization scheme and preliminary assessment of selected data bases for the root causes of component failures program

    International Nuclear Information System (INIS)

    Bruske, S.Z.; Cadwallader, L.C.; Stepina, P.L.

    1985-04-01

    The objective of the Nuclear Regulatory Commission's (NRC) Root Causes of Component Failures Program is to develop and apply a categorization scheme for identifying root causes of failures for components that comprise safety and safety support systems of nuclear power plants. Results from this program will provide valuable input in the areas of probabilistic risk assessment, reliability assurance, and application of risk assessments in the inspection program. This report presents the trial application and assessment of the candidate root cause categorization scheme to three failure data bases: the In-Plant Reliability Data System (IPRDS), the Licensee Event Report (LER) data base, and the Nuclear Plant Reliability Data System (NPRDS). Results of the trial application/assessment show that significant root cause information can be obtained from these failure data bases

  3. Prognostic utility of the Seattle Heart Failure Score and amino terminal pro B-type natriuretic peptide in varying stages of systolic heart failure.

    Science.gov (United States)

    Adlbrecht, Christopher; Hülsmann, Martin; Neuhold, Stephanie; Strunk, Guido; Pacher, Richard

    2013-05-01

    Cardiac transplantation represents the best procedure to improve long-term clinical outcome in advanced chronic heart failure (CHF), if pre-selection criteria are sufficient to outweigh the risk of the failing heart over the risk of transplantation. Although the cornerstone of success, risk assessment in heart transplant candidates is still under-investigated. Amino terminal pro B-type natriuretic peptide (NT-proBNP) is regarded as the best predictor of outcome in CHF, and the Seattle Heart Failure Score (SHFS), including clinical markers, is widely used if NT-proBNP is unavailable. The present study assessed the predictive value for all-cause death of the SHFS in CHF patients and compared it with NT-proBNP in a multivariate model including established baseline parameters known to predict survival. A total of 429 patients receiving stable HF-specific pharmacotherapy were included and monitored for 53.4 ± 20.6 months. Of these, 133 patients (31%) died during follow-up. Several established predictors of death on univariate analysis proved significant for the total study cohort. Systolic pulmonary arterial pressure (hazard ratio [HR], 1.03; 95% confidence interval [CI], 1.02-1.05); p < 0.001, Wald 15.1), logNT-proBNP (HR, 1.51; 95% CI, 1.22-1.86; p < 0.001, Wald 14.9), and the SHFS (HR, 0.99; 95% CI, 0.99-1.00; p < 0.001, Wald 12.6) remained within the stepwise multivariate Cox regression model as independent predictors of all-cause death. Receiver operating characteristic curve analysis revealed an area under the curve of 0.802 for logNT-proBNP and 0.762 for the SHFS. NT-proBNP is a more potent marker to identify patients at the highest risk. If the NT-proBNP measurement is unavailable, the SHFS may serve as an adequate clinical surrogate to predict all-cause death. Copyright © 2013 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.

  4. Electrical impedance tomography in the assessment of extravascular lung water in noncardiogenic acute respiratory failure

    NARCIS (Netherlands)

    Kunst, P. W.; Vonk Noordegraaf, A.; Raaijmakers, E.; Bakker, J.; Groeneveld, A. B.; Postmus, P. E.; de Vries, P. M.

    1999-01-01

    STUDY OBJECTIVES: To establish the value of electrical impedance tomography (EIT) in assessing pulmonary edema in noncardiogenic acute respiratory failure (ARF), as compared to the thermal dye double indicator dilution technique (TDD). DESIGN: Prospective clinical study. SETTING: ICU of a general

  5. Noninvasive radiographic assessment of cardiovascular function in acute and chronic respiratory failure

    International Nuclear Information System (INIS)

    Berger, H.J.; Matthay, R.A.

    1981-01-01

    Noninvasive radiographic techniques have provided a means of studying the natural history and pathogenesis of cardiovascular performance in acute and chronic respiratory failure. Chest radiography, radionuclide angiocardiography and thallium-201 imaging, and M mode and cross-sectional echocardiography have been employed. Each of these techniques has specific uses, attributes and limitations. For example, measurement of descending pulmonary arterial diameters on the plain chest radiograph allows determination of the presence or absence of pulmonary arterial hypertension. Right and left ventricular performance can be evaluated at rest and during exercise using radionuclide angiocardiography. The biventricular response to exercise and to therapeutic interventions also can be assessed with this approach. Evaluation of the pulmonary valve echogram and echocardiographic right ventricular dimensions have been shown to reflect right ventricular hemodynamics and size. Each of these noninvasive techniques has been applied to the study of patients with respiratory failure and has provided important physiologic data

  6. Two viewpoints for software failures and their relation in probabilistic safety assessment of digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2015-01-01

    As the use of digital systems in nuclear power plants increases, the reliability of the software becomes one of the important issues in probabilistic safety assessment. In this paper, two viewpoints for a software failure during the operation of a digital system or a statistical software test are identified, and the relation between them is provided. In conventional software reliability analysis, a failure is mainly viewed with respect to the system operation. A new viewpoint with respect to the system input is suggested. The failure probability density functions for the two viewpoints are defined, and the relation between the two failure probability density functions is derived. Each failure probability density function can be derived from the other failure probability density function by applying the derived relation between the two failure probability density functions. The usefulness of the derived relation is demonstrated by applying it to the failure data obtained from the software testing of a real system. The two viewpoints and their relation, as identified in this paper, are expected to help us extend our understanding of the reliability of safety-critical software. (author)

  7. Deformation and failure mechanism of slope in three dimensions

    Directory of Open Access Journals (Sweden)

    Yingfa Lu

    2015-04-01

    Full Text Available Understanding three-dimensional (3D slope deformation and failure mechanism and corresponding stability analyses are crucially important issues in geotechnical engineering. In this paper, the mechanisms of progressive failure with thrust-type and pull-type landslides are described in detail. It is considered that the post-failure stress state and the pre-peak stress state may occur at different regions of a landslide body with deformation development, and a critical stress state element (or the soil slice block exists between the post-failure stress state and the pre-peak stress state regions. In this regard, two sorts of failure modes are suggested for the thrust-type and three sorts for pull-type landslides, based on the characteristics of shear stress and strain (or tensile stress and strain. Accordingly, a new joint constitutive model (JCM is proposed based on the current stability analytical theories, and it can be used to describe the mechanical behaviors of geo-materials with softening properties. Five methods, i.e. CSRM (comprehensive sliding resistance method, MTM (main thrust method, CDM (comprehensive displacement method, SDM (surplus displacement method, and MPM (main pull method, for slope stability calculation are proposed. The S-shaped curve of monitored displacement vs. time is presented for different points on the sliding surface during progressive failure process of landslide, and the relationship between the displacement of different points on the sliding surface and height of landslide body is regarded as the parabolic curve. The comparisons between the predicted and observed load–displacement and displacement–time relations of the points on the sliding surface are conducted. The classification of stable/unstable displacement–time curves is proposed. The definition of the main sliding direction of a landslide is also suggested in such a way that the failure body of landslide (simplified as “collapse body” is only

  8. Diastolic effects of chronic digitalization in systolic heart failure.

    Science.gov (United States)

    Hassapoyannes, C A; Bergh, M E; Movahed, M R; Easterling, B M; Omoigui, N A

    1998-10-01

    The efficacy of short-term digitalization on exercise tolerance may, in part, reflect enhanced diastolic performance. However, cardiac glycosides can impair ventricular relaxation from cytosolic Ca++ overload. To detect any time-dependent adverse effect, we assessed the diastolic function after long-term use of digitalis in patients with mild to moderate systolic left ventricular failure. From a cohort of 80 patients who received long-term, randomized, double-blind treatment with digitalis versus placebo at the WJB Dorn Veterans Affairs Medical Center, 38 survivors were evaluated at the end of follow-up (mean 48.4 months) with evaluators blinded to treatment used. Each survivor underwent equilibrium scintigraphic and echocardiographic assessment of diastolic function. Peak and mean filling rates normalized with filling volume (FV), diastolic phase durations normalized with duration of diastole, and filling fractions were measured from the time-activity curve. The isovolumic relaxation period and ventricular dimensions were computed echocardiographically. By actual-treatment-received analysis, treated versus untreated patients manifested a trend toward longer isovolumic relaxation (80.76 ms vs 61.54 ms, P = .06) but a markedly lower peak rapid filling rate (6.39 FV/sec vs 10.56 FV/sec, P = .02) despite comparable loading conditions. In addition, treated patients exhibited a lower mean rate of rapid filling (2.75 FV/sec vs 3.78 FV/sec, P = .05) in the absence of a longer rapid filling duration. However, the end-diastolic ventricular dimension did not differ between the 2 groups. Similar results were obtained by intention-to-treat analysis. Importantly, the mortality rate from worsening heart failure in the inception cohort was lower in the digitalis group versus the placebo group (P = .05) with no difference in total cardiac or all-cause mortality. After long-term digitalization for systolic left ventricular failure, cross-sectional comparison with a control group

  9. Early biomarkers of acute kidney failure after heart angiography or heart surgery in patients with acute coronary syndrome or acute heart failure.

    Science.gov (United States)

    Torregrosa, Isidro; Montoliu, Carmina; Urios, Amparo; Elmlili, Nisrin; Puchades, María Jesús; Solís, Miguel Angel; Sanjuán, Rafael; Blasco, Maria Luisa; Ramos, Carmen; Tomás, Patricia; Ribes, José; Carratalá, Arturo; Juan, Isabel; Miguel, Alfonso

    2012-01-01

    Acute kidney injury (AKI) is a common complication in cardiac surgery and coronary angiography, which worsens patients' prognosis. The diagnosis is based on the increase in serum creatinine, which is delayed. It is necessary to identify and validate new biomarkers that allow for early and effective interventions. To assess the sensitivity and specificity of neutrophil gelatinase-associated lipocalin in urine (uNGAL), interleukin-18 (IL-18) in urine and cystatin C in serum for the early detection of AKI in patients with acute coronary syndrome or heart failure, and who underwent cardiac surgery or catheterization. The study included 135 patients admitted to the intensive care unit for acute coronary syndrome or heart failure due to coronary or valvular pathology and who underwent coronary angiography or cardiac bypass surgery or valvular replacement. The biomarkers were determined 12 hours after surgery and serum creatinine was monitored during the next six days for the diagnosis of AKI. The area under the ROC curve (AUC) for NGAL was 0.983, and for cystatin C and IL-18 the AUCs were 0.869 and 0.727, respectively. At a cut-off of 31.9 ng/ml for uNGAL the sensitivity was 100% and the specificity was 91%. uNGAL is an early marker of AKI in patients with acute coronary syndrome or heart failure and undergoing cardiac surgery and coronary angiography, with a higher predictive value than cystatin C or IL-18.

  10. Direct recording of cardiac output- and venous return-curves in the dog heart-lung preparation for a graphical analysis of the effects of cardioactive drugs.

    Science.gov (United States)

    Ishikawa, N; Taki, K; Hojo, Y; Hagino, Y; Shigei, T

    1978-09-01

    The dog heart-lung preparations were prepared. The "equilibrium point", which could be defined as the point at which the cardiac output (CO)-curve and the venous return (VR)-curve crossed, when the CO and VR were plotted against the right atrial pressure, was recorded directly by utilizing an X-Y recorder. The CO-curve was obtained, as a locus of the equilibrium point, by raising and lowering the level of blood in the venous reservoir (competence test). The meaning of the procedure was shown to increase or decrease the mean systemic pressure, and to cause the corresponding parallel shift in the VR-curve. The VR-curve was obtained by changing myocardial contractility. When heart failure was induced by pentobarbital or by chloroform, the equilibrium point shifted downwards to the right, depicting the VR-curve. During development of the failure, the slopes of CO-curves decreased gradually. Effects of cinobufagin and norepinephrine were also analyzed. Utilization of the X-Y recorder enabled us to settle the uniform experimental conditions more easily, and to follow the effects of drugs continuously on a diagram equating the CO- and VR-curves (Gyton's scheme).

  11. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  12. Failure Impact Assessment for Large-Scale Landslides Located Near Human Settlement: Case Study in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chung

    2018-05-01

    Full Text Available In 2009, Typhoon Morakot caused over 680 deaths and more than 20,000 landslides in Taiwan. From 2010 to 2015, the Central Geological Survey of the Ministry of Economic Affairs identified 1047 potential large-scale landslides in Taiwan, of which 103 may have affected human settlements. This paper presents an analytical procedure that can be applied to assess the possible impact of a landslide collapse on nearby settlements. In this paper, existing technologies, including interpretation of remote sensing images, hydrogeological investigation, and numerical analysis, are integrated to evaluate potential failure scenarios and the landslide scale of a specific case: the Xinzhuang landslide. GeoStudio and RAMMS analysis modes and hazard classification produced the following results: (1 evaluation of the failure mechanisms and the influence zones of large-scale landslides; (2 assessment of the migration and accumulation of the landslide mass after failure; and (3 a landslide hazard and evacuation map. The results of the case study show that this analytical procedure can quantitatively estimate potential threats to human settlements. Furthermore, it can be applied to other villages and used as a reference in disaster prevention and evacuation planning.

  13. Regional flow duration curves for ungauged sites in Sicily

    Directory of Open Access Journals (Sweden)

    F. Viola

    2011-01-01

    Full Text Available Flow duration curves are simple and powerful tools to deal with many hydrological and environmental problems related to water quality assessment, water-use assessment and water allocation. Unfortunately the scarcity of streamflow data enables the use of these instruments only for gauged basins. A regional model is developed here for estimating flow duration curves at ungauged basins in Sicily, Italy. Due to the complex ephemeral behavior of the examined region, this study distinguishes dry periods, when flows are zero, from wet periods using a three parameters power law to describe the frequency distribution of flows. A large dataset of streamflows has been analyzed and the parameters of flow duration curves have been derived for about fifty basins. Regional regression equations have been developed to derive flow duration curves starting from morphological basin characteristics.

  14. A new development in the method of measurement of reciprocity-law failure and its application to screen/green-sensitive x-ray film systems

    International Nuclear Information System (INIS)

    Fujita, Hiroshi; Uchida, Suguru.

    1981-01-01

    Since it has been confirmed by experiment that the intensity of X-rays varies approximately as the focus-film distance (FFD) to the minus 2.12th power, the X-ray intensity can be changed by varying the FFD. It is shown in this paper that two types of reciprocity failure curve, density vs. exposure time for constant exposure and relative exposure vs. exposure time for constant density, can easily be obtained from several time-scale characteristic curves taken experimentally for several FFD's in the rare-earth screen-film systems used. Only low-intensity reciprocity failure is present for exposure times of more than about 0.1 sec for one film, but both low-intensity and high-intensity reciprocity failures occur in the other one. The effects of reciprocity failure on the H D curves can be seen in the shape of the relative speed. (author)

  15. Fuzzy logic prioritization of failures in a system failure mode, effects and criticality analysis

    International Nuclear Information System (INIS)

    Bowles, John B.; Pelaez, C.E.

    1995-01-01

    This paper describes a new technique, based on fuzzy logic, for prioritizing failures for corrective actions in a Failure Mode, Effects and Criticality Analysis (FMECA). As in a traditional criticality analysis, the assessment is based on the severity, frequency of occurrence, and detectability of an item failure. However, these parameters are here represented as members of a fuzzy set, combined by matching them against rules in a rule base, evaluated with min-max inferencing, and then defuzzified to assess the riskiness of the failure. This approach resolves some of the problems in traditional methods of evaluation and it has several advantages compared to strictly numerical methods: 1) it allows the analyst to evaluate the risk associated with item failure modes directly using the linguistic terms that are employed in making the criticality assessment; 2) ambiguous, qualitative, or imprecise information, as well as quantitative data, can be used in the assessment and they are handled in a consistent manner; and 3) it gives a more flexible structure for combining the severity, occurrence, and detectability parameters. Two fuzzy logic based approaches for assessing criticality are presented. The first is based on the numerical rankings used in a conventional Risk Priority Number (RPN) calculation and uses crisp inputs gathered from the user or extracted from a reliability analysis. The second, which can be used early in the design process when less detailed information is available, allows fuzzy inputs and also illustrates the direct use of the linguistic rankings defined for the RPN calculations

  16. Use of Master Curve technology for assessing shallow flaws in a reactor pressure vessel material

    International Nuclear Information System (INIS)

    Bass, Bennett Richard; Taylor, Nigel

    2006-01-01

    In the NESC-IV project an experimental/analytical program was performed to develop validated analysis methods for transferring fracture toughness data to shallow flaws in reactor pressure vessels subject to biaxial loading in the lower-transition temperature region. Within this scope an extensive range of fracture tests was performed on material removed from a production-quality reactor pressure vessel. The Master Curve analysis of this data is reported and its application to the assessment of the project feature tests on large beam test pieces.

  17. Definition of containment failure

    International Nuclear Information System (INIS)

    Cybulskis, P.

    1982-01-01

    Core meltdown accidents of the types considered in probabilistic risk assessments (PRA's) have been predicted to lead to pressures that will challenge the integrity of containment structures. Review of a number of PRA's indicates considerable variation in the predicted probability of containment failure as a function of pressure. Since the results of PRA's are sensitive to the prediction of the occurrence and the timing of containment failure, better understanding of realistic containment capabilities and a more consistent approach to the definition of containment failure pressures are required. Additionally, since the size and location of the failure can also significantly influence the prediction of reactor accident risk, further understanding of likely failure modes is required. The thresholds and modes of containment failure may not be independent

  18. Assessment of compressive failure process of cortical bone materials using damage-based model.

    Science.gov (United States)

    Ng, Theng Pin; R Koloor, S S; Djuansjah, J R P; Abdul Kadir, M R

    2017-02-01

    The main failure factors of cortical bone are aging or osteoporosis, accident and high energy trauma or physiological activities. However, the mechanism of damage evolution coupled with yield criterion is considered as one of the unclear subjects in failure analysis of cortical bone materials. Therefore, this study attempts to assess the structural response and progressive failure process of cortical bone using a brittle damaged plasticity model. For this reason, several compressive tests are performed on cortical bone specimens made of bovine femur, in order to obtain the structural response and mechanical properties of the material. Complementary finite element (FE) model of the sample and test is prepared to simulate the elastic-to-damage behavior of the cortical bone using the brittle damaged plasticity model. The FE model is validated in a comparative method using the predicted and measured structural response as load-compressive displacement through simulation and experiment. FE results indicated that the compressive damage initiated and propagated at central region where maximum equivalent plastic strain is computed, which coincided with the degradation of structural compressive stiffness followed by a vast amount of strain energy dissipation. The parameter of compressive damage rate, which is a function dependent on damage parameter and the plastic strain is examined for different rates. Results show that considering a similar rate to the initial slope of the damage parameter in the experiment would give a better sense for prediction of compressive failure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Texas curve margin of safety.

    Science.gov (United States)

    2013-01-01

    This software can be used to assist with the assessment of margin of safety for a horizontal curve. It is intended for use by engineers and technicians responsible for safety analysis or management of rural highway pavement or traffic control devices...

  20. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  1. Relating oxygen partial pressure, saturation and content: the haemoglobin-oxygen dissociation curve.

    Science.gov (United States)

    Collins, Julie-Ann; Rudenski, Aram; Gibson, John; Howard, Luke; O'Driscoll, Ronan

    2015-09-01

    The delivery of oxygen by arterial blood to the tissues of the body has a number of critical determinants including blood oxygen concentration (content), saturation (S O2 ) and partial pressure, haemoglobin concentration and cardiac output, including its distribution. The haemoglobin-oxygen dissociation curve, a graphical representation of the relationship between oxygen satur-ation and oxygen partial pressure helps us to understand some of the principles underpinning this process. Historically this curve was derived from very limited data based on blood samples from small numbers of healthy subjects which were manipulated in vitro and ultimately determined by equations such as those described by Severinghaus in 1979. In a study of 3524 clinical specimens, we found that this equation estimated the S O2 in blood from patients with normal pH and S O2 >70% with remarkable accuracy and, to our knowledge, this is the first large-scale validation of this equation using clinical samples. Oxygen saturation by pulse oximetry (S pO2 ) is nowadays the standard clinical method for assessing arterial oxygen saturation, providing a convenient, pain-free means of continuously assessing oxygenation, provided the interpreting clinician is aware of important limitations. The use of pulse oximetry reduces the need for arterial blood gas analysis (S aO2 ) as many patients who are not at risk of hypercapnic respiratory failure or metabolic acidosis and have acceptable S pO2 do not necessarily require blood gas analysis. While arterial sampling remains the gold-standard method of assessing ventilation and oxygenation, in those patients in whom blood gas analysis is indicated, arterialised capillary samples also have a valuable role in patient care. The clinical role of venous blood gases however remains less well defined.

  2. Relating oxygen partial pressure, saturation and content: the haemoglobin–oxygen dissociation curve

    Directory of Open Access Journals (Sweden)

    Julie-Ann Collins

    2015-09-01

    The delivery of oxygen by arterial blood to the tissues of the body has a number of critical determinants including blood oxygen concentration (content, saturation (SO2 and partial pressure, haemoglobin concentration and cardiac output, including its distribution. The haemoglobin–oxygen dissociation curve, a graphical representation of the relationship between oxygen satur­ation and oxygen partial pressure helps us to understand some of the principles underpinning this process. Historically this curve was derived from very limited data based on blood samples from small numbers of healthy subjects which were manipulated in vitro and ultimately determined by equations such as those described by Severinghaus in 1979. In a study of 3524 clinical specimens, we found that this equation estimated the SO2 in blood from patients with normal pH and SO2 >70% with remarkable accuracy and, to our knowledge, this is the first large-scale validation of this equation using clinical samples. Oxygen saturation by pulse oximetry (SpO2 is nowadays the standard clinical method for assessing arterial oxygen saturation, providing a convenient, pain-free means of continuously assessing oxygenation, provided the interpreting clinician is aware of important limitations. The use of pulse oximetry reduces the need for arterial blood gas analysis (SaO2 as many patients who are not at risk of hypercapnic respiratory failure or metabolic acidosis and have acceptable SpO2 do not necessarily require blood gas analysis. While arterial sampling remains the gold-standard method of assessing ventilation and oxygenation, in those patients in whom blood gas analysis is indicated, arterialised capillary samples also have a valuable role in patient care. The clinical role of venous blood gases however remains less well defined.

  3. Assessment of modification level of hypoeutectic Al -Si alloys by pattern recognition of cooling curves

    Directory of Open Access Journals (Sweden)

    CHEN Xiang

    2005-11-01

    Full Text Available Most evaluations of modification level are done according to a specific scale based on an merican Foundry Society (AFS standard wall chart as qualitative analysis in Al-Si casting production currently. This method is quite dependent on human experience when making comparisons of the microstructure with the standard chart. And the structures depicted in the AFS chart do not always resemble those seen in actual Al-Si castings. Therefore, this ualitativeanalysis procedure is subjective and can introduce human-caused errors into comparative metallographic analyses. A quantization parameter of the modification level was introduced by setting up the relationship between mean area weighted shape factor of eutectic silicon phase and the modification level using image analysis technology. In order to evaluate the modification level, a new method called "intelligent evaluating of melt quality by pattern recognition of hermal analysis cooling curves" has also been introduced. The results show that silicon modification level can be precisely assessed by comparison of the cooling curve of the melt to be evaluated with the one most similar to it in a database.

  4. An analytical model for interactive failures

    International Nuclear Information System (INIS)

    Sun Yong; Ma Lin; Mathew, Joseph; Zhang Sheng

    2006-01-01

    In some systems, failures of certain components can interact with each other, and accelerate the failure rates of these components. These failures are defined as interactive failure. Interactive failure is a prevalent cause of failure associated with complex systems, particularly in mechanical systems. The failure risk of an asset will be underestimated if the interactive effect is ignored. When failure risk is assessed, interactive failures of an asset need to be considered. However, the literature is silent on previous research work in this field. This paper introduces the concepts of interactive failure, develops an analytical model to analyse this type of failure quantitatively, and verifies the model using case studies and experiments

  5. Failure criterion of concrete type material and punching failure analysis of thick mortar plate

    International Nuclear Information System (INIS)

    Ohno, T.; Kuroiwa, M.; Irobe, M.

    1979-01-01

    In this paper falure surface of concrete type material is proposed and its validity to structural analysis is examined. The study is an introductory part of evaluation for ultimate strength of reinforced and prestressed concrete structures in reactor technology. The failure surface is expressed in a linear form in terms of octahedral normal and shear stresses. Coefficient of the latter stress is given by a trigonometric series in threefold angle of similarity. Hence, its meridians are multilinear and traces of its deviatoric sections are smooth curves having periodicity of 2π/3 around space diagonal in principal stress space. The mathematical expression of the surface has an arbitraty number of parameters so that material test results are well reflected. To confirm the effectiveness of proposed failure criterion, experiment and numerical analysis by the finite element method on punching failure of thick mortar plate in axial symmetry are compared. In the numerical procedure yield surface of the material is assumed to exist mainly in compression region, since a brittle cleavage or elastic fracture occurs in the concrete type material under stress state with tension, while a ductile or plastic fracture occurs under compressive stress state. (orig.)

  6. BILAM: a composite laminate failure-analysis code using bilinear stress-strain approximations

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, P.V. Jr.; Dasgupta, A.; Chun, Y.W.

    1980-10-01

    The BILAM code which uses constant strain laminate analysis to generate in-plane load/deformation or stress/strain history of composite laminates to the point of laminate failure is described. The program uses bilinear stress-strain curves to model layer stress-strain behavior. Composite laminates are used for flywheels. The use of this computer code will help to develop data on the behavior of fiber composite materials which can be used by flywheel designers. In this program the stress-strain curves are modelled by assuming linear response in axial tension while using bilinear approximations (2 linear segments) for stress-strain response to axial compressive, transverse tensile, transverse compressive and axial shear loadings. It should be noted that the program attempts to empirically simulate the effects of the phenomena which cause nonlinear stress-strain behavior, instead of mathematically modelling the micromechanics involved. This code, therefore, performs a bilinear laminate analysis, and, in conjunction with several user-defined failure interaction criteria, is designed to provide sequential information on all layer failures up to and including the first fiber failure. The modus operandi is described. Code BILAM can be used to: predict the load-deformation/stress-strain behavior of a composite laminate subjected to a given combination of in-plane loads, and make analytical predictions of laminate strength.

  7. Determination of sieve grading curves using an optical device

    OpenAIRE

    PHAM, AM; DESCANTES, Yannick; DE LARRARD, François

    2011-01-01

    The grading curve of an aggregate is a fundamental characteristic for mix design that can easily be modified to adjust several mix properties. While sieve analysis remains the reference method to determine this curve, optical devices are developing, allowing easier and faster assessment of aggregate grading. Unfortunately, optical grading results significantly differ from sieve grading curves. As a consequence, getting full acceptance of these new methods requires building bridges between the...

  8. Linear transform of the multi-target survival curve

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J V [Cambridge Univ. (UK). Dept. of Clinical Oncology and Radiotherapeutics

    1978-07-01

    A completely linear transform of the multi-target survival curve is presented. This enables all data, including those on the shoulder region of the curve, to be analysed. The necessity to make a subjective assessment about which data points to exclude for conventional methods of analysis is, therefore, removed. The analysis has also been adapted to include a 'Pike-Alper' method of assessing dose modification factors. For the data cited this predicts compatibility with the hypothesis of a true oxygen 'dose-modification' whereas the conventional Pike-Alper analysis does not.

  9. Failure Assessment for the High-Strength Pipelines with Constant-Depth Circumferential Surface Cracks

    OpenAIRE

    X. Liu; Z. X. Lu; Y. Chen; Y. L. Sui; L. H. Dai

    2018-01-01

    In the oil and gas transportation system over long distance, application of high-strength pipeline steels can efficiently reduce construction and operation cost by increasing operational pressure and reducing the pipe wall thickness. Failure assessment is an important issue in the design, construction, and maintenance of the pipelines. The small circumferential surface cracks with constant depth in the welded pipelines are of practical interest. This work provides an engineering estimation pr...

  10. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  11. Association between Platelet Counts before and during Pharmacological Therapy for Patent Ductus Arteriosus and Treatment Failure in Preterm Infants

    Directory of Open Access Journals (Sweden)

    Hannes Sallmon

    2018-03-01

    Full Text Available BackgroundThe role of platelets for mediating closure of the ductus arteriosus in human preterm infants is controversial. Especially, the effect of low platelet counts on pharmacological treatment failure is still unclear.MethodsIn this retrospective study of 471 preterm infants [<1,500 g birth weight (BW], who were treated for a patent ductus arteriosus (PDA with indomethacin or ibuprofen, we investigated whether platelet counts before or during pharmacological treatment had an impact on the successful closure of a hemodynamically significant PDA. The effects of other factors, such as sepsis, preeclampsia, gestational age, BW, and gender, were also evaluated.ResultsPlatelet counts before initiation of pharmacological PDA treatment did not differ between infants with later treatment success or failure. However, we found significant associations between low platelet counts during pharmacological PDA therapy and treatment failure (p < 0.05. Receiver operating characteristic (ROC curve analysis showed that platelet counts after the first, and before and after the second cyclooxygenase inhibitor (COXI cycle were significantly associated with treatment failure (area under the curve of >0.6. However, ROC curve analysis did not reveal a specific platelet cutoff-value that could predict PDA treatment failure. Multivariate logistic regression analysis showed that lower platelet counts, a lower BW, and preeclampsia were independently associated with COXI treatment failure.ConclusionWe provide further evidence for an association between low platelet counts during pharmacological therapy for symptomatic PDA and treatment failure, while platelet counts before initiation of therapy did not affect treatment outcome.

  12. Readability Assessment of Online Patient Education Material on Congestive Heart Failure

    Science.gov (United States)

    2017-01-01

    Background Online health information is being used more ubiquitously by the general population. However, this information typically favors only a small percentage of readers, which can result in suboptimal medical outcomes for patients. Objective The readability of online patient education materials regarding the topic of congestive heart failure was assessed through six readability assessment tools. Methods The search phrase “congestive heart failure” was employed into the search engine Google. Out of the first 100 websites, only 70 were included attending to compliance with selection and exclusion criteria. These were then assessed through six readability assessment tools. Results Only 5 out of 70 websites were within the limits of the recommended sixth-grade readability level. The mean readability scores were as follows: the Flesch-Kincaid Grade Level (9.79), Gunning-Fog Score (11.95), Coleman-Liau Index (15.17), Simple Measure of Gobbledygook (SMOG) index (11.39), and the Flesch Reading Ease (48.87). Conclusion Most of the analyzed websites were found to be above the sixth-grade readability level recommendations. Efforts need to be made to better tailor online patient education materials to the general population. PMID:28656111

  13. Fibre failure assessment in carbon fibre reinforced polymers under fatigue loading by synchrotron X-ray computed tomography

    OpenAIRE

    Garcea, Serafina; Sinclair, Ian; Spearing, Simon

    2016-01-01

    In situ fatigue experiments using synchrotron X-ray computed tomography (SRCT) are used to assess the underpinning micromechanisms of fibre failure in double notch carbon/epoxy coupons. Observations showed fibre breaks along the 0º ply splits, associated with the presence and failure of bridging fibres, as well as fibres failed in the bulk composite within the 0º plies. A tendency for cluster formation, with multiple adjacent breaks in the bulk composite was observed when higher peak loads we...

  14. The South Carolina bridge-scour envelope curves

    Science.gov (United States)

    Benedict, Stephen T.; Feaster, Toby D.; Caldwell, Andral W.

    2016-09-30

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a series of three field investigations to evaluate historical, riverine bridge scour in the Piedmont and Coastal Plain regions of South Carolina. These investigations included data collected at 231 riverine bridges, which lead to the development of bridge-scour envelope curves for clear-water and live-bed components of scour. The application and limitations of the South Carolina bridge-scour envelope curves were documented in four reports, each report addressing selected components of bridge scour. The current investigation (2016) synthesizes the findings of these previous reports into a guidance manual providing an integrated procedure for applying the envelope curves. Additionally, the investigation provides limited verification for selected bridge-scour envelope curves by comparing them to field data collected outside of South Carolina from previously published sources. Although the bridge-scour envelope curves have limitations, they are useful supplementary tools for assessing the potential for scour at riverine bridges in South Carolina.

  15. Learning curves in health professions education.

    Science.gov (United States)

    Pusic, Martin V; Boutis, Kathy; Hatala, Rose; Cook, David A

    2015-08-01

    Learning curves, which graphically show the relationship between learning effort and achievement, are common in published education research but are not often used in day-to-day educational activities. The purpose of this article is to describe the generation and analysis of learning curves and their applicability to health professions education. The authors argue that the time is right for a closer look at using learning curves-given their desirable properties-to inform both self-directed instruction by individuals and education management by instructors.A typical learning curve is made up of a measure of learning (y-axis), a measure of effort (x-axis), and a mathematical linking function. At the individual level, learning curves make manifest a single person's progress towards competence including his/her rate of learning, the inflection point where learning becomes more effortful, and the remaining distance to mastery attainment. At the group level, overlaid learning curves show the full variation of a group of learners' paths through a given learning domain. Specifically, they make overt the difference between time-based and competency-based approaches to instruction. Additionally, instructors can use learning curve information to more accurately target educational resources to those who most require them.The learning curve approach requires a fine-grained collection of data that will not be possible in all educational settings; however, the increased use of an assessment paradigm that explicitly includes effort and its link to individual achievement could result in increased learner engagement and more effective instructional design.

  16. Growth curves in Down syndrome with congenital heart disease

    Directory of Open Access Journals (Sweden)

    Caroline D’Azevedo Sica

    Full Text Available SUMMARY Introduction: To assess dietary habits, nutritional status and food frequency in children and adolescents with Down syndrome (DS and congenital heart disease (CHD. Additionally, we attempted to compare body mass index (BMI classifications according to the World Health Organization (WHO curves and curves developed for individuals with DS. Method: Cross-sectional study including individuals with DS and CHD treated at a referral center for cardiology, aged 2 to 18 years. Weight, height, BMI, total energy and food frequency were measured. Nutritional status was assessed using BMI for age and gender, using curves for evaluation of patients with DS and those set by the WHO. Results: 68 subjects with DS and CHD were evaluated. Atrioventricular septal defect (AVSD was the most common heart disease (52.9%. There were differences in BMI classification between the curves proposed for patients with DS and those proposed by the WHO. There was an association between consumption of vitamin E and polyunsaturated fatty acids. Conclusion: Results showed that individuals with DS are mostly considered normal weight for age, when evaluated using specific curves for DS. Reviews on specific curves for DS would be the recommended practice for health professionals so as to avoid precipitated diagnosis of overweight and/or obesity in this population.

  17. Assessment of the causes of failures of roto-dynamic equipment in Cirus

    International Nuclear Information System (INIS)

    Rao, K.N.; Singh, S.; Ganeshan, P.

    1994-01-01

    As a part of Cirus reactor life extension program study, a service life evaluation of critical roto-dynamic equipment in Cirus such as primary coolant pumps, and their concrete foundation structures, pressurised water loop pumps, main air compressors and supply and exhaust fans, was performed. An assessment of the causes of failures of roto-dynamic equipment in Cirus was done. Based on assessment of the degradation mitigating features and comparison to similar roto-dynamic equipment and their concrete foundation structures, it was concluded that life extension of these roto-dynamic equipment and their structures is feasible. To support this conclusion a program involving: a) non-destructive testing, b) surveillance and monitoring and, c) preventive maintenance is recommended. (author). 4 refs

  18. The role of minimum supply and social vulnerability assessment for governing critical infrastructure failure: current gaps and future agenda

    Directory of Open Access Journals (Sweden)

    M. Garschagen

    2018-04-01

    Full Text Available Increased attention has lately been given to the resilience of critical infrastructure in the context of natural hazards and disasters. The major focus therein is on the sensitivity of critical infrastructure technologies and their management contingencies. However, strikingly little attention has been given to assessing and mitigating social vulnerabilities towards the failure of critical infrastructure and to the development, design and implementation of minimum supply standards in situations of major infrastructure failure. Addressing this gap and contributing to a more integrative perspective on critical infrastructure resilience is the objective of this paper. It asks which role social vulnerability assessments and minimum supply considerations can, should and do – or do not – play for the management and governance of critical infrastructure failure. In its first part, the paper provides a structured review on achievements and remaining gaps in the management of critical infrastructure and the understanding of social vulnerabilities towards disaster-related infrastructure failures. Special attention is given to the current state of minimum supply concepts with a regional focus on policies in Germany and the EU. In its second part, the paper then responds to the identified gaps by developing a heuristic model on the linkages of critical infrastructure management, social vulnerability and minimum supply. This framework helps to inform a vision of a future research agenda, which is presented in the paper's third part. Overall, the analysis suggests that the assessment of socially differentiated vulnerabilities towards critical infrastructure failure needs to be undertaken more stringently to inform the scientifically and politically difficult debate about minimum supply standards and the shared responsibilities for securing them.

  19. The role of minimum supply and social vulnerability assessment for governing critical infrastructure failure: current gaps and future agenda

    Science.gov (United States)

    Garschagen, Matthias; Sandholz, Simone

    2018-04-01

    Increased attention has lately been given to the resilience of critical infrastructure in the context of natural hazards and disasters. The major focus therein is on the sensitivity of critical infrastructure technologies and their management contingencies. However, strikingly little attention has been given to assessing and mitigating social vulnerabilities towards the failure of critical infrastructure and to the development, design and implementation of minimum supply standards in situations of major infrastructure failure. Addressing this gap and contributing to a more integrative perspective on critical infrastructure resilience is the objective of this paper. It asks which role social vulnerability assessments and minimum supply considerations can, should and do - or do not - play for the management and governance of critical infrastructure failure. In its first part, the paper provides a structured review on achievements and remaining gaps in the management of critical infrastructure and the understanding of social vulnerabilities towards disaster-related infrastructure failures. Special attention is given to the current state of minimum supply concepts with a regional focus on policies in Germany and the EU. In its second part, the paper then responds to the identified gaps by developing a heuristic model on the linkages of critical infrastructure management, social vulnerability and minimum supply. This framework helps to inform a vision of a future research agenda, which is presented in the paper's third part. Overall, the analysis suggests that the assessment of socially differentiated vulnerabilities towards critical infrastructure failure needs to be undertaken more stringently to inform the scientifically and politically difficult debate about minimum supply standards and the shared responsibilities for securing them.

  20. The constant failure rate model for fault tree evaluation as a tool for unit protection reliability assessment

    International Nuclear Information System (INIS)

    Vichev, S.; Bogdanov, D.

    2000-01-01

    The purpose of this paper is to introduce the fault tree analysis method as a tool for unit protection reliability estimation. The constant failure rate model applies for making reliability assessment, and especially availability assessment. For that purpose an example for unit primary equipment structure and fault tree example for simplified unit protection system is presented (author)

  1. Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves

    Energy Technology Data Exchange (ETDEWEB)

    Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)

    1998-11-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)

  2. Methodology for probability of failure assessment of offshore pipelines; Metodologia qualitativa de avaliacao da probabilidade de falha de dutos rigidos submarinos estaticos

    Energy Technology Data Exchange (ETDEWEB)

    Pezzi Filho, Mario [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2005-07-01

    In this study it is presented a methodology for assessing the likelihood of failure for every failure mechanism defined for carbon steel static offshore pipelines. This methodology is aimed to comply with the Integrity Management policy established by the Company. Decision trees are used for the development of the methodology and the evaluation of the extent and the significance of these failure mechanisms. Decision trees enable also the visualization of the logical structure of algorithms which eventually will be used in risk assessment software. The benefits of the proposed methodology are presented and it is recommended that it be tested on static offshore pipelines installed in different assets for validation. (author)

  3. Utility of CD4 cell counts for early prediction of virological failure during antiretroviral therapy in a resource-limited setting

    Directory of Open Access Journals (Sweden)

    Lawn Stephen D

    2008-07-01

    Full Text Available Abstract Background Viral load monitoring is not available for the vast majority of patients receiving antiretroviral therapy in resource-limited settings. However, the practical utility of CD4 cell count measurements as an alternative monitoring strategy has not been rigorously assessed. Methods In this study, we used a novel modelling approach that accounted for all CD4 cell count and VL values measured during follow-up from the first date that VL suppression was achieved. We determined the associations between CD4 counts (absolute values and changes during ART, VL measurements and risk of virological failure (VL > 1,000 copies/ml following initial VL suppression in 330 patients in South Africa. CD4 count changes were modelled both as the difference from baseline (ΔCD4 count and the difference between consecutive values (CD4 count slope using all 3-monthly CD4 count measurements during follow-up. Results During 7093.2 patient-months of observation 3756 paired CD4 count and VL measurements were made. In patients who developed virological failure (n = 179, VL correlated significantly with absolute CD4 counts (r = - 0.08, P = 0.003, ΔCD4 counts (r = - 0.11, P P P = 0.99, P = 0.92 and P = 0.75, respectively. Moreover, in a receiver operating characteristic (ROC curve, the association between a negative CD4 count slope and virological failure was poor (area under the curve = 0.59; sensitivity = 53.0%; specificity = 63.6%; positive predictive value = 10.9%. Conclusion CD4 count changes correlated significantly with VL at group level but had very limited utility in identifying virological failure in individual patients. CD4 count is an inadequate alternative to VL measurement for early detection of virological failure.

  4. Probability of failure prediction for step-stress fatigue under sine or random stress

    Science.gov (United States)

    Lambert, R. G.

    1979-01-01

    A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.

  5. Failure Mechanisms of Brittle Rocks under Uniaxial Compression

    Science.gov (United States)

    Liu, Taoying; Cao, Ping

    2017-09-01

    The behaviour of a rock mass is determined not only by the properties of the rock matrix, but mostly by the presence and properties of discontinuities or fractures within the mass. The compression test on rock-like specimens with two prefabricated transfixion fissures, made by pulling out the embedded metal inserts in the pre-cured period was carried out on the servo control uniaxial loading tester. The influence of the geometry of pre-existing cracks on the cracking processes was analysed with reference to the experimental observation of crack initiation and propagation from pre-existing flaws. Based on the rock fracture mechanics and the stress-strain curves, the evolution failure mechanism of the fissure body was also analyzed on the basis of exploring the law of the compression-shear crack initiation, wing crack growth and rock bridge connection. Meanwhile, damage fracture mechanical models of a compression-shear rock mass are established when the rock bridge axial transfixion failure, tension-shear combined failure, or wing crack shear connection failure occurs on the specimen under axial compression. This research was of significance in studying the failure mechanism of fractured rock mass.

  6. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  7. Impact of support system failure limitations on probabilistic safety assessment and in regulatory decision making

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1990-01-01

    When used as a tool for safety decision making, Probabilistic Safety Assessment (PSA) is as effective as it realistically characterizes the overall frequency and consequences of various types of system and component failures. If significant support system failure events are omitted from consideration, the PSA process omits the characterization of possible unique contributors to core damage risk, possibly underestimates the frequency of core damage, and reduces the future utility of the PSA as a decision making tool for the omitted support system. This paper is based on a review of several recent US PSA studies and the author's participation in several International Atomic Energy Agency (IAEA) sponsored peer reviews. 21 refs., 2 figs., 1 tab

  8. Use of the cumulative sum method (CUSUM) to assess the learning curves of ultrasound-guided continuous femoral nerve block.

    Science.gov (United States)

    Kollmann-Camaiora, A; Brogly, N; Alsina, E; Gilsanz, F

    2017-10-01

    Although ultrasound is a basic competence for anaesthesia residents (AR) there is few data available on the learning process. This prospective observational study aims to assess the learning process of ultrasound-guided continuous femoral nerve block and to determine the number of procedures that a resident would need to perform in order to reach proficiency using the cumulative sum (CUSUM) method. We recruited 19 AR without previous experience. Learning curves were constructed using the CUSUM method for ultrasound-guided continuous femoral nerve block considering 2 success criteria: a decrease of pain score>2 in a [0-10] scale after 15minutes, and time required to perform it. We analyse data from 17 AR for a total of 237 ultrasound-guided continuous femoral nerve blocks. 8/17 AR became proficient for pain relief, however all the AR who did more than 12 blocks (8/8) became proficient. As for time of performance 5/17 of AR achieved the objective of 12minutes, however all the AR who did more than 20 blocks (4/4) achieved it. The number of procedures needed to achieve proficiency seems to be 12, however it takes more procedures to reduce performance time. The CUSUM methodology could be useful in training programs to allow early interventions in case of repeated failures, and develop competence-based curriculum. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. IDF-curves for precipitation In Belgium

    International Nuclear Information System (INIS)

    Mohymont, Bernard; Demarde, Gaston R.

    2004-01-01

    The Intensity-Duration-Frequency (IDF) curves for precipitation constitute a relationship between the intensity, the duration and the frequency of rainfall amounts. The intensity of precipitation is expressed in mm/h, the duration or aggregation time is the length of the interval considered while the frequency stands for the probability of occurrence of the event. IDF-curves constitute a classical and useful tool that is primarily used to dimension hydraulic structures in general, as e.g., sewer systems and which are consequently used to assess the risk of inundation. In this presentation, the IDF relation for precipitation is studied for different locations in Belgium. These locations correspond to two long-term, high-quality precipitation networks of the RMIB: (a) the daily precipitation depths of the climatological network (more than 200 stations, 1951-2001 baseline period); (b) the high-frequency 10-minutes precipitation depths of the hydro meteorological network (more than 30 stations, 15 to 33 years baseline period). For the station of Uccle, an uninterrupted time-series of more than one hundred years of 10-minutes rainfall data is available. The proposed technique for assessing the curves is based on maximum annual values of precipitation. A new analytical formula for the IDF-curves was developed such that these curves stay valid for aggregation times ranging from 10 minutes to 30 days (when fitted with appropriate data). Moreover, all parameters of this formula have physical dimensions. Finally, adequate spatial interpolation techniques are used to provide nationwide extreme values precipitation depths for short- to long-term durations With a given return period. These values are estimated on the grid points of the Belgian ALADIN-domain used in the operational weather forecasts at the RMIB.(Author)

  10. Residual stress measurement by X-ray diffraction with the Gaussian curve method and its automation

    International Nuclear Information System (INIS)

    Kurita, M.

    1987-01-01

    X-ray technique with the Gaussian curve method and its automation are described for rapid and nondestructive measurement of residual stress. A simplified equation for measuring the stress by the Gaussian curve method is derived because in its previous form this method required laborious calculation. The residual stress can be measured in a few minutes, depending on materials, using an automated X-ray stress analyzer with a microcomputer which was developed in the laboratory. The residual stress distribution of a partially induction hardened and tempered (at 280 0 C) steel bar was measured with the Gaussian curve method. A sharp residual tensile stress peak of 182 MPa appeared right outside the hardened region at which fatigue failure is liable to occur

  11. Disadvantages of using the area under the receiver operating characteristic curve to assess imaging tests: A discussion and proposal for an alternative approach

    International Nuclear Information System (INIS)

    Halligan, Steve; Altman, Douglas G.; Mallett, Susan

    2015-01-01

    The objectives are to describe the disadvantages of the area under the receiver operating characteristic curve (ROC AUC) to measure diagnostic test performance and to propose an alternative based on net benefit. We use a narrative review supplemented by data from a study of computer-assisted detection for CT colonography. We identified problems with ROC AUC. Confidence scoring by readers was highly non-normal, and score distribution was bimodal. Consequently, ROC curves were highly extrapolated with AUC mostly dependent on areas without patient data. AUC depended on the method used for curve fitting. ROC AUC does not account for prevalence or different misclassification costs arising from false-negative and false-positive diagnoses. Change in ROC AUC has little direct clinical meaning for clinicians. An alternative analysis based on net benefit is proposed, based on the change in sensitivity and specificity at clinically relevant thresholds. Net benefit incorporates estimates of prevalence and misclassification costs, and it is clinically interpretable since it reflects changes in correct and incorrect diagnoses when a new diagnostic test is introduced. ROC AUC is most useful in the early stages of test assessment whereas methods based on net benefit are more useful to assess radiological tests where the clinical context is known. Net benefit is more useful for assessing clinical impact. (orig.)

  12. Retrospective analysis of the learning curve associated with laparoscopic ovariectomy in dogs and associated perioperative complication rates.

    Science.gov (United States)

    Pope, Juliet Frances Anne; Knowles, Toby Grahame

    2014-08-01

    To assess the learning curve associated with laparoscopic ovariectomy (LOE) in 618 dogs and to report perioperative complication rates. Case series. Dogs (n = 618). Data retrieved from the medical records of bitches admitted for LOE over 42 months included date of surgery, breed, weight (kg), age (months), surgeon, suture material used, intraoperative complications and postoperative complications. Each LOE was defined as "successful" or "unsuccessful" by the absence or presence of an intraoperative complication and "failure" rate described using a CUSUM technique. Follow-up time ranged from 152 to 1,435 days (median, 737 days). Intraoperative complications occurred in 10 dogs (1.6%) and included: splenic laceration (6 dogs; 1%), urinary bladder perforation (3 dogs; 0.5%), and subcutaneous emphysema (1 dog; 0.2%). Postoperative complications occurred in 99 dogs (16%) and included: incisional inflammation treated with antibiotics (87 dogs [14%]; 96/1,854 incisions; 5.1%), incisional seroma (5 dogs [0.8%]; 5/1,854 incisions, 0.3%), incisional hernia (4 dogs [0.6%]; 4/1,854 incisions, 0.2%), and ovarian remnant syndrome (3 dogs; 0.5%). CUSUM charts indicated an initial "learning curve" of ∼80 LOE. LOE is a technique with an initial learning curve but once surgical proficiency is reached after ∼80 procedures then intraoperative complication rates associated with the procedure can be low. © Copyright 2014 by The American College of Veterinary Surgeons.

  13. The costs of failure: A preliminary assessment of major energy accidents, 1907-2007

    International Nuclear Information System (INIS)

    Sovacool, Benjamin K.

    2008-01-01

    A combination of technical complexity, tight coupling, speed, and human fallibility contribute to the unexpected failure of large-scale energy technologies. This study offers a preliminary assessment of the social and economic costs of major energy accidents from 1907 to 2007. It documents 279 incidents that have been responsible for $41 billion in property damage and 182,156 deaths. Such disasters highlight an often-ignored negative externality to energy production and use, and emphasize the need for further research

  14. Usage of Failure Mode & EffectAnalysis Method (FMEA forsafety assessment in a drug manufacture

    Directory of Open Access Journals (Sweden)

    Y Nazari

    2006-04-01

    Full Text Available Background and Aims: This study was hold in purpose of recognizing and controlling workplacehazards in production units of a drag ManufactureMethod:So for recognition and assessment of hazards, FMEA Method was used. FMEASystematically investigates the effects of equipment and system failures leading often toequipment design improvements. At first the level of the study defined as system. Then accordingto observations, accident statistic, and interview with managers, supervisory, and workers highrisk system were determiner. So the boundaries of the system established and informationregarding the relevant Components, their function and interactions gathered. To preventConfusion between Similar pieces of equipment, a unique system identifier developed. After thatall failure modes and their causes for each equipment or system listed, the immediate effects ofeach failure mode and interactive effect on other equipment or system was described too. Riskpriority number was determined according to global and local criteriaResults: After all some actions and solution proposed to reduce the likelihood and severity offailures and raise their delectability.Conclusion :This study illustrated that although of the first step drug manufacture may seem safe,but there are still many hazardous condition that could cause serious accidents, The result proposedit is necessary: (1 to develop comprehensive manual for periodical and regular inspection ofinstruments of workplaces in purpose of recognize unknown failures and their causes, (2 developa comprehensive program for systems maintenance and repair, and (3 conduct worker training.

  15. Microvascular Anastomosis: Proposition of a Learning Curve.

    Science.gov (United States)

    Mokhtari, Pooneh; Tayebi Meybodi, Ali; Benet, Arnau; Lawton, Michael T

    2018-04-14

    Learning to perform a microvascular anastomosis is one of the most difficult tasks in cerebrovascular surgery. Previous studies offer little regarding the optimal protocols to maximize learning efficiency. This failure stems mainly from lack of knowledge about the learning curve of this task. To delineate this learning curve and provide information about its various features including acquisition, improvement, consistency, stability, and recall. Five neurosurgeons with an average surgical experience history of 5 yr and without any experience in bypass surgery performed microscopic anastomosis on progressively smaller-caliber silastic tubes (Biomet, Palm Beach Gardens, Florida) during 24 consecutive sessions. After a 1-, 2-, and 8-wk retention interval, they performed recall test on 0.7-mm silastic tubes. The anastomoses were rated based on anastomosis patency and presence of any leaks. Improvement rate was faster during initial sessions compared to the final practice sessions. Performance decline was observed in the first session of working on a smaller-caliber tube. However, this rapidly improved during the following sessions of practice. Temporary plateaus were seen in certain segments of the curve. The retention interval between the acquisition and recall phase did not cause a regression to the prepractice performance level. Learning the fine motor task of microvascular anastomosis adapts to the basic rules of learning such as the "power law of practice." Our results also support the improvement of performance during consecutive sessions of practice. The objective evidence provided may help in developing optimized learning protocols for microvascular anastomosis.

  16. Enhanced Schapery Theory Software Development for Modeling Failure of Fiber-Reinforced Laminates

    Science.gov (United States)

    Pineda, Evan J.; Waas, Anthony M.

    2013-01-01

    Progressive damage and failure analysis (PDFA) tools are needed to predict the nonlinear response of advanced fiber-reinforced composite structures. Predictive tools should incorporate the underlying physics of the damage and failure mechanisms observed in the composite, and should utilize as few input parameters as possible. The purpose of the Enhanced Schapery Theory (EST) was to create a PDFA tool that operates in conjunction with a commercially available finite element (FE) code (Abaqus). The tool captures the physics of the damage and failure mechanisms that result in the nonlinear behavior of the material, and the failure methodology employed yields numerical results that are relatively insensitive to changes in the FE mesh. The EST code is written in Fortran and compiled into a static library that is linked to Abaqus. A Fortran Abaqus UMAT material subroutine is used to facilitate the communication between Abaqus and EST. A clear distinction between damage and failure is imposed. Damage mechanisms result in pre-peak nonlinearity in the stress strain curve. Four internal state variables (ISVs) are utilized to control the damage and failure degradation. All damage is said to result from matrix microdamage, and a single ISV marks the micro-damage evolution as it is used to degrade the transverse and shear moduli of the lamina using a set of experimentally obtainable matrix microdamage functions. Three separate failure ISVs are used to incorporate failure due to fiber breakage, mode I matrix cracking, and mode II matrix cracking. Failure initiation is determined using a failure criterion, and the evolution of these ISVs is controlled by a set of traction-separation laws. The traction separation laws are postulated such that the area under the curves is equal to the fracture toughness of the material associated with the corresponding failure mechanism. A characteristic finite element length is used to transform the traction-separation laws into stress-strain laws

  17. A new method for explicit modelling of single failure event within different common cause failure groups

    International Nuclear Information System (INIS)

    Kančev, Duško; Čepin, Marko

    2012-01-01

    Redundancy and diversity are the main principles of the safety systems in the nuclear industry. Implementation of safety components redundancy has been acknowledged as an effective approach for assuring high levels of system reliability. The existence of redundant components, identical in most of the cases, implicates a probability of their simultaneous failure due to a shared cause—a common cause failure. This paper presents a new method for explicit modelling of single component failure event within multiple common cause failure groups simultaneously. The method is based on a modification of the frequently utilised Beta Factor parametric model. The motivation for development of this method lays in the fact that one of the most widespread softwares for fault tree and event tree modelling as part of the probabilistic safety assessment does not comprise the option for simultaneous assignment of single failure event to multiple common cause failure groups. In that sense, the proposed method can be seen as an advantage of the explicit modelling of common cause failures. A standard standby safety system is selected as a case study for application and study of the proposed methodology. The results and insights implicate improved, more transparent and more comprehensive models within probabilistic safety assessment.

  18. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  19. Assessment of performance measures and learning curves for use of a virtual-reality ultrasound simulator in transvaginal ultrasound examination

    DEFF Research Database (Denmark)

    Madsen, M E; Konge, L; Nørgaard, L N

    2014-01-01

    OBJECTIVE: To assess the validity and reliability of performance measures, develop credible performance standards and explore learning curves for a virtual-reality simulator designed for transvaginal gynecological ultrasound examination. METHODS: A group of 16 ultrasound novices, along with a group......-6), corresponding to an average of 219 min (range, 150-251 min) of training. The test/retest reliability was high, with an intraclass correlation coefficient of 0.93. CONCLUSIONS: Competence in the performance of gynecological ultrasound examination can be assessed in a valid and reliable way using virtual-reality...

  20. Assessment of two theoretical methods to estimate potentiometric titration curves of peptides: comparison with experiment.

    Science.gov (United States)

    Makowska, Joanna; Bagiñska, Katarzyna; Makowski, Mariusz; Jagielska, Anna; Liwo, Adam; Kasprzykowski, Franciszek; Chmurzyñski, Lech; Scheraga, Harold A

    2006-03-09

    We compared the ability of two theoretical methods of pH-dependent conformational calculations to reproduce experimental potentiometric titration curves of two models of peptides: Ac-K5-NHMe in 95% methanol (MeOH)/5% water mixture and Ac-XX(A)7OO-NH2 (XAO) (where X is diaminobutyric acid, A is alanine, and O is ornithine) in water, methanol (MeOH), and dimethyl sulfoxide (DMSO), respectively. The titration curve of the former was taken from the literature, and the curve of the latter was determined in this work. The first theoretical method involves a conformational search using the electrostatically driven Monte Carlo (EDMC) method with a low-cost energy function (ECEPP/3 plus the SRFOPT surface-solvation model, assumming that all titratable groups are uncharged) and subsequent reevaluation of the free energy at a given pH with the Poisson-Boltzmann equation, considering variable protonation states. In the second procedure, molecular dynamics (MD) simulations are run with the AMBER force field and the generalized Born model of electrostatic solvation, and the protonation states are sampled during constant-pH MD runs. In all three solvents, the first pKa of XAO is strongly downshifted compared to the value for the reference compounds (ethylamine and propylamine, respectively); the water and methanol curves have one, and the DMSO curve has two jumps characteristic of remarkable differences in the dissociation constants of acidic groups. The predicted titration curves of Ac-K5-NHMe are in good agreement with the experimental ones; better agreement is achieved with the MD-based method. The titration curves of XAO in methanol and DMSO, calculated using the MD-based approach, trace the shape of the experimental curves, reproducing the pH jump, while those calculated with the EDMC-based approach and the titration curve in water calculated using the MD-based approach have smooth shapes characteristic of the titration of weak multifunctional acids with small differences

  1. Analysis of non simultaneous common mode failures. Application to the reliability assessment of the decay heat removal of the RNR 1500 project

    International Nuclear Information System (INIS)

    Natta, M.; Bloch, M.

    1991-01-01

    The experience with the LMFBR PHENIX has shown many cases of failures on identical and redundant components, which were close in time but not simultaneous and due to the same causes such as a design error, an unappropriate material, corrosion, ... Since the decay heat removal (DHR) must be assured for a long period after shutdown of the reactor, the overall reliability of the DHR system depends much on this type of successive failures by common mode causes, for which the usual β factor methods are not appropriate since they imply that the several failures are simultaneous. In this communication, two methods will be presented. The first one was used to assess the reliability of the DHR system of the RNR 1500 project. In this method, one modelize the occurrence of successive failures on n identical files by a sudden jump of the failure rate from the value λ attributed to the first failure to the value λ' attributed to the (n-1) still available files. This method leads to a quite natural quantification of the interest of diversity for highly redundant systems. For the RNR 1500 project where, in case of the loss of normal DHR path through the steam generators, the decay heat is removed by four separated sodium loops of 26 MW unit capacity in forced convection, the probabilistic assessment shows that it is necessary to diversify the sodium-sodium heat exchanger in order to fullfil the upper limit of 10 -7 /year for the probability of failure of DHR. A separate assessment for the main sequence leading to DHR loss was performed using a different method in which the successive failures are interpreted as a premature end of life, the lifetimes being directly used as random variables. This Monte-Carlo type method, which can be applied to any type of lifetime distribution, leads to results consistent to those obtained with the first one

  2. Failure analysis of high strength pipeline with single and multiple corrosions

    International Nuclear Information System (INIS)

    Chen, Yanfei; Zhang, Hong; Zhang, Juan; Li, Xin; Zhou, Jing

    2015-01-01

    Highlights: • We study failure of high strength pipelines with single corrosion. • We give regression equations for failure pressure prediction. • We propose assessment procedure for pipelines with multiple corrosions. - Abstract: Corrosion will compromise safety operation of oil and gas pipelines, accurate determination of failure pressure finds importance in residual strength assessment and corrosion allowance design of onshore and offshore pipelines. This paper investigates failure pressure of high strength pipeline with single and multiple corrosions using nonlinear finite element analysis. On the basis of developed regression equations for failure pressure prediction of high strength pipeline with single corrosion, the paper proposes an assessment procedure for predicting failure pressure of high strength pipeline with multiple corrosions. Furthermore, failure pressures predicted by proposed solutions are compared with experimental results and various assessment methods available in literature, where accuracy and versatility are demonstrated

  3. Factors Predicting Treatment Failure in Patients Treated with Iodine-131 for Graves’ Disease

    International Nuclear Information System (INIS)

    Manohar, Kuruva; Mittal, Bhagwant Rai; Bhoil, Amit; Bhattacharya, Anish; Dutta, Pinaki; Bhansali, Anil

    2013-01-01

    Treatment of Graves' disease with iodine-131 ( 131 I) is well-known; however, all patients do not respond to a single dose of 131 I and may require higher and repeated doses. This study was carried out to identify the factors, which can predict treatment failure to a single dose of 131 I treatment in these patients. Data of 150 patients with Graves' disease treated with 259-370 MBq of 131 I followed-up for at least 1-year were retrospectively analyzed. Logistic regression analysis was used to predict factors which can predict treatment failure, such as age, sex, duration of disease, grade of goiter, duration of treatment with anti-thyroid drugs, mean dosage of anti-thyroid drugs used, 99m Tc-pertechnetate ( 99m TcO 4 - ) uptake at 20 min, dose of 131 I administered, total triiodothyronine and thyroxine levels. Of the 150 patients, 25 patients required retreatment within 1 year of initial treatment with 131 I. Logistic regression analysis revealed that male sex and 99m TcO 4 - uptake were associated with treatment failure. On receiver operating characteristic (ROC) curve analysis, area under the curve (AUC) was significant for 99m TcO 4 - uptake predicting treatment failure (AUC = 0.623; P = 0.039). Optimum cutoff for 99m TcO 4 - uptake was 17.75 with a sensitivity of 68% and specificity of 66% to predict treatment failure. Patients with >17.75% 99m TcO 4 - uptake had odds ratio of 3.14 (P = 0.014) for treatment failure and male patients had odds ratio of 1.783 for treatment failure. Our results suggest that male patients and patients with high pre-treatment 99m TcO 4 - uptake are more likely to require repeated doses of 131 I to achieve complete remission

  4. Canagliflozin and Heart Failure in Type 2 Diabetes Mellitus: Results From the CANVAS Program (Canagliflozin Cardiovascular Assessment Study).

    Science.gov (United States)

    Rådholm, Karin; Figtree, Gemma; Perkovic, Vlado; Solomon, Scott D; Mahaffey, Kenneth W; de Zeeuw, Dick; Fulcher, Greg; Barrett, Terrance D; Shaw, Wayne; Desai, Mehul; Matthews, David R; Neal, Bruce

    2018-03-11

    BACKGROUND : Canagliflozin is a sodium glucose cotransporter 2 inhibitor that reduces the risk of cardiovascular events. We report the effects on heart failure and cardiovascular death overall, in those with and without a baseline history of heart failure, and in other participant subgroups. METHODS : The CANVAS Program (Canagliflozin Cardiovascular Assessment Study) enrolled 10 142 participants with type 2 diabetes mellitus and high cardiovascular risk. Participants were randomly assigned to canagliflozin or placebo and followed for a mean of 188 weeks. The primary end point for these analyses was adjudicated cardiovascular death or hospitalized heart failure. RESULTS : Participants with a history of heart failure at baseline (14.4%) were more frequently women, white, and hypertensive and had a history of prior cardiovascular disease (all P failure was reduced in those treated with canagliflozin compared with placebo (16.3 versus 20.8 per 1000 patient-years; hazard ratio [HR], 0.78; 95% confidence interval [CI], 0.67-0.91), as was fatal or hospitalized heart failure (HR, 0.70; 95% CI, 0.55-0.89) and hospitalized heart failure alone (HR, 0.67; 95% CI, 0.52-0.87). The benefit on cardiovascular death or hospitalized heart failure may be greater in patients with a prior history of heart failure (HR, 0.61; 95% CI, 0.46-0.80) compared with those without heart failure at baseline (HR, 0.87; 95% CI, 0.72-1.06; P interaction =0.021). The effects of canagliflozin compared with placebo on other cardiovascular outcomes and key safety outcomes were similar in participants with and without heart failure at baseline (all interaction P values >0.130), except for a possibly reduced absolute rate of events attributable to osmotic diuresis among those with a prior history of heart failure ( P =0.03). CONCLUSIONS : In patients with type 2 diabetes mellitus and an elevated risk of cardiovascular disease, canagliflozin reduced the risk of cardiovascular death or hospitalized heart

  5. Utility of the Instability Severity Index Score in Predicting Failure After Arthroscopic Anterior Stabilization of the Shoulder.

    Science.gov (United States)

    Phadnis, Joideep; Arnold, Christine; Elmorsy, Ahmed; Flannery, Mark

    2015-08-01

    The redislocation rate after arthroscopic stabilization for anterior glenohumeral instability is up to 30%. The Instability Severity Index Score (ISIS) was developed to preoperatively rationalize the risk of failure, but it has not yet been validated by an independent group. To assess the utility of the ISIS in predicting failure of arthroscopic anterior shoulder stabilization and to identify other preoperative factors for failure. Case-control study; Level of evidence, 3. A case-control study was performed on 141 consecutive patients, comparing those who suffered failure of arthroscopic stabilization with those who had successful arthroscopic stabilization. The mean follow-up time was 47 months (range, 24-132 months). The ISIS was applied retrospectively, and an analysis was performed to establish independent risk factors for failure. A receiver operator coefficient curve was constructed to set a threshold ISIS for considering alternative surgery. Of 141 patients, 19 (13.5%) suffered recurrent instability. The mean ISIS of the failed stabilization group was higher than that of the successful stabilization group (5.1 vs 1.7; P surgery (P < .001), age at first dislocation (P = .01), competitive-level participation in sports (P < .001), and participation in contact or overhead sports (P = .03). The presence of glenoid bone loss carried the highest risk of failure (70%). There was a 70% risk of failure if the ISIS was ≥4, as opposed to a 4% risk of failure if the ISIS was <4. This is the first completely independent study to confirm that the ISIS is a useful preoperative tool. It is recommended that surgeons consider alternative forms of stabilization if the ISIS is ≥4. © 2015 The Author(s).

  6. Development of methods for assessing the vulnerability of Australian residential building stock to severe wind

    International Nuclear Information System (INIS)

    Wehner, Martin; Sandland, Carl; Edwards, Mark; Ginger, John; Holmes, John

    2010-01-01

    Knowledge of the degree of damage to residential structures expected from severe wind is used to study the benefits from adaptation strategies developed in response to expected changes in wind severity due to climate change. This study will inform government, the insurance industry and provide emergency services with estimates of expected damage. A series of heuristic wind vulnerability curves for Australian residential structures has been developed. In order to provide rigor to the heuristic curves and to enable quantitative assessment to be made of adaptation strategies, work has commenced to produce a simulation tool to quantitatively assess damage to buildings from severe wind. The simulation tool accounts for variability in wind profile, shielding, structural strength, pressure coefficients, building orientation, component self weights, debris damage and water ingress via a Monte Carlo approach. The software takes a component-based approach to modelling building vulnerability. It is based on the premise that overall building damage is strongly related to the failure of key components (i.e. connections). If these failures can be ascertained, and associated damage from debris and water penetration reliably estimated, scenarios of complete building damage can be assessed. This approach has been developed with varying degrees of rigor by researchers around the world and is best practice for the insurance industry.

  7. Acute kidney failure

    Science.gov (United States)

    ... Renal failure - acute; ARF; Kidney injury - acute Images Kidney anatomy References Devarajan P. Biomarkers for assessment of renal function during acute kidney injury. In: Alpern RJ, Moe OW, Caplan M, ...

  8. Supply-cost curves for geographically distributed renewable-energy resources

    International Nuclear Information System (INIS)

    Izquierdo, Salvador; Dopazo, Cesar; Fueyo, Norberto

    2010-01-01

    The supply-cost curves of renewable-energy sources are an essential tool to synthesize and analyze large-scale energy-policy scenarios, both in the short and long terms. Here, we suggest and test a parametrization of such curves that allows their representation for modeling purposes with a minimal set of information. In essence, an economic potential is defined based on the mode of the marginal supply-cost curves; and, using this definition, a normalized log-normal distribution function is used to model these curves. The feasibility of this proposal is assessed with data from a GIS-based analysis of solar, wind and biomass technologies in Spain. The best agreement is achieved for solar energy.

  9. Identifying the necessary and sufficient number of risk factors for predicting academic failure.

    Science.gov (United States)

    Lucio, Robert; Hunt, Elizabeth; Bornovalova, Marina

    2012-03-01

    Identifying the point at which individuals become at risk for academic failure (grade point average [GPA] academic success or failure. This study focused on 12 school-related factors. Using a thorough 5-step process, we identified which unique risk factors place one at risk for academic failure. Academic engagement, academic expectations, academic self-efficacy, homework completion, school relevance, school safety, teacher relationships (positive relationship), grade retention, school mobility, and school misbehaviors (negative relationship) were uniquely related to GPA even after controlling for all relevant covariates. Next, a receiver operating characteristic curve was used to determine a cutoff point for determining how many risk factors predict academic failure (GPA academic failure, which provides a way for early identification of individuals who are at risk. Further implications of these findings are discussed. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  10. Application of failure mode and effects analysis (FMEA) to pretreatment phases in tomotherapy.

    Science.gov (United States)

    Broggi, Sara; Cantone, Marie Claire; Chiara, Anna; Di Muzio, Nadia; Longobardi, Barbara; Mangili, Paola; Veronese, Ivan

    2013-09-06

    The aim of this paper was the application of the failure mode and effects analysis (FMEA) approach to assess the risks for patients undergoing radiotherapy treatments performed by means of a helical tomotherapy unit. FMEA was applied to the preplanning imaging, volume determination, and treatment planning stages of the tomotherapy process and consisted of three steps: 1) identification of the involved subprocesses; 2) identification and ranking of the potential failure modes, together with their causes and effects, using the risk probability number (RPN) scoring system; and 3) identification of additional safety measures to be proposed for process quality and safety improvement. RPN upper threshold for little concern of risk was set at 125. A total of 74 failure modes were identified: 38 in the stage of preplanning imaging and volume determination, and 36 in the stage of planning. The threshold of 125 for RPN was exceeded in four cases: one case only in the phase of preplanning imaging and volume determination, and three cases in the stage of planning. The most critical failures appeared related to (i) the wrong or missing definition and contouring of the overlapping regions, (ii) the wrong assignment of the overlap priority to each anatomical structure, (iii) the wrong choice of the computed tomography calibration curve for dose calculation, and (iv) the wrong (or not performed) choice of the number of fractions in the planning station. On the basis of these findings, in addition to the safety strategies already adopted in the clinical practice, novel solutions have been proposed for mitigating the risk of these failures and to increase patient safety.

  11. Consideration of uncertainties in CCDF risk curves in safety oriented decision making processes

    International Nuclear Information System (INIS)

    Stern, E.; Tadmor, J.

    1988-01-01

    In recent years, some of the results of Probabilistic Risk Assessment (i.e. the magnitudes of the various adverse health effects and other effects of potential accidents in nuclear power plants) have usually been presented in Complementary Cumulative Distribution Function curves, widely known as CCDF risk curves. CCDF curves are characteristic of probabilistic accident analyses and consequence calculations, although, in many cases, the codes producing the CCDF curves consist of a mixture of both probabilistic and deterministic calculations. One of the main difficulties in the process of PRA is the problem of uncertainties associated with the risk assessments. The uncertainties, as expressed in CCDF risk curves can be classified into two main categories: (a) uncertainties expressed by the CCDF risk curve itself due to its probabilistic nature and - (b) the uncertainty band of CCDF risk curves. The band consists of a ''family of CCDF curves'' which represents the risks (e.g. early/late fatalities) evaluated at various levels of confidence for a specific Plant-Site Combination (PSC) i.e. a certain nuclear power plant located at a certain site. The reasons why a family of curves rather than a single curve represents the risk of a certain PSC have been discussed. Generally, the uncertainty band of CCDF curves is limited by the 95% (''conservative'') and the 5% curves. In most cases the 50% (median, ''best estimate'') curve is also shown because scientists tend to believe that it represents the ''realistic'' (or real) risk of the plant

  12. Possibilities And Influencing Parameters For The Early Detection Of Sheet Metal Failure In Press Shop Operations

    International Nuclear Information System (INIS)

    Gerlach, Joerg; Kessler, Lutz; Paul, Udo; Roesen, Hartwig

    2007-01-01

    The concept of forming limit curves (FLC) is widely used in industrial practice. The required data should be delivered for typical material properties (measured on coils with properties in a range of +/- of the standard deviation from the mean production values) by the material suppliers. In particular it should be noted that its use for the validation of forming robustness providing forming limit curves for the variety of scattering in the mechanical properties is impossible. Therefore a forecast of the expected limit strains without expensive cost and time-consuming experiments is necessary. In the paper the quality of a regression analysis for determining forming limit curves based on tensile test results is presented and discussed.Owing to the specific definition of limit strains with FLCs following linear strain paths, the significance of this failure definition is limited. To consider nonlinear strain path effects, different methods are given in literature. One simple method is the concept of limit stresses. It should be noted that the determined value of the critical stress is dependent on the extrapolation of the tensile test curve. When the yield curve extrapolation is very similar to an exponential function, the definition of the critical stress value is very complicated due to the low slope of the hardening function at large strains.A new method to determine general failure behavior in sheet metal forming is the common use and interpretation of three criteria: onset on material instability (comparable with FLC concept), value of critical shear fracture and the value of ductile fracture. This method seems to be particularly successful for newly developed high strength steel grades in connection with more complex strain paths for some specific material elements. Nevertheless the identification of the different failure material parameters or functions will increase and the user has to learn with the interpretation of the numerical results

  13. A Bayesian hierarchical model for demand curve analysis.

    Science.gov (United States)

    Ho, Yen-Yi; Nhu Vo, Tien; Chu, Haitao; Luo, Xianghua; Le, Chap T

    2018-07-01

    Drug self-administration experiments are a frequently used approach to assessing the abuse liability and reinforcing property of a compound. It has been used to assess the abuse liabilities of various substances such as psychomotor stimulants and hallucinogens, food, nicotine, and alcohol. The demand curve generated from a self-administration study describes how demand of a drug or non-drug reinforcer varies as a function of price. With the approval of the 2009 Family Smoking Prevention and Tobacco Control Act, demand curve analysis provides crucial evidence to inform the US Food and Drug Administration's policy on tobacco regulation, because it produces several important quantitative measurements to assess the reinforcing strength of nicotine. The conventional approach popularly used to analyze the demand curve data is individual-specific non-linear least square regression. The non-linear least square approach sets out to minimize the residual sum of squares for each subject in the dataset; however, this one-subject-at-a-time approach does not allow for the estimation of between- and within-subject variability in a unified model framework. In this paper, we review the existing approaches to analyze the demand curve data, non-linear least square regression, and the mixed effects regression and propose a new Bayesian hierarchical model. We conduct simulation analyses to compare the performance of these three approaches and illustrate the proposed approaches in a case study of nicotine self-administration in rats. We present simulation results and discuss the benefits of using the proposed approaches.

  14. Utilization of multimode Love wave dispersion curve inversion for geotechnical site investigation

    International Nuclear Information System (INIS)

    Hamimu, La; Nawawi, Mohd; Safani, Jamhir

    2011-01-01

    Inversion codes based on a modified genetic algorithm (GA) have been developed to invert multimode Love wave dispersion curves. The multimode Love wave dispersion curves were synthesized from the profile representing shear-wave velocity reversal using a full SH (shear horizontal) waveform. In this study, we used a frequency–slowness transform to extract the dispersion curve from the full SH waveform. Dispersion curves overlain in dispersion images were picked manually. These curves were then inverted using the modified GA. To assess the accuracy of the inversion results, differences between the true and inverted shear-wave velocity profile were quantified in terms of shear-wave velocity and thickness errors, E S and E H . Our numerical modeling showed that the inversion of multimode dispersion curves can significantly provide the better assessment of a shear-wave velocity structure, especially with a velocity reversal profile at typical geotechnical site investigations. This approach has been applied on field data acquired at a site in Niigata prefecture, Japan. In these field data, our inversion results show good agreement between the calculated and experimental dispersion curves and accurately detect low velocity layer targets

  15. [Application of decision curve on evaluation of MRI predictive model for early assessing pathological complete response to neoadjuvant therapy in breast cancer].

    Science.gov (United States)

    He, Y J; Li, X T; Fan, Z Q; Li, Y L; Cao, K; Sun, Y S; Ouyang, T

    2018-01-23

    Objective: To construct a dynamic enhanced MR based predictive model for early assessing pathological complete response (pCR) to neoadjuvant therapy in breast cancer, and to evaluate the clinical benefit of the model by using decision curve. Methods: From December 2005 to December 2007, 170 patients with breast cancer treated with neoadjuvant therapy were identified and their MR images before neoadjuvant therapy and at the end of the first cycle of neoadjuvant therapy were collected. Logistic regression model was used to detect independent factors for predicting pCR and construct the predictive model accordingly, then receiver operating characteristic (ROC) curve and decision curve were used to evaluate the predictive model. Results: ΔArea(max) and Δslope(max) were independent predictive factors for pCR, OR =0.942 (95% CI : 0.918-0.967) and 0.961 (95% CI : 0.940-0.987), respectively. The area under ROC curve (AUC) for the constructed model was 0.886 (95% CI : 0.820-0.951). Decision curve showed that in the range of the threshold probability above 0.4, the predictive model presented increased net benefit as the threshold probability increased. Conclusions: The constructed predictive model for pCR is of potential clinical value, with an AUC>0.85. Meanwhile, decision curve analysis indicates the constructed predictive model has net benefit from 3 to 8 percent in the likely range of probability threshold from 80% to 90%.

  16. Test-to-Failure of Crystalline Silicon Modules: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hacke, P.; Terwilliger, K.; Glick, S.; Trudell, D.; Bosco, N.; Johnston, S.; Kurtz, S. R.

    2010-10-01

    Accelerated lifetime testing of five crystalline silicon module designs was carried out according to the Terrestrial Photovoltaic Module Accelerated Test-to-Failure Protocol. This protocol compares the reliability of various module constructions on a quantitative basis. The modules under test are subdivided into three accelerated lifetime testing paths: 85..deg..C/85% relative humidity with system bias, thermal cycling between ?40..deg..C and 85..deg..C, and a path that alternates between damp heat and thermal cycling. The most severe stressor is damp heat with system bias applied to simulate the voltages that modules experience when connected in an array. Positive 600 V applied to the active layer with respect to the grounded module frame accelerates corrosion of the silver grid fingers and degrades the silicon nitride antireflective coating on the cells. Dark I-V curve fitting indicates increased series resistance and saturation current around the maximum power point; however, an improvement in junction recombination characteristics is obtained. Shunt paths and cell-metallization interface failures are seen developing in the silicon cells as determined by electroluminescence, thermal imaging, and I-V curves in the case of negative 600 V bias applied to the active layer. Ability to withstand electrolytic corrosion, moisture ingress, and ion drift under system voltage bias are differentiated.

  17. Modeling of Electrical Cable Failure in a Dynamic Assessment of Fire Risk

    Science.gov (United States)

    Bucknor, Matthew D.

    complexity to existing cable failure techniques and tuned to empirical data can better approximate the temperature response of a cables located in tightly packed cable bundles. The new models also provide a way to determine the conditions insides a cable bundle which allows for separate treatment of cables on the interior of the bundle from cables on the exterior of the bundle. The results from the DET analysis show that the overall assessed probability of cable failure can be significantly reduced by more realistically accounting for the influence that the fire brigade has on a fire progression scenario. The shielding analysis results demonstrate a significant reduction in the temperature response of a shielded versus a non-shielded cable bundle; however the computational cost of using a fire progression model that can capture these effects may be prohibitive for performing DET analyses with currently available computational fluid dynamics models and computational resources.

  18. Critical crack path assessments in failure investigations

    Directory of Open Access Journals (Sweden)

    Robert D. Caligiuri

    2015-10-01

    Full Text Available This paper presents a case study in which identification of the controlling crack path was critical to identifying the root cause of the failure. The case involves the rupture of a 30-inch (0.76 m natural gas pipeline in 2010 that tragically led to the destruction of a number of homes and the loss of life. The segment of the pipeline that ruptured was installed in 1956. The longitudinal seam of the segment that ruptured was supposed to have been fabricated by double submerged arc welding. Unfortunately, portions of the segment only received a single submerged arc weld on the outside, leaving unwelded areas on the inside diameter. Post-failure examination of the segment revealed that the rupture originated at one of these unwelded areas. Examination also revealed three additional crack paths or zones emanating from the unwelded area: a zone of ductile tearing, a zone of fatigue, and a zone of cleavage fracture, in that sequence. Initial investigators ignored the ductile tear, assumed the critical crack path was the fatigue component, and (incorrectly concluded that the root cause of the incident was the failure of the operator to hydrotest the segment after it was installed in 1956. However, as discussed in this paper, the critical path or mechanism was the ductile tear. Furthermore, it was determined that the ductile tear was created during the hydrotest at installation by a mechanism known as pressure reversal. Thus the correct root cause of the rupture was the hydrotest the operator subjected the segment to at installation, helping to increase the awareness of operators and regulators about the potential problems associated with hydrotesting.

  19. ESTABLISHMENT OF THE PERMISSIBLE TRAIN SPEED ON THE CURVED TURNOUTS

    Directory of Open Access Journals (Sweden)

    O. M. Patlasov

    2016-04-01

    Full Text Available Purpose. Turnouts play a key role in the railway transportation process. One-sided and many-sided curved turnouts were railed over the last 20 years in difficult conditions (curved sections, yard necks. They have a number of geometric features, unlike the conventional one-sided turnouts. Today the normative documents prohibit laying such turnouts in curved track sections and only partially regulate the assessment procedure of their real condition. The question of establishment the permissible train speed within the curved turnouts is still open. In this regard, authors propose to set the train speed according to the driving comfort criterion using the results of field measurements of ordinates from the baseline for the particular curved turnout. Methodology. The article considers the criteria using which one can set the permissible speed on the turnouts. It defines the complexity of their application, advantages and disadvantages. Findings. The work analyzes the speed distribution along the length of the real curved turnout for the forward and lateral directions. It establishes the change rate values of unbalanced accelerations for the existing norms of the curved track sections maintenance according to the difference in the adjacent bend versine at speeds up to 160 km/h. Originality. A method for establishing the trains’ speed limit within the curved turnouts was developed. It takes into account the actual geometric position in the plan of forward and lateral turnout directions. This approach makes it possible to establish a barrier places in plan on the turnouts limiting the train speed. Practical value. The proposed method makes it possible to objectively assess and set the trains’ permissible speed on the basis of the ordinate measurement of the forward and lateral directions of the curved turnouts from the baseline using the driving comfort criteria. The method was tested using real turnouts, which are located within the Pridneprovsk

  20. Corrosion induced failure analysis of subsea pipelines

    International Nuclear Information System (INIS)

    Yang, Yongsheng; Khan, Faisal; Thodi, Premkumar; Abbassi, Rouzbeh

    2017-01-01

    Pipeline corrosion is one of the main causes of subsea pipeline failure. It is necessary to monitor and analyze pipeline condition to effectively predict likely failure. This paper presents an approach to analyze the observed abnormal events to assess the condition of subsea pipelines. First, it focuses on establishing a systematic corrosion failure model by Bow-Tie (BT) analysis, and subsequently the BT model is mapped into a Bayesian Network (BN) model. The BN model facilitates the modelling of interdependency of identified corrosion causes, as well as the updating of failure probabilities depending on the arrival of new information. Furthermore, an Object-Oriented Bayesian Network (OOBN) has been developed to better structure the network and to provide an efficient updating algorithm. Based on this OOBN model, probability updating and probability adaptation are performed at regular intervals to estimate the failure probabilities due to corrosion and potential consequences. This results in an interval-based condition assessment of subsea pipeline subjected to corrosion. The estimated failure probabilities would help prioritize action to prevent and control failures. Practical application of the developed model is demonstrated using a case study. - Highlights: • A Bow-Tie (BT) based corrosion failure model linking causation with the potential losses. • A novel Object-Oriented Bayesian Network (OOBN) based corrosion failure risk model. • Probability of failure updating and adaptation with respect to time using OOBN model. • Application of the proposed model to develop and test strategies to minimize failure risk.

  1. Failure Monitoring and Condition Assessment of Steel-Concrete Adhesive Connection Using Ultrasonic Waves

    Directory of Open Access Journals (Sweden)

    Magdalena Rucka

    2018-02-01

    Full Text Available Adhesive bonding is increasingly being incorporated into civil engineering applications. Recently, the use of structural adhesives in steel-concrete composite systems is of particular interest. The aim of the study is an experimental investigation of the damage assessment of the connection between steel and concrete during mechanical degradation. Nine specimens consisted of a concrete cube and two adhesively bonded steel plates were examined. The inspection was based on the ultrasound monitoring during push-out tests. Ultrasonic waves were excited and registered by means of piezoelectric transducers every two seconds until the specimen failure. To determine the slip between the steel and concrete a photogrammetric method was applied. The procedure of damage evaluation is based on the monitoring of the changes in the amplitude and phase shift of signals measured during subsequent phases of degradation. To quantify discrepancies between the reference signal and other registered signals, the Sprague and Gears metric was applied. The results showed the possibilities and limitations of the proposed approach in diagnostics of adhesive connections between steel and concrete depending on the failure modes.

  2. Biaxial failure criteria and stress-strain response for concrete of containment structure

    International Nuclear Information System (INIS)

    Lee, S. K.; Woo, S. K.; Song, Y. C.; Kweon, Y. K.; Cho, C. H.

    2001-01-01

    Biaxial failure criteria and stress-strain response for plain concrete of containment structure on nuclear power plants are studied under uniaxial and biaxial stress(compression-compression, compression-tension, and tension-tension combined stress). The concrete specimens of a square plate type are used for uniaxial and biaxial loading. The experimental data indicate that the strength of concrete under biaxial compression, f 2 /f 1 =-1/-1, is 17 percent larger than under uniaxial compression and the poisson's ratio of concrete is 0.1745. On the base of the results, a biaxial failure envelope for plain concrete that the uniaxial strength is 5660 psi are provided, and the biaxial failure behaviors for three biaxial loading areas are plotted respectively. And, various analytical equations having the reliability are proposed for representations of the biaxial failure criteria and stress-strain response curves of concrete

  3. Structural integrity of stainless steel components exposed to neutron irradiation. Change in failure strength of cracked components due to cold working

    International Nuclear Information System (INIS)

    Kamaya, Masayuki; Hojo, Tomohiro; Mochizuki, Masahito

    2015-01-01

    Load carrying capacity of austenitic stainless steel component is increased due to hardening caused by neutron irradiation if no crack is included in the component. On the other hand, if a crack is initiated in the reactor components, the hardening may decrease the load carrying capacity due to reduction in fracture toughness. In this paper, in order to develop a failure assessment procedure of irradiated cracked components, characteristics of change in failure strength of stainless steels due to cold working were investigated. It was experimentally shown that the proof and tensile strengths were increased by the cold working, whereas the fracture toughness was decreased. The fracture strengths of a cylinder with a circumferential surface crack were analyzed using the obtained material properties. Although the cold working altered the failure mode from plastic collapse to the unsteady ductile crack growth, it did not reduce failure strengths even if 50% cold working was applied. The increase in failure strength was caused not only by increase in flow stress but also by reduction in J-integral value, which was brought by the change in stress-strain curve. It was shown that the failure strength of the hardened stainless steel components could be derived by the two-parameter method, in which the change in material properties could be reasonably considered. (author)

  4. The multi-class binomial failure rate model for the treatment of common-cause failures

    International Nuclear Information System (INIS)

    Hauptmanns, U.

    1995-01-01

    The impact of common cause failures (CCF) on PSA results for NPPs is in sharp contrast with the limited quality which can be achieved in their assessment. This is due to the dearth of observations and cannot be remedied in the short run. Therefore the methods employed for calculating failure rates should be devised such as to make the best use of the few available observations on CCF. The Multi-Class Binomial Failure Rate (MCBFR) Model achieves this by assigning observed failures to different classes according to their technical characteristics and applying the BFR formalism to each of these. The results are hence determined by a superposition of BFR type expressions for each class, each of them with its own coupling factor. The model thus obtained flexibly reproduces the dependence of CCF rates on failure multiplicity insinuated by the observed failure multiplicities. This is demonstrated by evaluating CCFs observed for combined impulse pilot valves in German NPPs. (orig.) [de

  5. Effects of intravenous home dobutamine in palliative end-stage heart failure on quality of life, heart failure hospitalization, and cost expenditure.

    Science.gov (United States)

    Martens, Pieter; Vercammen, Jan; Ceyssens, Wendy; Jacobs, Linda; Luwel, Evert; Van Aerde, Herwig; Potargent, Peter; Renaers, Monique; Dupont, Matthias; Mullens, Wilfried

    2018-01-17

    In patients with palliative end-stage heart failure, interventions that could provide symptomatic relief and prevent hospital admissions are important. Ambulatory continuous intravenous inotropes have been advocated by guidelines for such a purpose. We sought to determine the effect of intravenous dobutamine on symptomatic status, hospital stay, mortality, and cost expenditure. All consecutive end-stage heart failure patients not amenable for advanced therapies and discharged with continuous intravenous home dobutamine from a single tertiary centre between April 2011 and January 2017 were retrospectively analysed. Dobutamine (fixed dose) was infused through a single-lumen central venous catheter with a small pump that was refilled by a nurse on a daily basis. Symptomatic status was longitudinally assessed as the change in New York Heart Association class and patient global assessment scale. Antecedent and incident heart failure hospitalizations were determined in a paired fashion, and cost impact was assessed. A total of 21 patients (age 77 ± 9 years) were followed up for 869 ± 647 days. At first follow-up (6 ± 1 weeks) after the initiation of dobutamine, patients had a significant improvement in New York Heart Association class (-1.29 ± 0.64; P heart failure hospitalizations assessed at 3, 6, and 12 months were significantly reduced (P heart failure hospitalizations over the same time period. Cost expenditure was significantly lower at 3 (P heart failure is feasible and associated with improved symptomatic status, heart failure hospitalizations, and health-care-related costs. Nevertheless, results should be interpreted in the context of the small and retrospective design. Larger studies are necessary to evaluate the effect of dobutamine in palliative end-stage heart failure. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  6. Prognostic Value of Pulmonary Vascular Resistance by Magnetic Resonance in Systolic Heart Failure

    Energy Technology Data Exchange (ETDEWEB)

    Fabregat-Andrés, Óscar, E-mail: osfabregat@gmail.com [Departamento de Cardiologia - Hospital General Universitario de Valencia, Valencia (Spain); Fundación para la Investigación - Hospital General Universitario de Valencia, Valencia (Spain); Estornell-Erill, Jordi [Unidad de Imagen Cardiaca - ERESA - Hospital General Universitario de Valencia, Valencia (Spain); Ridocci-Soriano, Francisco [Departamento de Cardiologia - Hospital General Universitario de Valencia, Valencia (Spain); Departamento de Medicina. Universitat de Valencia, Valencia (Spain); Pérez-Boscá, José Leandro [Departamento de Cardiologia - Hospital General Universitario de Valencia, Valencia (Spain); García-González, Pilar [Unidad de Imagen Cardiaca - ERESA - Hospital General Universitario de Valencia, Valencia (Spain); Payá-Serrano, Rafael [Departamento de Cardiologia - Hospital General Universitario de Valencia, Valencia (Spain); Departamento de Medicina. Universitat de Valencia, Valencia (Spain); Morell, Salvador [Departamento de Cardiologia - Hospital General Universitario de Valencia, Valencia (Spain); Cortijo, Julio [Fundación para la Investigación - Hospital General Universitario de Valencia, Valencia (Spain); Departamento de Farmacologia. Universitat de Valencia, Valencia (Spain)

    2016-03-15

    Pulmonary hypertension is associated with poor prognosis in heart failure. However, non-invasive diagnosis is still challenging in clinical practice. We sought to assess the prognostic utility of non-invasive estimation of pulmonary vascular resistances (PVR) by cardiovascular magnetic resonance to predict adverse cardiovascular outcomes in heart failure with reduced ejection fraction (HFrEF). Prospective registry of patients with left ventricular ejection fraction (LVEF) < 40% and recently admitted for decompensated heart failure during three years. PVRwere calculated based on right ventricular ejection fraction and average velocity of the pulmonary artery estimated during cardiac magnetic resonance. Readmission for heart failure and all-cause mortality were considered as adverse events at follow-up. 105 patients (average LVEF 26.0 ±7.7%, ischemic etiology 43%) were included. Patients with adverse events at long-term follow-up had higher values of PVR (6.93 ± 1.9 vs. 4.6 ± 1.7estimated Wood Units (eWu), p < 0.001). In multivariate Cox regression analysis, PVR ≥ 5 eWu(cutoff value according to ROC curve) was independently associated with increased risk of adverse events at 9 months follow-up (HR2.98; 95% CI 1.12-7.88; p < 0.03). In patients with HFrEF, the presence of PVR ≥ 5.0 Wu is associated with significantly worse clinical outcome at follow-up. Non-invasive estimation of PVR by cardiac magnetic resonance might be useful for risk stratification in HFrEF, irrespective of etiology, presence of late gadolinium enhancement or LVEF.

  7. Prognostic Value of Pulmonary Vascular Resistance by Magnetic Resonance in Systolic Heart Failure

    International Nuclear Information System (INIS)

    Fabregat-Andrés, Óscar; Estornell-Erill, Jordi; Ridocci-Soriano, Francisco; Pérez-Boscá, José Leandro; García-González, Pilar; Payá-Serrano, Rafael; Morell, Salvador; Cortijo, Julio

    2016-01-01

    Pulmonary hypertension is associated with poor prognosis in heart failure. However, non-invasive diagnosis is still challenging in clinical practice. We sought to assess the prognostic utility of non-invasive estimation of pulmonary vascular resistances (PVR) by cardiovascular magnetic resonance to predict adverse cardiovascular outcomes in heart failure with reduced ejection fraction (HFrEF). Prospective registry of patients with left ventricular ejection fraction (LVEF) < 40% and recently admitted for decompensated heart failure during three years. PVRwere calculated based on right ventricular ejection fraction and average velocity of the pulmonary artery estimated during cardiac magnetic resonance. Readmission for heart failure and all-cause mortality were considered as adverse events at follow-up. 105 patients (average LVEF 26.0 ±7.7%, ischemic etiology 43%) were included. Patients with adverse events at long-term follow-up had higher values of PVR (6.93 ± 1.9 vs. 4.6 ± 1.7estimated Wood Units (eWu), p < 0.001). In multivariate Cox regression analysis, PVR ≥ 5 eWu(cutoff value according to ROC curve) was independently associated with increased risk of adverse events at 9 months follow-up (HR2.98; 95% CI 1.12-7.88; p < 0.03). In patients with HFrEF, the presence of PVR ≥ 5.0 Wu is associated with significantly worse clinical outcome at follow-up. Non-invasive estimation of PVR by cardiac magnetic resonance might be useful for risk stratification in HFrEF, irrespective of etiology, presence of late gadolinium enhancement or LVEF

  8. Controller recovery from equipment failures in air traffic control: A framework for the quantitative assessment of the recovery context

    International Nuclear Information System (INIS)

    Subotic, Branka; Schuster, Wolfgang; Majumdar, Arnab; Ochieng, Washington

    2014-01-01

    Air Traffic Control (ATC) involves a complex interaction of human operators (primarily air traffic controllers), equipment and procedures. On the rare occasions when equipment malfunctions, controllers play a crucial role in the recovery process of the ATC system for continued safe operation. Research on human performance in other safety critical industries using human reliability assessment techniques has shown that the context in which recovery from failures takes place has a significant influence on the outcome of the process. This paper investigates the importance of context in which air traffic controller recovery from equipment failures takes place, defining it in terms of 20 Recovery Influencing Factors (RIFs). The RIFs are used to develop a novel approach for the quantitative assessment of the recovery context based on a metric referred to as the Recovery Context Indicator (RCI). The method is validated by a series of simulation exercises conducted at a specific ATC Centre. The proposed method is useful to assess recovery enhancement approaches within ATC centres

  9. Pathophysiological Characteristics Underlying Different Glucose Response Curves

    DEFF Research Database (Denmark)

    Hulman, Adam; Witte, Daniel R; Vistisen, Dorte

    2018-01-01

    different glucose curve patterns and studied their stability and reproducibility over 3 years of follow-up. RESEARCH DESIGN AND METHODS: We analyzed data from participants without diabetes from the observational cohort from the European Group for the Study of Insulin Resistance: Relationship between Insulin...... and secretion. The glucose patterns identified at follow-up were similar to those at baseline, suggesting that the latent class method is robust. We integrated our classification model into an easy-to-use online application that facilitates the assessment of glucose curve patterns for other studies. CONCLUSIONS...... Sensitivity and Cardiovascular Disease study; participants had a five-time point OGTT at baseline (n = 1,443) and after 3 years (n = 1,045). Measures of insulin sensitivity and secretion were assessed at baseline with a euglycemic-hyperinsulinemic clamp and intravenous glucose tolerance test. Heterogeneous...

  10. Modelling and assessment of urban flood hazards based on rainfall intensity-duration-frequency curves reformation

    OpenAIRE

    Ghazavi, Reza; Moafi Rabori, Ali; Ahadnejad Reveshty, Mohsen

    2016-01-01

    Estimate design storm based on rainfall intensity–duration–frequency (IDF) curves is an important parameter for hydrologic planning of urban areas. The main aim of this study was to estimate rainfall intensities of Zanjan city watershed based on overall relationship of rainfall IDF curves and appropriate model of hourly rainfall estimation (Sherman method, Ghahreman and Abkhezr method). Hydrologic and hydraulic impacts of rainfall IDF curves change in flood properties was evaluated via Stormw...

  11. Anorexia, functional capacity, and clinical outcome in patients with chronic heart failure: results from the Studies Investigating Co‐morbidities Aggravating Heart Failure (SICA‐HF)

    Science.gov (United States)

    Saitoh, Masakazu; dos Santos, Marcelo R.; Emami, Amir; Ishida, Junichi; Ebner, Nicole; Valentova, Miroslava; Bekfani, Tarek; Sandek, Anja; Lainscak, Mitja; Doehner, Wolfram; Anker, Stefan D.

    2017-01-01

    Abstract Aims We aimed to assess determinants of anorexia, that is loss of appetite in patients with heart failure (HF) and aimed to further elucidate the association between anorexia, functional capacity, and outcomes in affected patients. Methods and results We assessed anorexia status among 166 patients with HF (25 female, 66 ± 12 years) who participated in the Studies Investigating Co‐morbidities Aggravating HF. Anorexia was assessed by a 6‐point Likert scale (ranging from 0 to 5), wherein values ≥1 indicate anorexia. Functional capacity was assessed as peak oxygen uptake (peak VO2), 6 min walk test, and short physical performance battery test. A total of 57 patients (34%) reported any anorexia, and these patients showed lower values of peak VO2, 6 min walk distance, and short physical performance battery score (all P anorexia. A total of 22 patients (13%) died during a mean follow‐up of 22.5 ± 5.1 months. Kaplan‐Meier curves for cumulative survival showed that those patients with anorexia presented higher mortality (Log‐rank test P = 0.03). Conclusions Inflammation, use of loop diuretics, and cachexia are associated with an increased likelihood of anorexia in patients with HF, and patients with anorexia showed impaired functional capacity and poor outcomes. PMID:28960880

  12. Determination of Ductile Tearing Resistance Curve in Weld Joints

    International Nuclear Information System (INIS)

    Marie, S.; Gilles, P.; Ould, P.

    2010-01-01

    Steels present in the ductile domain a tearing resistance which increase with the crack propagation up to the failure. This ductile tearing resistance is in general characterised with curves giving the variation of a global parameter (opening displacement at the crack tip delta, integral J) versus the crack extension Delta a. These global approaches depend more or less on the specimen geometry and on the type of the imposed loading. Local approaches based on the description of the ductile tearing mechanisms provide reliable solution to the transferability problem (from the lab specimen to the component) but are complex and costly to use and are not codified. These problems get worse in the case of a weld joint where no standard is available for the measurement of their ductile tearing resistance. But the welded joints are often the weak point of the structure because of greater risk of defects, the heterogeneity of the microstructure of the weld, deformation along the interface between two materials with different yield stress (mismatch).... After briefly recalling the problems of transferability of the ductile tearing resistance curves obtained on lab specimen to the case of components, this article identifies the factors complicating the determination of the toughness in the welded joints and gives recommendations for the experimental determination of ductile tearing resistance curves of welded joints

  13. FN-curves: preliminary estimation of severe accident risks after Fukushima

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Costa, Antonio Carlos Lopes da

    2015-01-01

    Doubts of whether the risks related to severe accidents in nuclear reactors are indeed very low were raised after the nuclear accident at Fukushima Daiichi in 2011. Risk estimations of severe accidents in nuclear power plants involve both probability and consequence assessment of such events. Among the ways to display risks, risk curves are tools that express the frequency of exceeding a certain magnitude of consequence. Societal risk is often represented graphically in a FN-curve, a type of risk curve, which displays the probability of having N or more fatalities per year, as a function of N, on a double logarithmic scale. The FN-curve, originally introduced for the assessment of the risks in the nuclear industry through the U.S.NRC Reactor Safety Study WASH-1400 (1975), is used in various countries to express and limit risks of hazardous activities. This first study estimated an expected rate of core damage equal to 5x10 -5 by reactor-year and suggested an upper bound of 3x10 -4 by reactor-year. A more recent report issued by Electric Power Research Institute - EPRI (2008) estimates a figure of the order of 2x10 -5 by reactor-year. The Fukushima nuclear accident apparently implies that the observed core damage frequency is higher than that predicted by these probabilistic safety assessments. Therefore, this paper presents a preliminary analyses of the FN-curves related to severe nuclear reactor accidents, taking into account a combination of available data of past accidents, probability modelling to estimate frequencies, and expert judgments. (author)

  14. FN-curves: preliminary estimation of severe accident risks after Fukushima

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Costa, Antonio Carlos Lopes da, E-mail: vasconv@cdtn.br, E-mail: soaresw@cdtn.br, E-mail: aclc@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    Doubts of whether the risks related to severe accidents in nuclear reactors are indeed very low were raised after the nuclear accident at Fukushima Daiichi in 2011. Risk estimations of severe accidents in nuclear power plants involve both probability and consequence assessment of such events. Among the ways to display risks, risk curves are tools that express the frequency of exceeding a certain magnitude of consequence. Societal risk is often represented graphically in a FN-curve, a type of risk curve, which displays the probability of having N or more fatalities per year, as a function of N, on a double logarithmic scale. The FN-curve, originally introduced for the assessment of the risks in the nuclear industry through the U.S.NRC Reactor Safety Study WASH-1400 (1975), is used in various countries to express and limit risks of hazardous activities. This first study estimated an expected rate of core damage equal to 5x10{sup -5} by reactor-year and suggested an upper bound of 3x10{sup -4} by reactor-year. A more recent report issued by Electric Power Research Institute - EPRI (2008) estimates a figure of the order of 2x10{sup -5} by reactor-year. The Fukushima nuclear accident apparently implies that the observed core damage frequency is higher than that predicted by these probabilistic safety assessments. Therefore, this paper presents a preliminary analyses of the FN-curves related to severe nuclear reactor accidents, taking into account a combination of available data of past accidents, probability modelling to estimate frequencies, and expert judgments. (author)

  15. Worsening renal function definition is insufficient for evaluating acute renal failure in acute heart failure.

    Science.gov (United States)

    Shirakabe, Akihiro; Hata, Noritake; Kobayashi, Nobuaki; Okazaki, Hirotake; Matsushita, Masato; Shibata, Yusaku; Nishigoori, Suguru; Uchiyama, Saori; Asai, Kuniya; Shimizu, Wataru

    2018-06-01

    Whether or not the definition of a worsening renal function (WRF) is adequate for the evaluation of acute renal failure in patients with acute heart failure is unclear. One thousand and eighty-three patients with acute heart failure were analysed. A WRF, indicated by a change in serum creatinine ≥0.3 mg/mL during the first 5 days, occurred in 360 patients while no-WRF, indicated by a change failure; n = 98). The patients were assigned to another set of four groups: no-WRF/no-AKI (n = 512), no-WRF/AKI (n = 211), WRF/no-AKI (n = 239), and WRF/AKI (n = 121). A multivariate logistic regression model found that no-WRF/AKI and WRF/AKI were independently associated with 365 day mortality (hazard ratio: 1.916; 95% confidence interval: 1.234-2.974 and hazard ratio: 3.622; 95% confidence interval: 2.332-5.624). Kaplan-Meier survival curves showed that the rate of any-cause death during 1 year was significantly poorer in the no-WRF/AKI and WRF/AKI groups than in the WRF/no-AKI and no-WRF/no-AKI groups and in Class I and Class F than in Class R and the no-AKI group. The presence of AKI on admission, especially Class I and Class F status, is associated with a poor prognosis despite the lack of a WRF within the first 5 days. The prognostic ability of AKI on admission may be superior to WRF within the first 5 days. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  16. Comparison of Sprint Fidelis and Riata defibrillator lead failure rates.

    Science.gov (United States)

    Fazal, Iftikhar A; Shepherd, Ewen J; Tynan, Margaret; Plummer, Christopher J; McComb, Janet M

    2013-09-30

    Sprint Fidelis and Riata defibrillator leads are prone to early failure. Few data exist on the comparative failure rates and mortality related to lead failure. The aims of this study were to determine the failure rate of Sprint Fidelis and Riata leads, and to compare failure rates and mortality rates in both groups. Patients implanted with Sprint Fidelis leads and Riata leads at a single centre were identified and in July 2012, records were reviewed to ascertain lead failures, deaths, and relationship to device/lead problems. 113 patients had Sprint Fidelis leads implanted between June 2005 and September 2007; Riata leads were implanted in 106 patients between January 2003 and February 2008. During 53.0 ± 22.3 months of follow-up there were 13 Sprint Fidelis lead failures (11.5%, 2.60% per year) and 25 deaths. Mean time to failure was 45.1 ± 15.5 months. In the Riata lead cohort there were 32 deaths, and 13 lead failures (11.3%, 2.71% per year) over 54.8 ± 26.3 months follow-up with a mean time to failure of 53.5 ± 24.5 months. There were no significant differences in the lead failure-free Kaplan-Meier survival curve (p=0.77), deaths overall (p=0.17), or deaths categorised as sudden/cause unknown (p=0.54). Sprint Fidelis and Riata leads have a significant but comparable failure rate at 2.60% per year and 2.71% per year of follow-up respectively. The number of deaths in both groups is similar and no deaths have been identified as being related to lead failure in either cohort. Copyright © 2012. Published by Elsevier Ireland Ltd.

  17. Serviceability Assessment for Cascading Failures in Water Distribution Network under Seismic Scenario

    Directory of Open Access Journals (Sweden)

    Qing Shuang

    2016-01-01

    Full Text Available The stability of water service is a hot point in industrial production, public safety, and academic research. The paper establishes a service evaluation model for the water distribution network (WDN. The serviceability is measured in three aspects: (1 the functionality of structural components under disaster environment; (2 the recognition of cascading failure process; and (3 the calculation of system reliability. The node and edge failures in WDN are interrelated under seismic excitations. The cascading failure process is provided with the balance of water supply and demand. The matrix-based system reliability (MSR method is used to represent the system events and calculate the nonfailure probability. An example is used to illustrate the proposed method. The cascading failure processes with different node failures are simulated. The serviceability is analyzed. The critical node can be identified. The result shows that the aged network has a greater influence on the system service under seismic scenario. The maintenance could improve the antidisaster ability of WDN. Priority should be given to controlling the time between the initial failure and the first secondary failure, for taking postdisaster emergency measures within this time period can largely cut down the spread of cascade effect in the whole WDN.

  18. Correlation model to analyze dependent failures for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Dezfuli, H.

    1985-01-01

    A methodology is formulated to study the dependent (correlated) failures of various abnormal events in nuclear power plants. This methodology uses correlation analysis is a means for predicting and quantifying the dependent failures. Appropriate techniques are also developed to incorporate the dependent failure in quantifying fault trees and accident sequences. The uncertainty associated with each estimation in all of the developed techniques is addressed and quantified. To identify the relative importance of the degree of dependency (correlation) among events and to incorporate these dependencies in the quantification phase of PRA, the interdependency between a pair of events in expressed with the aid of the correlation coefficient. For the purpose of demonstrating the methodology, the data base used in the Accident Sequence Precursor Study (ASP) was adopted and simulated to obtain distributions for the correlation coefficients. A computer program entitled Correlation Coefficient Generator (CCG) was developed to generate a distribution for each correlation coefficient. The method of bootstrap technique was employed in the CCG computer code to determine confidence limits of the estimated correlation coefficients. A second computer program designated CORRELATE was also developed to obtain probability intervals for both fault trees and accident sequences with statistically correlated failure data

  19. Predicting device failure after percutaneous repair of functional mitral regurgitation in advanced heart failure: Implications for patient selection.

    Science.gov (United States)

    Stolfo, Davide; De Luca, Antonio; Morea, Gaetano; Merlo, Marco; Vitrella, Giancarlo; Caiffa, Thomas; Barbati, Giulia; Rakar, Serena; Korcova, Renata; Perkan, Andrea; Pinamonti, Bruno; Pappalardo, Aniello; Berardini, Alessandra; Biagini, Elena; Saia, Francesco; Grigioni, Francesco; Rapezzi, Claudio; Sinagra, Gianfranco

    2018-04-15

    Patients with heart failure (HF) and severe symptomatic functional mitral regurgitation (FMR) may benefit from MitraClip implantation. With increasing numbers of patients being treated the success of procedure becomes a key issue. We sought to investigate the pre-procedural predictors of device failure in patients with advanced HF treated with MitraClip. From April 2012 to November 2016, 76 patients with poor functional class (NYHA class III-IV) and severe left ventricular (LV) remodeling underwent MitraClip implantation at University Hospitals of Trieste and Bologna (Italy). Device failure was assessed according to MVARC criteria. Patients were subsequently followed to additionally assess the patient success after 12months. Mean age was 67±12years, the mean Log-EuroSCORE was 23.4±16.5%, and the mean LV end-diastolic volume index and ejection fraction (EF) were 112±33ml/m 2 and 30.6±8.9%, respectively. At short-term evaluation, device failure was observed in 22 (29%) patients. Univariate predictors of device failure were LVEF, LV and left atrial volumes and anteroposterior mitral annulus diameter. Annulus dimension (OR 1.153, 95% CI 1.002-1.327, p=0.043) and LV end-diastolic volume (OR 1.024, 95% CI 1.000-1.049, p=0.049) were the only variables independently associated with the risk of device failure at the multivariate model. Pre-procedural anteroposterior mitral annulus diameter accurately predicted the risk of device failure after MitraClip in the setting of advanced HF. Its assessment might aid the selection of the best candidates to percutaneous correction of FMR. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Assessment of cardiac sympathetic nerve activity in children with chronic heart failure using quantitative iodine-123 metaiodobenzylguanidine imaging

    Energy Technology Data Exchange (ETDEWEB)

    Karasawa, Kensuke; Ayusawa, Mamoru; Noto, Nobutaka; Sumitomo, Naokata; Okada, Tomoo; Harada, Kensuke [Nihon Univ., Tokyo (Japan). School of Medicine

    2000-12-01

    Cardiac sympathetic nerve activity in children with chronic heart failure was examined by quantitative iodine-123 metaiodobenzylguanidine (MIBG) myocardial imaging in 33 patients aged 7.5{+-}6.1 years (range 0-18 years), including 8 with cardiomyopathy, 15 with congenital heart disease, 3 with anthracycrine cardiotoxicity, 3 with myocarditis, 3 with primary pulmonary hypertension and 1 with Pompe's disease. Anterior planar images were obtained 15 min and 3 hr after the injection of iodine-123 MIBG. The cardiac iodine-123 MIBG uptake was assessed as the heart to upper mediastinum uptake activity ratio of the delayed image (H/M) and the cardiac percentage washout rate (%WR). The severity of chronic heart failure was class I (no medication) in 8 patients, class II (no symptom with medication) in 9, class III (symptom even with medication) in 10 and class IV (late cardiac death) in 6. H/M was 2.33{+-}0.22 in chronic heart failure class I, 2.50{+-}0.34 in class II, 1.95{+-}0.61 in class III, and 1.39{+-}0.29 in class IV (p<0.05). %WR was 24.8{+-}12.8% in chronic heart failure class I, 23.3{+-}10.2% in class II, 49.2{+-}24.5% in class III, and 66.3{+-}26.5% in class IV (p<0.05). The low H/M and high %WR were proportionate to the severity of chronic heart failure. Cardiac iodine-123 MIBG showed cardiac adrenergic neuronal dysfunction in children with severe chronic heart failure. Quantitative iodine-123 MIBG myocardial imaging is clinically useful as a predictor of therapeutic outcome and mortality in children with chronic heart failure. (author)

  1. Revisiting single-point incremental forming and formability/failure diagrams by means of finite elements and experimentation

    DEFF Research Database (Denmark)

    Silva, M. B.; Skjødt, Martin; Bay, Niels

    2009-01-01

    framework accounts for the influence of major process parameters and their mutual interaction to be studied both qualitatively and quantitatively. It enables the conclusion to be drawn that the probable mode of material failure in SPIF is consistent with stretching, rather than shearing being the governing...... mode of deformation. The study of the morphology of the cracks combined with the experimentally observed suppression of neck formation enabled the authors to conclude that traditional forming limit curves are inapplicable for describing failure. Instead, fracture forming limit curves should be employed...... the forming limits determined by the analytical framework with experimental values. It is shown that agreement between analytical, finite element, and experimental results is good, implying that the previously proposed analytical framework can be utilized to explain the mechanics of deformation...

  2. Pertussis toxin treatment of whole blood. A novel approach to assess G protein function in congestive heart failure

    NARCIS (Netherlands)

    Maisel, A. S.; Michel, M. C.; Insel, P. A.; Ennis, C.; Ziegler, M. G.; Phillips, C.

    1990-01-01

    This study was designed to assess G protein function in mononuclear leukocytes (MNL) of patients with congestive heart failure (CHF). MNL membranes were ADP-ribosylated in vitro in the presence of pertussis or cholera toxin. The amount of pertussis toxin substrates did not differ significantly

  3. Field failure mechanisms for photovoltaic modules

    Science.gov (United States)

    Dumas, L. N.; Shumka, A.

    1981-01-01

    Beginning in 1976, Department of Energy field centers have installed and monitored a number of field tests and application experiments using current state-of-the-art photovoltaic modules. On-site observations of module physical and electrical degradation, together with in-depth laboratory analysis of failed modules, permits an overall assessment of the nature and causes of early field failures. Data on failure rates are presented, and key failure mechanisms are analyzed with respect to origin, effect, and prospects for correction. It is concluded that all failure modes identified to date are avoidable or controllable through sound design and production practices.

  4. Using impedance cardiography to assess left ventricular systolic function via postural change in patients with heart failure.

    Science.gov (United States)

    DeMarzo, Arthur P; Calvin, James E; Kelly, Russell F; Stamos, Thomas D

    2005-01-01

    For the diagnosis and management of heart failure, it would be useful to have a simple point-of-care test for assessing ventricular function that could be performed by a nurse. An impedance cardiography (ICG) parameter called systolic amplitude (SA) can serve as an indicator of left ventricular systolic function (LVSF). This study tested the hypothesis that patients with normal LVSF should have a significant increase in SA in response to an increase in end-diastolic volume caused by postural change from sitting upright to supine, while patients with depressed LVSF associated with heart failure should have a minimal increase or a decrease in SA from upright to supine. ICG data were obtained in 12 patients without heart disease and with normal LVSF and 18 patients with clinically diagnosed heart failure. Consistent with the hypothesis, patients with normal LVSF had a significant increase in SA from upright to supine, whereas heart failure patients had a minimal increase or a decrease in SA from upright to supine. This ICG procedure may be useful for monitoring the trend of patient response to titration of beta blockers and other medications. ICG potentially could be used to detect worsening LVSF and provide a means of measurement for adjusting treatment.

  5. The development of an expert system for finding fragility curves of building structural systems in the preliminary design stage

    International Nuclear Information System (INIS)

    Yee, L.Y.; Okrent, D.

    1987-01-01

    This research is a starting point for the development of an expert system for determining seismic fragility curves of structural systems in a nuclear power plant or conventional building at the preliminary design stage. The resulting system assists an engineer with moderate engineering background and limited reliability knowledge to analyze the failure functions of building structures. It simulates the performance of an expert in identifying the potential failure modes and their variabilities for a structure of interest. On reviewing the methodology of seismic fragility evaluation for existing building structures in the nuclear power plant industry, one finds that the investigation process starts with the identification of critical components or substructures, whose failures result in the functional failure of safety related equipment or the failure of structural integrity itself, and follows with complicated numerical analyses to estimate the capacity functions associated with the limit states of these components or substructures

  6. Assessment of p-y Curves from Numerical Methods for a non-Slender Monopile in Cohesionless Soil

    DEFF Research Database (Denmark)

    Ibsen, Lars Bo; Roesen, Hanne Ravn; Wolf, Torben K.

    2013-01-01

    In current design the stiff large diameter monopile is a widely used solution as foundation of offshore wind turbines. Winds and waves subject the monopile to considerable lateral loads. The current design guidances apply the p-y curve method with formulations for the curves based on slender piles....... However, the behaviour of the stiff monopiles during lateral loading is not fully understood. In this paper case study from Barrow Offshore Wind Farm is used in a 3D finite element model. The analysis forms a basis for extraction of p-y curves which are used in an evaluation of the traditional curves...

  7. Assessment of p-y Curves from Numerical Methods for a non-Slender Monopile in Cohesionless Soil

    DEFF Research Database (Denmark)

    Wolf, Torben K.; Rasmussen, Kristian L.; Hansen, Mette

    In current design the stiff large diameter monopile is a widely used solution as foundation of offshore wind turbines. Winds and waves subject the monopile to considerable lateral loads. The current design guidances apply the p-y curve method with formulations for the curves based on slender piles....... However, the behaviour of the stiff monopiles during lateral loading is not fully understood. In this paper case study from Barrow Offshore Wind Farm is used in a 3D finite element model. The analysis forms a basis for extraction of p-y curves which are used in an evaluation of the traditional curves...

  8. Failure to Fail

    Directory of Open Access Journals (Sweden)

    Samuel Vriezen

    2013-07-01

    Full Text Available Between pessimism and optimism, Samuel Vriezen attempts to intuit a third way through an assessment of failure and negativity in the consonances and tensions between the prosody of Irish playwright Samuel Becekett and American poet Gertrude Stein.

  9. High mortality among heart failure patients treated with antidepressants

    DEFF Research Database (Denmark)

    Veien, Karsten Tang; Videbæk, Lars; Schou, Morten

    2011-01-01

    This study was designed to assess whether pharmacologically treated depression was associated with increased mortality risk in systolic heart failure (SHF) patients.......This study was designed to assess whether pharmacologically treated depression was associated with increased mortality risk in systolic heart failure (SHF) patients....

  10. Prognostic importance of pulmonary hypertension in patients with heart failure

    DEFF Research Database (Denmark)

    Kjaergaard, Jesper; Akkan, Dilek; Iversen, Kasper Karmark

    2007-01-01

    Pulmonary hypertension is a well-known complication in heart failure, but its prognostic importance is less well established. This study assessed the risk associated with pulmonary hypertension in patients with heart failure with preserved or reduced left ventricular (LV) ejection fractions. Pati...... obstructive lung disease, heart failure, and impaired renal function. In conclusion, pulmonary hypertension is associated with increased short- and long-term mortality in patients with reduced LV ejection fractions and also in patients with preserved LV ejection fractions.......Pulmonary hypertension is a well-known complication in heart failure, but its prognostic importance is less well established. This study assessed the risk associated with pulmonary hypertension in patients with heart failure with preserved or reduced left ventricular (LV) ejection fractions....... Patients with known or presumed heart failure (n = 388) underwent the echocardiographic assessment of pulmonary systolic pressure and LV ejection fraction. Patients were followed for up to 5.5 years. Increased pulmonary pressure was associated with increased short- and long-term mortality (p

  11. Initiation of Failure for Masonry Subject to In-Plane Loads through Micromechanics

    Directory of Open Access Journals (Sweden)

    V. P. Berardi

    2016-01-01

    Full Text Available A micromechanical procedure is used in order to evaluate the initiation of damage and failure of masonry with in-plane loads. Masonry material is viewed as a composite with periodic microstructure and, therefore, a unit cell with suitable boundary conditions is assumed as a representative volume element of the masonry. The finite element method is used to determine the average stress on the unit cell corresponding to a given average strain prescribed on the unit cell. Finally, critical curves representing the initiation of damage and failure in both clay brick masonry and adobe masonry are provided.

  12. Percentile Curves for Anthropometric Measures for Canadian Children and Youth

    Science.gov (United States)

    Kuhle, Stefan; Maguire, Bryan; Ata, Nicole; Hamilton, David

    2015-01-01

    Body mass index (BMI) is commonly used to assess a child's weight status but it does not provide information about the distribution of body fat. Since the disease risks associated with obesity are related to the amount and distribution of body fat, measures that assess visceral or subcutaneous fat, such as waist circumference (WC), waist-to-height ratio (WHtR), or skinfolds thickness may be more suitable. The objective of this study was to develop percentile curves for BMI, WC, WHtR, and sum of 5 skinfolds (SF5) in a representative sample of Canadian children and youth. The analysis used data from 4115 children and adolescents between 6 and 19 years of age that participated in the Canadian Health Measures Survey Cycles 1 (2007/2009) and 2 (2009/2011). BMI, WC, WHtR, and SF5 were measured using standardized procedures. Age- and sex-specific centiles were calculated using the LMS method and the percentiles that intersect the adult cutpoints for BMI, WC, and WHtR at age 18 years were determined. Percentile curves for all measures showed an upward shift compared to curves from the pre-obesity epidemic era. The adult cutoffs for overweight and obesity corresponded to the 72nd and 91st percentile, respectively, for both sexes. The current study has presented for the first time percentile curves for BMI, WC, WHtR, and SF5 in a representative sample of Canadian children and youth. The percentile curves presented are meant to be descriptive rather than prescriptive as associations with cardiovascular disease markers or outcomes were not assessed. PMID:26176769

  13. Clinical evaluation of new methods for the assessment of heart failure

    NARCIS (Netherlands)

    J.A.M. Wijbenga (Anke)

    1999-01-01

    textabstractAlthough every physician seems to know the term "heart failure", there is no general agreement on its definition. Due to the complex nature of heart failure and the changing Insights into its pathophysiology over time, many different definitions exist. l.' Some focus on clinical

  14. Assessment of a Business-to-Consumer (B2C) model for Telemonitoring patients with Chronic Heart Failure (CHF)

    NARCIS (Netherlands)

    A.S. Grustam (Andrija); Vrijhoef, H.J.M. (Hubertus J. M.); R. Koymans (Ron); Hukal, P. (Philipp); J.L. Severens (Hans)

    2017-01-01

    textabstractBackground: The purpose of this study is to assess the Business-to-Consumer (B2C) model for telemonitoring patients with Chronic Heart Failure (CHF) by analysing the value it creates, both for organizations or ventures that provide telemonitoring services based on it, and for society.

  15. Assessing the Value-Added by the Environmental Testing Process with the Aide of Physics/Engineering of Failure Evaluations

    Science.gov (United States)

    Cornford, S.; Gibbel, M.

    1997-01-01

    NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.

  16. Traditional and new composite endpoints in heart failure clinical trials : facilitating comprehensive efficacy assessments and improving trial efficiency

    NARCIS (Netherlands)

    Anker, Stefan D. t; Schroeder, Stefan; Atar, Dan; Bax, Jeroen J.; Ceconi, Claudio; Cowie, Martin R.; AdamCrisp,; Dominjon, Fabienne; Ford, Ian; Ghofrani, Hossein-Ardeschir; Gropper, Savion; Hindricks, Gerhard; Hlatky, Mark A.; Holcomb, Richard; Honarpour, Narimon; Jukema, J. Wouter; Kim, Albert M.; Kunz, Michael; Lefkowitz, Martin; Le Floch, Chantal; Landmesser, Ulf; McDonagh, Theresa A.; McMurray, John J.; Merkely, Bela; Packer, Milton; Prasad, Krishna; Revkin, James; Rosano, Giuseppe M. C.; Somaratne, Ransi; Stough, Wendy Gattis; Voors, Adriaan A.; Ruschitzka, Frank

    Composite endpoints are commonly used as the primary measure of efficacy in heart failure clinical trials to assess the overall treatment effect and to increase the efficiency of trials. Clinical trials still must enrol large numbers of patients to accrue a sufficient number of outcome events and

  17. Assessment of p-y curves from numerical methods for a non-slender monopile in cohesionless soil

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, L. B.; Ravn Roesen, H. [Aalborg Univ. Dept. of Civil Engineering, Aalborg (Denmark); Hansen, Mette; Kirk Wolf, T. [COWI, Kgs. Lyngby (Denmark); Lange Rasmussen, K. [Niras, Aalborg (Denmark)

    2013-06-15

    In current design the monopile is a widely used solution as foundation of offshore wind turbines. Winds and waves subject the monopile to considerable lateral loads. The behaviour of monopiles under lateral loading is not fully understood and the current design guidances apply the p-y curve method in a Winkler model approach. The p-y curve method was originally developed for jag-piles used in the oil and gas industry which are much more slender than the monopile foundation. In recent years the 3D finite element analysis has become a tool in the investigation of complex geotechnical situations, such as the laterally loaded monopile. In this paper a 3D FEA is conducted as basis of an extraction of p-y curves, as a basis for an evaluation of the traditional curves. Two different methods are applied to create a list of data points used for the p-y curves: A force producing a similar response as seen in the ULS situation is applied stepwise; hereby creating the most realistic soil response. This method, however, does not generate sufficient data points around the rotation point of the pile. Therefore, also a forced horizontal displacement of the entire pile is applied, whereby displacements are created over the entire length of the pile. The response is extracted from the interface and the nearby soil elements respectively, as to investigate the influence this has on the computed curves. p-y curves are obtained near the rotation point by evaluation of soil response during a prescribed displacement but the response is not in clear agreement with the response during an applied load. Two different material models are applied. It is found that the applied material models have a significant influence on the stiffness of the evaluated p-y curves. The p-y curves evaluated by means of FEA are compared to the conventional p-y curve formulation which provides a much stiffer response. It is found that the best response is computed by implementing the Hardening Soil model and

  18. Tension Behaviour on the Connection of the Cold-Formed Cut-Curved Steel Channel Section

    Science.gov (United States)

    Sani, Mohd Syahrul Hisyam Mohd; Muftah, Fadhluhartini; Fakri Muda, Mohd; Siang Tan, Cher

    2017-08-01

    Cold-formed steel (CFS) are utilised as a non-structural and structural element in construction activity especially a residential house and small building roof truss system. CFS with a lot of advantages and some of disadvantages such as buckling that must be prevented for roof truss production are being studied equally. CFS was used as a top chord of the roof truss system which normally a slender section is dramatically influenced to buckling failure and instability of the structure. So, the curved section is produced for a top chord for solving the compression member of the roof truss. Besides, there are lacked of design and production information about the CFS curved channel section. In the study, the CFS is bent by using a cut-curved method because of ease of production, without the use of skilled labour and high cost machine. The tension behaviour of the strengthening method of cut-curved or could be recognised as a connection of the cut-curved section was tested and analysed. There are seven types of connection was selected. From the testing and observation, it is shown the specimen with full weld along the cut section and adds with flange element plate with two self-drilling screws (F7A) was noted to have a higher value of ultimate load. Finally, there are three alternative methods of connection for CFS cut-curved that could be a reference for a contractor and further design.

  19. Can failure carefully observed become a springboard to success?

    Science.gov (United States)

    Adrian, Manuella

    2012-01-01

    Since its inception, the addictions field has had a history of failure: failures in conceptualizations, in treatment, in interventions, in policies, in process as well as outcome assessment. Certain actions and activities have had a less than stellar effect which may lead to feelings of personal failure among practitioners, the tagging of processes and programs as being failures when they are not so, as well as an identification of the person being intervened with, by self and others, as being a failure or loser. This paper discusses how to define success and failure and the need to identify both the short(er) and long(er) term, as well as temporary and permanent effects, including the implications of using binary (success or failure; success and failure) and nonbinary (and in addition) categories of assessment. The need to clarify expectations and to establish goals and measurable effects are noted. Being open to accepting results which may be personally disappointing, initially, but which offer opportunities for needed changes may lead to new developments in the field and the establishment of better interventions.

  20. Assessment of the coronary venous system in heart failure patients by blood pool agent enhanced whole-heart MRI

    Energy Technology Data Exchange (ETDEWEB)

    Manzke, Robert [University Hospital of Ulm, Department of Internal Medicine II, Ulm (Germany); Philips Research Europe, Clinical Sites Research, Hamburg (Germany); Binner, Ludwig; Bornstedt, Axel; Merkle, Nico; Lutz, Anja; Gradinger, Robert [University Hospital of Ulm, Department of Internal Medicine II, Ulm (Germany); Rasche, Volker [University Hospital of Ulm, Department of Internal Medicine II, Ulm (Germany); Experimental Cardiovascular Imaging, Internal Medicine II, Ulm (Germany)

    2011-04-15

    To investigate the feasibility of MRI for non-invasive assessment of the coronary sinus (CS) and the number and course of its major tributaries in heart failure patients. Fourteen non-ischaemic heart failure patients scheduled for cardiac resynchronisation therapy (CRT) underwent additional whole-heart coronary venography. MRI was performed 1 day before device implantation. The visibility, location and dimensions of the CS and its major tributaries were assessed and the number of potential implantation sites identified. The MRI results were validated by X-ray venography conventionally acquired during the device implantation procedure. The right atrium (RA), CS and mid-cardiac vein (MCV) could be visualised in all patients. 36% of the identified candidate branches were located posterolaterally, 48% laterally and 16% anterolaterally. The average diameter of the CS was quantified as 9.8 mm, the posterior interventricular vein (PIV) 4.6 mm, posterolateral segments 3.3 mm, lateral 2.9 mm and anterolateral 2.9 mm. Concordance with X-ray in terms of number and location of candidate branches was given in most cases. Contrast-enhanced MRI venography appears feasible for non-invasive pre-interventional assessment of the course of the CS and its major tributaries. (orig.)

  1. Assessment of the coronary venous system in heart failure patients by blood pool agent enhanced whole-heart MRI

    International Nuclear Information System (INIS)

    Manzke, Robert; Binner, Ludwig; Bornstedt, Axel; Merkle, Nico; Lutz, Anja; Gradinger, Robert; Rasche, Volker

    2011-01-01

    To investigate the feasibility of MRI for non-invasive assessment of the coronary sinus (CS) and the number and course of its major tributaries in heart failure patients. Fourteen non-ischaemic heart failure patients scheduled for cardiac resynchronisation therapy (CRT) underwent additional whole-heart coronary venography. MRI was performed 1 day before device implantation. The visibility, location and dimensions of the CS and its major tributaries were assessed and the number of potential implantation sites identified. The MRI results were validated by X-ray venography conventionally acquired during the device implantation procedure. The right atrium (RA), CS and mid-cardiac vein (MCV) could be visualised in all patients. 36% of the identified candidate branches were located posterolaterally, 48% laterally and 16% anterolaterally. The average diameter of the CS was quantified as 9.8 mm, the posterior interventricular vein (PIV) 4.6 mm, posterolateral segments 3.3 mm, lateral 2.9 mm and anterolateral 2.9 mm. Concordance with X-ray in terms of number and location of candidate branches was given in most cases. Contrast-enhanced MRI venography appears feasible for non-invasive pre-interventional assessment of the course of the CS and its major tributaries. (orig.)

  2. Study on real-time elevator brake failure predictive system

    Science.gov (United States)

    Guo, Jun; Fan, Jinwei

    2013-10-01

    This paper presented a real-time failure predictive system of the elevator brake. Through inspecting the running state of the coil by a high precision long range laser triangulation non-contact measurement sensor, the displacement curve of the coil is gathered without interfering the original system. By analyzing the displacement data using the diagnostic algorithm, the hidden danger of the brake system can be discovered in time and thus avoid the according accident.

  3. Evaluation of left ventricular diastolic function by appreciating the shape of time activity curve

    International Nuclear Information System (INIS)

    Nishimura, Tohru; Taya, Makoto; Shimoyama, Katsuya; Sasaki, Akira; Mizuno, Haruyoshi; Tahara, Yorio; Ono, Akifumi; Ishikawa, Kyozo

    1993-01-01

    To determine left ventricular diastolic function (LVDF), the shape of time activity curve and primary differential curve, as acquired by Tc-99m radionuclide angiography, were visually assessed. The study popoulation consisted of 1647 patients with heart disease, such as hypertension, ischemic heart disease, cardiomyopathy and valvular disease. Fifty-six other patients were served as controls. The LVDF was divided into 4 degrees: 0=normal, I=slight disturbance, II=moderate disturbance, and III=severe disturbance. LVDF variables, including time to peak filling (TPF), TPF/time to end-systole, peak filling rate (PFR), PFR/t, 1/3 filling fraction (1/3 FR), and 1/3 FR/t, were calculated from time activity curve. There was no definitive correlation between each variable and age or heart rate. Regarding these LVDF variables, except for 1/3 FR, there was no significant difference between the group 0 of heart disease patients and the control group. Among the groups 0-III of heart disease patients, there were significant difference in LVDF variables. Visual assessement concurred with left ventricular ejection fraction, PFR/end-diastolic curve, and filling rate/end-diastolic curve. Visual assessment using time activity curve was considered useful in the semiquantitative determination of early diastolic function. (N.K.)

  4. Assessment of SPACE code for multiple failure accident: 1% Cold Leg Break LOCA with HPSI failure at ATLAS Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Hyuk; Lee, Seung Wook; Kim, Kyung-Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Design extension conditions (DECs) is a popular key issue after the Fukushima accident. In a viewpoint of the reinforcement of the defense in depth concept, a high-risk multiple failure accident should be reconsidered. The target scenario of ATLAS A5.1 test was LSTF (Large Scale Test Facility) SB-CL-32 test, a 1% SBLOCA with total failure of high pressure safety injection (HPSI) system of emergency core cooling system (ECCS) and secondary side depressurization as the accident management (AM) action, as a counterpart test. As the needs to prepare the DEC accident because of a multiple failure of the present NPPs are emphasized, the capability of SPACE code, just like other system analysis code, is required to expand the DEC area. The objectives of this study is to validate the capability of SPACE code for a DEC scenario, which represents multiple failure accident like as a SBLOCA with HPSI fail. Therefore, the ATLAS A5.1 test scenario was chosen. As the needs to prepare the DEC accident because of a multiple failure of operating NPPs are emphasized, the capability of SPACE code is needed to expand the DEC area. So the capability of SPACE code was validated for one of a DEC scenario. The target scenario was selected as the ATLAS A5.1 test, which is a 1% SBLOCA with total failure of HPSI system of ECCS and secondary side depressurization. Through the sensitivity study on discharge coefficient of break flow, the best fit of integrated mass was found. Using the coefficient, the ATLAS A5.1 test was analyzed using the SPACE code. The major thermal hydraulic parameters such as the system pressure, temperatures were compared with the test and have a good agreement. Through the simulation, it was concluded that the SPACE code can effectively simulate one of multiple failure accidents like as SBLOCA with HPSI failure accident.

  5. Stenting for curved lesions using a novel curved balloon: Preliminary experimental study.

    Science.gov (United States)

    Tomita, Hideshi; Higaki, Takashi; Kobayashi, Toshiki; Fujii, Takanari; Fujimoto, Kazuto

    2015-08-01

    Stenting may be a compelling approach to dilating curved lesions in congenital heart diseases. However, balloon-expandable stents, which are commonly used for congenital heart diseases, are usually deployed in a straight orientation. In this study, we evaluated the effect of stenting with a novel curved balloon considered to provide better conformability to the curved-angled lesion. In vitro experiments: A Palmaz Genesis(®) stent (Johnson & Johnson, Cordis Co, Bridgewater, NJ, USA) mounted on the Goku(®) curve (Tokai Medical Co. Nagoya, Japan) was dilated in vitro to observe directly the behavior of the stent and balloon assembly during expansion. Animal experiment: A short Express(®) Vascular SD (Boston Scientific Co, Marlborough, MA, USA) stent and a long Express(®) Vascular LD stent (Boston Scientific) mounted on the curved balloon were deployed in the curved vessel of a pig to observe the effect of stenting in vivo. In vitro experiments: Although the stent was dilated in a curved fashion, stent and balloon assembly also rotated conjointly during expansion of its curved portion. In the primary stenting of the short stent, the stent was dilated with rotation of the curved portion. The excised stent conformed to the curved vessel. As the long stent could not be negotiated across the mid-portion with the balloon in expansion when it started curving, the mid-portion of the stent failed to expand fully. Furthermore, the balloon, which became entangled with the stent strut, could not be retrieved even after complete deflation. This novel curved balloon catheter might be used for implantation of the short stent in a curved lesion; however, it should not be used for primary stenting of the long stent. Post-dilation to conform the stent to the angled vessel would be safer than primary stenting irrespective of stent length. Copyright © 2014 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  6. Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.

    Science.gov (United States)

    Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M

    2014-12-01

    In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.

  7. Assessment of cardiac sympathetic nerve activity in children with chronic heart failure using quantitative iodine-123 metaiodobenzylguanidine imaging

    International Nuclear Information System (INIS)

    Karasawa, Kensuke; Ayusawa, Mamoru; Noto, Nobutaka; Sumitomo, Naokata; Okada, Tomoo; Harada, Kensuke

    2000-01-01

    Cardiac sympathetic nerve activity in children with chronic heart failure was examined by quantitative iodine-123 metaiodobenzylguanidine (MIBG) myocardial imaging in 33 patients aged 7.5±6.1 years (range 0-18 years), including 8 with cardiomyopathy, 15 with congenital heart disease, 3 with anthracycrine cardiotoxicity, 3 with myocarditis, 3 with primary pulmonary hypertension and 1 with Pompe's disease. Anterior planar images were obtained 15 min and 3 hr after the injection of iodine-123 MIBG. The cardiac iodine-123 MIBG uptake was assessed as the heart to upper mediastinum uptake activity ratio of the delayed image (H/M) and the cardiac percentage washout rate (%WR). The severity of chronic heart failure was class I (no medication) in 8 patients, class II (no symptom with medication) in 9, class III (symptom even with medication) in 10 and class IV (late cardiac death) in 6. H/M was 2.33±0.22 in chronic heart failure class I, 2.50±0.34 in class II, 1.95±0.61 in class III, and 1.39±0.29 in class IV (p<0.05). %WR was 24.8±12.8% in chronic heart failure class I, 23.3±10.2% in class II, 49.2±24.5% in class III, and 66.3±26.5% in class IV (p<0.05). The low H/M and high %WR were proportionate to the severity of chronic heart failure. Cardiac iodine-123 MIBG showed cardiac adrenergic neuronal dysfunction in children with severe chronic heart failure. Quantitative iodine-123 MIBG myocardial imaging is clinically useful as a predictor of therapeutic outcome and mortality in children with chronic heart failure. (author)

  8. An assessment of the BEST procedure to estimate the soil water retention curve

    Science.gov (United States)

    Castellini, Mirko; Di Prima, Simone; Iovino, Massimo

    2017-04-01

    The Beerkan Estimation of Soil Transfer parameters (BEST) procedure represents a very attractive method to accurately and quickly obtain a complete hydraulic characterization of the soil (Lassabatère et al., 2006). However, further investigations are needed to check the prediction reliability of soil water retention curve (Castellini et al., 2016). Four soils with different physical properties (texture, bulk density, porosity and stoniness) were considered in this investigation. Sites of measurement were located at Palermo University (PAL site) and Villabate (VIL site) in Sicily, Arborea (ARB site) in Sardinia and in Foggia (FOG site), Apulia. For a given site, BEST procedure was applied and the water retention curve was estimated using the available BEST-algorithms (i.e., slope, intercept and steady), and the reference values of the infiltration constants (β=0.6 and γ=0.75) were considered. The water retention curves estimated by BEST were then compared with those obtained in laboratory by the evaporation method (Wind, 1968). About ten experiments were carried out with both methods. A sensitivity analysis of the constants β and γ within their feasible range of variability (0.1analysis showed that S tended to increase for increasing β values and decreasing values of γ for all the BEST-algorithms and soils. On the other hand, Ks tended to decrease for increasing β and γ values. Our results also reveal that: i) BEST-intercept and BEST-steady algorithms yield lower S and higher Ks values than BEST-slope; ii) these algorithms yield also more variable values. For the latter, a higher sensitiveness of these two alternative algorithms to β than for γ was established. The decreasing sensitiveness to γ may lead to a possible lack in the correction of the simplified theoretical description of the parabolic two-dimensional and one-dimensional wetting front along the soil profile (Smettem et al., 1994). This likely resulted in lower S and higher Ks values

  9. JUMPING THE CURVE

    Directory of Open Access Journals (Sweden)

    René Pellissier

    2012-01-01

    Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.

  10. Demand curves for hypothetical cocaine in cocaine-dependent individuals.

    Science.gov (United States)

    Bruner, Natalie R; Johnson, Matthew W

    2014-03-01

    Drug purchasing tasks have been successfully used to examine demand for hypothetical consumption of abused drugs including heroin, nicotine, and alcohol. In these tasks, drug users make hypothetical choices whether to buy drugs, and if so, at what quantity, at various potential prices. These tasks allow for behavioral economic assessment of that drug's intensity of demand (preferred level of consumption at extremely low prices) and demand elasticity (sensitivity of consumption to price), among other metrics. However, a purchasing task for cocaine in cocaine-dependent individuals has not been investigated. This study examined a novel Cocaine Purchasing Task and the relation between resulting demand metrics and self-reported cocaine use data. Participants completed a questionnaire assessing hypothetical purchases of cocaine units at prices ranging from $0.01 to $1,000. Demand curves were generated from responses on the Cocaine Purchasing Task. Correlations compared metrics from the demand curve to measures of real-world cocaine use. Group and individual data were well modeled by a demand curve function. The validity of the Cocaine Purchasing Task was supported by a significant correlation between the demand curve metrics of demand intensity and O max (determined from Cocaine Purchasing Task data) and self-reported measures of cocaine use. Partial correlations revealed that after controlling for demand intensity, demand elasticity and the related measure, P max, were significantly correlated with real-world cocaine use. Results indicate that the Cocaine Purchasing Task produces orderly demand curve data, and that these data relate to real-world measures of cocaine use.

  11. ASSESSMENT OF BUILDING FAILURES IN NIGERIA: LAGOS AND ...

    African Journals Online (AJOL)

    Common failures seen on buildings were wall cracking, wall spalling, foundation settlement, column buckling, etc. Proper assurance of competent professionals and strict enforcement of ethical standards by the Nigerian Society of Engineers, the Nigerian Institute of Building, and the Nigerian Institute of Architects would ...

  12. Dual Smarandache Curves of a Timelike Curve lying on Unit dual Lorentzian Sphere

    OpenAIRE

    Kahraman, Tanju; Hüseyin Ugurlu, Hasan

    2016-01-01

    In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere. Firstly, we define the four types of dual Smarandache curves of a timelike curve lying on dual Lorentzian sphere.

  13. Variation of poorly ventilated lung units (silent spaces) measured by electrical impedance tomography to dynamically assess recruitment.

    Science.gov (United States)

    Spadaro, Savino; Mauri, Tommaso; Böhm, Stephan H; Scaramuzzo, Gaetano; Turrini, Cecilia; Waldmann, Andreas D; Ragazzi, Riccardo; Pesenti, Antonio; Volta, Carlo Alberto

    2018-01-31

    Assessing alveolar recruitment at different positive end-expiratory pressure (PEEP) levels is a major clinical and research interest because protective ventilation implies opening the lung without inducing overdistention. The pressure-volume (P-V) curve is a validated method of assessing recruitment but reflects global characteristics, and changes at the regional level may remain undetected. The aim of the present study was to compare, in intubated patients with acute hypoxemic respiratory failure (AHRF) and acute respiratory distress syndrome (ARDS), lung recruitment measured by P-V curve analysis, with dynamic changes in poorly ventilated units of the dorsal lung (dependent silent spaces [DSSs]) assessed by electrical impedance tomography (EIT). We hypothesized that DSSs might represent a dynamic bedside measure of recruitment. We carried out a prospective interventional study of 14 patients with AHRF and ARDS admitted to the intensive care unit undergoing mechanical ventilation. Each patient underwent an incremental/decremental PEEP trial that included five consecutive phases: PEEP 5 and 10 cmH 2 O, recruitment maneuver + PEEP 15 cmH 2 O, then PEEP 10 and 5 cmH 2 O again. We measured, at the end of each phase, recruitment from previous PEEP using the P-V curve method, and changes in DSS were continuously monitored by EIT. PEEP changes induced alveolar recruitment as assessed by the P-V curve method and changes in the amount of DSS (p Recruited volume measured by the P-V curves significantly correlated with the change in DSS (r s  = 0.734, p recruitment measured using the P-V curve technique. EIT might provide useful information to titrate personalized PEEP. ClinicalTrials.gov, NCT02907840 . Registered on 20 September 2016.

  14. Echocardiographic assessment of right ventricular function in routine practice: Which parameters are useful to predict one-year outcome in advanced heart failure patients with dilated cardiomyopathy?

    Science.gov (United States)

    Kawata, Takayuki; Daimon, Masao; Kimura, Koichi; Nakao, Tomoko; Lee, Seitetsu L; Hirokawa, Megumi; Kato, Tomoko S; Watanabe, Masafumi; Yatomi, Yutaka; Komuro, Issei

    2017-10-01

    Right ventricular (RV) function has recently gained attention as a prognostic predictor of outcome even in patients who have left-sided heart failure. Since several conventional echocardiographic parameters of RV systolic function have been proposed, our aim was to determine if any of these parameters (tricuspid annular plane systolic excursion: TAPSE, tissue Doppler derived systolic tricuspid annular motion velocity: S', fractional area change: FAC) are associated with outcome in advanced heart failure patients with dilated cardiomyopathy (DCM). We retrospectively enrolled 68 DCM patients, who were New York Heart Association (NYHA) Class III or IV and had a left ventricular (LV) ejection fraction functional class IV, plasma brain natriuretic peptide concentration, intravenous inotrope use, left atrial volume index, and FAC were associated with outcome, whereas TAPSE and S' were not. Receiver-operating characteristic curve analysis showed that the optimal FAC cut-off value to identify patients with an event was rights reserved.

  15. A self-controlled case series to assess the effectiveness of beta blockers for heart failure in reducing hospitalisations in the elderly

    Directory of Open Access Journals (Sweden)

    Pratt Nicole L

    2011-07-01

    Full Text Available Abstract Background To determine the suitability of using the self-controlled case series design to assess improvements in health outcomes using the effectiveness of beta blockers for heart failure in reducing hospitalisations as the example. Methods The Australian Government Department of Veterans' Affairs administrative claims database was used to undertake a self-controlled case-series in elderly patients aged 65 years or over to compare the risk of a heart failure hospitalisation during periods of being exposed and unexposed to a beta blocker. Two studies, the first using a one year period and the second using a four year period were undertaken to determine if the estimates varied due to changes in severity of heart failure over time. Results In the one year period, 3,450 patients and in the four year period, 12, 682 patients had at least one hospitalisation for heart failure. The one year period showed a non-significant decrease in hospitalisations for heart failure 4-8 months after starting beta-blockers, (RR, 0.76; 95% CI (0.57-1.02 and a significant decrease in the 8-12 months post-initiation of a beta blocker for heart failure (RR, 0.62; 95% CI (0.39, 0.99. For the four year study there was an increased risk of hospitalisation less than eight months post-initiation and significant but smaller decrease in the 8-12 month window (RR, 0.90; 95% CI (0.82, 0.98. Conclusions The results of the one year observation period are similar to those observed in randomised clinical trials indicating that the self-controlled case-series method can be successfully applied to assess health outcomes. However, the result appears sensitive to the study periods used and further research to understand the appropriate applications of this method in pharmacoepidemiology is still required. The results also illustrate the benefits of extending beta blocker utilisation to the older age group of heart failure patients in which their use is common but the evidence is

  16. Dynamic thresholds and a summary ROC curve: Assessing prognostic accuracy of longitudinal markers.

    Science.gov (United States)

    Saha-Chaudhuri, P; Heagerty, P J

    2018-04-19

    Cancer patients, chronic kidney disease patients, and subjects infected with HIV are routinely monitored over time using biomarkers that represent key health status indicators. Furthermore, biomarkers are frequently used to guide initiation of new treatments or to inform changes in intervention strategies. Since key medical decisions can be made on the basis of a longitudinal biomarker, it is important to evaluate the potential accuracy associated with longitudinal monitoring. To characterize the overall accuracy of a time-dependent marker, we introduce a summary ROC curve that displays the overall sensitivity associated with a time-dependent threshold that controls time-varying specificity. The proposed statistical methods are similar to concepts considered in disease screening, yet our methods are novel in choosing a potentially time-dependent threshold to define a positive test, and our methods allow time-specific control of the false-positive rate. The proposed summary ROC curve is a natural averaging of time-dependent incident/dynamic ROC curves and therefore provides a single summary of net error rates that can be achieved in the longitudinal setting. Copyright © 2018 John Wiley & Sons, Ltd.

  17. A REVIEW OF SOFTWARE-INDUCED FAILURE EXPERIENCE.

    Energy Technology Data Exchange (ETDEWEB)

    CHU, T.L.; MARTINEZ-GURIDI, G.; YUE, M.; LEHNER, J.

    2006-09-01

    We present a review of software-induced failures in commercial nuclear power plants (NPPs) and in several non-nuclear industries. We discuss the approach used for connecting operational events related to these failures and the insights gained from this review. In particular, we elaborate on insights that can be used to model this kind of failure in a probabilistic risk assessment (PRA) model. We present the conclusions reached in these areas.

  18. Preparation of severely curved simulated root canals using engine-driven rotary and conventional hand instruments.

    Science.gov (United States)

    Szep, S; Gerhardt, T; Leitzbach, C; Lüder, W; Heidemann, D

    2001-03-01

    This in vitro study evaluated the efficacy and safety of six different nickel-titanium engine-driven instruments used with a torque-controlled engine device and nickel-titanium hand and stainless steel hand instruments in preparation of curved canals. A total of 80 curved (36 degrees) simulated root canals were prepared. Images before and after were superimposed, and instrumentation areas were observed. Time of instrumentation, instrument failure, change in working length and weight loss were also recorded. Results show that stainless steel hand instruments cause significantly less transportation towards the inner wall of the canal than do nickel-titanium hand instruments. No instrument fracture occurred with hand instruments, but 30-60% breakage of instruments was recorded during instrumentation with the engine-driven devices. The working length was maintained by all types of instruments. Newly developed nickel-titanium rotary files were not able to prevent straightening of the severely curved canals when a torque-controlled engine-driven device was used.

  19. Failure mode effects and criticality analysis: innovative risk assessment to identify critical areas for improvement in emergency department sepsis resuscitation.

    Science.gov (United States)

    Powell, Emilie S; O'Connor, Lanty M; Nannicelli, Anna P; Barker, Lisa T; Khare, Rahul K; Seivert, Nicholas P; Holl, Jane L; Vozenilek, John A

    2014-06-01

    Sepsis is an increasing problem in the practice of emergency medicine as the prevalence is increasing and optimal care to reduce mortality requires significant resources and time. Evidence-based septic shock resuscitation strategies exist, and rely on appropriate recognition and diagnosis, but variation in adherence to the recommendations and therefore outcomes remains. Our objective was to perform a multi-institutional prospective risk-assessment, using failure mode effects and criticality analysis (FMECA), to identify high-risk failures in ED sepsis resuscitation. We conducted a FMECA, which prospectively identifies critical areas for improvement in systems and processes of care, across three diverse hospitals. A multidisciplinary group of participants described the process of emergency department (ED) sepsis resuscitation to then create a comprehensive map and table listing all process steps and identified process failures. High-risk failures in sepsis resuscitation from each of the institutions were compiled to identify common high-risk failures. Common high-risk failures included limited availability of equipment to place the central venous catheter and conduct invasive monitoring, and cognitive overload leading to errors in decision-making. Additionally, we identified great variability in care processes across institutions. Several common high-risk failures in sepsis care exist: a disparity in resources available across hospitals, a lack of adherence to the invasive components of care, and cognitive barriers that affect expert clinicians' decision-making capabilities. Future work may concentrate on dissemination of non-invasive alternatives and overcoming cognitive barriers in diagnosis and knowledge translation.

  20. Procedures for conducting common cause failure analysis in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1992-05-01

    The principal objective of this report is to supplement the procedure developed in Mosleh et al. (1988, 1989) by providing more explicit guidance for a practical approach to common cause failures (CCF) analysis. The detailed CCF analysis following that procedure would be very labour intensive and time consuming. This document identifies a number of options for performing the more labour intensive parts of the analysis in an attempt to achieve a balance between the need for detail, the purpose of the analysis and the resources available. The document is intended to be compatible with the Agency's Procedures for Conducting Probabilistic Safety Assessments for Nuclear Power Plants (IAEA, 1992), but can be regarded as a stand-alone report to be used in conjunction with NUREG/CR-4780 (Mosleh et al., 1988, 1989) to provide additional detail, and discussion of key technical issues

  1. Diuretics for heart failure.

    Science.gov (United States)

    Faris, Rajaa F; Flather, Marcus; Purcell, Henry; Poole-Wilson, Philip A; Coats, Andrew J S

    2012-02-15

    Chronic heart failure is a major cause of morbidity and mortality worldwide. Diuretics are regarded as the first-line treatment for patients with congestive heart failure since they provide symptomatic relief. The effects of diuretics on disease progression and survival remain unclear. To assess the harms and benefits of diuretics for chronic heart failure Updated searches were run in the Cochrane Central Register of Controlled Trials in The Cochrane Library (CENTRAL Issue 1 of 4, 2011), MEDLINE (1966 to 22 February 2011), EMBASE (1980 to 2011 Week 07) and HERDIN database (1990 to February 2011). We hand searched pertinent journals and reference lists of papers were inspected. We also contacted manufacturers and researchers in the field. No language restrictions were applied. Double-blinded randomised controlled trials of diuretic therapy comparing one diuretic with placebo, or one diuretic with another active agent (e.g. ACE inhibitors, digoxin) in patients with chronic heart failure. Two authors independently abstracted the data and assessed the eligibility and methodological quality of each trial. Extracted data were analysed by determining the odds ratio for dichotomous data, and difference in means for continuous data, of the treated group compared with controls. The likelihood of heterogeneity of the study population was assessed by the Chi-square test. If there was no evidence of statistical heterogeneity and pooling of results was clinically appropriate, a combined estimate was obtained using the fixed-effects model. This update has not identified any new studies for inclusion. The review includes 14 trials (525 participants), 7 were placebo-controlled, and 7 compared diuretics against other agents such as ACE inhibitors or digoxin. We analysed the data for mortality and for worsening heart failure. Mortality data were available in 3 of the placebo-controlled trials (202 participants). Mortality was lower for participants treated with diuretics than for

  2. Modeling Patterns of Activities using Activity Curves.

    Science.gov (United States)

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2016-06-01

    Pervasive computing offers an unprecedented opportunity to unobtrusively monitor behavior and use the large amount of collected data to perform analysis of activity-based behavioral patterns. In this paper, we introduce the notion of an activity curve , which represents an abstraction of an individual's normal daily routine based on automatically-recognized activities. We propose methods to detect changes in behavioral routines by comparing activity curves and use these changes to analyze the possibility of changes in cognitive or physical health. We demonstrate our model and evaluate our change detection approach using a longitudinal smart home sensor dataset collected from 18 smart homes with older adult residents. Finally, we demonstrate how big data-based pervasive analytics such as activity curve-based change detection can be used to perform functional health assessment. Our evaluation indicates that correlations do exist between behavior and health changes and that these changes can be automatically detected using smart homes, machine learning, and big data-based pervasive analytics.

  3. ECM using Edwards curves

    DEFF Research Database (Denmark)

    Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja

    2013-01-01

    -arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...

  4. An assessment of consumers’ subconscious responses to frontline employees’ attractiveness in a service failure and recovery situation

    Directory of Open Access Journals (Sweden)

    Christo Boshoff

    2017-06-01

    Full Text Available Background: Initial analyses of the impact of physical attractiveness in a business context have supported the ‘what is beautiful is good’ contention. However, in circumstances characterised by negative emotions, duress and stress, very little is known about how human beings respond at the subconscious level to the attractiveness of frontline service providers. Aim: The purpose of this study was to assess whether consumers who complain to a frontline service provider about a service failure respond differently at the subconscious level when the service provider involved in the service encounter is attractive compared with one who is less attractive. Method: Forty respondents were exposed to a video clip of a service failure and service recovery situation. While viewing the hypothetical scenario, two neuro-physiological measurements were used to collect data at the subconscious level, namely galvanic skin response (GSR and electroencephalography (EEG. Results: The results suggest that, at the subconscious level, customers respond differently to the service recovery efforts depending on the attractiveness of the frontline service provider who attempts to rectify the service failure. Conclusion: The results seem to suggest that the physical attractiveness of a frontline service provider moderates (or softens the negative emotions that a complaining customer might experience during a service failure and complaint situation – consistent with the ‘what is beautiful is good’ contention.

  5. Evaluation of possible prognostic factors for the success, survival, and failure of dental implants.

    Science.gov (United States)

    Geckili, Onur; Bilhan, Hakan; Geckili, Esma; Cilingir, Altug; Mumcu, Emre; Bural, Canan

    2014-02-01

    To analyze the prognostic factors that are associated with the success, survival, and failure rates of dental implants. Data including implant sizes, insertion time, implant location, and prosthetic treatment of 1656 implants have been collected, and the association of these factors with success, survival, and failure of implants was analyzed. The success rate was lower for short and maxillary implants. The failure rate of maxillary implants exceeded that of mandibular implants, and the failure rate of implants that were placed in the maxillary anterior region was significantly higher than other regions. The failure rates of implants that were placed 5 years ago or more were higher than those that were placed later. Anterior maxilla is more critical for implant loss than other sites. Implants in the anterior mandible show better success compared with other locations, and longer implants show better success rates. The learning curve of the clinician influences survival and success rates of dental implants.

  6. Nuclear cardiology and heart failure

    International Nuclear Information System (INIS)

    Giubbini, Raffaele; Bertagna, Francesco; Milan, Elisa; Mut, Fernando; Dondi, Maurizio; Metra, Marco; Rodella, Carlo

    2009-01-01

    The prevalence of heart failure in the adult population is increasing. It varies between 1% and 2%, although it mainly affects elderly people (6-10% of people over the age of 65 years will develop heart failure). The syndrome of heart failure arises as a consequence of an abnormality in cardiac structure, function, rhythm, or conduction. Coronary artery disease is the leading cause of heart failure and it accounts for this disorder in 60-70% of all patients affected. Nuclear techniques provide unique information on left ventricular function and perfusion by gated-single photon emission tomography (SPECT). Myocardial viability can be assessed by both SPECT and PET imaging. Finally, autonomic dysfunction has been shown to increase the risk of death in patients with heart disease and this may be applicable to all patients with cardiac disease regardless of aetiology. MIBG scanning has a very promising prognostic value in patients with heart failure. (orig.)

  7. Nuclear cardiology and heart failure

    Energy Technology Data Exchange (ETDEWEB)

    Giubbini, Raffaele; Bertagna, Francesco [University of Brescia, Department of Nuclear Medicine, Brescia (Italy); Milan, Elisa [Ospedale Di Castelfranco Veneto, Nuclear Medicine Unit, Castelfranco Veneto (Italy); Mut, Fernando; Dondi, Maurizio [International Atomic Energy Agency, Nuclear Medicine Section, Division of Human Health, Vienna (Austria); Metra, Marco [University of Brescia, Department of Cardiology, Brescia (Italy); Rodella, Carlo [Health Physics Department, Spedali Civili di Brescia, Brescia (Italy)

    2009-12-15

    The prevalence of heart failure in the adult population is increasing. It varies between 1% and 2%, although it mainly affects elderly people (6-10% of people over the age of 65 years will develop heart failure). The syndrome of heart failure arises as a consequence of an abnormality in cardiac structure, function, rhythm, or conduction. Coronary artery disease is the leading cause of heart failure and it accounts for this disorder in 60-70% of all patients affected. Nuclear techniques provide unique information on left ventricular function and perfusion by gated-single photon emission tomography (SPECT). Myocardial viability can be assessed by both SPECT and PET imaging. Finally, autonomic dysfunction has been shown to increase the risk of death in patients with heart disease and this may be applicable to all patients with cardiac disease regardless of aetiology. MIBG scanning has a very promising prognostic value in patients with heart failure. (orig.)

  8. Effects of Diameter on Initial Stiffness of P-Y Curves for Large-Diameter Piles in Sand

    DEFF Research Database (Denmark)

    Sørensen, Søren Peder Hyldal; Ibsen, Lars Bo; Augustesen, Anders Hust

    2010-01-01

    is developed for slender piles with diameters up to approximately 2.0 m. Hence, the method is not validated for piles with diameters of 4–6 m. The aim of the paper is to extend the p-y curve method to large-diameter non-slender piles in sand by considering the effects of the pile diameter on the soil-pile...... interaction. Hence, a modified expression for the p-y curves for statically loaded piles in sand is proposed in which the initial slope of the p-y curves depends on the depth below the soil surface, the pile diameter and the internal angle of friction. The evaluation is based on three-dimensional numerical...... analyses by means of the commercial program FLAC3D incorporating a Mohr-Coulomb failure criterion. The numerical model is validated with laboratory tests in a pressure tank at Aalborg University....

  9. Plasma Glutamine Concentrations in Liver Failure.

    Directory of Open Access Journals (Sweden)

    Gunnel Helling

    Full Text Available Higher than normal plasma glutamine concentration at admission to an intensive care unit is associated with an unfavorable outcome. Very high plasma glutamine levels are sometimes seen in both acute and chronic liver failure. We aimed to systematically explore the relation between different types of liver failure and plasma glutamine concentrations.Four different groups of patients were studies; chronic liver failure (n = 40, acute on chronic liver failure (n = 20, acute fulminant liver failure (n = 20, and post-hepatectomy liver failure (n = 20. Child-Pugh and Model for End-stage Liver Disease (MELD scores were assessed as indices of liver function. All groups except the chronic liver failure group were followed longitudinally during hospitalisation. Outcomes were recorded up to 48 months after study inclusion.All groups had individuals with very high plasma glutamine concentrations. In the total group of patients (n = 100, severity of liver failure correlated significantly with plasma glutamine concentration, but the correlation was not strong.Liver failure, regardless of severity and course of illness, may be associated with a high plasma glutamine concentration. Further studies are needed to understand whether high glutamine levels should be regarded as a biomarker or as a contributor to symptomatology in liver failure.

  10. TELECOMMUNICATIONS INFRASTRUCTURE AND GDP /JIPP CURVE/

    Directory of Open Access Journals (Sweden)

    Mariana Kaneva

    2016-07-01

    Full Text Available The relationship between telecommunications infrastructure and economic activity is under discussion in many scientific papers. Most of the authors use for research and analysis the Jipp curve. A lot of doubts about the correctness of the Jipp curve appear in terms of applying econometric models. The aim of this study is a review of the Jipp curve, refining the possibility of its application in modern conditions. The methodology used in the study is based on dynamic econometric models, including tests for nonstationarity and tests for causality. The focus of this study is directed to methodological problems in measuring the local density types of telecommunication networks. This study offers a specific methodology for assessing the Jipp law, through VAR-approach and Granger causality tests. It is proved that mechanical substitution of momentary aggregated variables (such as the number of subscribers of a telecommunication network at the end of the year and periodically aggregated variables (such as GDP per capita in the Jipp�s curve is methodologically wrong. Researchers have to reconsider the relationship set in the Jipp�s curve by including additional variables that characterize the Telecommunications sector and the economic activity in a particular country within a specified time period. GDP per capita should not be regarded as a single factor for the local density of telecommunications infrastructure. New econometric models studying the relationship between the investments in telecommunications infrastructure and economic development may be not only linear regression models, but also other econometric models. New econometric models should be proposed after testing and validating with sound economic theory and econometric methodology.

  11. Technological change in energy systems. Learning curves, logistic curves and input-output coefficients

    International Nuclear Information System (INIS)

    Pan, Haoran; Koehler, Jonathan

    2007-01-01

    Learning curves have recently been widely adopted in climate-economy models to incorporate endogenous change of energy technologies, replacing the conventional assumption of an autonomous energy efficiency improvement. However, there has been little consideration of the credibility of the learning curve. The current trend that many important energy and climate change policy analyses rely on the learning curve means that it is of great importance to critically examine the basis for learning curves. Here, we analyse the use of learning curves in energy technology, usually implemented as a simple power function. We find that the learning curve cannot separate the effects of price and technological change, cannot reflect continuous and qualitative change of both conventional and emerging energy technologies, cannot help to determine the time paths of technological investment, and misses the central role of R and D activity in driving technological change. We argue that a logistic curve of improving performance modified to include R and D activity as a driving variable can better describe the cost reductions in energy technologies. Furthermore, we demonstrate that the top-down Leontief technology can incorporate the bottom-up technologies that improve along either the learning curve or the logistic curve, through changing input-output coefficients. An application to UK wind power illustrates that the logistic curve fits the observed data better and implies greater potential for cost reduction than the learning curve does. (author)

  12. Current Understanding of the Pathophysiology of Myocardial Fibrosis and Its Quantitative Assessment in Heart Failure

    Directory of Open Access Journals (Sweden)

    Tong Liu

    2017-04-01

    Full Text Available Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis.

  13. Using Curved Crystals to Study Terrace-Width Distributions.

    Science.gov (United States)

    Einstein, Theodore L.

    Recent experiments on curved crystals of noble and late transition metals (Ortega and Juurlink groups) have renewed interest in terrace width distributions (TWD) for vicinal surfaces. Thus, it is timely to discuss refinements of TWD analysis that are absent from the standard reviews. Rather than by Gaussians, TWDs are better described by the generalized Wigner surmise, with a power-law rise and a Gaussian decay, thereby including effects evident for weak step repulsion: skewness and peak shifts down from the mean spacing. Curved crystals allow analysis of several mean spacings with the same substrate, so that one can check the scaling with the mean width. This is important since such scaling confirms well-established theory. Failure to scale also can provide significant insights. Complicating factors can include step touching (local double-height steps), oscillatory step interactions mediated by metallic (but not topological) surface states, short-range corrections to the inverse-square step repulsion, and accounting for the offset between adjacent layers of almost all surfaces. We discuss how to deal with these issues. For in-plane misoriented steps there are formulas to describe the stiffness but not yet the strength of the elastic interstep repulsion. Supported in part by NSF-CHE 13-05892.

  14. Limited sampling strategies drawn within 3 hours postdose poorly predict mycophenolic acid area-under-the-curve after enteric-coated mycophenolate sodium.

    NARCIS (Netherlands)

    Winter, B.C. de; Gelder, T. van; Mathôt, R.A.A.; Glander, P.; Tedesco-Silva, H.; Hilbrands, L.B.; Budde, K.; Hest, R.M. van

    2009-01-01

    Previous studies predicted that limited sampling strategies (LSS) for estimation of mycophenolic acid (MPA) area-under-the-curve (AUC(0-12)) after ingestion of enteric-coated mycophenolate sodium (EC-MPS) using a clinically feasible sampling scheme may have poor predictive performance. Failure of

  15. Sensitivity and Specificity of a Five-Minute Cognitive Screening Test in Patients With Heart Failure.

    Science.gov (United States)

    Cameron, Janette D; Gallagher, Robyn; Pressler, Susan J; McLennan, Skye N; Ski, Chantal F; Tofler, Geoffrey; Thompson, David R

    2016-02-01

    Cognitive impairment occurs in up to 80% of patients with heart failure (HF). The National Institute for Neurological Disorders and Stroke (NINDS) and the Canadian Stroke Network (CSN) recommend a 5-minute cognitive screening protocol that has yet to be psychometrically evaluated in HF populations. The aim of this study was to conduct a secondary analysis of the sensitivity and specificity of the NINDS-CSN brief cognitive screening protocol in HF patients. The Montreal Cognitive Assessment (MoCA) was administered to 221 HF patients. The NINDS-CSN screen comprises 3 MoCA items, with lower scores indicating poorer cognitive function. Receiver operator characteristic (ROC) curves were constructed, determining the sensitivity, specificity and appropriate cutoff scores of the NINDS-CSN screen. In an HF population aged 76 ± 12 years, 136 (62%) were characterized with cognitive impairment (MoCA area under the receiver operating characteristic curve indicated good accuracy in screening for cognitive impairment (0.88; P cognitive impairment in patients with HF. Future studies should include a neuropsychologic battery to more comprehensively examine the diagnostic accuracy of brief cognitive screening protocols. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Fuzzy-logic assessment of failure hazard in pipelines due to mining activity

    Directory of Open Access Journals (Sweden)

    A. A. Malinowska

    2015-11-01

    Full Text Available The present research is aimed at a critical analysis of a method presently used for evaluating failure hazard in linear objects in mining areas. A fuzzy model of failure hazard of a linear object was created on the basis of the experience gathered so far. The rules of Mamdani fuzzy model have been used in the analyses. Finally the scaled model was integrated with a Geographic Information System (GIS, which was used to evaluate failure hazard in a water pipeline in a mining area.

  17. An assessment of the linear damage summation method for creep-fatigue failure with reference to a cast of type 316 stainless steel tested at 570 deg. C

    International Nuclear Information System (INIS)

    Wareing, J.; Bretherton, I.

    This paper presents preliminary results from the programme for hold period tests on a cast BQ of type 316 stainless steel at 570 deg. C. The results of tensile hold period tests on a relatively low ductility cast of type 316 stainless steel have indicated that the failure mechanism changes from a creep-fatigue interaction failure to a creep dominated failure at low strain levels. An assessment of the linear damage summation approach for failure prediction indicates that it is inappropriate for creep-fatigue interaction failures. For creep dominated fracture, failure occurs when the accumulation relaxation strain exhausts the material ductility i.e. Nsub(f epsilon R)=D. The failure criterion based on a creep summation in terms of time to fracture underestimates life

  18. On changing points of mean residual life and failure rate function for some generalized Weibull distributions

    International Nuclear Information System (INIS)

    Xie, M.; Goh, T.N.; Tang, Y.

    2004-01-01

    The failure rate function and mean residual life function are two important characteristics in reliability analysis. Although many papers have studied distributions with bathtub-shaped failure rate and their properties, few have focused on the underlying associations between the mean residual life and failure rate function of these distributions, especially with respect to their changing points. It is known that the change point for mean residual life can be much earlier than that of failure rate function. In fact, the failure rate function should be flat for a long period of time for a distribution to be useful in practice. When the difference between the change points is large, the flat portion tends to be longer. This paper investigates the change points and focuses on the difference of the changing points. The exponentiated Weibull, a modified Weibull, and an extended Weibull distribution, all with bathtub-shaped failure rate function will be used. Some other issues related to the flatness of the bathtub curve are discussed

  19. The interaction of NDE and failure analysis

    International Nuclear Information System (INIS)

    Nichols, R.W.

    1988-01-01

    This paper deals with the use of Non-Destructive Examination (NDE) and failure analysis for the assessment of the structural integrity. It appears that failure analysis enables to know whether NDE is required or not, and can help to direct NDE into the most useful directions by identifying the areas where it is most important that defects are absent. It also appears that failure analysis can help the operator to decide which NDE method is best suited to the component studied and provides detailed specifications for this NDE method. The interaction between failure analysis and NDE is then described. (TEC)

  20. The interaction of NDE and failure analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, R W

    1988-12-31

    This paper deals with the use of Non-Destructive Examination (NDE) and failure analysis for the assessment of the structural integrity. It appears that failure analysis enables to know whether NDE is required or not, and can help to direct NDE into the most useful directions by identifying the areas where it is most important that defects are absent. It also appears that failure analysis can help the operator to decide which NDE method is best suited to the component studied and provides detailed specifications for this NDE method. The interaction between failure analysis and NDE is then described. (TEC).

  1. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy

    International Nuclear Information System (INIS)

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-01-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events. (paper)

  2. Health information systems: failure, success and improvisation.

    Science.gov (United States)

    Heeks, Richard

    2006-02-01

    The generalised assumption of health information systems (HIS) success is questioned by a few commentators in the medical informatics field. They point to widespread HIS failure. The purpose of this paper was therefore to develop a better conceptual foundation for, and practical guidance on, health information systems failure (and success). Literature and case analysis plus pilot testing of developed model. Defining HIS failure and success is complex, and the current evidence base on HIS success and failure rates was found to be weak. Nonetheless, the best current estimate is that HIS failure is an important problem. The paper therefore derives and explains the "design-reality gap" conceptual model. This is shown to be robust in explaining multiple cases of HIS success and failure, yet provides a contingency that encompasses the differences which exist in different HIS contexts. The design-reality gap model is piloted to demonstrate its value as a tool for risk assessment and mitigation on HIS projects. It also throws into question traditional, structured development methodologies, highlighting the importance of emergent change and improvisation in HIS. The design-reality gap model can be used to address the problem of HIS failure, both as a post hoc evaluative tool and as a pre hoc risk assessment and mitigation tool. It also validates a set of methods, techniques, roles and competencies needed to support the dynamic improvisations that are found to underpin cases of HIS success.

  3. Skeletal muscle beta-receptors and isoproterenol-stimulated vasodilation in canine heart failure

    International Nuclear Information System (INIS)

    Frey, M.J.; Lanoce, V.; Molinoff, P.B.; Wilson, J.R.

    1989-01-01

    To investigate whether heart failure alters beta-adrenergic receptors on skeletal muscle and its associated vasculature, the density of beta-adrenergic receptors, isoproterenol-stimulated adenylate cyclase activity, and coupling of the guanine nucleotide-binding regulatory protein were compared in 18 control dogs and 16 dogs with heart failure induced by 5-8 wk of ventricular pacing at 260 beats/min. Hindlimb vascular responses to isoproterenol were compared in eight controls and eight of the dogs with heart failure. In dogs with heart failure, the density of beta-receptors on skeletal muscle was reduced in both gastrocnemius (control: 50 +/- 5; heart failure: 33 +/- 8 fmol/mg of protein) and semitendinosus muscle (control: 43 +/- 9; heart failure: 27 +/- 9 fmol/mg of protein, both P less than 0.05). Receptor coupling to the ternary complex, as determined by isoproterenol competition curves with and without guanosine 5'-triphosphate (GTP), was unchanged. Isoproterenol-stimulated adenylate cyclase activity was significantly decreased in semitendinosus muscle (control: 52.4 +/- 4.6; heart failure: 36.5 +/- 9.5 pmol.mg-1.min-1; P less than 0.05) and tended to be decreased in gastrocnemius muscle (control: 40.1 +/- 8.5; heart failure: 33.5 +/- 4.5 pmol.mg-1.min-1; P = NS). Isoproterenol-induced hindlimb vasodilation was not significantly different in controls and in dogs with heart failure. These findings suggest that heart failure causes downregulation of skeletal muscle beta-adrenergic receptors, probably due to receptor exposure to elevated catecholamine levels, but does not reduce beta-receptor-mediated vasodilation in muscle

  4. An assessment of BWR [boiling water reactor] Mark-II containment challenges, failure modes, and potential improvements in performance

    International Nuclear Information System (INIS)

    Kelly, D.L.; Jones, K.R.; Dallman, R.J.; Wagner, K.C.

    1990-07-01

    This report assesses challenges to BWR Mark II containment integrity that could potentially arise from severe accidents. Also assessed are some potential improvements that could prevent core damage or containment failure, or could mitigate the consequences of such failure by reducing the release of fission products to the environment. These challenges and improvements are analyzed via a limited quantitative risk/benefit analysis of a generic BWR/4 reactor with Mark II containment. Point estimate frequencies of the dominant core damage sequences are obtained and simple containment event trees are constructed to evaluate the response of the containment to these severe accident sequences. The resulting containment release modes are then binned into source term release categories, which provide inputs to the consequence analysis. The output of the consequences analysis is used to construct an overall base case risk profile. Potential improvements and sensitivities are evaluated by modifying the event tree spilt fractions, thus generating a revised risk profile. Several important sensitivity cases are examined to evaluate the impact of phenomenological uncertainties on the final results. 75 refs., 25 figs., 65 tabs

  5. Estimations of creep behavior and failure life for a circumferentially notched specimen

    International Nuclear Information System (INIS)

    Kobayashi, Ken-ichi; Yokobori, Toshimitsu; Kikuchi, Kenji.

    1997-01-01

    No method with which to characterize and/or illustrate total creep behavior for specimens with notches, holes or cracks has been proposed. In this paper it is proposed that most creep curves can be drawn with a master curve for each creep test whenever test conditions and failure modes are similar to each other, and the lifetime ratio normalized by the rupture time is introduced. Using smooth and circumferentially notched specimens of 2.25 Cr-1 Mo steel, creep tests were performed at 600degC for examination of this concept. Furthermore, a θ projection method was used to describe creep curves for notched specimens and to extrapolate longer creep lives. Then, the whole creep curve shape for notched specimens could be easily drawn, except for that in the vicinity of the rupture point. However, longer creep lives of notched specimens were underestimated in comparison with a simple extrapolation of the experimental data. This resulted from the negative dependence of the parameter of θ 3 on the applied stress. (author)

  6. Failure is an option: Reactions to failure in elementary engineering design projects

    Science.gov (United States)

    Johnson, Matthew M.

    Recent reform documents in science education have called for teachers to use epistemic practices of science and engineering researchers to teach disciplinary content (NRC, 2007; NRC, 2012; NGSS Lead States, 2013). Although this creates challenges for classroom teachers unfamiliar with engineering, it has created a need for high quality research about how students and teachers engage in engineering activities to improve curriculum development and teaching pedagogy. While framers of the Next Generation Science Standards (NRC, 2012; NGSS Lead States 2013) focused on the similarities of the practices of science researchers and engineering designers, some have proposed that engineering has a unique set of epistemic practices, including improving from failure (Cunningham & Carlsen, 2014; Cunningham & Kelly, in review). While no one will deny failures occur in science, failure in engineering is thought of in fundamentally different ways. In the study presented here, video data from eight classes of elementary students engaged in one of two civil engineering units were analyzed using methods borrowed from psychology, anthropology, and sociolinguistics to investigate: 1) the nature of failure in elementary engineering design; 2) the ways in which teachers react to failure; and 3) how the collective actions of students and teachers support or constrain improvement in engineering design. I propose new ways of considering the types and causes of failure, and note three teacher reactions to failure: the manager, the cheerleader, and the strategic partner. Because the goal of iteration in engineering is improvement, I also studied improvement. Students only systematically improve when they have the opportunity, productive strategies, and fair comparisons between prototypes. I then investigate the use of student engineering journals to assess learning from the process of improvement after failure. After discussion, I consider implications from this work as well as future research

  7. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.

    Science.gov (United States)

    Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G

    2012-05-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.

  8. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program

    International Nuclear Information System (INIS)

    Afouxenidis, D.; Polymeris, G. S.; Tsirliganis, N. C.; Kitis, G.

    2012-01-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the Glow Curve Analysis Intercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters. (authors)

  9. Sequential Oxygenation Index and Organ Dysfunction Assessment within the First 3 Days of Mechanical Ventilation Predict the Outcome of Adult Patients with Severe Acute Respiratory Failure

    Directory of Open Access Journals (Sweden)

    Hsu-Ching Kao

    2013-01-01

    Full Text Available Objective. To determine early predictors of outcomes of adult patients with severe acute respiratory failure. Method. 100 consecutive adult patients with severe acute respiratory failure were evaluated in this retrospective study. Data including comorbidities, Sequential Organ Failure Assessment (SOFA score, Acute Physiological Assessment and Chronic Health Evaluation II (APACHE II score, PaO2, FiO2, PaO2/FiO2, PEEP, mean airway pressure (mPaw, and oxygenation index (OI on the 1st and the 3rd day of mechanical ventilation, and change in OI within 3 days were recorded. Primary outcome was hospital mortality; secondary outcome measure was ventilator weaning failure. Results. 38 out of 100 (38% patients died within the study period. 48 patients (48% failed to wean from ventilator. Multivariate analysis showed day 3 OI ( and SOFA ( score were independent predictors of hospital mortality. Preexisting cerebrovascular accident (CVA ( was the predictor of weaning failure. Results from Kaplan-Meier method demonstrated that higher day 3 OI was associated with shorter survival time (log-Rank test, . Conclusion. Early OI (within 3 days and SOFA score were predictors of mortality in severe acute respiratory failure. In the future, prospective studies measuring serial OIs in a larger scale of study cohort is required to further consolidate our findings.

  10. Assessing responsiveness of generic and specific health related quality of life measures in heart failure

    Directory of Open Access Journals (Sweden)

    Johnson Jeffrey A

    2006-11-01

    Full Text Available Abstract Background Responsiveness, or sensitivity to clinical change, is an important consideration in selection of a health-related quality of life (HRQL measure for trials or clinical applications. Many approaches can be used to assess responsiveness, which may affect the interpretation of study results. We compared the relative responsiveness of generic and heart failure specific HRQL instruments, as measured both by common psychometric indices and by external clinical criteria. Methods We analyzed data collected at baseline and 6-weeks in 298 subjects with heart failure on the following HRQL measures: EQ-5D (US, UK, and VAS Scoring, Kansas City Cardiomyopathy Questionnaire (KCCQ (Clinical and Overall Summary Score, and RAND12 (Physical and Mental Component Summaries. Three external indicators of clinical change were used to classify subjects as improved, deteriorated, or unchanged: 6-minute walk test, New York Heart Association (NYHA class, and physician global rating of change. Four responsiveness statistics (T-test, effect size, Guyatt's responsiveness statistic, and standardized response mean were used to evaluate the responsiveness of the select measures. The median rank of each HRQL measure across responsiveness indices and clinical criteria was then determined. Results Average age of subjects was 60 years, 75 percent were male, and had moderate to severe heart failure symptoms. Overall, the KCCQ Summary Scores had the highest relative ranking, irrespective of the responsiveness index or external criterion used. Importantly, we observed that the relative ranking of responsiveness of the generic measures (i.e. EQ-5D, RAND12 was influenced by both the responsive indices and external criterion used. Conclusion The disease specific KCCQ was the most responsive HRQL measure assessing change over a 6-week period, although generic measures provide information for which the KCCQ is not suitable. The responsiveness of generic HRQL measures may

  11. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dempsey, J. Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Antoun, Bonnie R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  12. Service reliability assessment using failure mode and effect analysis ...

    African Journals Online (AJOL)

    user

    Statistical Process Control Teng and Ho (1996) .... are still remaining left on modelling the interaction between impact of internal service failure and ..... Design error proofing: development of automated error-proofing information systems, Proceedings of.

  13. Development of container failure models

    International Nuclear Information System (INIS)

    Garisto, N.C.

    1990-01-01

    In order to produce a complete performance assessment for a Canadian waste vault some prediction of container failure times is required. Data are limited; however, the effects of various possible failure scenarios on the rest of the vault model can be tested. For titanium and copper, the two materials considered in the Canadian program, data are available on the frequency of failures due to manufacturing defects; there is also an estimate on the expected size of such defects. It can be shown that the consequences of such small defects in terms of the dose to humans are acceptable. It is not clear, from a modelling point of view, whether titanium or copper are preferable

  14. Tensile and compressive failure modes of laminated composites loaded by fatigue with different mean stress

    Science.gov (United States)

    Rotem, Assa

    1990-01-01

    Laminated composite materials tend to fail differently under tensile or compressive load. Under tension, the material accumulates cracks and fiber fractures, while under compression, the material delaminates and buckles. Tensile-compressive fatigue may cause either of these failure modes depending on the specific damage occurring in the laminate. This damage depends on the stress ratio of the fatigue loading. Analysis of the fatigue behavior of the composite laminate under tension-tension, compression-compression, and tension-compression had led to the development of a fatigue envelope presentation of the failure behavior. This envelope indicates the specific failure mode for any stress ratio and number of loading cycles. The construction of the fatigue envelope is based on the applied stress-cycles to failure (S-N) curves of both tensile-tensile and compressive-compressive fatigue. Test results are presented to verify the theoretical analysis.

  15. A standard curve based method for relative real time PCR data processing

    Directory of Open Access Journals (Sweden)

    Krause Andreas

    2005-03-01

    Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that

  16. Structures for common-cause failure analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1981-01-01

    Common-cause failure methodology and terminology have been reviewed and structured to provide a systematical basis for addressing and developing models and methods for quantification. The structure is based on (1) a specific set of definitions, (2) categories based on the way faults are attributable to a common cause, and (3) classes based on the time of entry and the time of elimination of the faults. The failure events are then characterized by their likelihood or frequency and the average residence time. The structure provides a basis for selecting computational models, collecting and evaluating data and assessing the importance of various failure types, and for developing effective defences against common-cause failure. The relationships of this and several other structures are described

  17. On the estimation of failure rates for living PSAs in the presence of model uncertainty

    International Nuclear Information System (INIS)

    Arsenis, S.P.

    1994-01-01

    The estimation of failure rates of heterogeneous Poisson components from data on times operated to failures is reviewed. Particular emphasis is given to the lack of knowledge on the form of the mixing distribution or population variability curve. A new nonparametric epirical Bayes estimation is proposed which generalizes the estimator of Robbins for different times of observations for the components. The behavior of the estimator is discussed by reference to two samples typically drawn from the CEDB, a component event database designed and operated by the Ispra JRC

  18. Analysis of dependent failures in the ORNL precursor study

    International Nuclear Information System (INIS)

    Ballard, G.M.

    1985-01-01

    The study of dependent failures (or common cause/mode failures) in the safety assessment of potentially hazardous plant is one of the significant areas of uncertainty in performing probabilistic safety studies. One major reason for this uncertainty is that data on dependent failures is apparently not readily available in sufficient quantity to assist in the development and validation of models. The incident reports that were compiled for the ORNL study on Precursors to Severe Core Damage Accidents (NUREG/CR-2497) provide an opportunity to look at the importance of dependent failures in the most significant incidents of recent reactor operations, to look at the success of probabilistic risk assessment (PRA) methods in accounting for the contribution of dependent failures, and to look at the dependent failure incidents with the aim of identifying the most significant problem areas. In this paper an analysis has been made of the incidents compiled in NUREG/CR-2497 and events involving multiple failures which were not independent have been identified. From this analysis it is clear that dependent failures are a very significant contributor to the precursor incidents. The method of enumeration of accident frequency used in NUREG-2497 can be shown to take account of dependent failures and this may be a significant factor contributing to the apparent difference between the precursor accident frequency and typical PRA frequencies

  19. Learning curve for laparoscopic Heller myotomy and Dor fundoplication for achalasia.

    Science.gov (United States)

    Yano, Fumiaki; Omura, Nobuo; Tsuboi, Kazuto; Hoshino, Masato; Yamamoto, Seryung; Akimoto, Shunsuke; Masuda, Takahiro; Kashiwagi, Hideyuki; Yanaga, Katsuhiko

    2017-01-01

    Although laparoscopic Heller myotomy and Dor fundoplication (LHD) is widely performed to address achalasia, little is known about the learning curve for this technique. We assessed the learning curve for performing LHD. Of the 514 cases with LHD performed between August 1994 and March 2016, the surgical outcomes of 463 cases were evaluated after excluding 50 cases with reduced port surgery and one case with the simultaneous performance of laparoscopic distal partial gastrectomy. A receiver operating characteristic (ROC) curve analysis was used to identify the cut-off value for the number of surgical experiences necessary to become proficient with LHD, which was defined as the completion of the learning curve. We defined the completion of the learning curve when the following 3 conditions were satisfied. 1) The operation time was less than 165 minutes. 2) There was no blood loss. 3) There was no intraoperative complication. In order to establish the appropriate number of surgical experiences required to complete the learning curve, the cut-off value was evaluated by using a ROC curve (AUC 0.717, p < 0.001). Finally, we identified the cut-off value as 16 surgical cases (sensitivity 0.706, specificity 0.646). Learning curve seems to complete after performing 16 cases.

  20. Learning curve for laparoscopic Heller myotomy and Dor fundoplication for achalasia.

    Directory of Open Access Journals (Sweden)

    Fumiaki Yano

    Full Text Available Although laparoscopic Heller myotomy and Dor fundoplication (LHD is widely performed to address achalasia, little is known about the learning curve for this technique. We assessed the learning curve for performing LHD.Of the 514 cases with LHD performed between August 1994 and March 2016, the surgical outcomes of 463 cases were evaluated after excluding 50 cases with reduced port surgery and one case with the simultaneous performance of laparoscopic distal partial gastrectomy. A receiver operating characteristic (ROC curve analysis was used to identify the cut-off value for the number of surgical experiences necessary to become proficient with LHD, which was defined as the completion of the learning curve.We defined the completion of the learning curve when the following 3 conditions were satisfied. 1 The operation time was less than 165 minutes. 2 There was no blood loss. 3 There was no intraoperative complication. In order to establish the appropriate number of surgical experiences required to complete the learning curve, the cut-off value was evaluated by using a ROC curve (AUC 0.717, p < 0.001. Finally, we identified the cut-off value as 16 surgical cases (sensitivity 0.706, specificity 0.646.Learning curve seems to complete after performing 16 cases.

  1. Intraoperative Transesophageal Echocardiography and Right Ventricular Failure After Left Ventricular Assist Device Implantation.

    Science.gov (United States)

    Silverton, Natalie A; Patel, Ravi; Zimmerman, Josh; Ma, Jianing; Stoddard, Greg; Selzman, Craig; Morrissey, Candice K

    2018-02-15

    To determine whether intraoperative measures of right ventricular (RV) function using transesophageal echocardiography are associated with subsequent RV failure after left ventricular assist device (LVAD) implantation. Retrospective, nonrandomized, observational study. Single tertiary-level, university-affiliated hospital. The study comprised 100 patients with systolic heart failure undergoing elective LVAD implantation. Transesophageal echocardiographic images before and after cardiopulmonary bypass were analyzed to quantify RV function using tricuspid annular plane systolic excursion (TAPSE), tricuspid annular systolic velocity (S'), fractional area change (FAC), RV global longitudinal strain, and RV free wall strain. A chart review was performed to determine which patients subsequently developed RV failure (right ventricular assist device placement or prolonged inotrope requirement ≥14 days). Nineteen patients (19%) subsequently developed RV failure. Postbypass FAC was the only measure of RV function that distinguished between the RV failure and non-RV failure groups (21.2% v 26.5%; p = 0.04). The sensitivity, specificity, and area under the curve of an abnormal RV FAC (failure after LVAD implantation were 84%, 20%, and 0.52, respectively. No other intraoperative measure of RV function was associated with subsequent RV failure. RV failure increased ventilator time, intensive care unit and hospital length of stay, and mortality. Intraoperative measures of RV function such as tricuspid annular plane systolic excursion, tricuspid annular systolic velocity, and RV strain were not associated with RV failure after LVAD implantation. Decreased postbypass FAC was significantly associated with RV failure but showed poor discrimination. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Crack resistance curves determination of tube cladding material

    Energy Technology Data Exchange (ETDEWEB)

    Bertsch, J. [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)]. E-mail: johannes.bertsch@psi.ch; Hoffelner, W. [Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)

    2006-06-30

    Zirconium based alloys have been in use as fuel cladding material in light water reactors since many years. As claddings change their mechanical properties during service, it is essential for the assessment of mechanical integrity to provide parameters for potential rupture behaviour. Usually, fracture mechanics parameters like the fracture toughness K {sub IC} or, for high plastic strains, the J-integral based elastic-plastic fracture toughness J {sub IC} are employed. In claddings with a very small wall thickness the determination of toughness needs the extension of the J-concept beyond limits of standards. In the paper a new method based on the traditional J approach is presented. Crack resistance curves (J-R curves) were created for unirradiated thin walled Zircaloy-4 and aluminium cladding tube pieces at room temperature using the single sample method. The procedure of creating sharp fatigue starter cracks with respect to optical recording was optimized. It is shown that the chosen test method is appropriate for the determination of complete J-R curves including the values J {sub 0.2} (J at 0.2 mm crack length), J {sub m} (J corresponding to the maximum load) and the slope of the curve.

  3. Assessment of surge arrester failure rate and application studies in Hellenic high voltage transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Christodoulou, C.A.; Fotis, G.P.; Gonos, I.F.; Stathopulos, I.A. [National Technical University of Athens, School of Electrical and Computer Engineering, High Voltage Laboratory, 9 Iroon Politechniou St., Zografou Campus, 157 80 Athens (Greece); Ekonomou, L. [A.S.PE.T.E. - School of Pedagogical and Technological Education, Department of Electrical Engineering Educators, N. Heraklion, 141 21 Athens (Greece)

    2010-02-15

    The use of transmission line surge arresters to improve the lightning performance of transmission lines is becoming more common. Especially in areas with high soil resistivity and ground flash density, surge arresters constitute the most effective protection mean. In this paper a methodology for assessing the surge arrester failure rate based on the electrogeometrical model is presented. Critical currents that exceed arresters rated energy stress were estimated by the use of a simulation tool. The methodology is applied on operating Hellenic transmission lines of 150 kV. Several case studies are analyzed by installing surge arresters on different intervals, in relation to the region's tower footing resistance and the ground flash density. The obtained results are compared with real records of outage rate showing the effectiveness of the surge arresters in the reduction of the recorded failure rate. The presented methodology can be proved valuable to the studies of electric power systems designers intending in a more effective lightning protection, reducing the operational costs and providing continuity of service. (author)

  4. Nonlinear Dynamic of Curved Railway Tracks in Three-Dimensional Space

    Science.gov (United States)

    Liu, X.; Ngamkhanong, C.; Kaewunruen, S.

    2017-12-01

    On curved tracks, high-pitch noise pollution can often be a considerable concern of rail asset owners, commuters, and people living or working along the rail corridor. Inevitably, wheel/rail interface can cause a traveling source of sound and vibration, which spread over a long distance of rail network. The sound and vibration can be in various forms and spectra. The undesirable sound and vibration on curves is often called ‘noise,’ includes flanging and squealing noises. This paper focuses on the squeal noise phenomena on curved tracks located in urban environments. It highlights the effect of curve radii on lateral track dynamics. It is important to note that rail freight curve noises, especially for curve squeals, can be observed almost everywhere and every type of track structures. The most pressing noise appears at sharper curved tracks where excessive lateral wheel/rail dynamics resonate with falling friction states, generating a tonal noise problem, so-call ‘squeal’. Many researchers have carried out measurements and simulations to understand the actual root causes of the squeal noise. Most researchers believe that wheel resonance over falling friction is the main cause, whilst a few others think that dynamic mode coupling of wheel and rail may also cause the squeal. Therefore, this paper is devoted to systems thinking the approach and dynamic assessment in resolving railway curve noise problems. The simulations of railway tracks with different curve radii will be carried out to develop state-of-the-art understanding into lateral track dynamics, including rail dynamics, cant dynamics, gauge dynamics and overall track responses.

  5. Nutrition in Heart Failure

    Directory of Open Access Journals (Sweden)

    Reci Meseri

    2013-10-01

    Full Text Available Heart failure is defined as decreased ability of heart due to various reasons. It%u2019s seen 2-3% but the prevalence increases sharply after the age of seventy. The objectives of nutrition therapy in heart failure are to prevent from water retention and edema, to avoid from hard digestion and to offer a balanced diet. In order to avoid fluid retention and edema, daily sodium and fluid intake must be monitored carefully. Main dilemma of the heart failure patients is the obesity-cachexia dilemma. Since one of the main reasons of heart failure is cardiovascular diseases, in first phase, the patient may be obese. In the later phases, cachexia may show up. It was shown that cachexia is associated with mortality. Within this period, patients should not be over-fed and the patient should pass from catabolic state to anabolic state slowly. If the gastrointestinal track is functional oral/enteral feeding must be preferred. Multi vitamin and mineral supportsmay be beneficial, which may replace the increased loss, increase anti-inflammatory response and be anti-oxidants. Large, controlled and well-designed studies must be conducted in order to evaluate the benefits of nutritional practices such as nutritional assessment, enteral feeding and nutrient supports in heart failure patients.

  6. [Analgesic quality in a postoperative pain service: continuous assessment with the cumulative sum (cusum) method].

    Science.gov (United States)

    Baptista Macaroff, W M; Castroman Espasandín, P

    2007-01-01

    The aim of this study was to assess the cumulative sum (cusum) method for evaluating the performance of our hospital's acute postoperative pain service. The period of analysis was 7 months. Analgesic failure was defined as a score of 3 points or more on a simple numerical scale. Acceptable failure (p0) was set at 20% of patients upon admission to the postanesthetic recovery unit and at 7% 24 hours after surgery. Unacceptable failure was set at double the p0 rate at each time (40% and 14%, respectively). The unit's patient records were used to generate a cusum graph for each evaluation. Nine hundred four records were included. The rate of failure was 31.6% upon admission to the unit and 12.1% at the 24-hour postoperative assessment. The curve rose rapidly to the value set for p0 at both evaluation times (n = 14 and n = 17, respectively), later leveled off, and began to fall after 721 and 521 cases, respectively. Our study shows the efficacy of the cusum method for monitoring a proposed quality standard. The graph also showed periods of suboptimal performance that would not have been evident from analyzing the data en block. Thus the cusum method would facilitate rapid detection of periods in which quality declines.

  7. Comparing passive angle-torque curves recorded simultaneously with a load cell versus an isokinetic dynamometer during dorsiflexion stretch tolerance assessments.

    Science.gov (United States)

    Buckner, Samuel L; Jenkins, Nathaniel D M; Costa, Pablo B; Ryan, Eric D; Herda, Trent J; Cramer, Joel T

    2015-05-01

    The purpose of the present study was to compare the passive angle-torque curves and the passive stiffness (PS, N m °(-)(1)) values recorded simultaneously from a load cell versus an isokinetic dynamometer during dorsiflexion stretch tolerance assessments in vivo. Nine healthy men (mean ± SD age = 21.4 ± 1.6 years) completed stretch tolerance assessments on a custom-built apparatus where passive torque was measured simultaneously from an isokinetic dynamometer and a load cell. Passive torque values that corresponded with the last 10° of dorsiflexion, verified by surface electromyographic amplitude, were analyzed for each device (θ1, θ2, θ3, …, θ10). Passive torque values measured with the load cell were greater (p ≤ 0.05) than the dynamometer torque values for θ4 through θ10. There were more statistical differentiations among joint angles for passive torque measured by the load cell, and the load cell measured a greater (p ≤ 0.01) increase in passive torque and PS than the isokinetic dynamometer. These findings suggested that when examining the angle-torque curves from passive dorsiflexion stretch tolerance tests, a load cell placed under the distal end of the foot may be more sensitive than the torque recorded from an isokinetic dynamometer. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Predicting water main failures using Bayesian model averaging and survival modelling approach

    International Nuclear Information System (INIS)

    Kabir, Golam; Tesfamariam, Solomon; Sadiq, Rehan

    2015-01-01

    To develop an effective preventive or proactive repair and replacement action plan, water utilities often rely on water main failure prediction models. However, in predicting the failure of water mains, uncertainty is inherent regardless of the quality and quantity of data used in the model. To improve the understanding of water main failure, a Bayesian framework is developed for predicting the failure of water mains considering uncertainties. In this study, Bayesian model averaging method (BMA) is presented to identify the influential pipe-dependent and time-dependent covariates considering model uncertainties whereas Bayesian Weibull Proportional Hazard Model (BWPHM) is applied to develop the survival curves and to predict the failure rates of water mains. To accredit the proposed framework, it is implemented to predict the failure of cast iron (CI) and ductile iron (DI) pipes of the water distribution network of the City of Calgary, Alberta, Canada. Results indicate that the predicted 95% uncertainty bounds of the proposed BWPHMs capture effectively the observed breaks for both CI and DI water mains. Moreover, the performance of the proposed BWPHMs are better compare to the Cox-Proportional Hazard Model (Cox-PHM) for considering Weibull distribution for the baseline hazard function and model uncertainties. - Highlights: • Prioritize rehabilitation and replacements (R/R) strategies of water mains. • Consider the uncertainties for the failure prediction. • Improve the prediction capability of the water mains failure models. • Identify the influential and appropriate covariates for different models. • Determine the effects of the covariates on failure

  9. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  10. Risk assessment of the emergency processes: Healthcare failure mode and effect analysis.

    Science.gov (United States)

    Taleghani, Yasamin Molavi; Rezaei, Fatemeh; Sheikhbardsiri, Hojat

    2016-01-01

    Ensuring about the patient's safety is the first vital step in improving the quality of care and the emergency ward is known as a high-risk area in treatment health care. The present study was conducted to evaluate the selected risk processes of emergency surgery department of a treatment-educational Qaem center in Mashhad by using analysis method of the conditions and failure effects in health care. In this study, in combination (qualitative action research and quantitative cross-sectional), failure modes and effects of 5 high-risk procedures of the emergency surgery department were identified and analyzed according to Healthcare Failure Mode and Effects Analysis (HFMEA). To classify the failure modes from the "nursing errors in clinical management model (NECM)", the classification of the effective causes of error from "Eindhoven model" and determination of the strategies to improve from the "theory of solving problem by an inventive method" were used. To analyze the quantitative data of descriptive statistics (total points) and to analyze the qualitative data, content analysis and agreement of comments of the members were used. In 5 selected processes by "voting method using rating", 23 steps, 61 sub-processes and 217 potential failure modes were identified by HFMEA. 25 (11.5%) failure modes as the high risk errors were detected and transferred to the decision tree. The most and the least failure modes were placed in the categories of care errors (54.7%) and knowledge and skill (9.5%), respectively. Also, 29.4% of preventive measures were in the category of human resource management strategy. "Revision and re-engineering of processes", "continuous monitoring of the works", "preparation and revision of operating procedures and policies", "developing the criteria for evaluating the performance of the personnel", "designing a suitable educational content for needs of employee", "training patients", "reducing the workload and power shortage", "improving team

  11. EVALUATION OF LIVER FAILURE STAGE IN CHILDREN

    Directory of Open Access Journals (Sweden)

    G. V. Volynets

    2013-01-01

    Full Text Available Aim: to develop a system of evaluation of liver failure stage in children based on the International classification of functioning, disability and health (ICF. Patients and methods: based on the retrospective analysis of 14 biochemical markers, characterizing hepatic role in proteins, lipids and carbohydrates metabolism, of 115 children without liver diseases, 15 children who died of liver failure and 220 patients with various hepatic disorders, being followed-up in the SCCH of RAMS, a score system of evaluation of liver failure stage in children as an additional diagnostic tool was developed. Each of the biochemical markers was assessed according to the 5-point rating scale in dependence of its changes intensity. Results: the sum of points was considered to be a criterion of liver failure stage. According to the ICF recommendations, decrease of points on 0–4% (54–56 points corresponds with absence of liver failure; 5–24% (43–53 points — as mild dysfunction, 25–49% (29–42 points — as moderate; 50-95% (3–28 points — as severe; and 96–100% (0-2 points — as absolute failure. Conclusions: score system of evaluation of liver failure stage can be applied at any step of diagnostics and treatment of children of any age, due to independence of the used markers from the age. It can be used in assessment of the severity of disorder in dynamics, in determination of the prognosis and as criterion of indications to liver transplantation, as well as during medico-social examination.

  12. A systematic methodology for creep master curve construction using the stepped isostress method (SSM): a numerical assessment

    Science.gov (United States)

    Miranda Guedes, Rui

    2018-02-01

    Long-term creep of viscoelastic materials is experimentally inferred through accelerating techniques based on the time-temperature superposition principle (TTSP) or on the time-stress superposition principle (TSSP). According to these principles, a given property measured for short times at a higher temperature or higher stress level remains the same as that obtained for longer times at a lower temperature or lower stress level, except that the curves are shifted parallel to the horizontal axis, matching a master curve. These procedures enable the construction of creep master curves with short-term experimental tests. The Stepped Isostress Method (SSM) is an evolution of the classical TSSP method. Higher reduction of the required number of test specimens to obtain the master curve is achieved by the SSM technique, since only one specimen is necessary. The classical approach, using creep tests, demands at least one specimen per each stress level to produce a set of creep curves upon which TSSP is applied to obtain the master curve. This work proposes an analytical method to process the SSM raw data. The method is validated using numerical simulations to reproduce the SSM tests based on two different viscoelastic models. One model represents the viscoelastic behavior of a graphite/epoxy laminate and the other represents an adhesive based on epoxy resin.

  13. Bragg Curve Spectroscopy

    International Nuclear Information System (INIS)

    Gruhn, C.R.

    1981-05-01

    An alternative utilization is presented for the gaseous ionization chamber in the detection of energetic heavy ions, which is called Bragg Curve Spectroscopy (BCS). Conceptually, BCS involves using the maximum data available from the Bragg curve of the stopping heavy ion (HI) for purposes of identifying the particle and measuring its energy. A detector has been designed that measures the Bragg curve with high precision. From the Bragg curve the range from the length of the track, the total energy from the integral of the specific ionization over the track, the dE/dx from the specific ionization at the beginning of the track, and the Bragg peak from the maximum of the specific ionization of the HI are determined. This last signal measures the atomic number, Z, of the HI unambiguously

  14. Damage and protection cost curves for coastal floods within the 600 largest European cities

    Science.gov (United States)

    Prahl, Boris F.; Boettle, Markus; Costa, Luís; Kropp, Jürgen P.; Rybski, Diego

    2018-01-01

    The economic assessment of the impacts of storm surges and sea-level rise in coastal cities requires high-level information on the damage and protection costs associated with varying flood heights. We provide a systematically and consistently calculated dataset of macroscale damage and protection cost curves for the 600 largest European coastal cities opening the perspective for a wide range of applications. Offering the first comprehensive dataset to include the costs of dike protection, we provide the underpinning information to run comparative assessments of costs and benefits of coastal adaptation. Aggregate cost curves for coastal flooding at the city-level are commonly regarded as by-products of impact assessments and are generally not published as a standalone dataset. Hence, our work also aims at initiating a more critical discussion on the availability and derivation of cost curves. PMID:29557944

  15. Damage and protection cost curves for coastal floods within the 600 largest European cities

    Science.gov (United States)

    Prahl, Boris F.; Boettle, Markus; Costa, Luís; Kropp, Jürgen P.; Rybski, Diego

    2018-03-01

    The economic assessment of the impacts of storm surges and sea-level rise in coastal cities requires high-level information on the damage and protection costs associated with varying flood heights. We provide a systematically and consistently calculated dataset of macroscale damage and protection cost curves for the 600 largest European coastal cities opening the perspective for a wide range of applications. Offering the first comprehensive dataset to include the costs of dike protection, we provide the underpinning information to run comparative assessments of costs and benefits of coastal adaptation. Aggregate cost curves for coastal flooding at the city-level are commonly regarded as by-products of impact assessments and are generally not published as a standalone dataset. Hence, our work also aims at initiating a more critical discussion on the availability and derivation of cost curves.

  16. Assessment of diagnostic value of tumor markers for colorectal neoplasm by logistic regression and ROC curve

    International Nuclear Information System (INIS)

    Ping, G.

    2007-01-01

    Full text: Objective: To assess the diagnostic value of CEA CA199 and CA50 for colorectal neoplasm by logistic regression and ROC curve. Methods: The subjects include 75 patients of colorectal cancer, 35 patients of benign intestinal disease and 49 health controls. CEA CA199 and CA50 are measured by CLIA ECLIA and IRMA respectively. The area under the curve (AUC) of CEA CA 199 CA50 and logistic regression results are compared. [Result] In the cancer-benign group, the AUC of CA50 is larger than the AUC of CA199 Compared with the AUC of combination of CEA CA199 and CA50 (0.604),the AUC of combination of CEA and CA50 (0.875) is larger and it is also larger than any other AUC of CEA CA199 or CA50 alone. In the cancerhealth group, the AUC of combination of CEA CA199 and CA50 is larger than any other AUC of CEA CA199 or CA50 alone. No matter in the cancer-benign group or cancerhealth group. The AUC of CEA is larger than the AUC of CA199 or CA50. Conclusion: CEA is useful in the diagnosis of colorectal cancer. In the process of differential diagnosis, the combination of CEA and CA50 can give more information, while the combination of three tumor markers does not perform well. Furthermore, as a statistical method, logistic regression can improve the diagnostic sensitivity and specificity. (author)

  17. Background does not significantly affect power-exponential fitting of gastric emptying curves

    International Nuclear Information System (INIS)

    Jonderko, K.

    1987-01-01

    Using a procedure enabling the assessment of background radiation, research was done to elucidate the course of changes in background activity during gastric emptying measurements. Attention was focused on the changes in the shape of power-exponential fitted gastric emptying curves after correction for background was performed. The observed pattern of background counts allowed to explain the shifts of the parameters characterizing power-exponential curves connected with background correction. It was concluded that background had a negligible effect on the power-exponential fitting of gastric emptying curves. (author)

  18. Salivary cortisol day curves in assessing glucocorticoid replacement therapy in Addison's disease.

    Science.gov (United States)

    Smans, Lisanne; Lentjes, Eef; Hermus, Ad; Zelissen, Pierre

    2013-01-01

    Patients with Addison's disease require lifelong treatment with glucocorticoids. At present, no glucocorticoid replacement therapy (GRT) can exactly mimic normal physiology. As a consequence, under- and especially overtreatment can occur. Suboptimal GRT may lead to various side effects. The aim of this study was to investigate the use of salivary cortisol day curves (SCDC) in the individual adjustment of GRT in order to approach normal cortisol levels as closely as possible, reduce over- and underreplacement and study the short-term effects on quality of life (QoL). Twenty patients with Addison's disease were included in this prospective study. A SCDC was obtained and compared to normal controls; general and disease specific QoL-questionnaires were completed. Based on SCDC assessment of over- and undertreatment (calculated as duration (h) × magnitude (nmol/L) at different time points, glucocorticoid dose and regime were adjusted. After 4 weeks SCDC and QoL assessment were repeated and the effect of adjusting GRT was analysed. At baseline, underreplacement was present in 3 and overreplacement in 18 patients; total calculated overreplacement was 32.8 h.nmol/L. Overreplacement decreased significantly to 13.3 h. nmol/L (p =0.005) after adjustment of GRT. Overreplacement was found particularly in the afternoon and evening. After reducing overreplacement in the evening, complaints about sleep disturbances significantly decreased. Individual adjustment of GRT based on SCDC to approach normal cortisol concentrations during the day can reduce overreplacement, especially in the evening. This can lead to a reduction of sleep disturbances and fatigue in patients with Addison's disease. A SCDC is a simple and patient-friendly tool for adjusting GRT and can be useful in the follow-up of patients with Addison's disease.

  19. Learning Curve? Which One?

    Directory of Open Access Journals (Sweden)

    Paulo Prochno

    2004-07-01

    Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.

  20. Association between Platelet Counts before and during Pharmacological Therapy for Patent Ductus Arteriosus and Treatment Failure in Preterm Infants.

    Science.gov (United States)

    Sallmon, Hannes; Weber, Sven C; Dirks, Juliane; Schiffer, Tamara; Klippstein, Tamara; Stein, Anja; Felderhoff-Müser, Ursula; Metze, Boris; Hansmann, Georg; Bührer, Christoph; Cremer, Malte; Koehne, Petra

    2018-01-01

    The role of platelets for mediating closure of the ductus arteriosus in human preterm infants is controversial. Especially, the effect of low platelet counts on pharmacological treatment failure is still unclear. In this retrospective study of 471 preterm infants [0.6). However, ROC curve analysis did not reveal a specific platelet cutoff-value that could predict PDA treatment failure. Multivariate logistic regression analysis showed that lower platelet counts, a lower BW, and preeclampsia were independently associated with COXI treatment failure. We provide further evidence for an association between low platelet counts during pharmacological therapy for symptomatic PDA and treatment failure, while platelet counts before initiation of therapy did not affect treatment outcome.

  1. Contractibility of curves

    Directory of Open Access Journals (Sweden)

    Janusz Charatonik

    1991-11-01

    Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.

  2. Roc curves for continuous data

    CERN Document Server

    Krzanowski, Wojtek J

    2009-01-01

    Since ROC curves have become ubiquitous in many application areas, the various advances have been scattered across disparate articles and texts. ROC Curves for Continuous Data is the first book solely devoted to the subject, bringing together all the relevant material to provide a clear understanding of how to analyze ROC curves.The fundamental theory of ROC curvesThe book first discusses the relationship between the ROC curve and numerous performance measures and then extends the theory into practice by describing how ROC curves are estimated. Further building on the theory, the authors prese

  3. Failure assessment techniques to ensure shipping container integrity

    International Nuclear Information System (INIS)

    McConnell, P.

    1986-02-01

    This report discusses several methodologies which may be used to ensure the structural integrity of containment systems to be used for the transport and storage of high-level radioactive substances. For economic reasons, shipping containers constructed of ferritic materials are being considered for manufacture by vendors in the US and Europe. Ferritic show an inherent transition from a ductile, high energy failure mode to a brittle, low energy fracture mode with decreasing temperature. Therefore, formal consideration of means by which to avoid unstable brittle fracture is necessary prior to the licensing of ferritic casks. It is suggested that failure of a shipping container wall be defined as occurring when a flaw extends through the outer wall of the containment system. Crack initiation which may lead to unstable brittle crack growth should therefore be prevented. It is suggested that a fundamental linear elastic fracture mechanics (lefm) approach be adopted on a case-by-case basis, applied perhaps by means of appropriate modifications to ASMA Section III or Section XI. A lefm analysis requires information concerning service temperatures, loading rates, flaw sizes, and applied stresses. Tentative judgments regarding these parameters for typical shipping containers have been made

  4. Synergistic failure of BWR internals

    International Nuclear Information System (INIS)

    Ware, A. G.; Chang, T.Y.

    1999-01-01

    Boiling Water Reactor (BWR) core shrouds and other reactor internals important to safety are experiencing intergranular stress corrosion cracking (IGSCC). The United States Nuclear Regulatory Commission has followed the problem, and as part of its investigations, contracted with the Idaho National Engineering and Environmental Laboratory to conduct a risk assessment. The overall project objective is to assess the potential consequences and risks associated with the failure of IGSCC-susceptible BWR vessel internals, with specific consideration given to potential cascading and common mode effects. An initial phase has been completed in which background material was gathered and evaluated, and potential accident sequences were identified. A second phase is underway to perform a simplified, quantitative probabilistic risk assessment on a representative high-power BWR/4. Results of the initial study conducted on the jet pumps show that any cascading failures would not result in a significant increase in the core damage frequency. The methodology is currently being extended to other major reactor internals components

  5. Hot Spot Temperature and Grey Target Theory-Based Dynamic Modelling for Reliability Assessment of Transformer Oil-Paper Insulation Systems: A Practical Case Study

    Directory of Open Access Journals (Sweden)

    Lefeng Cheng

    2018-01-01

    Full Text Available This paper develops a novel dynamic correction method for the reliability assessment of large oil-immersed power transformers. First, with the transformer oil-paper insulation system (TOPIS as the target of evaluation and the winding hot spot temperature (HST as the core point, an HST-based static ageing failure model is built according to the Weibull distribution and Arrhenius reaction law, in order to describe the transformer ageing process and calculate the winding HST for obtaining the failure rate and life expectancy of TOPIS. A grey target theory based dynamic correction model is then developed, combined with the data of Dissolved Gas Analysis (DGA in power transformer oil, in order to dynamically modify the life expectancy calculated by the built static model, such that the corresponding relationship between the state grade and life expectancy correction coefficient of TOPIS can be built. Furthermore, the life expectancy loss recovery factor is introduced to correct the life expectancy of TOPIS again. Lastly, a practical case study of an operating transformer has been undertaken, in which the failure rate curve after introducing dynamic corrections can be obtained for the reliability assessment of this transformer. The curve shows a better ability of tracking the actual reliability level of transformer, thus verifying the validity of the proposed method and providing a new way for transformer reliability assessment. This contribution presents a novel model for the reliability assessment of TOPIS, in which the DGA data, as a source of information for the dynamic correction, is processed based on the grey target theory, thus the internal faults of power transformer can be diagnosed accurately as well as its life expectancy updated in time, ensuring that the dynamic assessment values can commendably track and reflect the actual operation state of the power transformers.

  6. Impaired Curve Negotiation in Drivers with Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Ergun Y Uç

    2009-03-01

    Full Text Available OBJECTIVE: To assess the ability to negotiate curves in drivers with Parkinson’s disease (PD. METHODS: Licensed active drivers with mild-moderate PD (n= 76; 65 male, 11 female and elderly controls (n= 51; 26 male, 25 female drove on a simulated 2-lane rural highway in a high-fidelity simulator scenario in which the drivers had to negotiate 6 curves during a 37-mile drive. The participants underwent motor, cognitive, and visual testing before the simulator drive. RESULTS: Compared to controls, the drivers with PD had less vehicle control and driving safety, both on curves and straight baseline segments, as measured by significantly higher standard deviation of lateral position (SDLP and lane violation counts. The PD group also scored lower on tests of motor, cognitive, and visual abilities. In the PD group, lower scores on tests of motion perception, visuospatial ability, executive function, postural instability, and general cognition, as well as a lower level of independence in daily activities predicted low vehicle control on curves. CONCLUSION: Drivers with PD had less vehicle control and driving safety on curves compared to controls, which was associated primarily with impairments in visual perception and cognition, rather than motor function

  7. Lower head failure analysis

    International Nuclear Information System (INIS)

    Rempe, J.L.; Thinnes, G.L.; Allison, C.M.; Cronenberg, A.W.

    1991-01-01

    The US Nuclear Regulatory Commission is sponsoring a lower vessel head research program to investigate plausible modes of reactor vessel failure in order to determine (a) which modes have the greatest likelihood of occurrence during a severe accident and (b) the range of core debris and accident conditions that lead to these failures. This paper presents the methodology and preliminary results of an investigation of reactor designs and thermodynamic conditions using analytic closed-form approximations to assess the important governing parameters in non-dimensional form. Preliminary results illustrate the importance of vessel and tube geometrical parameters, material properties, and external boundary conditions on predicting vessel failure. Thermal analyses indicate that steady-state temperature distributions will occur in the vessel within several hours, although the exact time is dependent upon vessel thickness. In-vessel tube failure is governed by the tube-to-debris mass ratio within the lower head, where most penetrations are predicted to fail if surrounded by molten debris. Melt penetration distance is dependent upon the effective flow diameter of the tube. Molten debris is predicted to penetrate through tubes with a larger effective flow diameter, such as a boiling water reactor (BWR) drain nozzle. Ex-vessel tube failure for depressurized reactor vessels is predicted to be more likely for a BWR drain nozzle penetration because of its larger effective diameter. At high pressures (between ∼0.1 MPa and ∼12 MPa) ex-vessel tube rupture becomes a dominant failure mechanism, although tube ejection dominates control rod guide tube failure at lower temperatures. However, tube ejection and tube rupture predictions are sensitive to the vessel and tube radial gap size and material coefficients of thermal expansion

  8. Increased crop failure due to climate change: assessing adaptation options using models and socio-economic data for wheat in China

    Energy Technology Data Exchange (ETDEWEB)

    Challinor, Andrew J [Institute for Climate and Atmospheric Science, School of Earth and Environment, University of Leeds, Leeds LS2 9JT (United Kingdom); Simelton, Elisabeth S; Fraser, Evan D G [Sustainability Research Institute, School of Earth and Environment, University of Leeds, Leeds LS2 9JT (United Kingdom); Hemming, Debbie; Collins, Mathew, E-mail: a.j.challinor@leeds.ac.uk [Met Office Hadley Centre, FitzRoy Road, Exeter EX1 3PB (United Kingdom)

    2010-07-15

    Tools for projecting crop productivity under a range of conditions, and assessing adaptation options, are an important part of the endeavour to prioritize investment in adaptation. We present ensemble projections of crop productivity that account for biophysical processes, inherent uncertainty and adaptation, using spring wheat in Northeast China as a case study. A parallel 'vulnerability index' approach uses quantitative socio-economic data to account for autonomous farmer adaptation. The simulations show crop failure rates increasing under climate change, due to increasing extremes of both heat and water stress. Crop failure rates increase with mean temperature, with increases in maximum failure rates being greater than those in median failure rates. The results suggest that significant adaptation is possible through either socio-economic measures such as greater investment, or biophysical measures such as drought or heat tolerance in crops. The results also show that adaptation becomes increasingly necessitated as mean temperature and the associated number of extremes rise. The results, and the limitations of this study, also suggest directions for research for linking climate and crop models, socio-economic analyses and crop variety trial data in order to prioritize options such as capacity building, plant breeding and biotechnology.

  9. Atlas of stress-strain curves

    CERN Document Server

    2002-01-01

    The Atlas of Stress-Strain Curves, Second Edition is substantially bigger in page dimensions, number of pages, and total number of curves than the previous edition. It contains over 1,400 curves, almost three times as many as in the 1987 edition. The curves are normalized in appearance to aid making comparisons among materials. All diagrams include metric (SI) units, and many also include U.S. customary units. All curves are captioned in a consistent format with valuable information including (as available) standard designation, the primary source of the curve, mechanical properties (including hardening exponent and strength coefficient), condition of sample, strain rate, test temperature, and alloy composition. Curve types include monotonic and cyclic stress-strain, isochronous stress-strain, and tangent modulus. Curves are logically arranged and indexed for fast retrieval of information. The book also includes an introduction that provides background information on methods of stress-strain determination, on...

  10. An application of failure mode and effect analysis (FMEA to assess risks in petrochemical industry in Iran

    Directory of Open Access Journals (Sweden)

    Mehdi Kangavari

    2015-06-01

    Full Text Available Petrochemical industries have a high rate of accidents. Failure mode and effect analysis (FMEA is a systematic method and thus is capable of analyzing the risks of systems from concept phase to system disposal, detecting the failures in design stage, and determining the control measures and corrective actions for failures to reduce their impacts. The objectives of this research were to perform FMEA to identify risks in an Iranian petrochemical industry and determine the decrease of the risk priority number (RPN after implementation of intervention programs. This interventional study was performed at one petrochemical plant in Tehran, Iran in 2014. Relevant information about job categories and plant process was gathered using brainstorming techniques, fishbone diagram, and group decision making. The data were collected through interviews, observation, and documents investigations and was recorded in FMEA worksheets. The necessary corrective measures were performed on the basis of the results of initial FMEA. Forty eight failures were identified in welding unit by application of FMEA to assess risks. Welding processes especially working at height got the highest RPN. Obtained RPN for working at height before performing the corrective actions was 120 and the score was reduced to 96 after performing corrective measures. Calculated RPN for all processes was significantly reduced (p≤0.001 by implementing the corrective actions. Scores of RPN in all studied processes effectively decreased after performing corrective actions in a petrochemical industry. FMEA method is a useful tool for identifying risk intervention priorities and effectiveness in a studied petrochemical industry.

  11. Wright meets Markowitz: How standard portfolio theory changes when assets are technologies following experience curves

    OpenAIRE

    Way, Rupert; Lafond, François; Farmer, J. Doyne; Lillo, Fabrizio; Panchenko, Valentyn

    2017-01-01

    This paper considers how to optimally allocate investments in a portfolio of competing technologies. We introduce a simple model representing the underlying trade-off - between investing enough effort in any one project to spur rapid progress, and diversifying effort over many projects simultaneously to hedge against failure. We use stochastic experience curves to model the idea that investing more in a technology reduces its unit costs, and we use a mean-variance objective function to unders...

  12. Relationship between Activities of Daily Living and Readmission within 90 Days in Hospitalized Elderly Patients with Heart Failure

    Directory of Open Access Journals (Sweden)

    Masahiro Kitamura

    2017-01-01

    Full Text Available Aims. To examine the relationship between activities of daily living (ADL and readmission within 90 days and assess the cutoff value of ADL to predict readmission in hospitalized elderly patients with heart failure (HF. Methods. This cohort study comprised 589 consecutive patients with HF aged ≥65 years, who underwent cardiac rehabilitation from May 2012 to May 2016 and were discharged home. We investigated patients’ characteristics, basic attributes, and ADL (motor and cognitive Functional Independence Measure [FIM]. We analyzed the data using the unpaired t-test, χ2 test, Cox proportional hazard model, receiver operating characteristic (ROC curve, and Kaplan-Meier method. Results. Of 589 patients, 113 met the criteria, and they were divided into the nonreadmission (n=90 and readmission groups (n=23. Age, body mass index, New York Heart Association class, hemoglobin level, and motor FIM score were significantly different between the two groups (p<0.05. The body mass index (hazard ratio [HR]: 0.87; p<0.05 and motor FIM score (HR: 0.94; p<0.01 remained statistically significant. The cutoff value for the motor FIM score determined by ROC curve analysis was 74.5 points (area under the curve = 0.78; p<0.001. Conclusion. The motor FIM score in elderly patients with HF was an independent predictor of rehospitalization within 90 days.

  13. Hormonal and cardiovascular reflex assessment in a female patient with pure autonomic failure

    Directory of Open Access Journals (Sweden)

    Heno Ferreira Lopes

    2000-09-01

    Full Text Available We report the case of a 72-year-old female with pure autonomic failure, a rare entity, whose diagnosis of autonomic dysfunction was determined with a series of complementary tests. For approximately 2 years, the patient has been experiencing dizziness and a tendency to fall, a significant weight loss, generalized weakness, dysphagia, intestinal constipation, blurred vision, dry mouth, and changes in her voice. She underwent clinical assessment and laboratory tests (biochemical tests, chest X-ray, digestive endoscopy, colonoscopy, chest computed tomography, abdomen and pelvis computed tomography, abdominal ultrasound, and ambulatory blood pressure monitoring. Measurements of catecholamine and plasmatic renin activity were performed at rest and after physical exercise. Finally the patient underwent physiological and pharmacological autonomic tests that better diagnosed dysautonomia.

  14. Failures on stainless steel components

    International Nuclear Information System (INIS)

    Haenninen, H.

    1994-01-01

    Economic losses due to failure mainly by corrosion in process and nuclear industries are considered. In these industries the characteristics of different forms of corrosion and their economic effects are fairly well known and, especially, in nuclear industry the assessment of corrosion related costs has been comprehensive. In both industries the economic losses resulting from environmentally enhanced cracking of stainless steel components and the accompanying failures and outages have been considerable, owing as much to the frequency as the unpredictability of such occurrences. (orig.)

  15. An assessment of household electricity load curves and corresponding CO2 marginal abatement cost curves for Gujarat state, India

    International Nuclear Information System (INIS)

    Garg, Amit; Shukla, P.R.; Maheshwari, Jyoti; Upadhyay, Jigeesha

    2014-01-01

    Gujarat, a large industrialized state in India, consumed 67 TWh of electricity in 2009–10, besides experiencing a 4.5% demand–supply short-fall. Residential sector accounted for 15% of the total electricity consumption. We conducted load research survey across 21 cities and towns of the state to estimate residential electricity load curves, share of appliances by type and usage patterns for all types of household appliances at utility, geographic, appliance, income and end-use levels. The results indicate that a large scope exists for penetration of energy efficient devices in residential sector. Marginal Abatement Cost (MAC) curves for electricity and CO 2 were generated to analyze relative attractiveness of energy efficient appliance options. Results indicate that up to 7.9 TWh of electricity can be saved per year with 6.7 Mt-CO 2 emissions mitigation at negative or very low CO 2 prices of US$ 10/t-CO 2 . Despite such options existing, their penetration is not realized due to myriad barriers such as financial, institutional or awareness and therefore cannot be taken as baseline options for CO 2 emission mitigation regimes. - Highlights: • Residential sector provides focused mitigation opportunities. • Energy efficient space cooling is the main technology transition required. • Almost 26% residential load could be reduced by DSM measures. • Myriad barriers limit penetration of negative marginal cost efficient options

  16. Tornado-Shaped Curves

    Science.gov (United States)

    Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio

    2017-01-01

    In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.

  17. Quality and Health Literacy Demand of Online Heart Failure Information.

    Science.gov (United States)

    Cajita, Maan Isabella; Rodney, Tamar; Xu, Jingzhi; Hladek, Melissa; Han, Hae-Ra

    The ubiquity of the Internet is changing the way people obtain their health information. Although there is an abundance of heart failure information online, the quality and health literacy demand of these information are still unknown. The purpose of this study is to evaluate the quality and health literacy demand (readability, understandability, and actionability) of the heart failure information found online. Google, Yahoo, Bing, Ask.com, and DuckDuckGo were searched for relevant heart failure Web sites. Two independent raters then assessed the quality and health literacy demand of the included Web sites. The quality of the heart failure information was assessed using the DISCERN instrument. Readability was assessed using 7 established readability tests. Finally, understandability and actionability were assessed using the Patient Education Materials Assessment Tool for Print Materials. A total of 46 Web sites were included in this analysis. The overall mean quality rating was 46.0 ± 8.9 and the mean readability score was 12.6 grade reading level. The overall mean understandability score was 56.3% ± 16.2%. Finally, the overall mean actionability score was 34.7% ± 28.7%. The heart failure information found online was of fair quality but required a relatively high health literacy level. Web content authors need to consider not just the quality but also the health literacy demand of the information found in their Web sites. This is especially important considering that low health literacy is likely prevalent among the usual audience.

  18. A versatile curve-fit model for linear to deeply concave rank abundance curves

    NARCIS (Netherlands)

    Neuteboom, J.H.; Struik, P.C.

    2005-01-01

    A new, flexible curve-fit model for linear to concave rank abundance curves was conceptualized and validated using observational data. The model links the geometric-series model and log-series model and can also fit deeply concave rank abundance curves. The model is based ¿ in an unconventional way

  19. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    Science.gov (United States)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  20. Master curve approach to monitor fracture toughness of reactor pressure vessels in nuclear power plants

    International Nuclear Information System (INIS)

    2009-10-01

    A series of coordinated research projects (CRPs) have been sponsored by the IAEA, starting in the early 1970s, focused on neutron radiation effects on reactor pressure vessel (RPV) steels. The purpose of the CRPs was to develop correlative comparisons to test the uniformity of results through coordinated international research studies and data sharing. The overall scope of the eighth CRP (CRP-8), Master Curve Approach to Monitor Fracture Toughness of Reactor Pressure Vessels in Nuclear Power Plants, has evolved from previous CRPs which have focused on fracture toughness related issues. The ultimate use of embrittlement understanding is application to assure structural integrity of the RPV under current and future operation and accident conditions. The Master Curve approach for assessing the fracture toughness of a sampled irradiated material has been gaining acceptance throughout the world. This direct measurement of fracture toughness approach is technically superior to the correlative and indirect methods used in the past to assess irradiated RPV integrity. Several elements have been identified as focal points for Master Curve use: (i) limits of applicability for the Master Curve at the upper range of the transition region for loading quasi-static to dynamic/impact loading rates; (ii) effects of non-homogeneous material or changes due to environment conditions on the Master Curve, and how heterogeneity can be integrated into a more inclusive Master Curve methodology; (iii) importance of fracture mode differences and changes affect the Master Curve shape. The collected data in this report represent mostly results from non-irradiated testing, although some results from test reactor irradiations and plant surveillance programmes have been included as available. The results presented here should allow utility engineers and scientists to directly measure fracture toughness using small surveillance size specimens and apply the results using the Master Curve approach

  1. Failure Propagation Modeling and Analysis via System Interfaces

    Directory of Open Access Journals (Sweden)

    Lin Zhao

    2016-01-01

    Full Text Available Safety-critical systems must be shown to be acceptably safe to deploy and use in their operational environment. One of the key concerns of developing safety-critical systems is to understand how the system behaves in the presence of failures, regardless of whether that failure is triggered by the external environment or caused by internal errors. Safety assessment at the early stages of system development involves analysis of potential failures and their consequences. Increasingly, for complex systems, model-based safety assessment is becoming more widely used. In this paper we propose an approach for safety analysis based on system interface models. By extending interaction models on the system interface level with failure modes as well as relevant portions of the physical system to be controlled, automated support could be provided for much of the failure analysis. We focus on fault modeling and on how to compute minimal cut sets. Particularly, we explore state space reconstruction strategy and bounded searching technique to reduce the number of states that need to be analyzed, which remarkably improves the efficiency of cut sets searching algorithm.

  2. Is human failure a stochastic process?

    International Nuclear Information System (INIS)

    Dougherty, Ed M.

    1997-01-01

    Human performance results in failure events that occur with a risk-significant frequency. System analysts have taken for granted the random (stochastic) nature of these events in engineering assessments such as risk assessment. However, cognitive scientists and error technologists, at least those who have interest in human reliability, have, over the recent years, claimed that human error does not need this stochastic framework. Yet they still use the language appropriate to stochastic processes. This paper examines the potential for the stochastic nature of human failure production as the basis for human reliability analysis. It distinguishes and leaves to others, however, the epistemic uncertainties over the possible probability models for the real variability of human performance

  3. Numerical Investigation on the Propagation Mechanism of Steady Cellular Detonations in Curved Channels

    International Nuclear Information System (INIS)

    Li Jian; Ning Jian-Guo; Zhao Hui; Wang Cheng; Hao Li

    2015-01-01

    The propagation mechanism of steady cellular detonations in curved channels is investigated numerically with a detailed chemical reaction mechanism. The numerical results demonstrate that as the radius of the curvature decreases, detonation fails near the inner wall due to the strong expansion effect. As the radius of the curvature increases, the detonation front near the inner wall can sustain an underdriven detonation. In the case where detonation fails, a transverse detonation downstream forms and re-initiates the quenched detonation as it propagates toward the inner wall. Two kinds of propagation modes exist as the detonation is propagating in the curved channel. One is that the detonation fails first, and then a following transverse detonation initiates the quenched detonation and this process repeats itself. The other one is that without detonation failure and re-initiation, a steady detonation exists which consists of an underdriven detonation front near the inner wall subject to the diffraction and an overdriven detonation near the outer wall subject to the compression. (paper)

  4. Assessment of the impact of fueling machine failure on the safety of the CANDU-PHWR

    International Nuclear Information System (INIS)

    Al-Kusayer, T.A.

    1982-01-01

    A survey of possible LOCA (Loss-of-Coolant Accident) initiating events that might take place for CANDU-PHWRs (Canadian Deuterium Uranium-Pressurized Heavy Water Reactors) has been conducted covering both direct and indirect initiators. Among the 22 initiating events that were surveyed in this study, four direct initiators have been selected and analyzed briefly. Those selected were a pump suction piping break, an isolation valve piping break, a bleed valve failure, and a fueling machine interface failure. These were selected as examples of failures that could take place in the inlet side, outlet side, or PHTS (Primary Heat Transport System) interfaces. The Pickering NGS (Unit-A) was used for this case study. Double failure (failure of the protective devices to operate when the process equipment fault occurs) and a triple failure (failure of the protective devices and the ECCS as well as the process equipment) were found to be highly improbable

  5. Expected dose for the early failure scenario classes in the 2008 performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Helton, J.C.; Hansen, C.W.; Sallaberry, C.J.

    2014-01-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. In support of this development and an associated license application to the U.S. Nuclear Regulatory Commission (NRC), the DOE completed an extensive performance assessment (PA) for the proposed YM repository in 2008. This presentation describes the determination of expected dose to the reasonably maximally exposed individual (RMEI) specified in the NRC regulations for the YM repository for the early waste package (WP) failure scenario class and the early drip shield (DS) failure scenario class in the 2008 YM PA. The following topics are addressed: (i) properties of the early failure scenario classes and the determination of dose and expected dose the RMEI, (ii) expected dose and uncertainty in expected dose to the RMEI from the early WP failure scenario class, (iii) expected dose and uncertainty in expected dose to the RMEI from the early DS failure scenario class, (iv) expected dose and uncertainty in expected dose to the RMEI from the combined early WP and early DS failure scenario class with and without the inclusion of failures resulting from nominal processes, and (v) uncertainty in the occurrence of early failure scenario classes. The present article is part of a special issue of Reliability Engineering and System Safety devoted to the 2008 YM PA; additional articles in the issue describe other aspects of the 2008 YM PA. - Highlights: • Extensive work has been carried out by the U.S. DOE in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. • Properties of the early failure scenario classes (i.e. early waste package failure and early drip shield failure) in the 2008 YM performance assessment are described. • Determination of dose, expected dose and expected (mean

  6. Predicting Failure in Early Acute Prosthetic Joint Infection Treated With Debridement, Antibiotics, and Implant Retention: External Validation of the KLIC Score.

    Science.gov (United States)

    Löwik, Claudia A M; Jutte, Paul C; Tornero, Eduard; Ploegmakers, Joris J W; Knobben, Bas A S; de Vries, Astrid J; Zijlstra, Wierd P; Dijkstra, Baukje; Soriano, Alex; Wouthuyzen-Bakker, Marjan

    2018-03-27

    Debridement, antibiotics, and implant retention (DAIR) is a widely used treatment modality for early acute prosthetic joint infection (PJI). A preoperative risk score was previously designed for predicting DAIR failure, consisting of chronic renal failure (K), liver cirrhosis (L), index surgery (I), cemented prosthesis (C), and C-reactive protein >115 mg/L (KLIC). The aim of this study was to validate the KLIC score in an external cohort. We retrospectively evaluated patients with early acute PJI treated with DAIR between 2006 and 2016 in 3 Dutch hospitals. Early acute PJI was defined as infection-related death within 60 days after debridement. A total of 386 patients were included. Failure occurred in 148 patients (38.3%). Patients with KLIC scores of ≤2, 2.5-3.5, 4-5, 5.5-6.5, and ≥7 had failure rates of 27.9%, 37.1%, 49.3%, 54.5%, and 85.7%, respectively (P < .001). The receiver-operating characteristic curve showed an area under the curve of 0.64 (95% confidence interval 0.59-0.69). A KLIC score higher than 6 points showed a specificity of 97.9%. The KLIC score is a relatively good preoperative risk score for DAIR failure in patients with early acute PJI and appears to be most useful in clinical practice for patients with low or high KLIC scores. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Scoring system based on electrocardiogram features to predict the type of heart failure in patients with chronic heart failure

    Directory of Open Access Journals (Sweden)

    Hendry Purnasidha Bagaswoto

    2016-12-01

    Full Text Available ABSTRACT Heart failure is divided into heart failure with reduced ejection fraction (HFrEF and heart failure with preserved ejection fraction (HFpEF. Additional studies are required to distinguish between these two types of HF. A previous study showed that HFrEF is less likely when ECG findings are normal. This study aims to create a scoring system based on ECG findings that will predict the type of HF. We performed a cross-sectional study analyzing ECG and echocardiographic data from 110 subjects. HFrEF was defined as an ejection fraction ≤40%. Fifty people were diagnosed with HFpEF and 60 people suffered from HFrEF. Multiple logistic regression analysis revealed certain ECG variables that were independent predictors of HFrEF i.e., LAH, QRS duration >100 ms, RBBB, ST-T segment changes and prolongation of the QT interval. Based on ROC curve analysis, we obtained a score for HFpEF of -1 to +3, while HFrEF had a score of +4 to +6 with 76% sensitivity, 96% specificity, 95% positive predictive value, an 80% negative predictive value and an accuracy of 86%. The scoring system derived from this study, including the presence or absence of LAH, QRS duration >100 ms, RBBB, ST-T segment changes and prolongation of the QT interval can be used to predict the type of HF with satisfactory sensitivity and specificity

  8. In-Vehicle Dynamic Curve-Speed Warnings at High-Risk Rural Curves

    Science.gov (United States)

    2018-03-01

    Lane-departure crashes at horizontal curves represent a significant portion of fatal crashes on rural Minnesota roads. Because of this, solutions are needed to aid drivers in identifying upcoming curves and inform them of a safe speed at which they s...

  9. FRAC (failure rate analysis code): a computer program for analysis of variance of failure rates. An application user's guide

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.; McInteer, C.R.

    1982-03-01

    Probabilistic risk assessments (PRAs) require estimates of the failure rates of various components whose failure modes appear in the event and fault trees used to quantify accident sequences. Several reliability data bases have been designed for use in providing the necessary reliability data to be used in constructing these estimates. In the nuclear industry, the Nuclear Plant Reliability Data System (NPRDS) and the In-Plant Reliability Data System (IRPDS), among others, were designed for this purpose. An important characteristic of such data bases is the selection and identification of numerous factors used to classify each component that is reported and the subsequent failures of each component. However, the presence of such factors often complicates the analysis of reliability data in the sense that it is inappropriate to group (that is, pool) data for those combinations of factors that yield significantly different failure rate values. These types of data can be analyzed by analysis of variance. FRAC (Failure Rate Analysis Code) is a computer code that performs an analysis of variance of failure rates. In addition, FRAC provides failure rate estimates

  10. The learning curve for hip arthroscopy: a systematic review.

    Science.gov (United States)

    Hoppe, Daniel J; de Sa, Darren; Simunovic, Nicole; Bhandari, Mohit; Safran, Marc R; Larson, Christopher M; Ayeni, Olufemi R

    2014-03-01

    The learning curve for hip arthroscopy is consistently characterized as "steep." The purpose of this systematic review was to (1) identify the various learning curves reported in the literature, (2) examine the evidence supporting these curves, and (3) determine whether this evidence supports an accepted number of cases needed to achieve proficiency. The electronic databases Embase and Medline were screened for any clinical studies reporting learning curves in hip arthroscopy. Two reviewers conducted a full-text review of eligible studies and a hand search of conference proceedings and reference sections of the included articles. Inclusion/exclusion criteria were applied, and a quality assessment was completed for each included article. Descriptive statistics were compiled. We identified 6 studies with a total of 1,063 patients. Studies grouped surgical cases into "early" versus "late" in a surgeon's experience, with 30 cases being the most common cutoff used. Most of these studies used descriptive statistics and operative time and complication rates as measures of competence. Five of 6 studies showed improvement in these measures between early and late experience, but only one study proposed a bona fide curve. This review shows that when 30 cases was used as the cutoff point to differentiate between early and late cases in a surgeon's experience, there were significant reductions in operative time and complication rates. However, there was insufficient evidence to quantify the learning curve and validate 30, or any number of cases, as the point at which the learning curve plateaus. As a result, this number should be interpreted with caution. Level IV, systematic review of Level IV studies. Copyright © 2014 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  11. Calibration of the inertial consistency index to assess road safety on horizontal curves of two-lane rural roads.

    Science.gov (United States)

    Llopis-Castelló, David; Camacho-Torregrosa, Francisco Javier; García, Alfredo

    2018-05-26

    One of every four road fatalities occurs on horizontal curves of two-lane rural roads. To this regard, many studies have been undertaken to analyze the crash risk on this road element. Most of them were based on the concept of geometric design consistency, which can be defined as how drivers' expectancies and road behavior relate. However, none of these studies included a variable which represents and estimates drivers' expectancies. This research presents a new local consistency model based on the Inertial Consistency Index (ICI). This consistency parameter is defined as the difference between the inertial operating speed, which represents drivers' expectations, and the operating speed, which represents road behavior. The inertial operating speed was defined as the weighted average operating speed of the preceding road section. In this way, different lengths, periods of time, and weighting distributions were studied to identify how the inertial operating speed should be calculated. As a result, drivers' expectancies should be estimated considering 15 s along the segment and a linear weighting distribution. This was consistent with drivers' expectancies acquirement process, which is closely related to Short-Term Memory. A Safety Performance Function was proposed to predict the number of crashes on a horizontal curve and consistency thresholds were defined based on the ICI. To this regard, the crash rate increased as the ICI increased. Finally, the proposed consistency model was compared with previous models. As a conclusion, the new Inertial Consistency Index allowed a more accurate estimation of the number of crashes and a better assessment of the consistency level on horizontal curves. Therefore, highway engineers have a new tool to identify where road crashes are more likely to occur during the design stage of both new two-lane rural roads and improvements of existing highways. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Assessment of Estimation Methods ForStage-Discharge Rating Curve in Rippled Bed Rivers

    Directory of Open Access Journals (Sweden)

    P. Maleki

    2016-02-01

    Full Text Available Introduction: Interactionbetweenwater flow characteristics andthe bed erodibilityplays an important role in sediment transport process. In order to reach stability, rivers with deposition or bottom erosion make a different bed form in the riverbed. One way to identify thebehavior of therivers is to study the structure and formation of bed forms within them. Ripples are the smallest of the bed forms. The longitudinal cross section of ripples are usually not symmetrical. The upstream face is long and has a gentle slope, and the downstream face is short and steep. The height of ripples is usually between 0.5 cm and 2 cm; the height ripple is not more than 5 cm. The wave lengths normally do not exceed 30cm, and they are usually within the range of 1 cm to 15 cm. Their occurrence is the result of the unstable viscous layer near the boundary. They can form in both shallow and deep water.With an increase of the flow velocity, the plan form of the ripples gradually develops form straight line to curves and then to a pattern like fish scales, symmetrical or unsymmetrical, as shown in Fig 1. Figure1-The patterndevelopment oftheripple Raudkivi (1966 was the first person that, the flow structure over ripples was investigated experimentally.Hethenestablishseveraldifferent conditionsonthemovingsandbedinanlaboratorychannelconsisted of a rectangular cross-section with base width of 70cm, wasable toform arow ofripples , he wassucceed toform arow ofripples.JafariMianaei and Keshavarzi(2008,studied the turbulentflow betweentwoartificialripples for investigate the change of kinetic energyandshearstress on overripples. The stage- discharge rating curve is one of the most important tools in the hydraulic studies. In alluvial rivers,bed rippled are formed and significantly affect the stage- discharge rating curve. In this research, the effects of two different type of ripples (parallel and flakeshape onthe hydraulic characteristicsof flow were experimentally studied

  13. In-situ failure test in the research tunnel at Olkiluoto

    Energy Technology Data Exchange (ETDEWEB)

    Autio, J.; Johansson, E.; Kirkkomaeki, T. [Saanio and Riekkola Consulting Engineers, Helsinki (Finland); Hakala, M. [Gridpoint Finland Oy (Finland); Heikkilae, E. [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Rock Engineering

    2000-05-01

    A failure test suitable for execution in the Research Tunnel at Olkiluoto has been planned to study the failure of rock in-situ. The objectives of the in-situ failure test is to assess the applicability of numerical modelling codes and methods to the study of rock failure and associated crack propagation and to develop a novel technique to be used to determine the strength of rock in-situ. The objective of this study was to make a preliminary design of the failure test, assess the technical feasibility of the test and to give input information for further numerical modelling of the test. The design of the failure test is reported and results of preliminary modelling are given. The input information for future modelling includes a study of rock properties, fracture propagation in rock, in-situ stresses and the development of techniques for using the expanding agent to produce artificial stress field. The study showed that mechanical properties such as strength of gneissic tonalite, the main rock type in the Research Tunnel, depends highly on the orientation of schistocity. The in-situ failure test was shown to be technically feasible and a state of stress high enough to cause failure can be created artificially by using a proper expansive agent and design. (orig.)

  14. Evaluation of allograft perfusion by radionuclide first-pass study in renal failure following renal transplantation

    International Nuclear Information System (INIS)

    Baillet, G.; Ballarin, J.; Urdaneta, N.; Campos, H.; Vernejoul, P. de; Fermanian, J.; Kellershohn, C.; Kreis, H.

    1986-01-01

    To assess the diagnostic value of indices measured on a first-pass curve, we performed 72 radionuclide renal first-pass studies (RFP) in 21 patients during the early weeks following renal allograft transplantation. The diagnosis was based on standard clinical and biochemical data and on fine needle aspiration biopsy (FNAB) of the transplant. Aortic and renal first-pass curves were filtered using a true low-pass filter and five different indices of renal perfusion were computed, using formulae from the literature. Statistical analysis performed on the aortic and renal indices indicated excellent reproducibility of the isotopic study. Although renal indices presented a rather large scatter, they all discriminated well between normal and rejection. Three indices have a particularly good diagnostic value. In the discrimination between rejection and Acute Tubular Necrosis (ATN), only one index gave satisfying results. The indices, however, indicate that there are probably ATN with an alternation of renal perfusion and rejection episodes where perfusion is almost intact. We conclude that radionuclide first-pass study allows accurate and reproducible quantitation of renal allograft perfusion. The measured parameters are helpful to follow up the course of a post-transplantation renal failure episode and to gain more insight into renal ischemia following transplantation. (orig.)

  15. Micromechanics Based Failure Analysis of Heterogeneous Materials

    Science.gov (United States)

    Sertse, Hamsasew M.

    are performed for both brittle failure/high cycle fatigue (HCF) for negligible plastic strain and ductile failure/low cycle fatigue (LCF) for large plastic strain. The proposed approach is incorporated in SwiftComp and used to predict the initial failure envelope, stress-strain curve for various loading conditions, and fatigue life of heterogeneous materials. The combined effects of strain hardening and progressive fatigue damage on the effective properties of heterogeneous materials are also studied. The capability of the current approach is validated using several representative examples of heterogeneous materials including binary composites, continuous fiber-reinforced composites, particle-reinforced composites, discontinuous fiber-reinforced composites, and woven composites. The predictions of MSG are also compared with the predictions obtained using various micromechanics approaches such as Generalized Methods of Cells (GMC), Mori-Tanaka (MT), and Double Inclusions (DI) and Representative Volume Element (RVE) Analysis (called as 3-dimensional finite element analysis (3D FEA) in this document). This study demonstrates that a micromechanics based failure analysis has a great potential to rigorously and more accurately analyze initiation and progression of damage in heterogeneous materials. However, this approach requires material properties specific to damage analysis, which are needed to be independently calibrated for each constituent.

  16. Insulin resistance and exercise tolerance in heart failure patients

    DEFF Research Database (Denmark)

    Snoer, Martin; Monk-Hansen, Tea; Olsen, Rasmus Huan

    2012-01-01

    Insulin resistance has been linked to exercise intolerance in heart failure patients. The aim of this study was to assess the potential role of coronary flow reserve (CFR), endothelial function and arterial stiffness in explaining this linkage.......Insulin resistance has been linked to exercise intolerance in heart failure patients. The aim of this study was to assess the potential role of coronary flow reserve (CFR), endothelial function and arterial stiffness in explaining this linkage....

  17. Assessing changes in failure probability of dams in a changing climate

    Science.gov (United States)

    Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.

    2017-12-01

    Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.

  18. Validation of prognostic scores to predict short-term mortality in patients with acute-on-chronic liver failure.

    Science.gov (United States)

    Song, Do Seon; Kim, Tae Yeob; Kim, Dong Joon; Kim, Hee Yeon; Sinn, Dong Hyun; Yoon, Eileen L; Kim, Chang Wook; Jung, Young Kul; Suk, Ki Tae; Lee, Sang Soo; Lee, Chang Hyeong; Kim, Tae Hun; Choe, Won Hyeok; Yim, Hyung Joon; Kim, Sung Eun; Baik, Soon Koo; Jang, Jae Young; Kim, Hyoung Su; Kim, Sang Gyune; Yang, Jin Mo; Sohn, Joo Hyun; Choi, Eun Hee; Cho, Hyun Chin; Jeong, Soung Won; Kim, Moon Young

    2018-04-01

    The aim of this study was to validate the chronic liver failure-sequential organ failure assessment score (CLIF-SOFAs), CLIF consortium organ failure score (CLIF-C OFs), CLIF-C acute-on-chronic liver failure score (CLIF-C ACLFs), and CLIF-C acute decompensation score in Korean chronic liver disease patients with acute deterioration. Acute-on-chronic liver failure was defined by either the Asian Pacific Association for the study of the Liver ACLF Research Consortium (AARC) or CLIF-C criteria. The diagnostic performances for short-term mortality were compared by the area under the receiver operating characteristic curve. Among a total of 1470 patients, 252 patients were diagnosed with ACLF according to the CLIF-C (197 patients) or AARC definition (95 patients). As the ACLF grades increased, the survival rates became significantly lower. The areas under the receiver operating characteristic of the CLIF-SOFAs, CLIF-C OFs, and CLIF-C ACLFs were significantly higher than those of the Child-Pugh, model for end-stage liver disease, and model for end-stage liver disease-Na scores in ACLF patients according to the CLIF-C definition (all P < 0.05), but there were no significant differences in patients without ACLF or in patients with ACLF according to the AARC definition. The CLIF-SOFAs, CLIF-C OFs, and CLIF-C ACLFs had higher specificities with a fixed sensitivity than liver specific scores in ACLF patients according to the CLIF-C definition, but not in ACLF patients according to the AARC definition. The CLIF-SOFAs, CLIF-C OFs, and CLIF-C ACLFs are useful scoring systems that provide accurate information on prognosis in patients with ACLF according to the CLIF-C definition, but not the AARC definition. © 2017 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  19. A community based study of failure to thrive in Israel.

    Science.gov (United States)

    Wilensky, D S; Ginsberg, G; Altman, M; Tulchinsky, T H; Ben Yishay, F; Auerbach, J

    1996-01-01

    OBJECTIVE: To examine the characteristics of infants suffering from failure to thrive in a community based cohort in Israel and to ascertain the effect of failure to thrive on their cognitive development. METHODS: By review of records maintained at maternal and child health clinics in Jerusalem and the two of Beit Shemesh, epidemiological data were obtained at age 15 months on a cohort of all babies born in 1991. For each case of failure to thrive, a matched control was selected from the same maternal and child health clinic. At age 20 months, cognitive development was measured, and at 25 months a home visit was carried out to assess maternal psychiatric status by questionnaire, and the HOME assessment was performed to assess the home environment. RESULTS: 3.9% of infants were found to have fallen below the third centile in weight for at least three months during the first year of life. Infants with failure to thrive did not differ from the general population in terms of obstetric or neonatal complications, birth order, or parents' ethnic origin, age, or years of education. The infants with failure to thrive did have lower birthweights and marginally smaller head circumferences at birth. Developmental assessment at 20 months of age showed a DQ of 99.7 v 107.2 in the matched controls, with 11.5% having a DQ below 80, as opposed to only 4.6% of the controls. No differences were found in maternal psychiatric problems as measured by a self report questionnaire. There were, however, significant differences in subscales of the HOME scale. CONCLUSIONS: (1) Infants who suffered from failure to thrive had some physiological predispositions that put them at risk; (2) failure to thrive may be an early marker of families providing suboptimal developmental stimulation. PMID:8869197

  20. Failure to thrive in childhood.

    Science.gov (United States)

    Nützenadel, Walter

    2011-09-01

    Failure to thrive impairs children's weight gain and growth, their defenses against infection, and their psychomotor and intellectual development. This paper is a review of pertinent articles that were published from 1995 to October 2010 and contained the terms "failure to thrive", "underweight", "malnutrition", "malabsorption", "maldigestion" and "refeeding syndrome". The articles were retrieved by a search in the PubMed and Cochrane Library databases. In developed countries, failure to thrive is usually due to an underlying disease. The degree of malnutrition is assessed with anthropometric techniques. For each patient, the underlying disease must be identified and the mechanism of failure to thrive understood, so that proper medical and nutritional treatment can be provided. Nutritional treatment involves either giving more food, or else raising the caloric density of the patient's food. Liquid formulas can be given as a supplement to normal meals or as balanced or unbalanced tube feeds; they can be given orally, through a nasogastric tube, or through a gastrostomy tube. Severely malnourished children with poor oral intake should be treated with parenteral nutrition. To avoid refeeding syndrome in severely malnourished children, food intake should be increased slowly at first, and phosphate, magnesium, and potassium supplements should be given. The proper treatment of failure to thrive in childhood consists of treatment of the underlying illness, combined with nutritional treatment that addresses the mechanism of the accompanying failure to thrive.

  1. The Predictive Value of Ultrasound Learning Curves Across Simulated and Clinical Settings

    DEFF Research Database (Denmark)

    Madsen, Mette E; Nørgaard, Lone N; Tabor, Ann

    2017-01-01

    OBJECTIVES: The aim of the study was to explore whether learning curves on a virtual-reality (VR) sonographic simulator can be used to predict subsequent learning curves on a physical mannequin and learning curves during clinical training. METHODS: Twenty midwives completed a simulation-based tra......OBJECTIVES: The aim of the study was to explore whether learning curves on a virtual-reality (VR) sonographic simulator can be used to predict subsequent learning curves on a physical mannequin and learning curves during clinical training. METHODS: Twenty midwives completed a simulation......-based training program in transvaginal sonography. The training was conducted on a VR simulator as well as on a physical mannequin. A subgroup of 6 participants underwent subsequent clinical training. During each of the 3 steps, the participants' performance was assessed using instruments with established...... settings. RESULTS: A good correlation was found between time needed to achieve predefined performance levels on the VR simulator and the physical mannequin (Pearson correlation coefficient .78; P VR simulator correlated well to the clinical performance scores (Pearson...

  2. Failure internal pressure of spherical steel containments

    International Nuclear Information System (INIS)

    Sanchez Sarmiento, G.

    1985-01-01

    An application of the British CEGB's R6 Failure Assessment Approach to the determination of failure internal pressure of nuclear power plant spherical steel containments is presented. The presence of hypothetical cracks both in the base metal and in the welding material of the containment, with geometrical idealizations according to the ASME Boiler and Pressure Vessel Code (Section XI), was taken into account in order to analyze the sensitivity of the failure assessment with the values of the material fracture properties. Calculations of the elastoplastic collapse load have been performed by means of the Finite Element System SAMCEF. The clean axisymmetric shell (neglecting the influence of nozzles and minor irregularities) and two major penetrations (personnel and emergency locks) have been taken separately into account. Large-strain elastoplastic behaviour of the material was considered in the Code, using lower bounds of true stress-true strain relations obtained by testing a collection of tensile specimens. Assuming the presence of cracks in non-perturbed regions, the reserve factor for test pressure and the failure internal pressure have been determined as a function of the flaw depth. (orig.)

  3. Symphysis-fundal height curve in the diagnosis of fetal growth deviations

    Directory of Open Access Journals (Sweden)

    Djacyr Magna Cabral Freire

    2010-12-01

    Full Text Available OBJECTIVE: To validate a new symphysis-fundal curve for screening fetal growth deviations and to compare its performance with the standard curve adopted by the Brazilian Ministry of Health. METHODS: Observational study including a total of 753 low-risk pregnant women with gestational age above 27 weeks between March to October 2006 in the city of João Pessoa, Northeastern Brazil. Symphisys-fundal was measured using a standard technique recommended by the Brazilian Ministry of Health. Estimated fetal weight assessed through ultrasound using the Brazilian fetal weight chart for gestational age was the gold standard. A subsample of 122 women with neonatal weight measurements was taken up to seven days after estimated fetal weight measurements and symphisys-fundal classification was compared with Lubchenco growth reference curve as gold standard. Sensitivity, specificity, positive and negative predictive values were calculated. The McNemar χ2 test was used for comparing sensitivity of both symphisys-fundal curves studied. RESULTS: The sensitivity of the new curve for detecting small for gestational age fetuses was 51.6% while that of the Brazilian Ministry of Health reference curve was significantly lower (12.5%. In the subsample using neonatal weight as gold standard, the sensitivity of the new reference curve was 85.7% while that of the Brazilian Ministry of Health was 42.9% for detecting small for gestational age. CONCLUSIONS: The diagnostic performance of the new curve for detecting small for gestational age fetuses was significantly higher than that of the Brazilian Ministry of Health reference curve.

  4. Parameter Deduction and Accuracy Analysis of Track Beam Curves in Straddle-type Monorail Systems

    Directory of Open Access Journals (Sweden)

    Xiaobo Zhao

    2015-12-01

    Full Text Available The accuracy of the bottom curve of a PC track beam is strongly related to the production quality of the entire beam. Many factors may affect the parameters of the bottom curve, such as the superelevation of the curve and the deformation of a PC track beam. At present, no effective method has been developed to determine the bottom curve of a PC track beam; therefore, a new technique is presented in this paper to deduce the parameters of such a curve and to control the accuracy of the computation results. First, the domain of the bottom curve of a PC track beam is assumed to be a spindle plane. Then, the corresponding supposed top curve domain is determined based on a geometrical relationship that is the opposite of that identified by the conventional method. Second, several optimal points are selected from the supposed top curve domain according to the dichotomy algorithm; the supposed top curve is thus generated by connecting these points. Finally, one rigorous criterion is established in the fractal dimension to assess the accuracy of the assumed top curve deduced in the previous step. If this supposed curve coincides completely with the known top curve, then the assumed bottom curve corresponding to the assumed top curve is considered to be the real bottom curve. This technique of determining the bottom curve of a PC track beam is thus proven to be efficient and accurate.

  5. Learning Curves of Virtual Mastoidectomy in Distributed and Massed Practice

    DEFF Research Database (Denmark)

    Andersen, Steven Arild Wuyts; Konge, Lars; Cayé-Thomasen, Per

    2015-01-01

    IMPORTANCE: Repeated and deliberate practice is crucial in surgical skills training, and virtual reality (VR) simulation can provide self-directed training of basic surgical skills to meet the individual needs of the trainee. Assessment of the learning curves of surgical procedures is pivotal...... in understanding skills acquisition and best-practice implementation and organization of training. OBJECTIVE: To explore the learning curves of VR simulation training of mastoidectomy and the effects of different practice sequences with the aim of proposing the optimal organization of training. DESIGN, SETTING...... plateaued on a score of 16.0 (15.3-16.7) at approximately the ninth repetition, but the individual learning curves were highly variable. CONCLUSIONS AND RELEVANCE: Novices can acquire basic mastoidectomy competencies with self-directed VR simulation training. Training should be organized with distributed...

  6. The curve shortening problem

    CERN Document Server

    Chou, Kai-Seng

    2001-01-01

    Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.

  7. NormaCurve: a SuperCurve-based method that simultaneously quantifies and normalizes reverse phase protein array data.

    Directory of Open Access Journals (Sweden)

    Sylvie Troncale

    Full Text Available MOTIVATION: Reverse phase protein array (RPPA is a powerful dot-blot technology that allows studying protein expression levels as well as post-translational modifications in a large number of samples simultaneously. Yet, correct interpretation of RPPA data has remained a major challenge for its broad-scale application and its translation into clinical research. Satisfying quantification tools are available to assess a relative protein expression level from a serial dilution curve. However, appropriate tools allowing the normalization of the data for external sources of variation are currently missing. RESULTS: Here we propose a new method, called NormaCurve, that allows simultaneous quantification and normalization of RPPA data. For this, we modified the quantification method SuperCurve in order to include normalization for (i background fluorescence, (ii variation in the total amount of spotted protein and (iii spatial bias on the arrays. Using a spike-in design with a purified protein, we test the capacity of different models to properly estimate normalized relative expression levels. The best performing model, NormaCurve, takes into account a negative control array without primary antibody, an array stained with a total protein stain and spatial covariates. We show that this normalization is reproducible and we discuss the number of serial dilutions and the number of replicates that are required to obtain robust data. We thus provide a ready-to-use method for reliable and reproducible normalization of RPPA data, which should facilitate the interpretation and the development of this promising technology. AVAILABILITY: The raw data, the scripts and the normacurve package are available at the following web site: http://microarrays.curie.fr.

  8. Interim report on the state-of-the-art of solid-state motor controllers. Part 4. Failure-rate and failure-mode data

    International Nuclear Information System (INIS)

    Jaross, R.A.

    1983-09-01

    An assessment of the reliability of solid-state motor controllers for nuclear power plants is made. Available data on failure-rate and failure-mode data for solid-state motor controllers based on industrial operating experience is meager; the data are augmented by data on other solid-state power electronic devices that are shown to have components similar to those found in solid-state motor controllers. In addition to large nonnuclear solid-state adjustable-speed motor drives, the reliability of nuclear plant inverter systems and high-voltage solid-state dc transmission-line converters is assessed. Licensee Event Report analyses from several sources, the open literature, and personal communications are used to determine the realiability of solid-state devices typical of those expected to be used in nuclear power plants in terms of failures per hour

  9. Phonon transport across nano-scale curved thin films

    Energy Technology Data Exchange (ETDEWEB)

    Mansoor, Saad B.; Yilbas, Bekir S., E-mail: bsyilbas@kfupm.edu.sa

    2016-12-15

    Phonon transport across the curve thin silicon film due to temperature disturbance at film edges is examined. The equation for radiative transport is considered via incorporating Boltzmann transport equation for the energy transfer. The effect of the thin film curvature on phonon transport characteristics is assessed. In the analysis, the film arc length along the film centerline is considered to be constant and the film arc angle is varied to obtain various film curvatures. Equivalent equilibrium temperature is introduced to assess the phonon intensity distribution inside the curved thin film. It is found that equivalent equilibrium temperature decay along the arc length is sharper than that of in the radial direction, which is more pronounced in the region close to the film inner radius. Reducing film arc angle increases the film curvature; in which case, phonon intensity decay becomes sharp in the close region of the high temperature edge. Equivalent equilibrium temperature demonstrates non-symmetric distribution along the radial direction, which is more pronounced in the near region of the high temperature edge.

  10. Phonon transport across nano-scale curved thin films

    International Nuclear Information System (INIS)

    Mansoor, Saad B.; Yilbas, Bekir S.

    2016-01-01

    Phonon transport across the curve thin silicon film due to temperature disturbance at film edges is examined. The equation for radiative transport is considered via incorporating Boltzmann transport equation for the energy transfer. The effect of the thin film curvature on phonon transport characteristics is assessed. In the analysis, the film arc length along the film centerline is considered to be constant and the film arc angle is varied to obtain various film curvatures. Equivalent equilibrium temperature is introduced to assess the phonon intensity distribution inside the curved thin film. It is found that equivalent equilibrium temperature decay along the arc length is sharper than that of in the radial direction, which is more pronounced in the region close to the film inner radius. Reducing film arc angle increases the film curvature; in which case, phonon intensity decay becomes sharp in the close region of the high temperature edge. Equivalent equilibrium temperature demonstrates non-symmetric distribution along the radial direction, which is more pronounced in the near region of the high temperature edge.

  11. Worsening renal function definition is insufficient for evaluating acute renal failure in acute heart failure

    Science.gov (United States)

    Hata, Noritake; Kobayashi, Nobuaki; Okazaki, Hirotake; Matsushita, Masato; Shibata, Yusaku; Nishigoori, Suguru; Uchiyama, Saori; Asai, Kuniya; Shimizu, Wataru

    2018-01-01

    Abstract Aims Whether or not the definition of a worsening renal function (WRF) is adequate for the evaluation of acute renal failure in patients with acute heart failure is unclear. Methods and results One thousand and eighty‐three patients with acute heart failure were analysed. A WRF, indicated by a change in serum creatinine ≥0.3 mg/mL during the first 5 days, occurred in 360 patients while no‐WRF, indicated by a change <0.3 mg/dL, in 723 patients. Acute kidney injury (AKI) upon admission was defined based on the ratio of the serum creatinine value recorded on admission to the baseline creatinine value and placed into groups based on the degree of AKI: no‐AKI (n = 751), Class R (risk; n = 193), Class I (injury; n = 41), or Class F (failure; n = 98). The patients were assigned to another set of four groups: no‐WRF/no‐AKI (n = 512), no‐WRF/AKI (n = 211), WRF/no‐AKI (n = 239), and WRF/AKI (n = 121). A multivariate logistic regression model found that no‐WRF/AKI and WRF/AKI were independently associated with 365 day mortality (hazard ratio: 1.916; 95% confidence interval: 1.234–2.974 and hazard ratio: 3.622; 95% confidence interval: 2.332–5.624). Kaplan–Meier survival curves showed that the rate of any‐cause death during 1 year was significantly poorer in the no‐WRF/AKI and WRF/AKI groups than in the WRF/no‐AKI and no‐WRF/no‐AKI groups and in Class I and Class F than in Class R and the no‐AKI group. Conclusions The presence of AKI on admission, especially Class I and Class F status, is associated with a poor prognosis despite the lack of a WRF within the first 5 days. The prognostic ability of AKI on admission may be superior to WRF within the first 5 days. PMID:29388735

  12. The five-point Likert scale for dyspnea can properly assess the degree of pulmonary congestion and predict adverse events in heart failure outpatients

    Directory of Open Access Journals (Sweden)

    Cristina K. Weber

    2014-01-01

    Full Text Available OBJECTIVES: Proper assessment of dyspnea is important in patients with heart failure. Our aim was to evaluate the use of the 5-point Likert scale for dyspnea to assess the degree of pulmonary congestion and to determine the prognostic value of this scale for predicting adverse events in heart failure outpatients. METHODS: We undertook a prospective study of outpatients with moderate to severe heart failure. The 5-point Likert scale was applied during regular outpatient visits, along with clinical assessments. Lung ultrasound with ≥15 B-lines and an amino-terminal portion of pro-B-type natriuretic peptide (NT-proBNP level >1000 pg/mL were used as a reference for pulmonary congestion. The patients were then assessed every 30 days during follow-up to identify adverse clinical outcomes. RESULTS: We included 58 patients (65.5% male, age 43.5±11 years with a mean left ventricular ejection fraction of 27±6%. In total, 29.3% of these patients had heart failure with ischemic etiology. Additionally, pulmonary congestion, as diagnosed by lung ultrasound, was present in 58% of patients. A higher degree of dyspnea (3 or 4 points on the 5-point Likert scale was significantly correlated with a higher number of B-lines (p = 0.016. Patients stratified into Likert = 3-4 were at increased risk of admission compared with those in class 1-2 after adjusting for age, left ventricular ejection fraction, New York Heart Association functional class and levels of NT-proBNP >1000 pg/mL (HR = 4.9, 95% CI 1.33-18.64, p = 0.017. CONCLUSION: In our series, higher baseline scores on the 5-point Likert scale were related to pulmonary congestion and were independently associated with adverse events during follow-up. This simple clinical tool can help to identify patients who are more likely to decompensate and whose treatment should be intensified.

  13. Geometric and electromyographic assessments in the evaluation of curve progression in idiopathic scoliosis

    NARCIS (Netherlands)

    Cheung, J; Veldhuizen, AG; Halberts, JPK; Sluiter, WJ; Van Horn, [No Value

    2006-01-01

    Study Design. The natural history of patients with idiopathic scoliosis was analyzed radiographically and electromyographically in a prospective longitudinal study. Objectives. To identify changes in geometric variables and the sequence in which these changes occur during curve progression in the

  14. GENERIC, COMPONENT FAILURE DATA BASE FOR LIGHT WATER AND LIQUID SODIUM REACTOR PRAs

    Energy Technology Data Exchange (ETDEWEB)

    S. A. Eide; S. V. Chmielewski; T. D. Swantz

    1990-02-01

    A comprehensive generic component failure data base has been developed for light water and liquid sodium reactor probabilistic risk assessments (PRAs) . The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) and the Centralized Reliability Data Organization (CREDO) data bases were used to generate component failure rates . Using this approach, most of the failure rates are based on actual plant data rather than existing estimates .

  15. Aortic regurgitation after valve-sparing aortic root replacement: modes of failure.

    Science.gov (United States)

    Oka, Takanori; Okita, Yutaka; Matsumori, Masamichi; Okada, Kenji; Minami, Hitoshi; Munakata, Hiroshi; Inoue, Takeshi; Tanaka, Akiko; Sakamoto, Toshihito; Omura, Atsushi; Nomura, Takuo

    2011-11-01

    Despite the positive clinical results of valve-sparing aortic root replacement, little is known about the causes of reoperations and the modes of failure. From October 1999 to June 2010, 101 patients underwent valve-sparing aortic root replacement using the David reimplantation technique. The definition of aortic root repair failure included the following: (1) intraoperative conversion to the Bentall procedure; (2) reoperation performed because of aortic regurgitation; and (3) aortic regurgitation equal to or greater than a moderate degree at the follow-up. Sixteen patients were considered to have repair failure. Three patients required intraoperative conversion to valve replacement, 3 required reoperation within 3 months, and another 8 required reoperation during postoperative follow-up. At initial surgery 5 patients had moderate to severe aortic regurgitation, 6 patients had acute aortic dissections, 3 had Marfan syndrome, 2 had status post Ross operations, 3 had bicuspid aortic valves, and 1 had aortitis. Five patients had undergone cusp repair, including Arantius plication in 3 and plication at the commissure in 2. The causes of early failure in 6 patients included cusp perforation (3), cusp prolapse (3), and severe hemolysis (1). The causes of late failure in 10 patients included cusp prolapse (4), commissure dehiscence (3), torn cusp (2), and cusp retraction (1). Patients had valve replacements at a mean of 23 ± 20.9 months after reimplantation and survived. Causes of early failure after valve-sparing root replacement included technical failure, cusp lesions, and steep learning curve. Late failure was caused by aortic root wall degeneration due to gelatin-resorcin-formalin glue, cusp degeneration, or progression of cusp prolapse. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Learning curve for laparoendoscopic single-site surgery for an experienced laparoscopic surgeon

    OpenAIRE

    Pao-Ling Torng; Kuan-Hung Lin; Jing-Shiang Hwang; Hui-Shan Liu; I-Hui Chen; Chi-Ling Chen; Su-Cheng Huang

    2013-01-01

    Objectives: To assess the learning curve and safety of laparoendoscopic single-site (LESS) surgery of gynecological surgeries. Materials and methods: Sixty-three women who underwent LESS surgery by a single experienced laparoscopic surgeon from February 2011 to August 2011 were included. Commercialized single-incision laparoscopic surgery homemade ports were used, along with conventional straight instruments. The learning curve has been defined as the additional surgical time with respect ...

  17. Recurrent IVF failure and hereditary thrombophilia.

    Science.gov (United States)

    Safdarian, Leila; Najmi, Zahra; Aleyasin, Ashraf; Aghahosseini, Marzieh; Rashidi, Mandana; Asadollah, Sara

    2014-07-01

    The largest percentage of failed invitro fertilization (IVF (cycles, are due to lack of implantation. As hereditary thrombophilia can cause in placentation failure, it may have a role in recurrent IVF failure. Aim of this case-control study was to determine whether hereditary thrombophilia is more prevalent in women with recurrent IVF failures. Case group comprised 96 infertile women, with a history of recurrent IVF failure. Control group was comprised of 95 healthy women with proven fertility who had conceived spontaneously. All participants were assessed for the presence of inherited thrombophilias including: factor V Leiden, methilen tetrahydrofolate reductase (MTHFR) mutation, prothrombin mutation, homocystein level, protein S and C deficiency, antithrombin III (AT-III) deficiency and plasminogen activator inhibitor-1 (PAI-1) mutation. Presence of thrombophilia was compared between groups. Having at least one thrombophilia known as a risk factor for recurrent IVF failure (95% CI=1.74-5.70, OR=3.15, p=0.00). Mutation of factor V Leiden (95% CI=1.26-10.27, OR=3.06, P=0.01) and homozygote form of MTHFR mutation (95% CI=1.55-97.86, OR=12.33, p=0.05) were also risk factors for recurrent IVF failure. However, we could not find significant difference in other inherited thrombophilia's. Inherited thrombophilia is more prevalent in women with recurrent IVF failure compared with healthy women. Having at least one thrombophilia, mutation of factor V Leiden and homozygote form of MTHFR mutation were risk factors for recurrent IVF failure.

  18. Experimental analysis of the power curve sensitivity test series at ROSA-III

    International Nuclear Information System (INIS)

    Koizumi, Y.; Iriko, M.; Yonomoto, T.; Tasaka, K.

    1985-01-01

    The rig of safety assessment (ROSA)-III facility is a volumetrically scaled (1/424) boiling water reactor (BWR/6) system with an electrically heated core designed for integral LOCA and ECCS tests. Seven recirculation pump suction line break LOCA experiments were conducted at the ROSA-III facility in order to examine the effect of the initial stored heat of a fuel rod on the peak cladding temperature (PCT). The break size was changed from 200% to 5% in the test series and a failure of a high pressure core spray (HPCS) diesel generator was assumed. Three power curves which represented conservative, realistic and zero initial stored heat, respectively, were used. In a large break LOCA such as 200% or 50% breaks, the initial stored heat in a fuel rod has a large effect on the cladding surface temperature because core uncovery occurs before all the initial stored heat is released, whereas in a small break LOCA such as a 5% break little effect is observed because core uncovery occurs after the initial stored heat is released. The maximum PCTs for the conservative initial stored heat case was 925 K, obtained in the 50% break experiment, and that for the realistic initial stored heat case was 835 K, obtained in the 5% break experiment. (orig./HP)

  19. Use of structure-activity landscape index curves and curve integrals to evaluate the performance of multiple machine learning prediction models.

    Science.gov (United States)

    Ledonne, Norman C; Rissolo, Kevin; Bulgarelli, James; Tini, Leonard

    2011-02-07

    Standard approaches to address the performance of predictive models that used common statistical measurements for the entire data set provide an overview of the average performance of the models across the entire predictive space, but give little insight into applicability of the model across the prediction space. Guha and Van Drie recently proposed the use of structure-activity landscape index (SALI) curves via the SALI curve integral (SCI) as a means to map the predictive power of computational models within the predictive space. This approach evaluates model performance by assessing the accuracy of pairwise predictions, comparing compound pairs in a manner similar to that done by medicinal chemists. The SALI approach was used to evaluate the performance of continuous prediction models for MDR1-MDCK in vitro efflux potential. Efflux models were built with ADMET Predictor neural net, support vector machine, kernel partial least squares, and multiple linear regression engines, as well as SIMCA-P+ partial least squares, and random forest from Pipeline Pilot as implemented by AstraZeneca, using molecular descriptors from SimulationsPlus and AstraZeneca. The results indicate that the choice of training sets used to build the prediction models is of great importance in the resulting model quality and that the SCI values calculated for these models were very similar to their Kendall τ values, leading to our suggestion of an approach to use this SALI/SCI paradigm to evaluate predictive model performance that will allow more informed decisions regarding model utility. The use of SALI graphs and curves provides an additional level of quality assessment for predictive models.

  20. When Does Maluma/Takete Fail? Two Key Failures and a Meta-Analysis Suggest That Phonology and Phonotactics Matter

    OpenAIRE

    Styles, Suzy J.; Gawne, Lauren

    2017-01-01

    Eighty-seven years ago, K?hler reported that the majority of students picked the same answer in a quiz: Which novel word form (?maluma? or ?takete?) went best with which abstract line drawing (one curved, one angular). Others have consistently shown the effect in a variety of contexts, with only one reported failure by Rogers and Ross. In the spirit of transparency, we report our own failure in the same journal. In our study, speakers of Syuba, from the Himalaya in Nepal, do not show a prefer...

  1. Probabilistic Capacity Assessment of Lattice Transmission Towers under Strong Wind

    Directory of Open Access Journals (Sweden)

    Wei eZhang

    2015-10-01

    Full Text Available Serving as one key component of the most important lifeline infrastructure system, transmission towers are vulnerable to multiple nature hazards including strong wind and could pose severe threats to the power system security with possible blackouts under extreme weather conditions, such as hurricanes, derechoes, or winter storms. For the security and resiliency of the power system, it is important to ensure the structural safety with enough capacity for all possible failure modes, such as structural stability. The study is to develop a probabilistic capacity assessment approach for transmission towers under strong wind loads. Due to the complicated structural details of lattice transmission towers, wind tunnel experiments are carried out to understand the complex interactions of wind and the lattice sections of transmission tower and drag coefficients and the dynamic amplification factor for different panels of the transmission tower are obtained. The wind profile is generated and the wind time histories are simulated as a summation of time-varying mean and fluctuating components. The capacity curve for the transmission towers is obtained from the incremental dynamic analysis (IDA method. To consider the stochastic nature of wind field, probabilistic capacity curves are generated by implementing IDA analysis for different wind yaw angles and different randomly generated wind speed time histories. After building the limit state functions based on the maximum allowable drift to height ratio, the probabilities of failure are obtained based on the meteorological data at a given site. As the transmission tower serves as the key nodes for the power network, the probabilistic capacity curves can be incorporated into the performance based design of the power transmission network.

  2. Uncertainty analysis with statistically correlated failure data

    International Nuclear Information System (INIS)

    Modarres, M.; Dezfuli, H.; Roush, M.L.

    1987-01-01

    Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)

  3. Creep test observation of viscoelastic failure of edible fats

    Energy Technology Data Exchange (ETDEWEB)

    Vithanage, C R; Grimson, M J; Wills, P R [Department of Physics, University of Auckland, Private Bag 92019 (New Zealand); Smith, B G, E-mail: cvit002@aucklanduni.ac.nz [Food Science Programmes, Department of Chemistry, University of Auckland, Private Bag 92019 (New Zealand)

    2011-03-01

    A rheological creep test was used to investigate the viscoelastic failure of five edible fats. Butter, spreadable blend and spread were selected as edible fats because they belong to three different groups according to the Codex Alimentarius. Creep curves were analysed according to the Burger model. Results were fitted to a Weibull distribution representing the strain-dependent lifetime of putative fibres in the material. The Weibull shape and scale (lifetime) parameters were estimated for each substance. A comparison of the rheometric measurements of edible fats demonstrated a clear difference between the three different groups. Taken together the results indicate that butter has a lower threshold for mechanical failure than spreadable blend and spread. The observed behaviour of edible fats can be interpreted using a model in which there are two types of bonds between fat crystals; primary bonds that are strong and break irreversibly, and secondary bonds, which are weaker but break and reform reversibly.

  4. Part 5: Receiver Operating Characteristic Curve and Area under the Curve

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2016-04-01

    Full Text Available Multiple diagnostic tools are used by emergency physicians,every day. In addition, new tools are evaluated to obtainmore accurate methods and reduce time or cost of conventionalones. In the previous parts of this educationalseries, we described diagnostic performance characteristicsof diagnostic tests including sensitivity, specificity, positiveand negative predictive values, and likelihood ratios. Thereceiver operating characteristics (ROC curve is a graphicalpresentation of screening characteristics. ROC curve is usedto determine the best cutoff point and compare two or moretests or observers by measuring the area under the curve(AUC. In this part of our educational series, we explain ROCcurve and two methods to determine the best cutoff value.

  5. Percentile curves for skinfold thickness for Canadian children and youth.

    Science.gov (United States)

    Kuhle, Stefan; Ashley-Martin, Jillian; Maguire, Bryan; Hamilton, David C

    2016-01-01

    Background. Skinfold thickness (SFT) measurements are a reliable and feasible method for assessing body fat in children but their use and interpretation is hindered by the scarcity of reference values in representative populations of children. The objective of the present study was to develop age- and sex-specific percentile curves for five SFT measures (biceps, triceps, subscapular, suprailiac, medial calf) in a representative population of Canadian children and youth. Methods. We analyzed data from 3,938 children and adolescents between 6 and 19 years of age who participated in the Canadian Health Measures Survey cycles 1 (2007/2009) and 2 (2009/2011). Standardized procedures were used to measure SFT. Age- and sex-specific centiles for SFT were calculated using the GAMLSS method. Results. Percentile curves were materially different in absolute value and shape for boys and girls. Percentile girls in girls steadily increased with age whereas percentile curves in boys were characterized by a pubertal centered peak. Conclusions. The current study has presented for the first time percentile curves for five SFT measures in a representative sample of Canadian children and youth.

  6. Study on Flexible Pavement Failures in Soft Soil Tropical Regions

    Science.gov (United States)

    Jayakumar, M.; Chee Soon, Lee

    2015-04-01

    Road network system experienced rapid upgrowth since ages ago and it started developing in Malaysia during the colonization of British due to its significant impacts in transportation field. Flexible pavement, the major road network in Malaysia, has been deteriorating by various types of distresses which cause descending serviceability of the pavement structure. This paper discusses the pavement condition assessment carried out in Sarawak and Sabah, Malaysia to have design solutions for flexible pavement failures. Field tests were conducted to examine the subgrade strength of existing roads in Sarawak at various failure locations, to assess the impact of subgrade strength on pavement failures. Research outcomes from field condition assessment and subgrade testing showed that the critical causes of pavement failures are inadequate design and maintenance of drainage system and shoulder cross fall, along with inadequate pavement thickness provided by may be assuming the conservative value of soil strength at optimum moisture content, whereas the exiting and expected subgrade strengths at equilibrium moisture content are far below. Our further research shows that stabilized existing recycled asphalt and base materials to use as a sub-base along with bitumen stabilized open graded base in the pavement composition may be a viable solution for pavement failures.

  7. Failure of Grass Covered Flood Defences with Roads on Top Due to Wave Overtopping: A Probabilistic Assessment Method

    Directory of Open Access Journals (Sweden)

    Juan P. Aguilar-López

    2018-06-01

    Full Text Available Hard structures, i.e., roads, are commonly found over flood defences, such as dikes, in order to ensure access and connectivity between flood protected areas. Several climate change future scenario studies have concluded that flood defences will be required to withstand more severe storms than the ones used for their original design. Therefore, this paper presents a probabilistic methodology to assess the effect of a road on top of a dike: it gives the failure probability of the grass cover due to wave overtopping over a wide range of design storms. The methodology was developed by building two different dike configurations in computational fluid dynamics Navier–Stokes solution software; one with a road on top and one without a road. Both models were validated with experimental data collected from field-scale experiments. Later, both models were used to produce data sets for training simpler and faster emulators. These emulators were coupled to a simplified erosion model which allowed testing storm scenarios which resulted in local scouring conditioned statistical failure probabilities. From these results it was estimated that the dike with a road has higher probabilities (5 × 10−5 > Pf >1 × 10−4 of failure than a dike without a road (Pf < 1 × 10−6 if realistic grass quality spatial distributions were assumed. The coupled emulator-erosion model was able to yield realistic probabilities, given all the uncertainties in the modelling process and it seems to be a promising tool for quantifying grass cover erosion failure.

  8. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  9. A brief measure of social media self-control failure

    NARCIS (Netherlands)

    Du, Jie; van Koningsbruggen, Guido M.; Kerkhof, Peter

    People often fail in controlling their social media use when it conflicts with other goals and obligations. To facilitate research on understanding social media self-control failures, we constructed a brief social media self-control failure (SMSCF)-scale to assess how often social media users give

  10. Method of construction spatial transition curve

    Directory of Open Access Journals (Sweden)

    S.V. Didanov

    2013-04-01

    Full Text Available Purpose. The movement of rail transport (speed rolling stock, traffic safety, etc. is largely dependent on the quality of the track. In this case, a special role is the transition curve, which ensures smooth insertion of the transition from linear to circular section of road. The article deals with modeling of spatial transition curve based on the parabolic distribution of the curvature and torsion. This is a continuation of research conducted by the authors regarding the spatial modeling of curved contours. Methodology. Construction of the spatial transition curve is numerical methods for solving nonlinear integral equations, where the initial data are taken coordinate the starting and ending points of the curve of the future, and the inclination of the tangent and the deviation of the curve from the tangent plane at these points. System solutions for the numerical method are the partial derivatives of the equations of the unknown parameters of the law of change of torsion and length of the transition curve. Findings. The parametric equations of the spatial transition curve are calculated by finding the unknown coefficients of the parabolic distribution of the curvature and torsion, as well as the spatial length of the transition curve. Originality. A method for constructing the spatial transition curve is devised, and based on this software geometric modeling spatial transition curves of railway track with specified deviations of the curve from the tangent plane. Practical value. The resulting curve can be applied in any sector of the economy, where it is necessary to ensure a smooth transition from linear to circular section of the curved space bypass. An example is the transition curve in the construction of the railway line, road, pipe, profile, flat section of the working blades of the turbine and compressor, the ship, plane, car, etc.

  11. Photoelectic BV Light Curves of Algol and the Interpretations of the Light Curves

    Directory of Open Access Journals (Sweden)

    Ho-Il Kim

    1985-06-01

    Full Text Available Standardized B and V photoelectric light curves of Algol are made with the observations obtained during 1982-84 with the 40-cm and the 61-cm reflectors of Yonsei University Observatory. These light curves show asymmetry between ascending and descending shoulders. The ascending shoulder is 0.02 mag brighter than descending shoulder in V light curve and 0.03 mag in B light curve. These asymmetric light curves are interpreted as the result of inhomogeneous energy distribution on the surface of one star of the eclipsing pair rather than the result of gaseous stream flowing from KOIV to B8V star. The 180-year periodicity, so called great inequality, are most likely the result proposed by Kim et al. (1983 that the abrupt and discrete mass losses of cooler component may be the cause of this orbital change. The amount of mass loss deduced from these discrete period changes turned out to be of the order of 10^(-6 - 10^(-5 Msolar.

  12. A Journey Between Two Curves

    Directory of Open Access Journals (Sweden)

    Sergey A. Cherkis

    2007-03-01

    Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.

  13. Technical basis for the extension of ASME Code Case N-494 for assessment of austenitic piping

    International Nuclear Information System (INIS)

    Bloom, J.M.

    1995-01-01

    In 1990, the ASME Boiler and Pressure Vessel Code for Nuclear Components approved Code Case N-494 as an alternative procedure for evaluating laws in Light Water Reactor alterative procedure for evaluating flaws in Light Water Reactor (LWR) ferritic piping. The approach is an alternative to Appendix H of the ASME Code and alloys the user to remove some unnecessary conservatism in the existing procedure by allowing the use of pipe specific material properties. The Code Case is an implementation of the methodology of the Deformation Plasticity Failure Assessment diagram (DPFAD). The key ingredient in the application of DPFAD is that the material stress-strain curve must be in the format of a simple power law hardening stress-strain curve such as the Ramberg-Osgood (R-O) model. Ferritic materials can be accurately fit by the R-O model and, therefore, it was natural to use the DPFAD methodology for the assessment of LWR ferritic piping. An extension of Code Case N-494 to austenitic piping required a modification of the existing DPFAD methodology. The Code Case N-494 approach was revised using the PWFAD procedure in the same manner as in the development of the original N-494 approach for ferritic materials. A lower bound stress-strain curve was used to generate a PWFAD curve for the geometry of a part-through wall circumferential flaw in a cylinder under tension. Earlier work demonstrated that a cylinder under axial tension with a 50% flaw depth, 90 degrees in circumference, and radius to thickness of 10, produced a lower bound FAD curve. Validation of the new proposed Code Case procedure for austenitic piping was performed using actual pipe test data. Using the lower bound PWFAD curve, pipe test results were conservatively predicted. The resultant development of ht PWFAD curve for austenitic piping led to a revision of Code Case N-494 to include a procedure for assessment of flaws in austenitic piping

  14. An Expectation Maximization Algorithm to Model Failure Times by Continuous-Time Markov Chains

    Directory of Open Access Journals (Sweden)

    Qihong Duan

    2010-01-01

    Full Text Available In many applications, the failure rate function may present a bathtub shape curve. In this paper, an expectation maximization algorithm is proposed to construct a suitable continuous-time Markov chain which models the failure time data by the first time reaching the absorbing state. Assume that a system is described by methods of supplementary variables, the device of stage, and so on. Given a data set, the maximum likelihood estimators of the initial distribution and the infinitesimal transition rates of the Markov chain can be obtained by our novel algorithm. Suppose that there are m transient states in the system and that there are n failure time data. The devised algorithm only needs to compute the exponential of m×m upper triangular matrices for O(nm2 times in each iteration. Finally, the algorithm is applied to two real data sets, which indicates the practicality and efficiency of our algorithm.

  15. An appraisal of the learning curve in robotic general surgery.

    Science.gov (United States)

    Pernar, Luise I M; Robertson, Faith C; Tavakkoli, Ali; Sheu, Eric G; Brooks, David C; Smink, Douglas S

    2017-11-01

    Robotic-assisted surgery is used with increasing frequency in general surgery for a variety of applications. In spite of this increase in usage, the learning curve is not yet defined. This study reviews the literature on the learning curve in robotic general surgery to inform adopters of the technology. PubMed and EMBASE searches yielded 3690 abstracts published between July 1986 and March 2016. The abstracts were evaluated based on the following inclusion criteria: written in English, reporting original work, focus on general surgery operations, and with explicit statistical methods. Twenty-six full-length articles were included in final analysis. The articles described the learning curves in colorectal (9 articles, 35%), foregut/bariatric (8, 31%), biliary (5, 19%), and solid organ (4, 15%) surgery. Eighteen of 26 (69%) articles report single-surgeon experiences. Time was used as a measure of the learning curve in all studies (100%); outcomes were examined in 10 (38%). In 12 studies (46%), the authors identified three phases of the learning curve. Numbers of cases needed to achieve plateau performance were wide-ranging but overlapping for different kinds of operations: 19-128 cases for colorectal, 8-95 for foregut/bariatric, 20-48 for biliary, and 10-80 for solid organ surgery. Although robotic surgery is increasingly utilized in general surgery, the literature provides few guidelines on the learning curve for adoption. In this heterogeneous sample of reviewed articles, the number of cases needed to achieve plateau performance varies by case type and the learning curve may have multiple phases as surgeons add more complex cases to their case mix with growing experience. Time is the most common determinant for the learning curve. The literature lacks a uniform assessment of outcomes and complications, which would arguably reflect expertise in a more meaningful way than time to perform the operation alone.

  16. Bond yield curve construction

    Directory of Open Access Journals (Sweden)

    Kožul Nataša

    2014-01-01

    Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.

  17. Failure Modes and Effects Analysis (FMEA): A Bibliography

    Science.gov (United States)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  18. Management of Mechanical Ventilation in Decompensated Heart Failure

    Directory of Open Access Journals (Sweden)

    Brooks T. Kuhn

    2016-12-01

    Full Text Available Mechanical ventilation (MV is a life-saving intervention for respiratory failure, including decompensated congestive heart failure. MV can reduce ventricular preload and afterload, decrease extra-vascular lung water, and decrease the work of breathing in heart failure. The advantages of positive pressure ventilation must be balanced with potential harm from MV: volutrauma, hyperoxia-induced injury, and difficulty assessing readiness for liberation. In this review, we will focus on cardiac, pulmonary, and broader effects of MV on patients with decompensated HF, focusing on practical considerations for management and supporting evidence.

  19. Methodology for failure assessment of SMART SG tube with once-through helical-coiled type

    International Nuclear Information System (INIS)

    Kim, Young Jin; Choi, Shin Beom; Cho, Doo Ho; Chang, Yoon Suk

    2010-09-01

    In this research project, existing integrity evaluation method for SMART steam generator tube with crack-like flaw was reviewed to determine subject analysis model and investigate possibility of failure under crack closure behavior. Furthermore, failure pressure estimation was proposed for SMART steam generator tubes containing wear-type defects. For each subject, the following issues are addressed: 1. Determination of subject analysis model for SMART SG tube contaning crack-like flaw 2. Applicability review on existing integrity evaluation method and investigation of failure possibility for SMART SG tube containing crack-like flaw 3. Development of failure pressure estimation model for SMART SG tube with wear type defect It is anticipated that if the technologies developed in this study are applied, structural integrity can be estimated accurately

  20. The Spectrum of Renal Allograft Failure.

    Directory of Open Access Journals (Sweden)

    Sourabh Chand

    Full Text Available Causes of "true" late kidney allograft failure remain unclear as study selection bias and limited follow-up risk incomplete representation of the spectrum.We evaluated all unselected graft failures from 2008-2014 (n = 171; 0-36 years post-transplantation by contemporary classification of indication biopsies "proximate" to failure, DSA assessment, clinical and biochemical data.The spectrum of graft failure changed markedly depending on the timing of allograft failure. Failures within the first year were most commonly attributed to technical failure, acute rejection (with T-cell mediated rejection [TCMR] dominating antibody-mediated rejection [ABMR]. Failures beyond a year were increasingly dominated by ABMR and 'interstitial fibrosis with tubular atrophy' without rejection, infection or recurrent disease ("IFTA". Cases of IFTA associated with inflammation in non-scarred areas (compared with no inflammation or inflammation solely within scarred regions were more commonly associated with episodes of prior rejection, late rejection and nonadherence, pointing to an alloimmune aetiology. Nonadherence and late rejection were common in ABMR and TCMR, particularly Acute Active ABMR. Acute Active ABMR and nonadherence were associated with younger age, faster functional decline, and less hyalinosis on biopsy. Chronic and Chronic Active ABMR were more commonly associated with Class II DSA. C1q-binding DSA, detected in 33% of ABMR episodes, were associated with shorter time to graft failure. Most non-biopsied patients were DSA-negative (16/21; 76.1%. Finally, twelve losses to recurrent disease were seen (16%.This data from an unselected population identifies IFTA alongside ABMR as a very important cause of true late graft failure, with nonadherence-associated TCMR as a phenomenon in some patients. It highlights clinical and immunological characteristics of ABMR subgroups, and should inform clinical practice and individualised patient care.

  1. Curve Digitizer – A software for multiple curves digitizing

    Directory of Open Access Journals (Sweden)

    Florentin ŞPERLEA

    2010-06-01

    Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document

  2. TU-AB-BRD-02: Failure Modes and Effects Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huq, M. [University of Pittsburgh Medical Center (United States)

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before a failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to

  3. TU-AB-BRD-02: Failure Modes and Effects Analysis

    International Nuclear Information System (INIS)

    Huq, M.

    2015-01-01

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before a failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to

  4. StAR: a simple tool for the statistical comparison of ROC curves

    Directory of Open Access Journals (Sweden)

    Melo Francisco

    2008-06-01

    Full Text Available Abstract Background As in many different areas of science and technology, most important problems in bioinformatics rely on the proper development and assessment of binary classifiers. A generalized assessment of the performance of binary classifiers is typically carried out through the analysis of their receiver operating characteristic (ROC curves. The area under the ROC curve (AUC constitutes a popular indicator of the performance of a binary classifier. However, the assessment of the statistical significance of the difference between any two classifiers based on this measure is not a straightforward task, since not many freely available tools exist. Most existing software is either not free, difficult to use or not easy to automate when a comparative assessment of the performance of many binary classifiers is intended. This constitutes the typical scenario for the optimization of parameters when developing new classifiers and also for their performance validation through the comparison to previous art. Results In this work we describe and release new software to assess the statistical significance of the observed difference between the AUCs of any two classifiers for a common task estimated from paired data or unpaired balanced data. The software is able to perform a pairwise comparison of many classifiers in a single run, without requiring any expert or advanced knowledge to use it. The software relies on a non-parametric test for the difference of the AUCs that accounts for the correlation of the ROC curves. The results are displayed graphically and can be easily customized by the user. A human-readable report is generated and the complete data resulting from the analysis are also available for download, which can be used for further analysis with other software. The software is released as a web server that can be used in any client platform and also as a standalone application for the Linux operating system. Conclusion A new software for

  5. Fuel failure detection and location methods in CAGRs

    International Nuclear Information System (INIS)

    Harris, A.M.

    1982-06-01

    The release of fission products from AGR fuel failures and the way in which the signals from such failures must be detected against the background signal from uranium contamination of the fuel is considered. Theoretical assessments of failure detection are used to show the limitations of the existing Electrostatic Wire Precipitator Burst Can Detection system (BCD) and how its operating parameters can be optimised. Two promising alternative methods, the 'split count' technique and the use of iodine measurements, are described. The results of a detailed study of the mechanical and electronic performance of the present BCD trolleys are given. The limited experience of detection and location of two fuel failures in CAGR using conventional and alternative methods is reviewed. The larger failure was detected and located using the conventional BCD equipment with a high confidence level. It is shown that smaller failures may not be easy to detect and locate using the current BCD equipment, and the second smaller failure probably remained in the reactor for about a year before it was discharged. The split count technique used with modified BCD equipment was able to detect the smaller failure after careful inspection of the data. (author)

  6. Single shell tank sluicing history and failure frequency

    International Nuclear Information System (INIS)

    HERTZEL, J.S.

    1998-01-01

    This document assesses the potential for failure of the single-shell tanks (SSTs) that are presumably sound and helps to establish the retrieval priorities for these and the assumed leakers. Furthermore, this report examines probabilities of SST failure as a function of age and operational history, and provides a simple statistical summary of historical leak volumes, leak rates, and corrosion factor

  7. Estimating reaction rate constants: comparison between traditional curve fitting and curve resolution

    NARCIS (Netherlands)

    Bijlsma, S.; Boelens, H. F. M.; Hoefsloot, H. C. J.; Smilde, A. K.

    2000-01-01

    A traditional curve fitting (TCF) algorithm is compared with a classical curve resolution (CCR) approach for estimating reaction rate constants from spectral data obtained in time of a chemical reaction. In the TCF algorithm, reaction rate constants an estimated from the absorbance versus time data

  8. A catalog of special plane curves

    CERN Document Server

    Lawrence, J Dennis

    2014-01-01

    Among the largest, finest collections available-illustrated not only once for each curve, but also for various values of any parameters present. Covers general properties of curves and types of derived curves. Curves illustrated by a CalComp digital incremental plotter. 12 illustrations.

  9. Application of environmentally-corrected fatigue curves to nuclear power plant components

    International Nuclear Information System (INIS)

    Ware, A.G.; Morton, D.K.; Nitzel, M.E.

    1996-01-01

    Recent test data indicate that the effects of the light water reactor (LWR) environment could significantly reduce the fatigue resistance of materials used in the reactor coolant pressure boundary components of operating nuclear power plants. Argonne National Laboratory has developed interim fatigue curves based on test data simulating LWR conditions, and published them in NUREG/CR-5999. In order to assess the significance of these interim fatigue curves, fatigue evaluations of a sample of the components in the reactor coolant pressure boundary of LWRs were performed. The sample consists of components from facilities designed by each of the four US nuclear steam supply system vendors. For each facility, six locations were studied including two locations on the reactor pressure vessel. In addition, there are older vintage plants where components of the reactor coolant pressure boundary were designed to codes that did not require an explicit fatigue analysis of the components. In order to assess the fatigue resistance of the older vintage plants, an evaluation was also conducted on selected components of three of these plants. This paper discusses the insights gained from the application of the interim fatigue curves to components of seven operating nuclear power plants

  10. Diagnostic tests’ decision-making rules based upon analysis of ROC-curves

    Directory of Open Access Journals (Sweden)

    Л. В. Батюк

    2015-10-01

    Full Text Available In this paper we propose the model which substantiates diagnostics decision making based on the analysis of Receiver Operating Characteristic curves (ROC-curves and predicts optimal values of diagnostic indicators of biomedical information. To assess the quality of the test result prediction the standard criteria of the sensitivity and specificity of the model were used. Values of these criteria were calculated for the cases when the sensitivity of the test was greater than specificity by several times, when the number of correct diagnoses was maximal, when the sensitivity of the test was equal to its specificity and the sensitivity of the test was several times greater than the specificity of the test. To assess the significance of the factor characteristics and to compare the prognostic characteristics of models we used mathematical modeling and plotting the ROC-curves. The optimal value of the diagnostic indicator was found to be achieved when the sensitivity of the test is equal to its specificity. The model was adapted to solve the case when the sensitivity of the test is greater than specificity of the test.

  11. Intersection numbers of spectral curves

    CERN Document Server

    Eynard, B.

    2011-01-01

    We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the Marino-Vafa formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV, Marino-Vafa formula, and Mumford formula.

  12. Evaluation of the behavior of waterlogged fuel rod failures in LWRs

    International Nuclear Information System (INIS)

    Siegel, B.

    1977-11-01

    A summary of the available information on waterlogged fuel rod failures is presented. The information includes experimental results from waterlogging tests in research reactors, observations of waterlogging failures in commercial reactors, and reactor vendor assessments. It is concluded that (a) operating restrictions to reduce pellet/cladding interactions also reduce the potential for waterlogging failures during transients, (b) tests to simulate accident conditions produced the worst waterlogging failures, and (c) there is no apparent threat from waterlogging failures to the overall coolability of the core or to safe reactor shutdown

  13. The failure of earthquake failure models

    Science.gov (United States)

    Gomberg, J.

    2001-01-01

    In this study I show that simple heuristic models and numerical calculations suggest that an entire class of commonly invoked models of earthquake failure processes cannot explain triggering of seismicity by transient or "dynamic" stress changes, such as stress changes associated with passing seismic waves. The models of this class have the common feature that the physical property characterizing failure increases at an accelerating rate when a fault is loaded (stressed) at a constant rate. Examples include models that invoke rate state friction or subcritical crack growth, in which the properties characterizing failure are slip or crack length, respectively. Failure occurs when the rate at which these grow accelerates to values exceeding some critical threshold. These accelerating failure models do not predict the finite durations of dynamically triggered earthquake sequences (e.g., at aftershock or remote distances). Some of the failure models belonging to this class have been used to explain static stress triggering of aftershocks. This may imply that the physical processes underlying dynamic triggering differs or that currently applied models of static triggering require modification. If the former is the case, we might appeal to physical mechanisms relying on oscillatory deformations such as compaction of saturated fault gouge leading to pore pressure increase, or cyclic fatigue. However, if dynamic and static triggering mechanisms differ, one still needs to ask why static triggering models that neglect these dynamic mechanisms appear to explain many observations. If the static and dynamic triggering mechanisms are the same, perhaps assumptions about accelerating failure and/or that triggering advances the failure times of a population of inevitable earthquakes are incorrect.