WorldWideScience

Sample records for failure assessment curve

  1. The elastic-plastic failure assessment diagram of surface cracked structure

    International Nuclear Information System (INIS)

    Ning, J.; Gao, Q.

    1987-01-01

    The simplified NLSM is able to calculate the EPFM parameters and failure assessment curve for the surface cracked structure correctly and conveniently. The elastic-plastic failure assessment curve of surface crack is relevant to crack geometry, loading form and material deformation behaviour. It is necessary to construct the EPFM failure assessment curve of the surface crack for the failure assessment of surface cracked structure. (orig./HP)

  2. Failure assessment diagrams for circular hollow section X- and K-joints

    International Nuclear Information System (INIS)

    Qian, Xudong

    2013-01-01

    This paper reports the failure assessment curves for semi-elliptical surface cracks located at hot-spot positions in the circular hollow section X- and K-joints. The failure assessment curves derive from the square root of the ratio between the linear–elastic and the elastic–plastic energy release rates, computed from the domain-integral approach. This study examines both the material and geometric dependence of the failure assessment curves. The area reduction factor, used in defining the strength of the cracked joints, imposes a significant effect on the computed failure assessment curve. The failure assessment curves indicate negligible variations with respect to the crack-front locations and the material yield strength. The crack depth ratio exerts a stronger effect on the computed failure assessment curve than does the crack aspect ratio. This study proposes a parametric expression for the failure assessment curves based on the geometric parameters for surface cracks in circular hollow section X- and K-joints. -- Highlights: ► This study proposes geometric dependent expressions of FADs for tubular joints. ► We examine the geometric and material dependence of the FADs for X- and K-joints. ► The proposed FAD is independent of yield strength and is a lower-bound for typical hardening

  3. Experience Curves: A Tool for Energy Policy Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Neij, Lena; Helby, Peter [Lund Univ. (Sweden). Environmental and Energy Systems Studies; Dannemand Andersen, Per; Morthorst, Poul Erik [Riso National Laboratory, Roskilde (Denmark); Durstewitz, Michael; Hoppe-Kilpper, Martin [Inst. fuer Solare Energieversorgungstechnik e.V., Kassel (DE); and others

    2003-07-01

    The objective of the project, Experience curves: a tool for energy policy assessment (EXTOOL), was to analyse the experience curve as a tool for the assessment of energy policy measures. This is of special interest, since the use of experience curves for the assessment of energy policy measures requires the development of the established experience curve methodology. This development raises several questions which have been addressed and analysed in this project. The analysis is based on case studies of wind power, an area with considerable experience in technology development, deployment and policy measures. Therefore, a case study based on wind power provides a good opportunity to study the usefulness of experience curves as a tool for the assessment of energy policy measures. However, the results are discussed in terms of using experience curves for the assessment of any energy technology. The project shows that experience curves can be used to assess the effect of combined policy measures in terms of cost reductions. Moreover, the result of the project show that experience curves could be used to analyse international 'learning systems', i.e. cost reductions brought about by the development of wind power and policy measures used in other countries. Nevertheless, the use of experience curves for the assessment of policy programmes has several limitations. First, the analysis and assessment of policy programmes cannot be achieved unless relevant experience curves based on good data can be developed. The authors are of the opinion that only studies that provide evidence of the validity, reliability and relevance of experience curves should be taken into account in policy making. Second, experience curves provide an aggregated picture of the situation and more detailed analysis of various sources of cost reduction, and cost reductions resulting from individual policy measures, requires additional data and analysis tools. Third, we do not recommend the use of

  4. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  5. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  6. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  7. Analysis of leak and break behavior in a failure assessment diagram for carbon steel pipes

    International Nuclear Information System (INIS)

    Kanno, Satoshi; Hasegawa, Kunio; Shimizu, Tasuku; Saitoh, Takashi; Gotoh, Nobuho

    1992-01-01

    The leak and break behavior of a cracked coolant pipe subjected to an internal pressure and a bending moment was analyzed with a failure assessment diagram using the R6 approach. This paper examines the conditions of the detectable coolant leakage without breakage. A leakage assessment curve, a locus of assessment point for detectable coolant leakage, was defined in the failure assessment diagram. The region between the leak assessment and failure assessment curves satisfies the condition of detectable leakage without breakage. In this region, a crack can be safely inspected by a coolant leak detector. (orig.)

  8. The Component And System Reliability Analysis Of Multipurpose Reactor G.A. Subway's Based On The Failure Rate Curve

    International Nuclear Information System (INIS)

    Sriyono; Ismu Wahyono, Puradwi; Mulyanto, Dwijo; Kusmono, Siamet

    2001-01-01

    The main component of Multipurpose G.A.Siwabessy had been analyzed by its failure rate curve. The main component ha'..e been analyzed namely, the pump of ''Fuel Storage Pool Purification System'' (AK-AP), ''Primary Cooling System'' (JE01-AP), ''Primary Pool Purification System'' (KBE01-AP), ''Warm Layer System'' (KBE02-AP), ''Cooling Tower'' (PA/D-AH), ''Secondary Cooling System'', and Diesel (BRV). The Failure Rate Curve is made by component database that was taken from 'log book' operation of RSG GAS. The total operation of that curve is 2500 hours. From that curve it concluded that the failure rate of components form of bathtub curve. The maintenance processing causes the curve anomaly

  9. A simplified early-warning system for imminent landslide prediction based on failure index fragility curves developed through numerical analysis

    Directory of Open Access Journals (Sweden)

    Ugur Ozturk

    2016-07-01

    Full Text Available Early-warning systems (EWSs are crucial to reduce the risk of landslide, especially where the structural measures are not fully capable of preventing the devastating impact of such an event. Furthermore, designing and successfully implementing a complete landslide EWS is a highly complex task. The main technical challenges are linked to the definition of heterogeneous material properties (geotechnical and geomechanical parameters as well as a variety of the triggering factors. In addition, real-time data processing creates a significant complexity, since data collection and numerical models for risk assessment are time consuming tasks. Therefore, uncertainties in the physical properties of a landslide together with the data management represent the two crucial deficiencies in an efficient landslide EWS. Within this study the application is explored of the concept of fragility curves to landslides; fragility curves are widely used to simulate systems response to natural hazards, i.e. floods or earthquakes. The application of fragility curves to landslide risk assessment is believed to simplify emergency risk assessment; even though it cannot substitute detailed analysis during peace-time. A simplified risk assessment technique can remove some of the unclear features and decrease data processing time. The method is based on synthetic samples which are used to define the approximate failure thresholds for landslides, taking into account the materials and the piezometric levels. The results are presented in charts. The method presented in this paper, which is called failure index fragility curve (FIFC, allows assessment of the actual real-time risk in a case study that is based on the most appropriate FIFC. The application of an FIFC to a real case is presented as an example. This method to assess the landslide risk is another step towards a more integrated dynamic approach to a potential landslide prevention system. Even if it does not define

  10. Comparing risk of failure models in water supply networks using ROC curves

    International Nuclear Information System (INIS)

    Debon, A.; Carrion, A.; Cabrera, E.; Solano, H.

    2010-01-01

    The problem of predicting the failure of water mains has been considered from different perspectives and using several methodologies in engineering literature. Nowadays, it is important to be able to accurately calculate the failure probabilities of pipes over time, since water company profits and service quality for citizens depend on pipe survival; forecasting pipe failures could have important economic and social implications. Quantitative tools (such as managerial or statistical indicators and reliable databases) are required in order to assess the current and future state of networks. Companies managing these networks are trying to establish models for evaluating the risk of failure in order to develop a proactive approach to the renewal process, instead of using traditional reactive pipe substitution schemes. The main objective of this paper is to compare models for evaluating the risk of failure in water supply networks. Using real data from a water supply company, this study has identified which network characteristics affect the risk of failure and which models better fit data to predict service breakdown. The comparison using the receiver operating characteristics (ROC) graph leads us to the conclusion that the best model is a generalized linear model. Also, we propose a procedure that can be applied to a pipe failure database, allowing the most appropriate decision rule to be chosen.

  11. Comparing risk of failure models in water supply networks using ROC curves

    Energy Technology Data Exchange (ETDEWEB)

    Debon, A., E-mail: andeau@eio.upv.e [Centro de Gestion de la Calidad y del Cambio, Dpt. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica de Valencia, E-46022 Valencia (Spain); Carrion, A. [Centro de Gestion de la Calidad y del Cambio, Dpt. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica de Valencia, E-46022 Valencia (Spain); Cabrera, E. [Dpto. De Ingenieria Hidraulica Y Medio Ambiente, Instituto Tecnologico del Agua, Universidad Politecnica de Valencia, E-46022 Valencia (Spain); Solano, H. [Universidad Diego Portales, Santiago (Chile)

    2010-01-15

    The problem of predicting the failure of water mains has been considered from different perspectives and using several methodologies in engineering literature. Nowadays, it is important to be able to accurately calculate the failure probabilities of pipes over time, since water company profits and service quality for citizens depend on pipe survival; forecasting pipe failures could have important economic and social implications. Quantitative tools (such as managerial or statistical indicators and reliable databases) are required in order to assess the current and future state of networks. Companies managing these networks are trying to establish models for evaluating the risk of failure in order to develop a proactive approach to the renewal process, instead of using traditional reactive pipe substitution schemes. The main objective of this paper is to compare models for evaluating the risk of failure in water supply networks. Using real data from a water supply company, this study has identified which network characteristics affect the risk of failure and which models better fit data to predict service breakdown. The comparison using the receiver operating characteristics (ROC) graph leads us to the conclusion that the best model is a generalized linear model. Also, we propose a procedure that can be applied to a pipe failure database, allowing the most appropriate decision rule to be chosen.

  12. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  13. Assessment of Nonorganic Failure To Thrive.

    Science.gov (United States)

    Wooster, Donna M.

    1999-01-01

    This article describes basic assessment considerations for infants and toddlers exhibiting nonorganic failure to thrive. The evaluation process must examine feeding, maternal-child interactions, child temperament, and environmental risks and behaviors. Early identification and intervention are necessary to minimize the long-term developmental…

  14. Renal function assessment in heart failure.

    Science.gov (United States)

    Pérez Calvo, J I; Josa Laorden, C; Giménez López, I

    Renal function is one of the most consistent prognostic determinants in heart failure. The prognostic information it provides is independent of the ejection fraction and functional status. This article reviews the various renal function assessment measures, with special emphasis on the fact that the patient's clinical situation and response to the heart failure treatment should be considered for the correct interpretation of the results. Finally, we review the literature on the performance of tubular damage biomarkers. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  15. Risk assessment of tailings facility dam failure

    OpenAIRE

    Hadzi-Nikolova, Marija; Mirakovski, Dejan; Stefanova, Violeta

    2011-01-01

    This paper presents the consequences of tailings facility dam failure and therefore the needs for its risk assessment. Tailings are fine-grained wastes of the mining industry, output as slurries, due to mixing with water during mineral processing. Tailings dams vary a lot as it is affected by: tailings characteristics and mill output, site characteristics as: topography, hydrology, geology, groundwater, seismicity and available material and disposal methods. The talings which accumulat...

  16. Assessing Risks of Mine Tailing Dam Failures

    Science.gov (United States)

    Concha Larrauri, P.; Lall, U.

    2017-12-01

    The consequences of tailings dam failures can be catastrophic for communities and ecosystems in the vicinity of the dams. The failure of the Fundão tailings dam at the Samarco mine in 2015 killed 19 people with severe consequences for the environment. The financial and legal consequences of a tailings dam failure can also be significant for the mining companies. For the Fundão tailings dam, the company had to pay 6 billion dollars in fines and twenty-one executives were charged with qualified murder. There are tenths of thousands of active, inactive, and abandoned tailings dams in the world and there is a need to better understand the hazards posed by these structures to downstream populations and ecosystems. A challenge to assess the risks of tailings dams in a large scale is that many of them are not registered in publicly available databases and there is little information about their current physical state. Additionally, hazard classifications of tailings dams - common in many countries- tend to be subjective, include vague parameter definitions, and are not always updated over time. Here we present a simple methodology to assess and rank the exposure to tailings dams using ArcGIS that removes subjective interpretations. The method uses basic information such as current dam height, storage volume, topography, population, land use, and hydrological data. A hazard rating risk was developed to compare the potential extent of the damage across dams. This assessment provides a general overview of what in the vicinity of the tailings dams could be affected in case of a failure and a way to rank tailings dams that is directly linked to the exposure at any given time. One hundred tailings dams in Minas Gerais, Brazil were used for the test case. This ranking approach could inform the risk management strategy of the tailings dams within a company, and when disclosed, it could enable shareholders and the communities to make decisions on the risks they are taking.

  17. Failure detection system risk reduction assessment

    Science.gov (United States)

    Aguilar, Robert B. (Inventor); Huang, Zhaofeng (Inventor)

    2012-01-01

    A process includes determining a probability of a failure mode of a system being analyzed reaching a failure limit as a function of time to failure limit, determining a probability of a mitigation of the failure mode as a function of a time to failure limit, and quantifying a risk reduction based on the probability of the failure mode reaching the failure limit and the probability of the mitigation.

  18. Statistical assessment of the learning curves of health technologies.

    Science.gov (United States)

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second

  19. Variations of fracture toughness and stress-strain curve of cold worked stainless steel and their influence on failure strength of cracked pipe

    International Nuclear Information System (INIS)

    Kamaya, Masayuki

    2016-01-01

    In order to assess failure probability of cracked components, it is important to know the variations of the material properties and their influence on the failure load assessment. In this study, variations of the fracture toughness and stress-strain curve were investigated for cold worked stainless steel. The variations of the 0.2% proof and ultimate strengths obtained using 8 specimens of 20% cold worked stainless steel (CW20) were 77 MPa and 81 MPa, respectively. The respective variations were decreased to 13 and 21 MPa for 40% cold worked material (CW40). Namely, the variation in the tensile strength was decreased by hardening. The COVs (coefficients of variation) of fracture toughness were 7.3% and 16.7% for CW20 and CW40, respectively. Namely, the variation in the fracture toughness was increased by hardening. Then, in order to investigate the influence of the variations in the material properties on failure load of a cracked pipe, flaw assessments were performed for a cracked pipe subjected to a global bending load. Using the obtained material properties led to variation in the failure load. The variation in the failure load of the cracked pipe caused by the variation in the stress-strain curve was less than 1.5% for the COV. The variation in the failure load caused by fracture toughness variation was relatively large for CW40, although it was less than 2.0% for the maximum case. It was concluded that the hardening induced by cold working does not cause significant variation in the failure load of cracked stainless steel pipe. (author)

  20. Assessing the impact of windfarms - the learning curve in Cornwall

    International Nuclear Information System (INIS)

    Hull, A.

    1998-01-01

    This paper uses windfarm application decisions in Cornwall between 1989 and 1995 to illustrate the learning curve of planners in assessing appropriate windfarm locations, and in particular how the process of knowledge construction is constantly reviewed and modified in the light of experience and circumstance. One of the accepted purposes of Environmental Impact Assessment is to predict the possible effects, both beneficial and adverse, of the development on the environment. In practice what is beneficial and what is adverse can be a matter of dispute. The paper draws out the role of the planning system in assessing what is problematic or benign, and the practical strategies and procedures used to assess and control the environmental impacts of wind energy schemes. (Author)

  1. Probabilistic assessment of roadway departure risk in a curve

    Science.gov (United States)

    Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.

    2011-10-01

    Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.

  2. Disease assessment and prognosis of liver failure

    Directory of Open Access Journals (Sweden)

    ZHANG Jing

    2016-09-01

    Full Text Available Liver failure has a high fatality rate and greatly threatens human health. Liver transplantation can effectively reduce the fatality rate. However, the problems such as donor shortage and allograft rejection limit the wide application of liver transplantation. An accurate early assessment helps to evaluate patients′ condition and optimize therapeutic strategies. At present, commonly used systems for prognostic evaluation include the King′s College Hospital, MELD, integrated MELD, Child-Pugh score, CLIF-SOFA, CLIF-C ACLFS, and D-MELD, and each system has its own advantages and disadvantages. Among these systems, the MELD scoring system is the most commonly used one, and the D-MELD scoring system is the most innovative one, which can be used for patients on the waiting list for liver transplantation. This article elaborates on the characteristics and predictive value of each scoring system in clinical practice.

  3. Hypertonic Saline in Conjunction with High-Dose Furosemide Improves Dose-Response Curves in Worsening Refractory Congestive Heart Failure.

    Science.gov (United States)

    Paterna, Salvatore; Di Gaudio, Francesca; La Rocca, Vincenzo; Balistreri, Fabio; Greco, Massimiliano; Torres, Daniele; Lupo, Umberto; Rizzo, Giuseppina; di Pasquale, Pietro; Indelicato, Sergio; Cuttitta, Francesco; Butler, Javed; Parrinello, Gaspare

    2015-10-01

    Diuretic responsiveness in patients with chronic heart failure (CHF) is better assessed by urine production per unit diuretic dose than by the absolute urine output or diuretic dose. Diuretic resistance arises over time when the plateau rate of sodium and water excretion is reached prior to optimal fluid elimination and may be overcome when hypertonic saline solution (HSS) is added to high doses of furosemide. Forty-two consecutively hospitalized patients with refractory CHF were randomized in a 1:1:1 ratio to furosemide doses (125 mg, 250 mg, 500 mg) so that all patients received intravenous furosemide diluted in 150 ml of normal saline (0.9%) in the first step (0-24 h) and the same furosemide dose diluted in 150 ml of HSS (1.4%) in the next step (24-48 h) as to obtain 3 groups as follows: Fourteen patients receiving 125 mg (group 1), fourteen patients receiving 250 mg (group 2), and fourteen patients receiving 500 mg (group 3) of furosemide. Urine samples of all patients were collected at 30, 60, and 90 min, and 3, 4, 5, 6, 8, and 24 h after infusion. Diuresis, sodium excretion, osmolality, and furosemide concentration were evaluated for each urine sample. After randomization, 40 patients completed the study. Two patients, one in group 2 and one in group 3 dropped out. Patients in group 1 (125 mg furosemide) had a mean age of 77 ± 17 years, 43% were male, 6 (43%) had heart failure with a preserved ejection fraction (HFpEF), and 64% were in New York Heart Association (NYHA) class IV; the mean age of patients in group 2 (250 mg furosemide) was 80 ± 8.1 years, 15% were male, 5 (38%) had HFpEF, and 84% were in NYHA class IV; and the mean age of patients in group 3 (500 mg furosemide) was 73 ± 12 years, 54% were male, 6 (46%) had HFpEF, and 69% were in NYHA class IV. HSS added to furosemide increased total urine output, sodium excretion, urinary osmolality, and furosemide urine delivery in all patients and at all time points. The percentage increase was 18,14, and

  4. Dynamic loads during failure risk assessment of bridge crane structures

    Science.gov (United States)

    Gorynin, A. D.; Antsev, V. Yu; Shaforost, A. N.

    2018-03-01

    The paper presents the method of failure risk assessment associated with a bridge crane metal structure at the design stage. It also justifies the necessity of taking into account dynamic loads with regard to the operational cycle of a bridge crane during failure risk assessment of its metal structure.

  5. Improving FMEA risk assessment through reprioritization of failures

    Science.gov (United States)

    Ungureanu, A. L.; Stan, G.

    2016-08-01

    Most of the current methods used to assess the failure and to identify the industrial equipment defects are based on the determination of Risk Priority Number (RPN). Although conventional RPN calculation is easy to understand and use, the methodology presents some limitations, such as the large number of duplicates and the difficulty of assessing the RPN indices. In order to eliminate the afore-mentioned shortcomings, this paper puts forward an easy and efficient computing method, called Failure Developing Mode and Criticality Analysis (FDMCA), which takes into account the failures and the defect evolution in time, from failure appearance to a breakdown.

  6. A big data analysis approach for rail failure risk assessment

    NARCIS (Netherlands)

    Jamshidi, A.; Faghih Roohi, S.; Hajizadeh, S.; Nunez Vicencio, Alfredo; Babuska, R.; Dollevoet, R.P.B.J.; Li, Z.; De Schutter, B.H.K.

    2017-01-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by

  7. Parametric and quantitative analysis of MR renographic curves for assessing the functional behaviour of the kidney

    Energy Technology Data Exchange (ETDEWEB)

    Michoux, N.; Montet, X.; Pechere, A.; Ivancevic, M.K.; Martin, P.-Y.; Keller, A.; Didier, D.; Terrier, F.; Vallee, J.-P

    2005-04-01

    The aim of this study was to refine the description of the renal function based on MR images and through transit-time curve analysis on a normal population and on a population with renal failure, using the quantitative model of the up-slope. Thirty patients referred for a kidney MR exam were divided in a first population with well-functioning kidneys and in a second population with renal failure from ischaemic kidney disease. The perfusion sequence consisted of an intravenous injection of Gd-DTPA and of a fast GRE sequence T1-TFE with 90 deg. magnetisation preparation (Intera 1.5 T MR System, Philips Medical System). To convert the signal intensity into 1/T1, which is proportional to the contrast media concentration, a flow-corrected calibration procedure was used. Following segmentation of regions of interest in the cortex and medulla of the kidney and in the abdominal aorta, outflow curves were obtained and filtered to remove the high frequency fluctuations. The model of the up-slope method was then applied. Significant reduction of the cortical perfusion (Q{sub c}=0.057{+-}0.030 ml/(s 100 g) to Q{sub c}=0.030{+-}0.017 ml/(s 100 g), P<0.013), of the medullary perfusion (Q{sub m}=0.023{+-}0.018 ml/(s 100 g) to Q{sub m}=0.011{+-}0.006 ml/(s 100 g), P<0.046) and of the accumulation of contrast media in the medulla (Q{sub a}=0.005{+-}0.003 ml/(s 100 g) to Q{sub a}=0.0009{+-}0.0008 ml/(s 100 g), P<0.001) were found in presence of renal failure. High correlations were found between the creatinine level and the accumulation Q{sub a} in the medulla (r{sup 2}=0.72, P<0.05), and between the perfusion ratio Q{sub c}/Q{sub m} and the accumulation Q{sub a} in the medulla (r{sup 2}=0.81, P<0.05). No significant difference was found in times to peak between both populations despite a trend showing T{sub a} the time to the end of the increasing contrast accumulation period in the medulla, arriving later for renal failure. Advances in MR signal calibration with the building of

  8. Common-Cause Failure Analysis in Event Assessment

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  9. Critical crack path assessments in failure investigations

    Directory of Open Access Journals (Sweden)

    Robert D. Caligiuri

    2015-10-01

    Full Text Available This paper presents a case study in which identification of the controlling crack path was critical to identifying the root cause of the failure. The case involves the rupture of a 30-inch (0.76 m natural gas pipeline in 2010 that tragically led to the destruction of a number of homes and the loss of life. The segment of the pipeline that ruptured was installed in 1956. The longitudinal seam of the segment that ruptured was supposed to have been fabricated by double submerged arc welding. Unfortunately, portions of the segment only received a single submerged arc weld on the outside, leaving unwelded areas on the inside diameter. Post-failure examination of the segment revealed that the rupture originated at one of these unwelded areas. Examination also revealed three additional crack paths or zones emanating from the unwelded area: a zone of ductile tearing, a zone of fatigue, and a zone of cleavage fracture, in that sequence. Initial investigators ignored the ductile tear, assumed the critical crack path was the fatigue component, and (incorrectly concluded that the root cause of the incident was the failure of the operator to hydrotest the segment after it was installed in 1956. However, as discussed in this paper, the critical path or mechanism was the ductile tear. Furthermore, it was determined that the ductile tear was created during the hydrotest at installation by a mechanism known as pressure reversal. Thus the correct root cause of the rupture was the hydrotest the operator subjected the segment to at installation, helping to increase the awareness of operators and regulators about the potential problems associated with hydrotesting.

  10. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    Science.gov (United States)

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  11. Sequential Organ Failure Assessment Score for Evaluating Organ Failure and Outcome of Severe Maternal Morbidity in Obstetric Intensive Care

    Directory of Open Access Journals (Sweden)

    Antonio Oliveira-Neto

    2012-01-01

    Full Text Available Objective. To evaluate the performance of Sequential Organ Failure Assessment (SOFA score in cases of severe maternal morbidity (SMM. Design. Retrospective study of diagnostic validation. Setting. An obstetric intensive care unit (ICU in Brazil. Population. 673 women with SMM. Main Outcome Measures. mortality and SOFA score. Methods. Organ failure was evaluated according to maximum score for each one of its six components. The total maximum SOFA score was calculated using the poorest result of each component, reflecting the maximum degree of alteration in systemic organ function. Results. highest total maximum SOFA score was associated with mortality, 12.06 ± 5.47 for women who died and 1.87 ± 2.56 for survivors. There was also a significant correlation between the number of failing organs and maternal mortality, ranging from 0.2% (no failure to 85.7% (≥3 organs. Analysis of the area under the receiver operating characteristic (ROC curve (AUC confirmed the excellent performance of total maximum SOFA score for cases of SMM (AUC = 0.958. Conclusions. Total maximum SOFA score proved to be an effective tool for evaluating severity and estimating prognosis in cases of SMM. Maximum SOFA score may be used to conceptually define and stratify the degree of severity in cases of SMM.

  12. Failure assessment of pressure vessels under yielding conditions

    International Nuclear Information System (INIS)

    Harrison, R.P.; Darlaston, B.J.L.; Townley, C.H.A.

    1977-01-01

    The paper summarizes the work carried out to establish the behavior of structures containing defects and outlines a failure assessment route which can be used to assess the integrity of a structure containing a defect. The basis for this failure assessment route is the two-criteria approach of Dowling and Townley, which can be applied to structures containing defects irrespective of whether they are in the linear elastic fracture mechanics regime, the fully plastic regime, or in an intermediate regime. The extension of this concept to include crack growth by stable tearing is dealt with in the paper

  13. Assessing the Impacts of Multiple Breadbasket Failures

    Science.gov (United States)

    Casellas Connors, J. P.; Janetos, A.

    2016-12-01

    A relatively small area of the world accounts for a large proportion of total global cereal production, with most of the area devoted to the production of the world's three major cereal crops, rice, wheat and maize. An extensive literature of the sensitivity of agricultural productivity of these crops, and many others, has arisen over the past 25 years, with a general consensus that continued change in the physical climate system will very likely increase the difficulty of agricultural production in areas of the world that are already marginal with respect to production. But what this research only rarely does is assess the influence of extreme events in shocking agricultural production, and how the rest of the agricultural system reacts, in terms of prices, food insecurity, subsequent land-use change, and terrestrial carbon emissions, among many other possible responses. Because the agricultural system is interlinked with energy systems, food distribution and transportation systems, and economic systems, models that focus only on agricultural productivity can only provide a unidimensional view of the magnitude of potential impacts. We know such impacts can occur as a consequence of extreme climatic events, because they have - the impact of the severe regional drought and heat wave on the Russian and Ukrainian wheat harvests in 2010 had global consequences for food prices, just as one example. In this paper, we use an Integrated Assessment Model, the Global Change Assessment Model (GCAM), to investigate the potential outcomes of both moderate and severe shocks to agricultural productivity in the major breadbaskets of the world - both singly and in combination. The results demonstrate clearly that there are likely to be multidimensional consequences from the kinds of shocks that are possible from a rapidly changing climate system, especially when combined with other demographic and economic trends in the coming decades. These results are only one aspect of

  14. Semiconductor failure threshold estimation problem in electromagnetic assessment

    International Nuclear Information System (INIS)

    Enlow, E.W.; Wunsch, D.C.

    1984-01-01

    Present semiconductor failure models to predict the one-microsecond square-wave power failure level for use with system electromagnetic (EM) assessments and hardening design are incomplete. This is because for a majority of device types there is insufficient data readily available in a composite data source to quantify the model parameters and the inaccuracy of the models cause complications in definition of adequate hardness margins and quantification of EM performance. This paper presents new semiconductor failure models which use a generic approach that are an integration and simplification of many present models. This generic approach uses two categorical models: one for diodes and transistors, and one for integrated circuits. The models were constructed from a large database of semiconductor failure data. The approach used for constructing diode and transistor failure level models is based on device rated power and are simple to use and universally applicable. The model predicts the value of the 1 μ second failure power to be used in the power failure models P = Kt /SUP -1/2/ or P = K 1 t -1 + K 2 t /SUP -1/2/ + K 3

  15. A Big Data Analysis Approach for Rail Failure Risk Assessment.

    Science.gov (United States)

    Jamshidi, Ali; Faghih-Roohi, Shahrzad; Hajizadeh, Siamak; Núñez, Alfredo; Babuska, Robert; Dollevoet, Rolf; Li, Zili; De Schutter, Bart

    2017-08-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  16. Analysis of dependent failures in risk assessment and reliability evaluation

    International Nuclear Information System (INIS)

    Fleming, K.N.; Mosleh, A.; Kelley, A.P. Jr.; Gas-Cooled Reactors Associates, La Jolla, CA)

    1983-01-01

    The ability to estimate the risk of potential reactor accidents is largely determined by the ability to analyze statistically dependent multiple failures. The importance of dependent failures has been indicated in recent probabilistic risk assessment (PRA) studies as well as in reports of reactor operating experiences. This article highlights the importance of several different types of dependent failures from the perspective of the risk and reliability analyst and provides references to the methods and data available for their analysis. In addition to describing the current state of the art, some recent advances, pitfalls, misconceptions, and limitations of some approaches to dependent failure analysis are addressed. A summary is included of the discourse on this subject, which is presented in the Institute of Electrical and Electronics Engineers/American Nuclear Society PRA Procedures Guide

  17. Failure rate data for fusion safety and risk assessment

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1993-01-01

    The Fusion Safety Program (FSP) at the Idaho National Engineering Laboratory (INEL) conducts safety research in materials, chemical reactions, safety analysis, risk assessment, and in component research and development to support existing magnetic fusion experiments and also to promote safety in the design of future experiments. One of the areas of safety research is applying probabilistic risk assessment (PRA) methods to fusion experiments. To apply PRA, we need a fusion-relevant radiological dose code and a component failure rate data base. This paper describes the FSP effort to develop a failure rate data base for fusion-specific components

  18. Consequences assessment for fuel channel failure with consequential moderator drain

    International Nuclear Information System (INIS)

    Wahba, N.N.; Bayoumi, M.H.

    2002-01-01

    This paper documents the consequences of spontaneous pressure tube/consequential calandria tube rupture followed by the ejection of end fittings (as a result of guillotine failure of pressure tube) leading to the drain of the moderator. The event is postulated to occur in conjunction with an independent failure of Emergency Coolant Injection System (ECIS). The results of the detailed consequence assessments are used to propose a course of action to mitigate the consequences of such an event. A methodology based on a lumped-parameter model was developed to assess the consequences of the postulated event. (author)

  19. Reliability of dune erosion assessment along curved coastlines

    NARCIS (Netherlands)

    Hoonhout, B.M.; Den Heijer, C.

    2010-01-01

    The dune assessment methods used to ensure the safety of the lower areas in The Netherlands are based on simple empirical relations that are, strictly speaking, only valid for infinitely long, uniform and straight coasts. The wide application of these relations is mainly justified due to intentional

  20. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  1. Experimental Assessment of Tensile Failure Characteristic for Advanced Composite Laminates

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Keon [Agency for Defense Development, Daejeon (Korea, Republic of); Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-10-15

    In recent years, major airplane manufacturers have been using the laminate failure theory to estimate the strain of composite structures for airplanes. The laminate failure theory uses the failure strain of the laminate to analyze composite structures. This paper describes a procedure for the experimental assessment of laminate tensile failure characteristics. Regression analysis was used as the experimental assessment method. The regression analysis was performed with the response variable being the laminate failure strain and with the regressor variables being two-ply orientation (0° and ±45°) variables. The composite material in this study is a carbon/epoxy unidirectional (UD) tape that was cured as a pre-preg at 177°C(350°F). A total of 149 tension tests were conducted on specimens from 14 distinct laminates that were laid up at standard angle layers (0°, 45°, -45°, and 90°). The ASTM-D-3039 standard was used as the test method.

  2. Preventing blood transfusion failures: FMEA, an effective assessment method.

    Science.gov (United States)

    Najafpour, Zhila; Hasoumi, Mojtaba; Behzadi, Faranak; Mohamadi, Efat; Jafary, Mohamadreza; Saeedi, Morteza

    2017-06-30

    Failure Mode and Effect Analysis (FMEA) is a method used to assess the risk of failures and harms to patients during the medical process and to identify the associated clinical issues. The aim of this study was to conduct an assessment of blood transfusion process in a teaching general hospital, using FMEA as the method. A structured FMEA was recruited in our study performed in 2014, and corrective actions were implemented and re-evaluated after 6 months. Sixteen 2-h sessions were held to perform FMEA in the blood transfusion process, including five steps: establishing the context, selecting team members, analysis of the processes, hazard analysis, and developing a risk reduction protocol for blood transfusion. Failure modes with the highest risk priority numbers (RPNs) were identified. The overall RPN scores ranged from 5 to 100 among which, four failure modes were associated with RPNs over 75. The data analysis indicated that failures with the highest RPNs were: labelling (RPN: 100), transfusion of blood or the component (RPN: 100), patient identification (RPN: 80) and sampling (RPN: 75). The results demonstrated that mis-transfusion of blood or blood component is the most important error, which can lead to serious morbidity or mortality. Provision of training to the personnel on blood transfusion, knowledge raising on hazards and appropriate preventative measures, as well as developing standard safety guidelines are essential, and must be implemented during all steps of blood and blood component transfusion.

  3. Experimental Assessment of Tensile Failure Characteristic for Advanced Composite Laminates

    International Nuclear Information System (INIS)

    Lee, Myoung Keon; Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon

    2017-01-01

    In recent years, major airplane manufacturers have been using the laminate failure theory to estimate the strain of composite structures for airplanes. The laminate failure theory uses the failure strain of the laminate to analyze composite structures. This paper describes a procedure for the experimental assessment of laminate tensile failure characteristics. Regression analysis was used as the experimental assessment method. The regression analysis was performed with the response variable being the laminate failure strain and with the regressor variables being two-ply orientation (0° and ±45°) variables. The composite material in this study is a carbon/epoxy unidirectional (UD) tape that was cured as a pre-preg at 177°C(350°F). A total of 149 tension tests were conducted on specimens from 14 distinct laminates that were laid up at standard angle layers (0°, 45°, -45°, and 90°). The ASTM-D-3039 standard was used as the test method.

  4. Application of Master Curve Methodology for Structural Integrity Assessments of Nuclear Components

    Energy Technology Data Exchange (ETDEWEB)

    Sattari-Far, Iradj [Det Norske Veritas, Stockholm (Sweden); Wallin, Kim [VTT, Esbo (Finland)

    2005-10-15

    The objective was to perform an in-depth investigation of the Master Curve methodology and also based on this method develop a procedure for fracture assessments of nuclear components. The project has sufficiently illustrated the capabilities of the Master Curve methodology for fracture assessments of nuclear components. Within the scope of this work, the theoretical background of the methodology and its validation on small and large specimens has been studied and presented to a sufficiently large extent, as well as the correlations between the charpy-V data and the Master Curve T{sub 0} reference temperature in the evaluation of fracture toughness. The work gives a comprehensive report of the background theory and the different applications of the Master Curve methodology. The main results of the work have shown that the cleavage fracture toughness is characterized by a large amount of statistical scatter in the transition region, it is specimen size dependent and it should be treated statistically rather than deterministically. The Master Curve methodology is able to make use of statistical data in a consistent way. Furthermore, the Master Curve methodology provides a more precise prediction of the fracture toughness of embrittled materials in comparison with the ASME K{sub IC} reference curve, which often gives over-conservative results. The suggested procedure in this study, concerning the application of the Master Curve method in fracture assessments of ferritic steels in the transition region and the low shelf regions, is valid for the temperatures range T{sub 0}-50{<=}T{<=}T{sub 0}+50 deg C. If only approximate information is required, the Master Curve may well be extrapolated outside this temperature range. The suggested procedure has also been illustrated for some examples.

  5. A unified approach to failure assessment of engineering structures

    International Nuclear Information System (INIS)

    Harrison, R.P.

    1977-01-01

    A codified procedure for the failure assessment of engineering structures is presented which has as its basis the two criteria approach of Dowling and Townley (Int. J. Press. Vessels and Piping; 3:77 (1975)) and the Bilby, Cottrell and Swinden (Proc. R. Soc.; A272:304 (1963)) and Dugdale (J. Mech. Phys. Sol.; 8:100 (1960)) model of yielding ahead of a crack tip. The procedure consists of independently assessing the risk of failure (a) under linear elastic conditions only and (b) under plastic collapse conditions only. These two limiting criteria are then plotted as a co-ordinate point on a Failure Assessment Diagram. From this a measure of the degree of safety of the structure can be obtained. As examples, several of the HSST vessel tests are used to indicate the simplicity and versatility of the procedure. It is shown how maximum allowable pressures or defect sizes can be obtained and how safety factors can be readily incorporated on any of the parameters used in the assessment. It is also demonstrated how helpful the procedure is in designing not only working structures, but also structures that are to be used for testing. (author)

  6. The shape of a strain-based failure assessment diagram

    International Nuclear Information System (INIS)

    Budden, P.J.; Ainsworth, R.A.

    2012-01-01

    There have been a number of recent developments of strain-based fracture assessment approaches, including proposals by Budden [Engng Frac Mech 2006;73:537–52] for a strain-based failure assessment diagram (FAD) related to the conventional stress-based FAD. However, recent comparisons with finite element (FE) data have shown that this proposed strain-based FAD can be non-conservative in some cases, particularly for deeper cracks and materials with little strain-hardening capacity. Therefore, this paper re-examines the shape of the strain-based FAD, guided by these FE analyses and some theoretical analysis. On this basis, modified proposals for the shape of the strain-based FAD are given, including simplified and more detailed options in line with the options available for stress-based FADs in existing fitness-for-service procedures. The proposals are then illustrated by a worked example and by comparison with FE data, which demonstrate that the new proposals are generally conservative. - Highlights: ► The strain-based failure assessment diagram approach to fracture is developed. ► The new approach modifies earlier proposals by Budden. ► A new generic Option 1 strain-based failure assessment diagram is proposed. ► Validation based on finite element J data for plates and cylinders is presented. ► The new approach is generally conservative compared with the finite element data.

  7. FIB/FESEM experimental and analytical assessment of R-curve behavior of WC–Co cemented carbides

    Energy Technology Data Exchange (ETDEWEB)

    Tarragó, J.M., E-mail: jose.maria.tarrago@upc.edu [CIEFMA, Departament de Ciència dels Materials i Enginyeria Metallúrgica, ETSEIB, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); CRnE, Centre de Recerca en Nanoenginyeria, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); Jiménez-Piqué, E. [CIEFMA, Departament de Ciència dels Materials i Enginyeria Metallúrgica, ETSEIB, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); CRnE, Centre de Recerca en Nanoenginyeria, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); Schneider, L. [Sandvik Hyperion, Coventry CV4 0XG (United Kingdom); Casellas, D. [Fundació CTM Centre Tecnològic, 08243 Manresa (Spain); Torres, Y. [Departamento de Ingeniería y Ciencia de los Materiales y del Transporte, ETSI, Universidad de Sevilla, 41092 Sevilla (Spain); Llanes, L. [CIEFMA, Departament de Ciència dels Materials i Enginyeria Metallúrgica, ETSEIB, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain); CRnE, Centre de Recerca en Nanoenginyeria, Universitat Politècnica de Catalunya, 08028 Barcelona (Spain)

    2015-10-01

    Exceptional fracture toughness levels exhibited by WC–Co cemented carbides (hardmetals) are due mainly to toughening derived from plastic stretching of crack-bridging ductile enclaves. This takes place due to the development of a multiligament zone at the wake of cracks growing in a stable manner. As a result, hardmetals exhibit crack growth resistance (R-curve) behavior. In this work, the toughening mechanics and mechanisms of these materials are investigated by combining experimental and analytical approaches. Focused Ion Beam technique (FIB) and Field-Emission Scanning Electron Microscopy (FESEM) are implemented to obtain serial sectioning and imaging of crack–microstructure interaction in cracks arrested after stable extension under monotonic loading. The micrographs obtained provide experimental proof of the developing multiligament zone, including failure micromechanisms within individual bridging ligaments. Analytical assessment of the multiligament zone is then conducted on the basis of experimental information attained from FIB/FESEM images, and a model for the description of R-curve behavior of hardmetals is proposed. It was found that, due to the large stresses supported by the highly constrained and strongly bonded bridging ligaments, WC–Co cemented carbides exhibit quite steep but short R-curve behavior. Relevant strength and reliability attributes exhibited by hardmetals may then be rationalized on the basis of such toughening scenario.

  8. Evaluation of flaws in ferritic piping: ASME Code Appendix J, Deformation Plasticity Failure Assessment Diagram (DPFAD)

    International Nuclear Information System (INIS)

    Bloom, J.M.

    1991-08-01

    This report summarizes the methods and bases used by an ASME Code procedure for the evaluation of flaws in ferritic piping. The procedure is currently under consideration by the ASME Boiler and Pressure Vessel Code Committee of Section 11. The procedure was initially proposed in 1985 for the evaluation of the acceptability of flaws detected in piping during in-service inspection for certain materials, identified in Article IWB-3640 of the ASME Boiler and Pressure Vessel Code Section 11 ''Rules for In-service Inspection of Nuclear Power Plant Components.'' for which the fracture toughness is not sufficiently high to justify acceptance based solely on the plastic limit load evaluation methodology of Appendix C and IWB-3641. The procedure, referred to as Appendix J, originally included two approaches: a J-integral based tearing instability (J-T) analysis and the deformation plasticity failure assessment diagram (DPFAD) methodology. In Appendix J, a general DPFAD approach was simplified for application to part-through wall flows in ferritic piping through the use of a single DPFAD curve for circumferential flaws. Axial flaws are handled using two DPFAD curves where the ratio of flaw depth to wall thickness is used to determine the appropriate DPFAD curve. Flaws are evaluated in Appendix J by comparing the actual pipe applied stress with the allowable stress with the appropriate safety factors for the flaw size at the end of the evaluation period. Assessment points for circumferential and axial flaws are plotted on the appropriate failure assessment diagram. In addition, this report summarizes the experimental test predictions of the results of the Battelle Columbus Laboratory experiments, the Eiber experiments, and the JAERI tests using the Appendix J DPFAD methodology. Lastly, this report also provides guidelines for handling residual stresses in the evaluation procedure. 22 refs., 13 figs., 5 tabs

  9. Assessment of congestive heart failure in chest radiographs

    International Nuclear Information System (INIS)

    Henriksson, L.; Sundin, A.; Smedby, Oe.; Albrektsson, P.

    1990-01-01

    The effect of observer variations and film-screen quality on the diagnosis of congestive heart failure based on chest radiographs was studied in 27 patients. For each patient, two films were exposed, one with the Kodak Lanex Medium system and one with the Agfa MR 400 system. The films were presented to three observers who assessed the presence of congestive heart failure on a three-graded scale. The results showed no significant difference between the two systems but large systematic differences between the observers. There were also differences between the two ratings by the same observer that could not be explained by the film-screen factor. It is concluded that the choice between these two systems is of little importance in view of the interobserver and intraobserver variability that can exist within the same department. (orig.)

  10. Assessing neural activity related to decision-making through flexible odds ratio curves and their derivatives.

    Science.gov (United States)

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Pardo-Vazquez, Jose L; Leboran, Victor; Molenberghs, Geert; Faes, Christel; Acuña, Carlos

    2011-06-30

    It is well established that neural activity is stochastically modulated over time. Therefore, direct comparisons across experimental conditions and determination of change points or maximum firing rates are not straightforward. This study sought to compare temporal firing probability curves that may vary across groups defined by different experimental conditions. Odds-ratio (OR) curves were used as a measure of comparison, and the main goal was to provide a global test to detect significant differences of such curves through the study of their derivatives. An algorithm is proposed that enables ORs based on generalized additive models, including factor-by-curve-type interactions to be flexibly estimated. Bootstrap methods were used to draw inferences from the derivatives curves, and binning techniques were applied to speed up computation in the estimation and testing processes. A simulation study was conducted to assess the validity of these bootstrap-based tests. This methodology was applied to study premotor ventral cortex neural activity associated with decision-making. The proposed statistical procedures proved very useful in revealing the neural activity correlates of decision-making in a visual discrimination task. Copyright © 2011 John Wiley & Sons, Ltd.

  11. An investigation on vulnerability assessment of steel structures with thin steel shear wall through development of fragility curves

    OpenAIRE

    Mohsen Gerami; Saeed Ghaffari; Amir Mahdi Heidari Tafreshi

    2017-01-01

    Fragility curves play an important role in damage assessment of buildings. Probability of damage induction to the structure against seismic events can be investigated upon generation of afore mentioned curves. In current research 360 time history analyses have been carried out on structures of 3, 10 and 20 story height and subsequently fragility curves have been adopted. The curves are developed based on two indices of inter story drifts and equivalent strip axial strains of the shear wall. T...

  12. Towards a whole-network risk assessment for railway bridge failures caused by scour during flood events

    Directory of Open Access Journals (Sweden)

    Lamb Rob

    2016-01-01

    Full Text Available Localised erosion (scour during flood flow conditions can lead to costly damage or catastrophic failure of bridges, and in some cases loss of life or significant disruption to transport networks. Here, we take a broad scale view to assess risk associated with bridge scour during flood events over an entire infrastructure network, illustrating the analysis with data from the British railways. There have been 54 recorded events since 1846 in which scour led to the failure of railway bridges in Britain. These events tended to occur during periods of extremely high river flow, although there is uncertainty about the precise conditions under which failures occur, which motivates a probabilistic analysis of the failure events. We show how data from the historical bridge failures, combined with hydrological analysis, have been used to construct fragility curves that quantify the conditional probability of bridge failure as a function of river flow, accompanied by estimates of the associated uncertainty. The new fragility analysis is tested using flood events simulated from a national, spatial joint probability model for extremes in river flows. The combined models appear robust in comparison with historical observations of the expected number of bridge failures in a flood event, and provide an empirical basis for further broad-scale network risk analysis.

  13. Selected component failure rate values from fusion safety assessment tasks

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1998-01-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers

  14. Selected component failure rate values from fusion safety assessment tasks

    Energy Technology Data Exchange (ETDEWEB)

    Cadwallader, L.C.

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  15. Selected Component Failure Rate Values from Fusion Safety Assessment Tasks

    Energy Technology Data Exchange (ETDEWEB)

    Cadwallader, Lee Charles

    1998-09-01

    This report is a compilation of component failure rate and repair rate values that can be used in magnetic fusion safety assessment tasks. Several safety systems are examined, such as gas cleanup systems and plasma shutdown systems. Vacuum system component reliability values, including large vacuum chambers, have been reviewed. Values for water cooling system components have also been reported here. The report concludes with the examination of some equipment important to personnel safety, atmospheres, combustible gases, and airborne releases of radioactivity. These data should be useful to system designers to calculate scoping values for the availability and repair intervals for their systems, and for probabilistic safety or risk analysts to assess fusion systems for safety of the public and the workers.

  16. Hysteroscopic sterilization using a virtual reality simulator: assessment of learning curve.

    Science.gov (United States)

    Janse, Juliënne A; Goedegebuure, Ruben S A; Veersema, Sebastiaan; Broekmans, Frank J M; Schreuder, Henk W R

    2013-01-01

    To assess the learning curve using a virtual reality simulator for hysteroscopic sterilization with the Essure method. Prospective multicenter study (Canadian Task Force classification II-2). University and teaching hospital in the Netherlands. Thirty novices (medical students) and five experts (gynecologists who had performed >150 Essure sterilization procedures). All participants performed nine repetitions of bilateral Essure placement on the simulator. Novices returned after 2 weeks and performed a second series of five repetitions to assess retention of skills. Structured observations on performance using the Global Rating Scale and parameters derived from the simulator provided measurements for analysis. The learning curve is represented by improvement per procedure. Two-way repeated-measures analysis of variance was used to analyze learning curves. Effect size (ES) was calculated to express the practical significance of the results (ES ≥ 0.50 indicates a large learning effect). For all parameters, significant improvements were found in novice performance within nine repetitions. Large learning effects were established for six of eight parameters (p learning curve established in this study endorses future implementation of the simulator in curricula on hysteroscopic skill acquisition for clinicians who are interested in learning this sterilization technique. Copyright © 2013 AAGL. Published by Elsevier Inc. All rights reserved.

  17. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Yee, Eric [KEPCO International Nuclear Graduate School, Dept. of Nuclear Power Plant Engineering, Ulsan (Korea, Republic of)

    2017-03-15

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

  18. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    International Nuclear Information System (INIS)

    Yee, Eric

    2017-01-01

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered

  19. Fuel failure assessments based on radiochemistry. Experience feedback and challenges

    International Nuclear Information System (INIS)

    Petit, C.; Ziabletsev, D.; Zeh, P.

    2015-01-01

    Significant improvements have been observed in LWR nuclear fuel reliability over the past years. As a result, the number of fuel failures in PWRs and BWRs has recently dramatically decreased. Nevertheless, a few remaining challenges still exist. One of them is that the industry has recently started seeing a relatively new type of fuel failure, so-called 'weak leak failures', which could be characterized by a very small release of gaseous fission products and essentially almost zero release of iodines or any other soluble fission products in the reactor coolant. Correspondingly, the behavior of these weak leakers does not follow typical behavior of a conventional leaker characterized by a proportionality of the amount of released Xe 133 related to the failed rod power. Instead, for a weak leaker, the activity of Xe 133 is directly correlated to the size of the cladding defects of the leaker. The presence of undetected weak leaker in the core may lead to carryover of a leaker into the subsequent cycle. Even if the presence of weak leaker in the core is suspected, it typically requires more effort to identify the leaker which could result in extended duration of the outage and ultimately to economic losses to the utility operating the reactor. To effectively deal with this issue the industry has been facing, several changes have been recently realized, which are different from the methodology of dealing with conventional leaker. These changes include new assessment methods, the need for improved sipping techniques to better identify low release leakers, and correspondingly better equipment to be able to locate small clad defects associated with weak leaker, such as sensitive localization device of failed rods, sensitive eddy current coil for the failed rod, ultra high definition cameras for the failed rod examination and experienced fuel reliability engineers performing cause of failure and rood cause research and analyses. Ultimately, the destructive

  20. Customer system efficiency improvement assessment: Supply curves for transmission and distribution conservation options

    Energy Technology Data Exchange (ETDEWEB)

    Tepel, R.C.; Callaway, J.W.; De Steese, J.G.

    1987-11-01

    This report documents the results of Task 6 in the Customer System Efficiency Improvement (CSEI) Assessment Project. A principal objective of this project is to assess the potential for energy conservation in the transmission and distribution (TandD) systems of electric utilities in the BPA service area. The scope of this assessment covers BPA customers in the Pacific Northwest region and all non-federal TandD systems, including those that currently place no load on the BPA system. Supply curves were developed to describe the conservation resource potentially available from TandD-system efficiency improvements. These supply curves relate the levelized cost of upgrading existing equipment to the estimated amount of energy saved. Stated in this form, the resource represented by TandD loss reductions can be compared with other conservation options and regional electrical generation resources to determine the most cost-effective method of supplying power to the Pacific Northwest. The development of the supply curves required data acquisition and methodology development that are also described in this report. 11 refs., 11 figs., 16 tabs.

  1. [Reference curves for assessing the physical growth of male Wistar rats].

    Science.gov (United States)

    Cossio-Bolaños, Marco; Gómez Campos, Rossana; Vargas Vitoria, Rodrigo; Hochmuller Fogaça, Rosalvo Tadeu; de Arruda, Miguel

    2013-11-01

    Wistar rats are one of the most popular strains routinely used for research in the laboratory to serve as an important research tool, so it requires strict control of variables such as age, sex and body weight, and Thus to extrapolate the results to the human model. To develop reference curves for assessing the physical growth of male Wistar rats according to chronological age and somatic maturation from a non-invasive. The subjects studied were 731 male Wistar rats transversely. We assessed age, body weight and body surface. LMS method was used to construct percentile curves based on weight and somatic maturation. The proposed physical growth curves are used to track the physical growth and nutritional status diagnosis of male Wistar rats. Budgets by cutting points are: P3, P10, P25, P50, P75, P90 and P97. The results suggest that scientists from different areas can use such references, in order to extrapolate somatic growth phases of the laboratory rat and the human model is a non-invasive alternative to assess growth and nutritional status. Copyright AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  2. An assessment of household electricity load curves and corresponding CO2 marginal abatement cost curves for Gujarat state, India

    International Nuclear Information System (INIS)

    Garg, Amit; Shukla, P.R.; Maheshwari, Jyoti; Upadhyay, Jigeesha

    2014-01-01

    Gujarat, a large industrialized state in India, consumed 67 TWh of electricity in 2009–10, besides experiencing a 4.5% demand–supply short-fall. Residential sector accounted for 15% of the total electricity consumption. We conducted load research survey across 21 cities and towns of the state to estimate residential electricity load curves, share of appliances by type and usage patterns for all types of household appliances at utility, geographic, appliance, income and end-use levels. The results indicate that a large scope exists for penetration of energy efficient devices in residential sector. Marginal Abatement Cost (MAC) curves for electricity and CO 2 were generated to analyze relative attractiveness of energy efficient appliance options. Results indicate that up to 7.9 TWh of electricity can be saved per year with 6.7 Mt-CO 2 emissions mitigation at negative or very low CO 2 prices of US$ 10/t-CO 2 . Despite such options existing, their penetration is not realized due to myriad barriers such as financial, institutional or awareness and therefore cannot be taken as baseline options for CO 2 emission mitigation regimes. - Highlights: • Residential sector provides focused mitigation opportunities. • Energy efficient space cooling is the main technology transition required. • Almost 26% residential load could be reduced by DSM measures. • Myriad barriers limit penetration of negative marginal cost efficient options

  3. Assessing patient preferences in heart failure using conjoint methodology

    Directory of Open Access Journals (Sweden)

    Pisa G

    2015-08-01

    Full Text Available Giovanni Pisa,1 Florian Eichmann,1 Stephan Hupfer21Kantar Health GmbH, Munich, Germany; 2Novartis Pharma GmbH, Nuernberg, GermanyAim: The course of heart failure (HF is characterized by frequent hospitalizations, a high mortality rate, as well as a severely impaired health-related quality of life (HRQoL. To optimize disease management, understanding of patient preferences is crucial. We aimed to assess patient preferences using conjoint methodology and HRQoL in patients with HF.Methods: Two modules were applied: an initial qualitative module, consisting of in-depth interviews with 12 HF patients, and the main quantitative module in 300 HF patients from across Germany. Patients were stratified according to the time of their last HF hospitalization. Each patient was presented with ten different scenarios during the conjoint exercise. Additionally, patients completed the generic HRQoL instrument, EuroQol health questionnaire (EQ-5D™.Results: The attribute with the highest relative importance was dyspnea (44%, followed by physical capacity (18%. Of similar importance were exhaustion during mental activities (13%, fear due to HF (13%, and autonomy (12%. The most affected HRQoL dimensions according to the EQ-5D questionnaire were anxiety/depression (23% with severe problems, pain/discomfort (19%, and usual activities (15%. Overall average EQ-5D score was 0.39 with stable, chronic patients (never hospitalized having a significantly better health state vs the rest of the cohort.Conclusion: This paper analyzed patient preference in HF using a conjoint methodology. The preference weights resulting from the conjoint analysis could be used in future to design HRQoL questionnaires which could better assess patient preferences in HF care.Keywords: heart failure, quality of life, conjoint analysis, utility, patient preference

  4. Enhancement of global flood damage assessments using building material based vulnerability curves

    Science.gov (United States)

    Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves

  5. An experimental assessment of proposed universal yield curves for secondary electron emission

    International Nuclear Information System (INIS)

    Salehi, M.; Flinn, E.A.

    1980-01-01

    A variety of 'Universal Yield Curves' for the secondary emission process have been proposed. A series of precise measurements of the secondary emission properties of a range of related amorphous semiconducting materials, made under UHV on freshly vacuum-cleaved surfaces, and covering a wide range of primary energies, have recently made possible an accurate assessment of the validity of the various UYC's suggested. It is found that no truly universal curve exists; the atomic number of the target material plays an important part in determining the secondary emission properties. Agarwal's (Proc. Phys. Soc.; 71: 851 (1958)) semi-empirical expression, which takes account of the atomic number and weight, is found to give good agreement for all the materials studied. Further theoretical investigation is required. (author)

  6. The prehospital intravenous access assessment: a prospective study on intravenous access failure and access delay in prehospital emergency medicine.

    Science.gov (United States)

    Prottengeier, Johannes; Albermann, Matthias; Heinrich, Sebastian; Birkholz, Torsten; Gall, Christine; Schmidt, Joachim

    2016-12-01

    Intravenous access in prehospital emergency care allows for early administration of medication and extended measures such as anaesthesia. Cannulation may, however, be difficult, and failure and resulting delay in treatment and transport may have negative effects on the patient. Therefore, our study aims to perform a concise assessment of the difficulties of prehospital venous cannulation. We analysed 23 candidate predictor variables on peripheral venous cannulations in terms of cannulation failure and exceedance of a 2 min time threshold. Multivariate logistic regression models were fitted for variables of predictive value (P0.6) of their respective receiver operating characteristic curve. A total of 762 intravenous cannulations were enroled. In all, 22% of punctures failed on the first attempt and 13% of punctures exceeded 2 min. Model selection yielded a three-factor model (vein visibility without tourniquet, vein palpability with tourniquet and insufficient ambient lighting) of fair accuracy for the prediction of puncture failure (AUC=0.76) and a structurally congruent model of four factors (failure model factors plus vein visibility with tourniquet) for the exceedance of the 2 min threshold (AUC=0.80). Our study offers a simple assessment to identify cases of difficult intravenous access in prehospital emergency care. Of the numerous factors subjectively perceived as possibly exerting influences on cannulation, only the universal - not exclusive to emergency care - factors of lighting, vein visibility and palpability proved to be valid predictors of cannulation failure and exceedance of a 2 min threshold.

  7. Sequential decision reliability concept and failure rate assessment

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1990-11-01

    Conventionally, a reliability concept is considered together with both each basic unit and their integration in a complicated large scale system such as a nuclear power plant (NPP). Basically, as the plant's operational status is determined by the information obtained from various sensors, the plant's reliability and the risk assessment is closely related to the reliability of the sensory information and hence the sensor components. However, considering the relevant information-processing systems, e.g. fault detection processors, there exists a further question about the reliability of such systems, specifically the reliability of the systems' decision-based outcomes by means of which the further actions are performed. To this end, a general sequential decision reliability concept and the failure rate assessment methodology is introduced. The implications of the methodology are investigated and the importance of the decision reliability concept in system operation is demonstrated by means of sensory signals in real-time from the Borssele NPP in the Netherlands. (author). 21 refs.; 8 figs

  8. Elastic-plastic fracture assessment using a J-R curve by direct method

    International Nuclear Information System (INIS)

    Asta, E.P.

    1996-01-01

    In the elastic-plastic evaluation methods, based on J integral and tearing modulus procedures, an essential input is the material fracture resistance (J-R) curve. In order to simplify J-R determination direct, a method from load-load point displacement records of the single specimen tests may be employed. This procedure has advantages such as avoiding accuracy problems of the crack growth measuring devices and reducing testing time. This paper presents a structural integrity assessment approach, for ductile fracture, using the J-R obtained by a direct method from small single specimen fracture tests. The J-R direct method was carried out by means of a developed computational program based on theoretical elastic-plastic expressions. A comparative evaluation between the direct method J resistance curves and those obtained by the standard testing methodology on typical pressure vessel steels has been made. The J-R curves estimated from the direct method give an acceptable agreement with the approach proposed in this study which is reliable to use for engineering determinations. (orig.)

  9. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-10-01

    When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model

  10. Use of Master Curve technology for assessing shallow flaws in a reactor pressure vessel material

    International Nuclear Information System (INIS)

    Bass, Bennett Richard; Taylor, Nigel

    2006-01-01

    In the NESC-IV project an experimental/analytical program was performed to develop validated analysis methods for transferring fracture toughness data to shallow flaws in reactor pressure vessels subject to biaxial loading in the lower-transition temperature region. Within this scope an extensive range of fracture tests was performed on material removed from a production-quality reactor pressure vessel. The Master Curve analysis of this data is reported and its application to the assessment of the project feature tests on large beam test pieces.

  11. New method of safety assessment for pressure vessel of nuclear power plant--brief introduction of master curve approach

    International Nuclear Information System (INIS)

    Yang Wendou

    2011-01-01

    The new Master Curve Method is called as a revolutionary advance to the assessment of- reactor pressure vessel integrity in USA. This paper explains the origin, basis and standard of the Master Curve from the reactor pressure-temperature limit curve which assures the safety of nuclear power plant. According to the characteristics of brittle fracture which is greatly susceptible to the microstructure, the theory and the test method of the Master Curve as well as its statistical law which can be modeled using Weibull distribution are described in this paper. The meaning, advantage, application and importance of the Master Curve as well as the relation between the Master Curve and nuclear power safety are understood from the fitting formula for the fracture toughness database by Weibull distribution model. (author)

  12. Frailty Assessment in Heart Failure: an Overview of the Multi-domain Approach.

    Science.gov (United States)

    McDonagh, Julee; Ferguson, Caleb; Newton, Phillip J

    2018-02-01

    The study aims (1) to provide a contemporary description of frailty assessment in heart failure and (2) to provide an overview of multi-domain frailty assessment in heart failure. Frailty assessment is an important predictive measure for mortality and hospitalisation in individuals with heart failure. To date, there are no frailty assessment instruments validated for use in heart failure. This has resulted in significant heterogeneity between studies regarding the assessment of frailty. The most common frailty assessment instrument used in heart failure is the Frailty Phenotype which focuses on five physical domains of frailty; the appropriateness a purely physical measure of frailty in individuals with heart failure who frequently experience decreased exercise tolerance and shortness of breath is yet to be determined. A limited number of studies have approached frailty assessment using a multi-domain view which may be more clinically relevant in heart failure. There remains a lack of consensus regarding frailty assessment and an absence of a validated instrument in heart failure. Despite this, frailty continues to be assessed frequently, primarily for research purposes, using predominantly physical frailty measures. A more multidimensional view of frailty assessment using a multi-domain approach will likely be more sensitive to identifying at risk patients.

  13. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  14. Service reliability assessment using failure mode and effect analysis ...

    African Journals Online (AJOL)

    user

    Statistical Process Control Teng and Ho (1996) .... are still remaining left on modelling the interaction between impact of internal service failure and ..... Design error proofing: development of automated error-proofing information systems, Proceedings of.

  15. Assessment of left ventricular function in patients with atrial fibrillation by left ventricular filling and function curves determined by ECG gated blood pool scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Inagaki, Suetsugu

    1986-06-01

    Accurate cardiac function in patients with atrial fibrillation (Af) is difficult to assess, since a wide fluctuation of cardiac cycle makes the ventricular hemodynamics variable. Although ECG gated blood pool scintigraphy (EGBPS) is useful to evaluate left ventricular (LV) function, a conventional EGBPS might have a problem in applying to Af. Therefore, a new processing algorithm was devised to make multiple gated images discriminated by preceding R-R intervals (PRR), and LV filling and function curves were obtained in 62 patients with Af to evaluate LV function. LV filling curve, obtained by plotting end-diastolic volume (EDV) againt PRR, demonstrated that the blood filling was impaired in mitral stenosis and constrictive pericarditis, but recovered after mitral commissurotomy. LV function curve, by plotting stroke volume (SV) againt EDV, was quantitatively analysed by the indices such as Slope and Position. Both indices reduced significantly in heart failure. When compared among underlying diseases individually, the indices decreased in the following order; lone Af, hyperthyroidism, senile Af, hypertension, mitral valve disease, ischemic heart disease, dilated cardiomyopathy and aortic regurgitation. After the treatment with digitalis and/or diuretics, left and upward shift of function curve was observed. The rise in heart rate by atropine infusion made Slope and Position unchanged, and which implied that function curve was little influenced by heart rate per se. The rise in systolic blood pressure by angiotensin-II infusion caused shifts in function curve to rightward and downward. Downward shift, mostly seen in patients with gentler slope in control state, may imply afterload mismatch due to a decrease in preload reserve. (J.P.N.).

  16. Ambiguity assessment of small-angle scattering curves from monodisperse systems.

    Science.gov (United States)

    Petoukhov, Maxim V; Svergun, Dmitri I

    2015-05-01

    A novel approach is presented for an a priori assessment of the ambiguity associated with spherically averaged single-particle scattering. The approach is of broad interest to the structural biology community, allowing the rapid and model-independent assessment of the inherent non-uniqueness of three-dimensional shape reconstruction from scattering experiments on solutions of biological macromolecules. One-dimensional scattering curves recorded from monodisperse systems are nowadays routinely utilized to generate low-resolution particle shapes, but the potential ambiguity of such reconstructions remains a major issue. At present, the (non)uniqueness can only be assessed by a posteriori comparison and averaging of repetitive Monte Carlo-based shape-determination runs. The new a priori ambiguity measure is based on the number of distinct shape categories compatible with a given data set. For this purpose, a comprehensive library of over 14,000 shape topologies has been generated containing up to seven beads closely packed on a hexagonal grid. The computed scattering curves rescaled to keep only the shape topology rather than the overall size information provide a `scattering map' of this set of shapes. For a given scattering data set, one rapidly obtains the number of neighbours in the map and the associated shape topologies such that in addition to providing a quantitative ambiguity measure the algorithm may also serve as an alternative shape-analysis tool. The approach has been validated in model calculations on geometrical bodies and its usefulness is further demonstrated on a number of experimental X-ray scattering data sets from proteins in solution. A quantitative ambiguity score (a-score) is introduced to provide immediate and convenient guidance to the user on the uniqueness of the ab initio shape reconstruction from the given data set.

  17. Assessment of modification level of hypoeutectic Al -Si alloys by pattern recognition of cooling curves

    Directory of Open Access Journals (Sweden)

    CHEN Xiang

    2005-11-01

    Full Text Available Most evaluations of modification level are done according to a specific scale based on an merican Foundry Society (AFS standard wall chart as qualitative analysis in Al-Si casting production currently. This method is quite dependent on human experience when making comparisons of the microstructure with the standard chart. And the structures depicted in the AFS chart do not always resemble those seen in actual Al-Si castings. Therefore, this ualitativeanalysis procedure is subjective and can introduce human-caused errors into comparative metallographic analyses. A quantization parameter of the modification level was introduced by setting up the relationship between mean area weighted shape factor of eutectic silicon phase and the modification level using image analysis technology. In order to evaluate the modification level, a new method called "intelligent evaluating of melt quality by pattern recognition of hermal analysis cooling curves" has also been introduced. The results show that silicon modification level can be precisely assessed by comparison of the cooling curve of the melt to be evaluated with the one most similar to it in a database.

  18. Assessment of diagnostic value of tumor markers for colorectal neoplasm by logistic regression and ROC curve

    International Nuclear Information System (INIS)

    Ping, G.

    2007-01-01

    Full text: Objective: To assess the diagnostic value of CEA CA199 and CA50 for colorectal neoplasm by logistic regression and ROC curve. Methods: The subjects include 75 patients of colorectal cancer, 35 patients of benign intestinal disease and 49 health controls. CEA CA199 and CA50 are measured by CLIA ECLIA and IRMA respectively. The area under the curve (AUC) of CEA CA 199 CA50 and logistic regression results are compared. [Result] In the cancer-benign group, the AUC of CA50 is larger than the AUC of CA199 Compared with the AUC of combination of CEA CA199 and CA50 (0.604),the AUC of combination of CEA and CA50 (0.875) is larger and it is also larger than any other AUC of CEA CA199 or CA50 alone. In the cancerhealth group, the AUC of combination of CEA CA199 and CA50 is larger than any other AUC of CEA CA199 or CA50 alone. No matter in the cancer-benign group or cancerhealth group. The AUC of CEA is larger than the AUC of CA199 or CA50. Conclusion: CEA is useful in the diagnosis of colorectal cancer. In the process of differential diagnosis, the combination of CEA and CA50 can give more information, while the combination of three tumor markers does not perform well. Furthermore, as a statistical method, logistic regression can improve the diagnostic sensitivity and specificity. (author)

  19. Tourism and solid waste generation in Europe: A panel data assessment of the Environmental Kuznets Curve.

    Science.gov (United States)

    Arbulú, Italo; Lozano, Javier; Rey-Maquieira, Javier

    2015-12-01

    The relationship between tourism growth and municipal solid waste (MSW) generation has been, until now, the subject of little research. This is puzzling since the tourism sector is an important MSW generator and, at the same time, is willing to avoid negative impacts from MSW mismanagement. This paper aims to provide tools for tourism and MSW management by assessing the effects of tourism volume, tourism quality and tourism specialization on MSW generation in the UE. This is done using the Environmental Kuznets Curve (EKC) framework. The study considers a panel data for 32 European economies in the 1997-2010 periods. Empirical results support the EKC hypothesis for MSW and shows that northern countries tend to have lower income elasticity than less developed countries; furthermore, results confirm a non-linear and significant effect of tourism arrivals, expenditure per tourist and tourism specialization on MSW generation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. An investigation on vulnerability assessment of steel structures with thin steel shear wall through development of fragility curves

    Directory of Open Access Journals (Sweden)

    Mohsen Gerami

    2017-02-01

    Full Text Available Fragility curves play an important role in damage assessment of buildings. Probability of damage induction to the structure against seismic events can be investigated upon generation of afore mentioned curves. In current research 360 time history analyses have been carried out on structures of 3, 10 and 20 story height and subsequently fragility curves have been adopted. The curves are developed based on two indices of inter story drifts and equivalent strip axial strains of the shear wall. Time history analysis is carried out in Perform 3d considering 10 far field seismograms and 10 near fields. Analysis of low height structures revealed that they are more vulnerable in accelerations lower than 0.8 g in near field earthquakes because of higher mode effects. Upon the generated fragility curves it was observed that middle and high structures have more acceptable performance and lower damage levels compared to low height structures in both near and far field seismic hazards.

  1. ASSESSMENT OF BUILDING FAILURES IN NIGERIA: LAGOS AND ...

    African Journals Online (AJOL)

    Common failures seen on buildings were wall cracking, wall spalling, foundation settlement, column buckling, etc. Proper assurance of competent professionals and strict enforcement of ethical standards by the Nigerian Society of Engineers, the Nigerian Institute of Building, and the Nigerian Institute of Architects would ...

  2. Correlation model to analyze dependent failures for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Dezfuli, H.

    1985-01-01

    A methodology is formulated to study the dependent (correlated) failures of various abnormal events in nuclear power plants. This methodology uses correlation analysis is a means for predicting and quantifying the dependent failures. Appropriate techniques are also developed to incorporate the dependent failure in quantifying fault trees and accident sequences. The uncertainty associated with each estimation in all of the developed techniques is addressed and quantified. To identify the relative importance of the degree of dependency (correlation) among events and to incorporate these dependencies in the quantification phase of PRA, the interdependency between a pair of events in expressed with the aid of the correlation coefficient. For the purpose of demonstrating the methodology, the data base used in the Accident Sequence Precursor Study (ASP) was adopted and simulated to obtain distributions for the correlation coefficients. A computer program entitled Correlation Coefficient Generator (CCG) was developed to generate a distribution for each correlation coefficient. The method of bootstrap technique was employed in the CCG computer code to determine confidence limits of the estimated correlation coefficients. A second computer program designated CORRELATE was also developed to obtain probability intervals for both fault trees and accident sequences with statistically correlated failure data

  3. An assessment of the BEST procedure to estimate the soil water retention curve

    Science.gov (United States)

    Castellini, Mirko; Di Prima, Simone; Iovino, Massimo

    2017-04-01

    The Beerkan Estimation of Soil Transfer parameters (BEST) procedure represents a very attractive method to accurately and quickly obtain a complete hydraulic characterization of the soil (Lassabatère et al., 2006). However, further investigations are needed to check the prediction reliability of soil water retention curve (Castellini et al., 2016). Four soils with different physical properties (texture, bulk density, porosity and stoniness) were considered in this investigation. Sites of measurement were located at Palermo University (PAL site) and Villabate (VIL site) in Sicily, Arborea (ARB site) in Sardinia and in Foggia (FOG site), Apulia. For a given site, BEST procedure was applied and the water retention curve was estimated using the available BEST-algorithms (i.e., slope, intercept and steady), and the reference values of the infiltration constants (β=0.6 and γ=0.75) were considered. The water retention curves estimated by BEST were then compared with those obtained in laboratory by the evaporation method (Wind, 1968). About ten experiments were carried out with both methods. A sensitivity analysis of the constants β and γ within their feasible range of variability (0.1analysis showed that S tended to increase for increasing β values and decreasing values of γ for all the BEST-algorithms and soils. On the other hand, Ks tended to decrease for increasing β and γ values. Our results also reveal that: i) BEST-intercept and BEST-steady algorithms yield lower S and higher Ks values than BEST-slope; ii) these algorithms yield also more variable values. For the latter, a higher sensitiveness of these two alternative algorithms to β than for γ was established. The decreasing sensitiveness to γ may lead to a possible lack in the correction of the simplified theoretical description of the parabolic two-dimensional and one-dimensional wetting front along the soil profile (Smettem et al., 1994). This likely resulted in lower S and higher Ks values

  4. Modelling and assessment of urban flood hazards based on rainfall intensity-duration-frequency curves reformation

    OpenAIRE

    Ghazavi, Reza; Moafi Rabori, Ali; Ahadnejad Reveshty, Mohsen

    2016-01-01

    Estimate design storm based on rainfall intensity–duration–frequency (IDF) curves is an important parameter for hydrologic planning of urban areas. The main aim of this study was to estimate rainfall intensities of Zanjan city watershed based on overall relationship of rainfall IDF curves and appropriate model of hourly rainfall estimation (Sherman method, Ghahreman and Abkhezr method). Hydrologic and hydraulic impacts of rainfall IDF curves change in flood properties was evaluated via Stormw...

  5. Use of the cumulative sum method (CUSUM) to assess the learning curves of ultrasound-guided continuous femoral nerve block.

    Science.gov (United States)

    Kollmann-Camaiora, A; Brogly, N; Alsina, E; Gilsanz, F

    2017-10-01

    Although ultrasound is a basic competence for anaesthesia residents (AR) there is few data available on the learning process. This prospective observational study aims to assess the learning process of ultrasound-guided continuous femoral nerve block and to determine the number of procedures that a resident would need to perform in order to reach proficiency using the cumulative sum (CUSUM) method. We recruited 19 AR without previous experience. Learning curves were constructed using the CUSUM method for ultrasound-guided continuous femoral nerve block considering 2 success criteria: a decrease of pain score>2 in a [0-10] scale after 15minutes, and time required to perform it. We analyse data from 17 AR for a total of 237 ultrasound-guided continuous femoral nerve blocks. 8/17 AR became proficient for pain relief, however all the AR who did more than 12 blocks (8/8) became proficient. As for time of performance 5/17 of AR achieved the objective of 12minutes, however all the AR who did more than 20 blocks (4/4) achieved it. The number of procedures needed to achieve proficiency seems to be 12, however it takes more procedures to reduce performance time. The CUSUM methodology could be useful in training programs to allow early interventions in case of repeated failures, and develop competence-based curriculum. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Assessment of pedotransfer functions for estimating soil water retention curves for the amazon region

    Directory of Open Access Journals (Sweden)

    João Carlos Medeiros

    2014-06-01

    Full Text Available Knowledge of the soil water retention curve (SWRC is essential for understanding and modeling hydraulic processes in the soil. However, direct determination of the SWRC is time consuming and costly. In addition, it requires a large number of samples, due to the high spatial and temporal variability of soil hydraulic properties. An alternative is the use of models, called pedotransfer functions (PTFs, which estimate the SWRC from easy-to-measure properties. The aim of this paper was to test the accuracy of 16 point or parametric PTFs reported in the literature on different soils from the south and southeast of the State of Pará, Brazil. The PTFs tested were proposed by Pidgeon (1972, Lal (1979, Aina & Periaswamy (1985, Arruda et al. (1987, Dijkerman (1988, Vereecken et al. (1989, Batjes (1996, van den Berg et al. (1997, Tomasella et al. (2000, Hodnett & Tomasella (2002, Oliveira et al. (2002, and Barros (2010. We used a database that includes soil texture (sand, silt, and clay, bulk density, soil organic carbon, soil pH, cation exchange capacity, and the SWRC. Most of the PTFs tested did not show good performance in estimating the SWRC. The parametric PTFs, however, performed better than the point PTFs in assessing the SWRC in the tested region. Among the parametric PTFs, those proposed by Tomasella et al. (2000 achieved the best accuracy in estimating the empirical parameters of the van Genuchten (1980 model, especially when tested in the top soil layer.

  7. Failure assessment techniques to ensure shipping container integrity

    International Nuclear Information System (INIS)

    McConnell, P.

    1986-02-01

    This report discusses several methodologies which may be used to ensure the structural integrity of containment systems to be used for the transport and storage of high-level radioactive substances. For economic reasons, shipping containers constructed of ferritic materials are being considered for manufacture by vendors in the US and Europe. Ferritic show an inherent transition from a ductile, high energy failure mode to a brittle, low energy fracture mode with decreasing temperature. Therefore, formal consideration of means by which to avoid unstable brittle fracture is necessary prior to the licensing of ferritic casks. It is suggested that failure of a shipping container wall be defined as occurring when a flaw extends through the outer wall of the containment system. Crack initiation which may lead to unstable brittle crack growth should therefore be prevented. It is suggested that a fundamental linear elastic fracture mechanics (lefm) approach be adopted on a case-by-case basis, applied perhaps by means of appropriate modifications to ASMA Section III or Section XI. A lefm analysis requires information concerning service temperatures, loading rates, flaw sizes, and applied stresses. Tentative judgments regarding these parameters for typical shipping containers have been made

  8. Reliability assessment of slender concrete columns at the stability failure

    Science.gov (United States)

    Valašík, Adrián; Benko, Vladimír; Strauss, Alfred; Täubling, Benjamin

    2018-01-01

    The European Standard for designing concrete columns within the use of non-linear methods shows deficiencies in terms of global reliability, in case that the concrete columns fail by the loss of stability. The buckling failure is a brittle failure which occurs without warning and the probability of its formation depends on the columns slenderness. Experiments with slender concrete columns were carried out in cooperation with STRABAG Bratislava LTD in Central Laboratory of Faculty of Civil Engineering SUT in Bratislava. The following article aims to compare the global reliability of slender concrete columns with slenderness of 90 and higher. The columns were designed according to methods offered by EN 1992-1-1 [1]. The mentioned experiments were used as basis for deterministic nonlinear modelling of the columns and subsequent the probabilistic evaluation of structural response variability. Final results may be utilized as thresholds for loading of produced structural elements and they aim to present probabilistic design as less conservative compared to classic partial safety factor based design and alternative ECOV method.

  9. Clinical use of nuclear cardiology in the assessment of heart failure

    International Nuclear Information System (INIS)

    Han Lei; Shi Hongcheng

    2011-01-01

    Nuclear cardiology is the most commonly performed non-invasive cardiac imaging test in patients with heart failure, and it plays an important role in their assessment and management. Quantitative gated single positron emission computed tomography is used to assess quantitatively cardiac volume, left ventricular ejection fraction, stroke volume, and cardiac diastolic function. Resting and stress myocardial perfusion imaging can not only identify nonischemic heart failure and ischemic heart failure, but also demonstrate myocardial viability. Diastolic heart failure also termed as heart failure with a preserved left ventricular ejection fraction is readily identified by nuclear cardiology techniques and can accurately be estimated by peak filling rate and time to peak filling rate. With newer techniques such as three-dimensional, quantitative gated single positron emission computed tomography can assess movement of the left ventricle, and wall thickening evaluation aids its assessment. Myocardial perfusion imaging is also commonly used to identify candidates for implantable cardiac defibrillator and cardiac resynchronization therapies. Neurotransmitter imaging using 123 I-metaiodobenzylguanidine offers prognostic information in patients with heart failure. Metabolism and function in the heart are closely related, and energy substrate metabolism is a potential target of medical therapies to improve cardiac function in patients with heart failure. Cardiac metabolic imaging using 123 I-15-(p-iodophenyl) 3-R, S-methylpentadecacoic acid is a commonly used tracer in clinical studies to diagnose metabolic heart failure. Nuclear cardiology tests, including neurotransmitter imaging and metabolic imaging, are now easily preformed with new tracers to improve heart failure diagnosis. Nuclear cardiology techniques contribute significantly to identifying patients with heart failure and to guiding their management decisions. (authors)

  10. Salivary cortisol day curves in assessing glucocorticoid replacement therapy in Addison's disease.

    Science.gov (United States)

    Smans, Lisanne; Lentjes, Eef; Hermus, Ad; Zelissen, Pierre

    2013-01-01

    Patients with Addison's disease require lifelong treatment with glucocorticoids. At present, no glucocorticoid replacement therapy (GRT) can exactly mimic normal physiology. As a consequence, under- and especially overtreatment can occur. Suboptimal GRT may lead to various side effects. The aim of this study was to investigate the use of salivary cortisol day curves (SCDC) in the individual adjustment of GRT in order to approach normal cortisol levels as closely as possible, reduce over- and underreplacement and study the short-term effects on quality of life (QoL). Twenty patients with Addison's disease were included in this prospective study. A SCDC was obtained and compared to normal controls; general and disease specific QoL-questionnaires were completed. Based on SCDC assessment of over- and undertreatment (calculated as duration (h) × magnitude (nmol/L) at different time points, glucocorticoid dose and regime were adjusted. After 4 weeks SCDC and QoL assessment were repeated and the effect of adjusting GRT was analysed. At baseline, underreplacement was present in 3 and overreplacement in 18 patients; total calculated overreplacement was 32.8 h.nmol/L. Overreplacement decreased significantly to 13.3 h. nmol/L (p =0.005) after adjustment of GRT. Overreplacement was found particularly in the afternoon and evening. After reducing overreplacement in the evening, complaints about sleep disturbances significantly decreased. Individual adjustment of GRT based on SCDC to approach normal cortisol concentrations during the day can reduce overreplacement, especially in the evening. This can lead to a reduction of sleep disturbances and fatigue in patients with Addison's disease. A SCDC is a simple and patient-friendly tool for adjusting GRT and can be useful in the follow-up of patients with Addison's disease.

  11. Assessment of electronic component failure rates on the basis of experimental data

    International Nuclear Information System (INIS)

    Nitsch, R.

    1991-01-01

    Assessment and prediction of failure rates of electronic systems are made using experimental data derived from laboratory-scale tests or from the practice, as for instance from component failure rate statistics or component repair statistics. Some problems and uncertainties encountered in an evaluation of such field data are discussed in the paper. In order to establish a sound basis for comparative assessment of data from various sources, the items of comparison and the procedure in case of doubt have to be defined. The paper explains two standard methods proposed for practical failure rate definition. (orig.) [de

  12. Assessment of two theoretical methods to estimate potentiometric titration curves of peptides: comparison with experiment.

    Science.gov (United States)

    Makowska, Joanna; Bagiñska, Katarzyna; Makowski, Mariusz; Jagielska, Anna; Liwo, Adam; Kasprzykowski, Franciszek; Chmurzyñski, Lech; Scheraga, Harold A

    2006-03-09

    We compared the ability of two theoretical methods of pH-dependent conformational calculations to reproduce experimental potentiometric titration curves of two models of peptides: Ac-K5-NHMe in 95% methanol (MeOH)/5% water mixture and Ac-XX(A)7OO-NH2 (XAO) (where X is diaminobutyric acid, A is alanine, and O is ornithine) in water, methanol (MeOH), and dimethyl sulfoxide (DMSO), respectively. The titration curve of the former was taken from the literature, and the curve of the latter was determined in this work. The first theoretical method involves a conformational search using the electrostatically driven Monte Carlo (EDMC) method with a low-cost energy function (ECEPP/3 plus the SRFOPT surface-solvation model, assumming that all titratable groups are uncharged) and subsequent reevaluation of the free energy at a given pH with the Poisson-Boltzmann equation, considering variable protonation states. In the second procedure, molecular dynamics (MD) simulations are run with the AMBER force field and the generalized Born model of electrostatic solvation, and the protonation states are sampled during constant-pH MD runs. In all three solvents, the first pKa of XAO is strongly downshifted compared to the value for the reference compounds (ethylamine and propylamine, respectively); the water and methanol curves have one, and the DMSO curve has two jumps characteristic of remarkable differences in the dissociation constants of acidic groups. The predicted titration curves of Ac-K5-NHMe are in good agreement with the experimental ones; better agreement is achieved with the MD-based method. The titration curves of XAO in methanol and DMSO, calculated using the MD-based approach, trace the shape of the experimental curves, reproducing the pH jump, while those calculated with the EDMC-based approach and the titration curve in water calculated using the MD-based approach have smooth shapes characteristic of the titration of weak multifunctional acids with small differences

  13. Reliability-based failure cause assessment of collapsed bridge during construction

    International Nuclear Information System (INIS)

    Choi, Hyun-Ho; Lee, Sang-Yoon; Choi, Il-Yoon; Cho, Hyo-Nam; Mahadevan, Sankaran

    2006-01-01

    Until now, in many forensic reports, the failure cause assessments are usually carried out by a deterministic approach so far. However, it may be possible for the forensic investigation to lead to unreasonable results far from the real collapse scenario, because the deterministic approach does not systematically take into account any information on the uncertainties involved in the failures of structures. Reliability-based failure cause assessment (reliability-based forensic engineering) methodology is developed which can incorporate the uncertainties involved in structural failures and structures, and to apply them to the collapsed bridge in order to identify the most critical failure scenario and find the cause that triggered the bridge collapse. Moreover, to save the time and cost of evaluation, an algorithm of automated event tree analysis (ETA) is proposed and possible to automatically calculate the failure probabilities of the failure events and the occurrence probabilities of failure scenarios. Also, for reliability analysis, uncertainties are estimated more reasonably by using the Bayesian approach based on the experimental laboratory testing data in the forensic report. For the applicability, the proposed approach is applied to the Hang-ju Grand Bridge, which collapsed during construction, and compared with deterministic approach

  14. Regional Curve Development and Use in Stream Restoration and Hydrologic Assessment in High Gradient Headwater Streams

    Science.gov (United States)

    Introduction to Regional Curves including; regressions relating bankfull channelcharacteristics to drainage area, providing estimates of bankfull discharge and channel geometry, validating the selection of the bankfull channel as determined in the field

  15. Geometric and electromyographic assessments in the evaluation of curve progression in idiopathic scoliosis

    NARCIS (Netherlands)

    Cheung, J; Veldhuizen, AG; Halberts, JPK; Sluiter, WJ; Van Horn, [No Value

    2006-01-01

    Study Design. The natural history of patients with idiopathic scoliosis was analyzed radiographically and electromyographically in a prospective longitudinal study. Objectives. To identify changes in geometric variables and the sequence in which these changes occur during curve progression in the

  16. Exponential Decay Nonlinear Regression Analysis of Patient Survival Curves: Preliminary Assessment in Non-Small Cell Lung Cancer

    Science.gov (United States)

    Stewart, David J.; Behrens, Carmen; Roth, Jack; Wistuba, Ignacio I.

    2010-01-01

    Background For processes that follow first order kinetics, exponential decay nonlinear regression analysis (EDNRA) may delineate curve characteristics and suggest processes affecting curve shape. We conducted a preliminary feasibility assessment of EDNRA of patient survival curves. Methods EDNRA was performed on Kaplan-Meier overall survival (OS) and time-to-relapse (TTR) curves for 323 patients with resected NSCLC and on OS and progression-free survival (PFS) curves from selected publications. Results and Conclusions In our resected patients, TTR curves were triphasic with a “cured” fraction of 60.7% (half-life [t1/2] >100,000 months), a rapidly-relapsing group (7.4%, t1/2=5.9 months) and a slowly-relapsing group (31.9%, t1/2=23.6 months). OS was uniphasic (t1/2=74.3 months), suggesting an impact of co-morbidities; hence, tumor molecular characteristics would more likely predict TTR than OS. Of 172 published curves analyzed, 72 (42%) were uniphasic, 92 (53%) were biphasic, 8 (5%) were triphasic. With first-line chemotherapy in advanced NSCLC, 87.5% of curves from 2-3 drug regimens were uniphasic vs only 20% of those with best supportive care or 1 drug (p<0.001). 54% of curves from 2-3 drug regimens had convex rapid-decay phases vs 0% with fewer agents (p<0.001). Curve convexities suggest that discontinuing chemotherapy after 3-6 cycles “synchronizes” patient progression and death. With postoperative adjuvant chemotherapy, the PFS rapid-decay phase accounted for a smaller proportion of the population than in controls (p=0.02) with no significant difference in rapid-decay t1/2, suggesting adjuvant chemotherapy may move a subpopulation of patients with sensitive tumors from the relapsing group to the cured group, with minimal impact on time to relapse for a larger group of patients with resistant tumors. In untreated patients, the proportion of patients in the rapid-decay phase increased (p=0.04) while rapid-decay t1/2 decreased (p=0.0004) with increasing

  17. Assessing apical transportation in curved canals: comparison between cross-sections and micro-computed tomography

    Directory of Open Access Journals (Sweden)

    Laila Gonzales Freire

    2012-06-01

    Full Text Available The aim of this study was to compare two methods of assessing apical transportation in curved canals after rotary instrumentation, namely, cross-sections and micro-computed tomography (µCT. Thirty mandibular molars were divided into two groups and prepared according to the requirements of each method. In G1 (cross-sections, teeth were embedded in resin blocks and sectioned at 2.0, 3.5, and 5.0 mm from the anatomic apex. Pre- and postoperative sections were photographed and analyzed. In G2 (µCT, teeth were embedded in a rubber-base impression material and scanned before and after instrumentation. Mesiobuccal canals were instrumented with the Twisted File (TF system (SybronEndo, Orange, USA, and mesiolingual canals, with the EndoSequence (ES system (Brasseler, Savannah, USA. Images were reconstructed, and sections corresponding to distances 2.0, 3.5, and 5.0 mm from the anatomic apex were selected for comparison. Data were analyzed using Mann-Whitney's test at a 5% significance level. The TF and ES instruments produced little deviation from the root canal center, with no statistical difference between them (P > 0.05. The canal transportation results were significantly lower (0.056 mm in G2 than in G1 (0.089 mm (p = 0.0012. The µCT method was superior to the cross-section method, especially in view of its ability to preserve specimens and provide results that are more closely related to clinical situations.

  18. Omnibus risk assessment via accelerated failure time kernel machine modeling.

    Science.gov (United States)

    Sinnott, Jennifer A; Cai, Tianxi

    2013-12-01

    Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai, Tonini, and Lin, 2011). In this article, we derive testing and prediction methods for KM regression under the accelerated failure time (AFT) model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. © 2013, The International Biometric Society.

  19. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    International Nuclear Information System (INIS)

    Harry, T; Manger, R; Cervino, L; Pawlicki, T

    2016-01-01

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this work was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.

  20. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Harry, T [Oregon State University, Corvallis, OR (United States); University of California, San Diego, La Jolla, CA (United States); Manger, R; Cervino, L; Pawlicki, T [University of California, San Diego, La Jolla, CA (United States)

    2016-06-15

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this work was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.

  1. An assessment of mode-coupling and falling-friction mechanisms in railway curve squeal through a simplified approach

    Science.gov (United States)

    Ding, Bo; Squicciarini, Giacomo; Thompson, David; Corradi, Roberto

    2018-06-01

    Curve squeal is one of the most annoying types of noise caused by the railway system. It usually occurs when a train or tram is running around tight curves. Although this phenomenon has been studied for many years, the generation mechanism is still the subject of controversy and not fully understood. A negative slope in the friction curve under full sliding has been considered to be the main cause of curve squeal for a long time but more recently mode coupling has been demonstrated to be another possible explanation. Mode coupling relies on the inclusion of both the lateral and vertical dynamics at the contact and an exchange of energy occurs between the normal and the axial directions. The purpose of this paper is to assess the role of the mode-coupling and falling-friction mechanisms in curve squeal through the use of a simple approach based on practical parameter values representative of an actual situation. A tramway wheel is adopted to study the effect of the adhesion coefficient, the lateral contact position, the contact angle and the damping ratio. Cases corresponding to both inner and outer wheels in the curve are considered and it is shown that there are situations in which both wheels can squeal due to mode coupling. Additionally, a negative slope is introduced in the friction curve while keeping active the vertical dynamics in order to analyse both mechanisms together. It is shown that, in the presence of mode coupling, the squealing frequency can differ from the natural frequency of either of the coupled wheel modes. Moreover, a phase difference between wheel vibration in the vertical and lateral directions is observed as a characteristic of mode coupling. For both these features a qualitative comparison is shown with field measurements which show the same behaviour.

  2. The Impact of Grading on a Curve: Assessing the Results of Kulick and Wright's Simulation Analysis

    Science.gov (United States)

    Bailey, Gary L.; Steed, Ronald C.

    2012-01-01

    Kulick and Wright concluded, based on theoretical mathematical simulations of hypothetical student exam scores, that assigning exam grades to students based on the relative position of their exam performance scores within a normal curve may be unfair, given the role that randomness plays in any given student's performance on any given exam.…

  3. Correlation of radiological assessment of congestive heart failure with left ventricular end-diastolic pressure

    International Nuclear Information System (INIS)

    Herman, P.G.; Kahn, A.; Kallman, C.E.; Rojas, K.A.; Bodenheimer, M.M.

    1988-01-01

    Left ventricular end-diastolic pressure (LVEDP) has been considered a reliable indicator of left ventricular function. The purpose of this study was to correlate the radiologic assessment of congestive heart failure with LVEDP. The population of the study consisted of 85 consecutive cases in four ranges of LVEDP ( 24). The PA chest radiographs obtained 1 day prior to cardiac catherization were assessed for radiological evidence of congestive heart failure and were graded from normal to abnormal (0-3). The results will be summarized in the authors' presentation. The discordance of radiological assessment of congestive heart failure in patients with elevated LVEDP will be discussed in light of recent advances in pathophysiologic understanding of left ventricular function and the impact of new classes of drugs in the management of these patients

  4. Probabilistic evaluation of design S-N curve and reliability assessment of ASME code-based evaluation

    International Nuclear Information System (INIS)

    Zhao Yongxiang

    1999-01-01

    A probabilistic evaluating approach of design S-N curve and a reliability assessment approach of the ASME code-based evaluation are presented on the basis of Langer S-N model-based P-S-N curves. The P-S-N curves are estimated by a so-called general maximum likelihood method. This method can be applied to deal with the virtual stress amplitude-crack initial life data which have a characteristics of double random variables. Investigation of a set of the virtual stress amplitude-crack initial life (S-N) data of 1Cr18Ni9Ti austenitic stainless steel-welded joint reveals that the P-S-N curves can give a good prediction of scatter regularity of the S-N data. Probabilistic evaluation of the design S-N curve with 0.9999 survival probability has considered various uncertainties, besides of the scatter of the S-N data, to an appropriate extent. The ASME code-based evaluation with 20 reduction factor on the mean life is much more conservative than that with 2 reduction factor on the stress amplitude. Evaluation of the latter in 666.61 MPa virtual stress amplitude is equivalent to 0.999522 survival probability and in 2092.18 MPa virtual stress amplitude equivalent to 0.9999999995 survival probability. This means that the evaluation in the low loading level may be non-conservative and in contrast, too conservative in the high loading level. Cause is that the reduction factors are constants and the factors can not take into account the general observation that scatter of the N data increases with the loading level decreasing. This has indicated that it is necessary to apply the probabilistic approach to the evaluation of design S-N curve

  5. Physics of Failure as a Basis for Solder Elements Reliability Assessment in Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    description of the reliability. A physics of failure approach is applied. A SnAg solder component used in power electronics is used as an example. Crack propagation in the SnAg solder is modeled and a model to assess the accumulated plastic strain is proposed based on a physics of failure approach. Based...... on the proposed model it is described how to find the accumulated linear damage and reliability levels for a given temperature loading profile. Using structural reliability methods the reliability levels of the electrical components are assessed by introducing scale factors for stresses....

  6. Application of Master Curve fracture toughness for reactor pressure vessel integrity assessment in the USA

    International Nuclear Information System (INIS)

    Server, William; Rosinski, Stan; Lott, Randy; Kim, Charles; Weakland, Dennis

    2002-01-01

    The Master Curve fracture toughness approach has been used in the USA for better defining the transition temperature fracture toughness of irradiated reactor pressure vessel (RPV) steels for end-of-life (EOL) and EOL extension (EOLE) time periods. The first application was for the Kewaunee plant in which the life-limiting material was a circumferential weld metal. Fracture toughness testing of this weld metal corresponding to EOL and beyond EOLE was used to reassess the PTS screening value, RT PTS , and to develop new operating pressure-temperature curves. The NRC has approved this application using a shift-based methodology and higher safety margins than those proposed by the utility and its contractors. Beaver Valley Unit 1, a First Energy nuclear plant, has performed similar fracture toughness testing, but none of the testing has been conducted at EOL or EOLE at this time. Therefore, extrapolation of the life-limiting plate data to higher fluences is necessary, and the projections will be checked in the next decade by Master Curve fracture toughness testing of all of the Beaver Valley Unit 1 beltline materials (three plates and three welds) at fluences near or greater than EOLE. A supplemental surveillance capsule has been installed in the sister plant, Beaver Valley Unit 2, which has the capability of achieving a higher lead factor while operating under essentially the same environment. The Beaver Valley Unit 1 evaluation has been submitted to the NRC. This paper reviews the shift-based approach taken for the Beaver Valley Unit 1 RPV and presents the use of the RT T 0 methodology (which evolved out of the Master Curve testing and endorsed through two ASME Code Cases). The applied margin accounts for uncertainties in the various material parameters. Discussion of a direct measurement of RT T 0 approach, as originally submitted for the Kewaunee case, is also presented

  7. Assessment of CHF enhancement mechanisms in a curved, rectangular channel subjected to concave heating

    International Nuclear Information System (INIS)

    Sturgis, J.C.; Mudawar, I.

    1999-01-01

    An experimental study was undertaken to examine the enhancement in critical heat flux (CHF) provided by streamwise curvature. Curved and straight rectangular flow channels were fabricated with identical 5.0 x 2.5 mm cross sections and heated lengths of 101.6 mm in which the heat was applied to only one wall--the concave wall (32.3 mm radius) in the curved channel and a side wall in the straight. Tests were conducted using FC-72 liquid with mean inlet velocity and outlet subcooling of 0.25 to 10 m s -1 and 3 to 29 C, respectively. Centripetal acceleration for curved flow reached 315 times earth's gravitational acceleration. Critical heat flux was enhanced due to flow curvature at all conditions but the enhancement decreased with increasing subcooling. For near-saturated conditions, the enhancement was approximately 60% while for highly subcooled flow it was only 20%. The causes for the enhancement were identified as (1) increased pressure on the liquid-vapor interface at wetting fronts, (2) buoyancy forces and (3) increased subcooling at the concave wall. Flow visualization tests were conducted in transparent channels to explore the role of buoyancy forces in enhancing the critical heat flux. These forces were observed to remove vapor from the concave wall and distribute it throughout the cross section. Vapor removal was only effective at near-saturated conditions, yielding the observed substantial enhancement in CHF relative to the straight channel

  8. Dynamic thresholds and a summary ROC curve: Assessing prognostic accuracy of longitudinal markers.

    Science.gov (United States)

    Saha-Chaudhuri, P; Heagerty, P J

    2018-04-19

    Cancer patients, chronic kidney disease patients, and subjects infected with HIV are routinely monitored over time using biomarkers that represent key health status indicators. Furthermore, biomarkers are frequently used to guide initiation of new treatments or to inform changes in intervention strategies. Since key medical decisions can be made on the basis of a longitudinal biomarker, it is important to evaluate the potential accuracy associated with longitudinal monitoring. To characterize the overall accuracy of a time-dependent marker, we introduce a summary ROC curve that displays the overall sensitivity associated with a time-dependent threshold that controls time-varying specificity. The proposed statistical methods are similar to concepts considered in disease screening, yet our methods are novel in choosing a potentially time-dependent threshold to define a positive test, and our methods allow time-specific control of the false-positive rate. The proposed summary ROC curve is a natural averaging of time-dependent incident/dynamic ROC curves and therefore provides a single summary of net error rates that can be achieved in the longitudinal setting. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Validation of BS7910:2005 failure assessment diagrams for cracked square hollow section T-, Y- and K-joints

    International Nuclear Information System (INIS)

    Lie, S.T.; Yang, Z.M.; Gho, W.M.

    2009-01-01

    This paper describes the usage of finite element (FE) analyses results to validate the standard BS7910 assessment procedure for the safe design of cracked square hollow section (SHS) T-, Y- and K-joints. In the study, the actual 3D surface cracks obtained from previous fatigue tests have been included in the FE models. An automatic mesh generation program is then developed and used to produce the failure assessment diagram (FAD) through the J-integral method. The ultimate strength of uncracked SHS joints with reduced load bearing areas have been referenced to derive the plastic collapse loads of cracked SHS joints for the development of FAD. These loads have been validated against the previous experimental results. In comparison with the existing standard BS7910 Level 2A/3A FAD curve and the proposed assessment procedure for circular hollow section joints, it is found that a plastic collapse load with a penalty factor of 1.05 will be sufficient for the safe assessment of cracked SHS T, Y, and K-joints under brace end axial loading

  10. Multidisciplinary approach for in-deep assessment of joint prosthesis failure.

    Science.gov (United States)

    Tessarolo, F; Caola, I; Piccoli, F; Dorigotti, P; Demattè, E; Molinari, M; Malavolta, M; Barbareschi, M; Caciagli, P; Nollo, G

    2009-01-01

    In spite of advancement in biomaterials and biomechanics, in development of new osteo-integrative materials and coatings, and in macro- micro- component design, a non negligible fraction of the implanted prosthesis fails before the expected lifetime. A prospective observational clinical study has been conducted to define and apply a set of experimental techniques to in-deep assess the failure of joint prosthesis. Microbiological, histological and micro-structural techniques were implemented to specifically address phenomena occurring at the tissue-implant interface. Results obtained from 27 cases of prosthetic joint failure are discussed in terms of sensitivity and specificity. A procedural flow-chart is finally proposed for the assessment of joint prosthesis failure.

  11. Assessment of performance measures and learning curves for use of a virtual-reality ultrasound simulator in transvaginal ultrasound examination

    DEFF Research Database (Denmark)

    Madsen, M E; Konge, L; Nørgaard, L N

    2014-01-01

    OBJECTIVE: To assess the validity and reliability of performance measures, develop credible performance standards and explore learning curves for a virtual-reality simulator designed for transvaginal gynecological ultrasound examination. METHODS: A group of 16 ultrasound novices, along with a group......-6), corresponding to an average of 219 min (range, 150-251 min) of training. The test/retest reliability was high, with an intraclass correlation coefficient of 0.93. CONCLUSIONS: Competence in the performance of gynecological ultrasound examination can be assessed in a valid and reliable way using virtual-reality...

  12. Assessment of Estimation Methods ForStage-Discharge Rating Curve in Rippled Bed Rivers

    Directory of Open Access Journals (Sweden)

    P. Maleki

    2016-02-01

    Full Text Available Introduction: Interactionbetweenwater flow characteristics andthe bed erodibilityplays an important role in sediment transport process. In order to reach stability, rivers with deposition or bottom erosion make a different bed form in the riverbed. One way to identify thebehavior of therivers is to study the structure and formation of bed forms within them. Ripples are the smallest of the bed forms. The longitudinal cross section of ripples are usually not symmetrical. The upstream face is long and has a gentle slope, and the downstream face is short and steep. The height of ripples is usually between 0.5 cm and 2 cm; the height ripple is not more than 5 cm. The wave lengths normally do not exceed 30cm, and they are usually within the range of 1 cm to 15 cm. Their occurrence is the result of the unstable viscous layer near the boundary. They can form in both shallow and deep water.With an increase of the flow velocity, the plan form of the ripples gradually develops form straight line to curves and then to a pattern like fish scales, symmetrical or unsymmetrical, as shown in Fig 1. Figure1-The patterndevelopment oftheripple Raudkivi (1966 was the first person that, the flow structure over ripples was investigated experimentally.Hethenestablishseveraldifferent conditionsonthemovingsandbedinanlaboratorychannelconsisted of a rectangular cross-section with base width of 70cm, wasable toform arow ofripples , he wassucceed toform arow ofripples.JafariMianaei and Keshavarzi(2008,studied the turbulentflow betweentwoartificialripples for investigate the change of kinetic energyandshearstress on overripples. The stage- discharge rating curve is one of the most important tools in the hydraulic studies. In alluvial rivers,bed rippled are formed and significantly affect the stage- discharge rating curve. In this research, the effects of two different type of ripples (parallel and flakeshape onthe hydraulic characteristicsof flow were experimentally studied

  13. On 'lower bound crack resistance curves' as material law for the safety assessment of components

    International Nuclear Information System (INIS)

    Roos, E.; Silcher, H.; Eisele, U.

    1991-01-01

    Experimental fracture-mechanics investigations were carried out on large scale specimens. The specimen geometries and the crack depth ratio, a/W, in a parameter field were varied. Three materials of different toughness were chosen for the specimens. Their load-deformation behaviour, crack resistance curves, stretched zones Δa i and crack initiation values J i were determined and compared with the results from CT25 specimens. Numerical finite-element calculations were made to determine the state of stres in the specimens and the size of the plastic zones. (orig.)

  14. Chronic renal failure and sexual functioning: clinical status versus objectively assessed sexual response

    NARCIS (Netherlands)

    Toorians, A. W.; Janssen, E.; Laan, E.; Gooren, L. J.; Giltay, E. J.; Oe, P. L.; Donker, A. J.; Everaerd, W.

    1997-01-01

    BACKGROUND: Sexual dysfunctions are common among patients with chronic renal failure. The prevalence was assessed in a population of 281 patients (20-60 years), and it was attempted to determine whether their mode of treatment (haemodialysis, peritoneal dialysis, or kidney transplantation), or

  15. Nonorganic Failure to Thrive: Developmental Outcomes and Psychosocial Assessment and Intervention Issues.

    Science.gov (United States)

    Heffer, Robert W.; Kelley, Mary L.

    1994-01-01

    This review describes Nonorganic Failure to Thrive, presents developmental outcomes, and discusses psychosocial assessment and intervention issues relevant to this developmental disability of early childhood, focusing on child-specific variables, situational and family variables, parent-child interaction variables, and biopsychosocial formulation…

  16. Assessment of characteristic failure envelopes for intact rock using results from triaxial tests

    OpenAIRE

    Muralha, J.; Lamas, L.

    2014-01-01

    The paper presents contributions to the statistical study of the parameters of the Mohr-Coulomb and Hoek-Brown strength criteria, in order to assess the characteristic failure envelopes for intact rock, based on the results of several sets of triaxial tests performed by LNEC. 10p DBB/NMMR

  17. Electrical impedance tomography in the assessment of extravascular lung water in noncardiogenic acute respiratory failure

    NARCIS (Netherlands)

    Kunst, P. W.; Vonk Noordegraaf, A.; Raaijmakers, E.; Bakker, J.; Groeneveld, A. B.; Postmus, P. E.; de Vries, P. M.

    1999-01-01

    STUDY OBJECTIVES: To establish the value of electrical impedance tomography (EIT) in assessing pulmonary edema in noncardiogenic acute respiratory failure (ARF), as compared to the thermal dye double indicator dilution technique (TDD). DESIGN: Prospective clinical study. SETTING: ICU of a general

  18. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  19. Application of the Learning Curve Analysis to the LHC Main Dipole Production First Assessment

    CERN Document Server

    Fessia, P; Rossi, L

    2006-01-01

    About two third of the LHC main dipoles have been delivered by the three suppliers charged of the production. The training of the staff, mostly hired just for this manufacture, and the natural improvement of the procedures with the acquired experience, decrease naturally the time necessary for the assembly of a unit. The aim of this paper is to apply methodologies like the cost-based learning curves and the time-based learning curves to the LHC Main Dipole comparing the estimated learning percentage to the ones experienced in other industries. This type of analysis, still in a preliminary phase and here applied to about 40% of the total production of the LHC magnets that will end by 2006, shows that our production has a relatively high learning percentage and it is similar to aerospace and complex machine tools for new models. Therefore with the LHC project, accelerator magnets seem to have reached industrial maturity and this production can be used as bench mark for other large scientific projects implying s...

  20. Validation of self assessment patient knowledge questionnaire for heart failure patients.

    Science.gov (United States)

    Lainscak, Mitja; Keber, Irena

    2005-12-01

    Several studies showed insufficient knowledge and poor compliance to non-pharmacological management in heart failure patients. Only a limited number of validated tools are available to assess their knowledge. The aim of the study was to test our 10-item Patient knowledge questionnaire. The Patient knowledge questionnaire was administered to 42 heart failure patients from Heart failure clinic and to 40 heart failure patients receiving usual care. Construct validity (Pearson correlation coefficient), internal consistency (Cronbach alpha), reproducibility (Wilcoxon signed rank test), and reliability (chi-square test and Student's t-test for independent samples) were assessed. Overall score of the Patient knowledge questionnaire had the strongest correlation to the question about regular weighing (r=0.69) and the weakest to the question about presence of heart disease (r=0.33). There was a strong correlation between question about fluid retention and questions assessing regular weighing, (r=0.86), weight of one litre of water (r=0.86), and salt restriction (r=0.57). The Cronbach alpha was 0.74 and could be improved by exclusion of questions about clear explanation (Chronbach alpha 0.75), importance of fruit, soup, and vegetables (Chronbach alpha 0.75), and self adjustment of diuretic (Chronbach alpha 0.81). During reproducibility testing 91% to 98% of questions were answered equally. Patients from Heart failure clinic scored significantly better than patients receiving usual care (7.9 (1.3) vs. 5.7 (2.2), p<0.001). Patient knowledge questionnaire is a valid and reliable tool to measure knowledge of heart failure patients.

  1. No-threshold dose-response curves for nongenotoxic chemicals: Findings and applications for risk assessment

    International Nuclear Information System (INIS)

    Sheehan, Daniel M.

    2006-01-01

    We tested the hypothesis that no threshold exists when estradiol acts through the same mechanism as an active endogenous estrogen. A Michaelis-Menten (MM) equation accounting for response saturation, background effects, and endogenous estrogen level fit a turtle sex-reversal data set with no threshold and estimated the endogenous dose. Additionally, 31 diverse literature dose-response data sets were analyzed by adding a term for nonhormonal background; good fits were obtained but endogenous dose estimations were not significant due to low resolving power. No thresholds were observed. Data sets were plotted using a normalized MM equation; all 178 data points were accommodated on a single graph. Response rates from ∼1% to >95% were well fit. The findings contradict the threshold assumption and low-dose safety. Calculating risk and assuming additivity of effects from multiple chemicals acting through the same mechanism rather than assuming a safe dose for nonthresholded curves is appropriate

  2. A Dual Assessment of the Environmental Kuznets Curve: The Case of Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Ankarhem, Mattias (e-mail: mattias.ankarhem@econ.umu.se)

    2005-04-15

    In this paper, we calculate time series of shadow prices for Swedish emissions of CO{sub 2}, SO{sub 2}, and VOC for the period 1918 - 1994. Newly constructed historical emission time series enable studying a single country's emission paths through increasing levels of economic activity. The shadow prices are, in the next step, related to income to explain the environmental Kuznets curves (EKC) previously found in Swedish data for these three emissions. A directional distance function approach is used to estimate the production process for Swedish industry thus enabling the opportunity costs of a reduction in these emissions to be calculated. We attribute the annual changes in the shadow prices to the main causal factors by decomposing them into a technological effect and a substitution effect. We conclude that the time series of the shadow prices show support for EKCs for Swedish industry.

  3. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    International Nuclear Information System (INIS)

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins

  4. [Assessment of medical management of heart failure at National Hospital Blaise COMPAORE].

    Science.gov (United States)

    Kambiré, Y; Konaté, L; Diallo, I; Millogo, G R C; Kologo, K J; Tougouma, J B; Samadoulougou, A K; Zabsonré, P

    2018-05-09

    The aim of this study was to assess the quality of medical management of heart failure at the National Hospital Blaise Compaoré according to the international guidelines. A retrospective study was performed including consecutive patients admitted for heart failure documented sonographically from October 2012 to March 2015 in the Medicine and Medical Specialties Department of National Hospital Blaise Compaore with a minimum follow-up of six weeks. Data analysis was made by the SPSS 20.0 software. Eighty-four patients, mean age of 57.61±18.24 years, were included. It was an acute heart failure in 84.5% of patients with systolic left ventricular function impaired (77.4%). The rate of prescription of different drugs in heart failure any type was 88.1% for loop diuretics; 77.1% for angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and 65.5% for betablockers. In patients with systolic dysfunction, 84.62% of patients were received the combination of angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and 75.38% for betablockers. Exercise rehabilitation was undergoing in 10.7% of patients. The death rate was 16.7% and hospital readmission rate of 16.7%. The prescription rate of major heart failure drugs is satisfactory. Cardiac rehabilitation should be developed. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  5. Application of failure assessment diagram methods to cracked straight pipes and elbows

    International Nuclear Information System (INIS)

    Ainsworth, R.A.; Gintalas, M.; Sahu, M.K.; Chattopadhyay, J.; Dutta, B.K.

    2016-01-01

    This paper reports fracture assessments of large-scale straight pipes and elbows of various pipe diameters and crack sizes. The assessments estimate the load for ductile fracture initiation using the failure assessment diagram method. Recent solutions in the literature for stress intensity factor and limit load provide the analysis inputs. An assessment of constraint effects is also performed using recent solutions for elastic T-stress. It is found that predictions of initiation load are close to the experimental values for straight pipes under pure bending. For elbows, there is generally increased conservatism in the sense that the experimental loads are greater than those predicted. The effects of constraint are found not to be a major contributor to the initiation fracture assessments but may have some influence on the ductile crack extension. - Highlights: • This paper presents assessments of the loads for ductile fracture initiation in 21 large-scale piping tests. • Modern stress intensity factor and limit load solutions were used for standard failure assessment diagram methods. • This leads to generally accurate assessments of the loads for ductile crack initiation. • The effects of constraint are found not to be a major contributor to the initiation fracture assessments.

  6. Predictive Performance of the Simplified Acute Physiology Score (SAPS) II and the Initial Sequential Organ Failure Assessment (SOFA) Score in Acutely Ill Intensive Care Patients

    DEFF Research Database (Denmark)

    Granholm, Anders; Møller, Morten Hylander; Kragh, Mette

    2016-01-01

    PURPOSE: Severity scores including the Simplified Acute Physiology Score (SAPS) II and the Sequential Organ Failure Assessment (SOFA) score are used in intensive care units (ICUs) to assess disease severity, predict mortality and in research. We aimed to assess the predictive performance of SAPS II...... compared the discrimination of SAPS II and initial SOFA scores, compared the discrimination of SAPS II in our cohort with the original cohort, assessed the calibration of SAPS II customised to our cohort, and compared the discrimination for 90-day mortality vs. in-hospital mortality for both scores....... Discrimination was evaluated using areas under the receiver operating characteristics curves (AUROC). Calibration was evaluated using Hosmer-Lemeshow's goodness-of-fit Ĉ-statistic. RESULTS: AUROC for in-hospital mortality was 0.80 (95% confidence interval (CI) 0.77-0.83) for SAPS II and 0.73 (95% CI 0...

  7. How to interpret safety critical failures in risk and reliability assessments

    International Nuclear Information System (INIS)

    Selvik, Jon Tømmerås; Signoret, Jean-Pierre

    2017-01-01

    Management of safety systems often receives high attention due to the potential for industrial accidents. In risk and reliability literature concerning such systems, and particularly concerning safety-instrumented systems, one frequently comes across the term ‘safety critical failure’. It is a term associated with the term ‘critical failure’, and it is often deduced that a safety critical failure refers to a failure occurring in a safety critical system. Although this is correct in some situations, it is not matching with for example the mathematical definition given in ISO/TR 12489:2013 on reliability modeling, where a clear distinction is made between ‘safe failures’ and ‘dangerous failures’. In this article, we show that different interpretations of the term ‘safety critical failure’ exist, and there is room for misinterpretations and misunderstandings regarding risk and reliability assessments where failure information linked to safety systems are used, and which could influence decision-making. The article gives some examples from the oil and gas industry, showing different possible interpretations of the term. In particular we discuss the link between criticality and failure. The article points in general to the importance of adequate risk communication when using the term, and gives some clarification on interpretation in risk and reliability assessments.

  8. Assessing rockfall susceptibility in steep and overhanging slopes using three-dimensional analysis of failure mechanisms

    Science.gov (United States)

    Matasci, Battista; Stock, Greg M.; Jaboyedoff, Michael; Carrea, Dario; Collins, Brian D.; Guérin, Antoine; Matasci, G.; Ravanel, L.

    2018-01-01

    Rockfalls strongly influence the evolution of steep rocky landscapes and represent a significant hazard in mountainous areas. Defining the most probable future rockfall source areas is of primary importance for both geomorphological investigations and hazard assessment. Thus, a need exists to understand which areas of a steep cliff are more likely to be affected by a rockfall. An important analytical gap exists between regional rockfall susceptibility studies and block-specific geomechanical calculations. Here we present methods for quantifying rockfall susceptibility at the cliff scale, which is suitable for sub-regional hazard assessment (hundreds to thousands of square meters). Our methods use three-dimensional point clouds acquired by terrestrial laser scanning to quantify the fracture patterns and compute failure mechanisms for planar, wedge, and toppling failures on vertical and overhanging rock walls. As a part of this work, we developed a rockfall susceptibility index for each type of failure mechanism according to the interaction between the discontinuities and the local cliff orientation. The susceptibility for slope parallel exfoliation-type failures, which are generally hard to identify, is partly captured by planar and toppling susceptibility indexes. We tested the methods for detecting the most susceptible rockfall source areas on two famously steep landscapes, Yosemite Valley (California, USA) and the Drus in the Mont-Blanc massif (France). Our rockfall susceptibility models show good correspondence with active rockfall sources. The methods offer new tools for investigating rockfall hazard and improving our understanding of rockfall processes.

  9. Assessing Trust and Effectiveness in Virtual Teams: Latent Growth Curve and Latent Change Score Models

    Directory of Open Access Journals (Sweden)

    Michael D. Coovert

    2017-08-01

    Full Text Available Trust plays a central role in the effectiveness of work groups and teams. This is the case for both face-to-face and virtual teams. Yet little is known about the development of trust in virtual teams. We examined cognitive and affective trust and their relationship to team effectiveness as reflected through satisfaction with one’s team and task performance. Latent growth curve analysis reveals both trust types start at a significant level with individual differences in that initial level. Cognitive trust follows a linear growth pattern while affective trust is overall non-linear, but becomes linear once established. Latent change score models are utilized to examine change in trust and also its relationship with satisfaction with the team and team performance. In examining only change in trust and its relationship to satisfaction there appears to be a straightforward influence of trust on satisfaction and satisfaction on trust. However, when incorporated into a bivariate coupling latent change model the dynamics of the relationship are revealed. A similar pattern holds for trust and task performance; however, in the bivariate coupling change model a more parsimonious representation is preferred.

  10. Rapid learning curve assessment in an ex vivo training system for microincisional glaucoma surgery.

    Science.gov (United States)

    Dang, Yalong; Waxman, Susannah; Wang, Chao; Parikh, Hardik A; Bussel, Igor I; Loewen, Ralitsa T; Xia, Xiaobo; Lathrop, Kira L; Bilonick, Richard A; Loewen, Nils A

    2017-05-09

    Increasing prevalence and cost of glaucoma have increased the demand for surgeons well trained in newer, microincisional surgery. These procedures occur in a highly confined space, making them difficult to learn by observation or assistance alone as is currently done. We hypothesized that our ex vivo outflow model is sensitive enough to allow computing individual learning curves to quantify progress and refine techniques. Seven trainees performed nine trabectome-mediated ab interno trabeculectomies in pig eyes (n = 63). An expert surgeon rated the procedure using an Operating Room Score (ORS). The extent of outflow beds accessed was measured with canalograms. Data was fitted using mixed effect models. ORS reached a half-maximum on an asymptote after only 2.5 eyes. Surgical time decreased by 1.4 minutes per eye in a linear fashion. The ablation arc followed an asymptotic function with a half-maximum inflection point after 5.3 eyes. Canalograms revealed that this progress did not correlate well with improvement in outflow, suggesting instead that about 30 eyes are needed for true mastery. This inexpensive pig eye model provides a safe and effective microsurgical training model and allows objective quantification of outcomes for the first time.

  11. Bridge Expansion Joint in Road Transition Curve: Effects Assessment on Heavy Vehicles

    Directory of Open Access Journals (Sweden)

    Paola Di Mascio

    2017-06-01

    Full Text Available Properly-designed road surfaces provide a durable surface on which traffic can pass smoothly and safely. In fact, the main causes that determine the structural decay of the pavement and its parts are the traffic loads. These repeated actions can create undesirable unevennesses on the road surface, which induce vertical accelerations on vehicles, up to hindering contact between pavement and tire, with dangerous consequences on traffic safety. The dynamic actions transmitted by the vehicles depend on these irregularities: often, a bridge expansion joint (BEJ, introducing a necessary discontinuity between different materials, determines from the beginning a geometric irregularity in the running surface. Besides, some structural conditions could emphasize the problem (e.g., local cracking due to the settlement of the subgrade near the abutment or the discontinuity of stiffness due to the presence of different materials. When the BEJ is located in a transition curve, an inevitable vertical irregularity between road and joint can reach values of some centimeters, with serious consequences for the road safety. This paper deals with the analysis of a case study of a BEJ. Several test surveys were performed in order to fully characterize the effects on both vehicles and pavement. The three-dimensional representation of the pavement surface and the acceleration measurements on a heavy test vehicle were performed to analyze the joint behavior under traffic. Finally, a finite element model was implemented to evaluate the stress contribution on vehicle components induced by the vertical irregularities.

  12. Assessment of importance of elements for systems that condition depends on the sequence of elements failures

    International Nuclear Information System (INIS)

    Povyakalo, A.A.

    1996-01-01

    This paper proposes new general formulas for calculation of indices of elements importance for systems whose condition depends on sequence of elements failures. These systems have been called as systems with memory of failures (M-systems). Techniques existing for assessment of importance of elements are based on the Bool's models of system reliability, for which it is significant to suggest, that in every period of time system state depends only on a combination of states of elements at that very moment of time. These systems have been called as combinational systems (C-systems). Reliability of M-systems at any moment of operating time is a functional having distributions of elements time before failure as its arguments. Bool's models and methods of assessment of element importance, based on these models, are not appropriate for these systems. Pereguda and Povyakalo proposed the new techniques for assessment of elements importance for PO-SS systems that includes Protection Object (PO) and Safety System (PO). PO-SS system is an example of M-system. That technique is used at this paper as a basis for more general consideration. It has been shown that technique proposed for assessment of elements importance for M-systems has well-known Birnbaum's method as its particular case. Also the system with double protection is considered as an example

  13. Assessment of SPACE code for multiple failure accident: 1% Cold Leg Break LOCA with HPSI failure at ATLAS Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Hyuk; Lee, Seung Wook; Kim, Kyung-Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Design extension conditions (DECs) is a popular key issue after the Fukushima accident. In a viewpoint of the reinforcement of the defense in depth concept, a high-risk multiple failure accident should be reconsidered. The target scenario of ATLAS A5.1 test was LSTF (Large Scale Test Facility) SB-CL-32 test, a 1% SBLOCA with total failure of high pressure safety injection (HPSI) system of emergency core cooling system (ECCS) and secondary side depressurization as the accident management (AM) action, as a counterpart test. As the needs to prepare the DEC accident because of a multiple failure of the present NPPs are emphasized, the capability of SPACE code, just like other system analysis code, is required to expand the DEC area. The objectives of this study is to validate the capability of SPACE code for a DEC scenario, which represents multiple failure accident like as a SBLOCA with HPSI fail. Therefore, the ATLAS A5.1 test scenario was chosen. As the needs to prepare the DEC accident because of a multiple failure of operating NPPs are emphasized, the capability of SPACE code is needed to expand the DEC area. So the capability of SPACE code was validated for one of a DEC scenario. The target scenario was selected as the ATLAS A5.1 test, which is a 1% SBLOCA with total failure of HPSI system of ECCS and secondary side depressurization. Through the sensitivity study on discharge coefficient of break flow, the best fit of integrated mass was found. Using the coefficient, the ATLAS A5.1 test was analyzed using the SPACE code. The major thermal hydraulic parameters such as the system pressure, temperatures were compared with the test and have a good agreement. Through the simulation, it was concluded that the SPACE code can effectively simulate one of multiple failure accidents like as SBLOCA with HPSI failure accident.

  14. Failure Assessment for the High-Strength Pipelines with Constant-Depth Circumferential Surface Cracks

    OpenAIRE

    X. Liu; Z. X. Lu; Y. Chen; Y. L. Sui; L. H. Dai

    2018-01-01

    In the oil and gas transportation system over long distance, application of high-strength pipeline steels can efficiently reduce construction and operation cost by increasing operational pressure and reducing the pipe wall thickness. Failure assessment is an important issue in the design, construction, and maintenance of the pipelines. The small circumferential surface cracks with constant depth in the welded pipelines are of practical interest. This work provides an engineering estimation pr...

  15. The costs of failure: A preliminary assessment of major energy accidents, 1907-2007

    International Nuclear Information System (INIS)

    Sovacool, Benjamin K.

    2008-01-01

    A combination of technical complexity, tight coupling, speed, and human fallibility contribute to the unexpected failure of large-scale energy technologies. This study offers a preliminary assessment of the social and economic costs of major energy accidents from 1907 to 2007. It documents 279 incidents that have been responsible for $41 billion in property damage and 182,156 deaths. Such disasters highlight an often-ignored negative externality to energy production and use, and emphasize the need for further research

  16. Current Understanding of the Pathophysiology of Myocardial Fibrosis and Its Quantitative Assessment in Heart Failure

    Directory of Open Access Journals (Sweden)

    Tong Liu

    2017-04-01

    Full Text Available Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis.

  17. Usage of Failure Mode & EffectAnalysis Method (FMEA forsafety assessment in a drug manufacture

    Directory of Open Access Journals (Sweden)

    Y Nazari

    2006-04-01

    Full Text Available Background and Aims: This study was hold in purpose of recognizing and controlling workplacehazards in production units of a drag ManufactureMethod:So for recognition and assessment of hazards, FMEA Method was used. FMEASystematically investigates the effects of equipment and system failures leading often toequipment design improvements. At first the level of the study defined as system. Then accordingto observations, accident statistic, and interview with managers, supervisory, and workers highrisk system were determiner. So the boundaries of the system established and informationregarding the relevant Components, their function and interactions gathered. To preventConfusion between Similar pieces of equipment, a unique system identifier developed. After thatall failure modes and their causes for each equipment or system listed, the immediate effects ofeach failure mode and interactive effect on other equipment or system was described too. Riskpriority number was determined according to global and local criteriaResults: After all some actions and solution proposed to reduce the likelihood and severity offailures and raise their delectability.Conclusion :This study illustrated that although of the first step drug manufacture may seem safe,but there are still many hazardous condition that could cause serious accidents, The result proposedit is necessary: (1 to develop comprehensive manual for periodical and regular inspection ofinstruments of workplaces in purpose of recognize unknown failures and their causes, (2 developa comprehensive program for systems maintenance and repair, and (3 conduct worker training.

  18. Assessment of compressive failure process of cortical bone materials using damage-based model.

    Science.gov (United States)

    Ng, Theng Pin; R Koloor, S S; Djuansjah, J R P; Abdul Kadir, M R

    2017-02-01

    The main failure factors of cortical bone are aging or osteoporosis, accident and high energy trauma or physiological activities. However, the mechanism of damage evolution coupled with yield criterion is considered as one of the unclear subjects in failure analysis of cortical bone materials. Therefore, this study attempts to assess the structural response and progressive failure process of cortical bone using a brittle damaged plasticity model. For this reason, several compressive tests are performed on cortical bone specimens made of bovine femur, in order to obtain the structural response and mechanical properties of the material. Complementary finite element (FE) model of the sample and test is prepared to simulate the elastic-to-damage behavior of the cortical bone using the brittle damaged plasticity model. The FE model is validated in a comparative method using the predicted and measured structural response as load-compressive displacement through simulation and experiment. FE results indicated that the compressive damage initiated and propagated at central region where maximum equivalent plastic strain is computed, which coincided with the degradation of structural compressive stiffness followed by a vast amount of strain energy dissipation. The parameter of compressive damage rate, which is a function dependent on damage parameter and the plastic strain is examined for different rates. Results show that considering a similar rate to the initial slope of the damage parameter in the experiment would give a better sense for prediction of compressive failure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. VALIDATING A COMPUTER-BASED TECHNIQUE FOR ASSESSING STABILITY TO FAILURE STRESS

    Directory of Open Access Journals (Sweden)

    I. F. Arshava

    2013-03-01

    Full Text Available An upsurge of interest in the implicit personality assessment, currently observed both in personality psycho-diagnostics and in experimental studies of social attitudes and prejudices, signals the shifting of researchers’ attention from de?ning between-person personality taxonomy to specifying comprehensive within-person processes, the dynamics of which can be captured at the level of an individual case. This research examines the possibility of the implicit assessment of the individual’s stability vs. susceptibility to failure stress by comparing the degrees of ef?cacy in the voluntary self-regulation of a computer-simulated information-processing activity under different conditions (patent of Ukraine № 91842, issued in 2010. By exposing two groups of participants (university undergraduates to processing the information, the scope of which exceeds the human short-term memory capacity at one of the stages of the modeled activity an unexpected and unavoidable failure is elicited. The participants who retain stability of their self-regulation behavior after having been exposed to failure, i.e. who keep processing information as effectively as they did prior to failure, are claimed to retain homeostasis and thus possess emotional stability. Those, who loose homeostasis after failure and display lower standards of self-regulation behavior, are considered to be susceptible to stress. The validity of the suggested type of the implicit diagnostics was empirically tested by clustering (K-means algorithm two samples of the participants on the  properties of their self-regulation behavior and testing between-cluster differences by a set of the explicitly assessed variables: Action control ef?cacy (Kuhl, 2001, preferred strategies of Coping with Stressful Situations (Endler, Parker, 1990,  Purpose-in-Life orientation (a Russian version of the test by Crumbaugh and Maholick, modi?ed by D.Leontiev, 1992, Psychological Well-being (Ryff, 1989

  20. Calibration of the inertial consistency index to assess road safety on horizontal curves of two-lane rural roads.

    Science.gov (United States)

    Llopis-Castelló, David; Camacho-Torregrosa, Francisco Javier; García, Alfredo

    2018-05-26

    One of every four road fatalities occurs on horizontal curves of two-lane rural roads. To this regard, many studies have been undertaken to analyze the crash risk on this road element. Most of them were based on the concept of geometric design consistency, which can be defined as how drivers' expectancies and road behavior relate. However, none of these studies included a variable which represents and estimates drivers' expectancies. This research presents a new local consistency model based on the Inertial Consistency Index (ICI). This consistency parameter is defined as the difference between the inertial operating speed, which represents drivers' expectations, and the operating speed, which represents road behavior. The inertial operating speed was defined as the weighted average operating speed of the preceding road section. In this way, different lengths, periods of time, and weighting distributions were studied to identify how the inertial operating speed should be calculated. As a result, drivers' expectancies should be estimated considering 15 s along the segment and a linear weighting distribution. This was consistent with drivers' expectancies acquirement process, which is closely related to Short-Term Memory. A Safety Performance Function was proposed to predict the number of crashes on a horizontal curve and consistency thresholds were defined based on the ICI. To this regard, the crash rate increased as the ICI increased. Finally, the proposed consistency model was compared with previous models. As a conclusion, the new Inertial Consistency Index allowed a more accurate estimation of the number of crashes and a better assessment of the consistency level on horizontal curves. Therefore, highway engineers have a new tool to identify where road crashes are more likely to occur during the design stage of both new two-lane rural roads and improvements of existing highways. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve

    International Nuclear Information System (INIS)

    Sudret, B.; Hornet, P.; Stephan, J.-M.; Guede, Z.; Lemaire, M.

    2003-01-01

    A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)

  2. Impact of support system failure limitations on probabilistic safety assessment and in regulatory decision making

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1990-01-01

    When used as a tool for safety decision making, Probabilistic Safety Assessment (PSA) is as effective as it realistically characterizes the overall frequency and consequences of various types of system and component failures. If significant support system failure events are omitted from consideration, the PSA process omits the characterization of possible unique contributors to core damage risk, possibly underestimates the frequency of core damage, and reduces the future utility of the PSA as a decision making tool for the omitted support system. This paper is based on a review of several recent US PSA studies and the author's participation in several International Atomic Energy Agency (IAEA) sponsored peer reviews. 21 refs., 2 figs., 1 tab

  3. Noninvasive radiographic assessment of cardiovascular function in acute and chronic respiratory failure

    International Nuclear Information System (INIS)

    Berger, H.J.; Matthay, R.A.

    1981-01-01

    Noninvasive radiographic techniques have provided a means of studying the natural history and pathogenesis of cardiovascular performance in acute and chronic respiratory failure. Chest radiography, radionuclide angiocardiography and thallium-201 imaging, and M mode and cross-sectional echocardiography have been employed. Each of these techniques has specific uses, attributes and limitations. For example, measurement of descending pulmonary arterial diameters on the plain chest radiograph allows determination of the presence or absence of pulmonary arterial hypertension. Right and left ventricular performance can be evaluated at rest and during exercise using radionuclide angiocardiography. The biventricular response to exercise and to therapeutic interventions also can be assessed with this approach. Evaluation of the pulmonary valve echogram and echocardiographic right ventricular dimensions have been shown to reflect right ventricular hemodynamics and size. Each of these noninvasive techniques has been applied to the study of patients with respiratory failure and has provided important physiologic data

  4. Determination of the time to failure curve as a function of stress for a highly irradiated AISI 304 stainless steel after constant load tests in simulated PWR water environment

    International Nuclear Information System (INIS)

    Pokor, C.; Massoud, J.P.; Wintergerst, M.; Toivonen, A.; Ehrnsten, U.; Karlsen, W.

    2011-01-01

    The structures of Reactor Pressure Vessel Internals are subjected to an intense neutron flux. Under these operating conditions, the microstructure and the mechanical properties of the austenitic stainless steel components change. In addition, these components are subjected to stress of either manufacturing origin or generated under operation. Cases of baffle bolts cracking have occurred in CP0 Nuclear Power Plant units. The mechanism of degradation of these bolts is Irradiation-Assisted Stress Corrosion Cracking. In order to obtain a better understanding of this mechanism and its principal parameters of influence, a set of stress corrosion tests (mainly constant load tests) were launched within the framework of the EDF project 'PWR Internals' using materials from a Chooz A baffle corner (SA 304). These tests aim to quantify the influence on IASCC of the applied stress, temperature and environment (primary water, higher lithium concentration, inert environment) for an irradiation dose close to 30 dpa. A curve showing time to failure as a function of the stress was determined. The shape of this curve is consistent with the few data that are available in the literature. A stress threshold of about 50 % of the yield strength value at the test temperature has been determined, below which cracking in that environment seems impossible. After irradiation this material is sensitive to intergranular fracture in a primary environment, but also in an inert environment (argon) at 340 C. The tests also showed a negative effect of increased lithium concentration on the time to failure and on the stress threshold. (authors)

  5. Assessing Technical Performance and Determining the Learning Curve in Cleft Palate Surgery Using a High-Fidelity Cleft Palate Simulator.

    Science.gov (United States)

    Podolsky, Dale J; Fisher, David M; Wong Riff, Karen W; Szasz, Peter; Looi, Thomas; Drake, James M; Forrest, Christopher R

    2018-06-01

    This study assessed technical performance in cleft palate repair using a newly developed assessment tool and high-fidelity cleft palate simulator through a longitudinal simulation training exercise. Three residents performed five and one resident performed nine consecutive endoscopically recorded cleft palate repairs using a cleft palate simulator. Two fellows in pediatric plastic surgery and two expert cleft surgeons also performed recorded simulated repairs. The Cleft Palate Objective Structured Assessment of Technical Skill (CLOSATS) and end-product scales were developed to assess performance. Two blinded cleft surgeons assessed the recordings and the final repairs using the CLOSATS, end-product scale, and a previously developed global rating scale. The average procedure-specific (CLOSATS), global rating, and end-product scores increased logarithmically after each successive simulation session for the residents. Reliability of the CLOSATS (average item intraclass correlation coefficient (ICC), 0.85 ± 0.093) and global ratings (average item ICC, 0.91 ± 0.02) among the raters was high. Reliability of the end-product assessments was lower (average item ICC, 0.66 ± 0.15). Standard setting linear regression using an overall cutoff score of 7 of 10 corresponded to a pass score for the CLOSATS and the global score of 44 (maximum, 60) and 23 (maximum, 30), respectively. Using logarithmic best-fit curves, 6.3 simulation sessions are required to reach the minimum standard. A high-fidelity cleft palate simulator has been developed that improves technical performance in cleft palate repair. The simulator and technical assessment scores can be used to determine performance before operating on patients.

  6. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  7. Efficacy of reciprocating and rotary NiTi instruments for retreatment of curved root canals assessed by micro-CT.

    Science.gov (United States)

    Rödig, T; Reicherts, P; Konietschke, F; Dullin, C; Hahn, W; Hülsmann, M

    2014-10-01

    To compare the efficacy of reciprocating and rotary NiTi-instruments in removing filling material from curved root canals using micro-computed tomography. Sixty curved root canals were prepared and filled with gutta-percha and sealer. After determination of root canal curvatures and radii in two directions as well as volumes of filling material, the teeth were assigned to three comparable groups (n = 20). Retreatment was performed using Reciproc, ProTaper Universal Retreatment or Hedström files. Percentages of residual filling material and dentine removal were assessed using micro-CT imaging. Working time and procedural errors were recorded. Statistical analysis was performed by variance procedures. No significant differences amongst the three retreatment techniques concerning residual filling material were detected (P > 0.05). Hedström files removed significantly more dentine than ProTaper Universal Retreatment (P  0.05). Reciproc and ProTaper Universal Retreatment were significantly faster than Hedström files (P = 0.0001). No procedural errors such as instrument fracture, blockage, ledging or perforation were detected for Hedström files. Three perforations were recorded for ProTaper Universal Retreatment, and in both NiTi groups, one instrument fracture occured. Remnants of filling material were observed in all samples with no significant differences between the three techniques. Hedström files removed significantly more dentine than ProTaper Universal Retreatment, but no significant differences between both NiTi systems were detected. Procedural errors were observed with ProTaper Universal Retreatment and Reciproc. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  8. A human reliability analysis (HRA) method for identifying and assessing the error of commission (EOC) from a diagnosis failure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Yun; Kang, Dae Il

    2005-01-01

    The study deals with a method for systematically identifying and assessing the EOC events that might be caused from a diagnosis failure or misdiagnosis of the expected events in accident scenarios of nuclear power plants. The method for EOC identification and assessment consists of three steps: analysis of the potential for a diagnosis failure (or misdiagnosis), identification of the EOC events from the diagnosis failure, quantitative assessment of the identified EOC events. As a tool for analysing a diagnosis failure, the MisDiagnosis Tree Analysis (MDTA) technique is proposed with the taxonomy of misdiagnosis causes. Also, the guidance on the identification of EOC events and the classification system and data are given for quantitiative assessment. As an applicaton of the proposed method, the EOCs identification and assessment for Younggwang 3 and 4 plants and their impact on the plant risk were performed. As the result, six events or event sequences were considered for diagnosis failures and about 20 new Human Failure Events (HFEs) involving EOCs were identified. According to the assessment of the risk impact of the identified HFEs, they increase the CDF by 11.4 % of the current CDF value, which corresponds to 10.2 % of the new CDF. The small loss of coolant accident (SLOCA) turned out to be a major contributor to the increase of CDF resulting in 9.2 % increaseof the current CDF.

  9. Cost development of future technologies for power generation-A study based on experience curves and complementary bottom-up assessments

    International Nuclear Information System (INIS)

    Neij, Lena

    2008-01-01

    Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development

  10. The learning curve, interobserver, and intraobserver agreement of endoscopic confocal laser endomicroscopy in the assessment of mucosal barrier defects.

    Science.gov (United States)

    Chang, Jeff; Ip, Matthew; Yang, Michael; Wong, Brendon; Power, Theresa; Lin, Lisa; Xuan, Wei; Phan, Tri Giang; Leong, Rupert W

    2016-04-01

    Confocal laser endomicroscopy can dynamically assess intestinal mucosal barrier defects and increased intestinal permeability (IP). These are functional features that do not have corresponding appearance on histopathology. As such, previous pathology training may not be beneficial in learning these dynamic features. This study aims to evaluate the diagnostic accuracy, learning curve, inter- and intraobserver agreement for identifying features of increased IP in experienced and inexperienced analysts and pathologists. A total of 180 endoscopic confocal laser endomicroscopy (Pentax EC-3870FK; Pentax, Tokyo, Japan) images of the terminal ileum, subdivided into 6 sets of 30 were evaluated by 6 experienced analysts, 13 inexperienced analysts, and 2 pathologists, after a 30-minute teaching session. Cell-junction enhancement, fluorescein leak, and cell dropout were used to represent increased IP and were either present or absent in each image. For each image, the diagnostic accuracy, confidence, and quality were assessed. Diagnostic accuracy was significantly higher for experienced analysts compared with inexperienced analysts from the first set (96.7% vs 83.1%, P 0.86 for experienced observers. Features representative of increased IP can be rapidly learned with high inter- and intraobserver agreement. Confidence and image quality were significant predictors of accurate interpretation. Previous pathology training did not have an effect on learning. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  11. Industry-Cost-Curve Approach for Modeling the Environmental Impact of Introducing New Technologies in Life Cycle Assessment.

    Science.gov (United States)

    Kätelhön, Arne; von der Assen, Niklas; Suh, Sangwon; Jung, Johannes; Bardow, André

    2015-07-07

    The environmental costs and benefits of introducing a new technology depend not only on the technology itself, but also on the responses of the market where substitution or displacement of competing technologies may occur. An internationally accepted method taking both technological and market-mediated effects into account, however, is still lacking in life cycle assessment (LCA). For the introduction of a new technology, we here present a new approach for modeling the environmental impacts within the framework of LCA. Our approach is motivated by consequential life cycle assessment (CLCA) and aims to contribute to the discussion on how to operationalize consequential thinking in LCA practice. In our approach, we focus on new technologies producing homogeneous products such as chemicals or raw materials. We employ the industry cost-curve (ICC) for modeling market-mediated effects. Thereby, we can determine substitution effects at a level of granularity sufficient to distinguish between competing technologies. In our approach, a new technology alters the ICC potentially replacing the highest-cost producer(s). The technologies that remain competitive after the new technology's introduction determine the new environmental impact profile of the product. We apply our approach in a case study on a new technology for chlor-alkali electrolysis to be introduced in Germany.

  12. Assessment of the French and US embrittlement trend curves applied to RPV materials irradiated in the BR2 materials test reactor

    International Nuclear Information System (INIS)

    Chaouadi, R.; Gerard, R.; Boagaerts, A.S.

    2011-01-01

    The irradiation embrittlement of reactor pressure vessels (RPVs) in monitored through the surveillance programs associated with predictive formulas, the so-called embrittlement trend curves. These formulas are generally empirically derived and contain the major embrittlement-inducing elements such as copper, nickel and phosphorus. There are a number of such trend curves used in various regulatory guides used in the US, France, Germany, Russia and Japan. These trend curves are often supported by surveillance data and regularly assessed in view of updated surveillance databases. With the recent worldwide move towards life extension of existing reactors above their initially-scheduled lifetime of 40 years, adequate and accurate modeling of irradiation embrittlement becomes a concern for long term operation. The aim of this work is to assess the performance of the embrittlement trend curves used in a regulatory perspective. The work presented here is limited to US and French trend curves because the reactor pressure vessels of the Belgian nuclear power plants are either Westinghouse or Framatome design. The chemical composition of the Belgian RPVs being very close to the one of the French 900 MW units, the French trend curve is used except for the Doel 1-2 units for which these curves are not applicable due to the higher copper content of the welds. In this case, the U.S. trend curves are used. The aim of this work is to evaluate the performance of the embrittlement trend curves used in a regulatory perspective to represent the experimental data obtained in the BR2 reactor. In particular, the French (FIM, FIS) and the US (Reg. Guide 1.99 Rev. 2, ASTM E900-02, EWO and EONY) formulas are of prime interest. The results obtained clearly show that the French trend curves tend to over-estimate the actual irradiation hardening while the US curves under-estimate it. Within the long term operation perspective, both over- and under-estimating are undesirable and therefore the

  13. Assessing responsiveness of generic and specific health related quality of life measures in heart failure

    Directory of Open Access Journals (Sweden)

    Johnson Jeffrey A

    2006-11-01

    Full Text Available Abstract Background Responsiveness, or sensitivity to clinical change, is an important consideration in selection of a health-related quality of life (HRQL measure for trials or clinical applications. Many approaches can be used to assess responsiveness, which may affect the interpretation of study results. We compared the relative responsiveness of generic and heart failure specific HRQL instruments, as measured both by common psychometric indices and by external clinical criteria. Methods We analyzed data collected at baseline and 6-weeks in 298 subjects with heart failure on the following HRQL measures: EQ-5D (US, UK, and VAS Scoring, Kansas City Cardiomyopathy Questionnaire (KCCQ (Clinical and Overall Summary Score, and RAND12 (Physical and Mental Component Summaries. Three external indicators of clinical change were used to classify subjects as improved, deteriorated, or unchanged: 6-minute walk test, New York Heart Association (NYHA class, and physician global rating of change. Four responsiveness statistics (T-test, effect size, Guyatt's responsiveness statistic, and standardized response mean were used to evaluate the responsiveness of the select measures. The median rank of each HRQL measure across responsiveness indices and clinical criteria was then determined. Results Average age of subjects was 60 years, 75 percent were male, and had moderate to severe heart failure symptoms. Overall, the KCCQ Summary Scores had the highest relative ranking, irrespective of the responsiveness index or external criterion used. Importantly, we observed that the relative ranking of responsiveness of the generic measures (i.e. EQ-5D, RAND12 was influenced by both the responsive indices and external criterion used. Conclusion The disease specific KCCQ was the most responsive HRQL measure assessing change over a 6-week period, although generic measures provide information for which the KCCQ is not suitable. The responsiveness of generic HRQL measures may

  14. Assessing the impact of heart failure specialist services on patient populations

    Directory of Open Access Journals (Sweden)

    Lyratzopoulos Georgios

    2004-05-01

    Full Text Available Abstract Background The assessment of the impact of healthcare interventions may help commissioners of healthcare services to make optimal decisions. This can be particularly the case if the impact assessment relates to specific patient populations and uses timely local data. We examined the potential impact on readmissions and mortality of specialist heart failure services capable of delivering treatments such as b-blockers and Nurse-Led Educational Intervention (N-LEI. Methods Statistical modelling of prevented or postponed events among previously hospitalised patients, using estimates of: treatment uptake and contraindications (based on local audit data; treatment effectiveness and intolerance (based on literature; and annual number of hospitalization per patient and annual risk of death (based on routine data. Results Optimal treatment uptake among eligible but untreated patients would over one year prevent or postpone 11% of all expected readmissions and 18% of all expected deaths for spironolactone, 13% of all expected readmisisons and 22% of all expected deaths for b-blockers (carvedilol and 20% of all expected readmissions and an uncertain number of deaths for N-LEI. Optimal combined treatment uptake for all three interventions during one year among all eligible but untreated patients would prevent or postpone 37% of all expected readmissions and a minimum of 36% of all expected deaths. Conclusion In a population of previously hospitalised patients with low previous uptake of b-blockers and no uptake of N-LEI, optimal combined uptake of interventions through specialist heart failure services can potentially help prevent or postpone approximately four times as many readmissions and a minimum of twice as many deaths compared with simply optimising uptake of spironolactone (not necessarily requiring specialist services. Examination of the impact of different heart failure interventions can inform rational planning of relevant healthcare

  15. Assessing the impact of heart failure specialist services on patient populations.

    Science.gov (United States)

    Lyratzopoulos, Georgios; Cook, Gary A; McElduff, Patrick; Havely, Daniel; Edwards, Richard; Heller, Richard F

    2004-05-24

    The assessment of the impact of healthcare interventions may help commissioners of healthcare services to make optimal decisions. This can be particularly the case if the impact assessment relates to specific patient populations and uses timely local data. We examined the potential impact on readmissions and mortality of specialist heart failure services capable of delivering treatments such as b-blockers and Nurse-Led Educational Intervention (N-LEI). Statistical modelling of prevented or postponed events among previously hospitalised patients, using estimates of: treatment uptake and contraindications (based on local audit data); treatment effectiveness and intolerance (based on literature); and annual number of hospitalization per patient and annual risk of death (based on routine data). Optimal treatment uptake among eligible but untreated patients would over one year prevent or postpone 11% of all expected readmissions and 18% of all expected deaths for spironolactone, 13% of all expected readmisisons and 22% of all expected deaths for b-blockers (carvedilol) and 20% of all expected readmissions and an uncertain number of deaths for N-LEI. Optimal combined treatment uptake for all three interventions during one year among all eligible but untreated patients would prevent or postpone 37% of all expected readmissions and a minimum of 36% of all expected deaths. In a population of previously hospitalised patients with low previous uptake of b-blockers and no uptake of N-LEI, optimal combined uptake of interventions through specialist heart failure services can potentially help prevent or postpone approximately four times as many readmissions and a minimum of twice as many deaths compared with simply optimising uptake of spironolactone (not necessarily requiring specialist services). Examination of the impact of different heart failure interventions can inform rational planning of relevant healthcare services.

  16. [Implantable sensors for outpatient assessment of ventricular filling pressure in advanced heart failure : Which telemonitoring design is optimal?

    Science.gov (United States)

    Herrmann, E; Fichtlscherer, S; Hohnloser, S H; Zeiher, A M; Aßmus, B

    2016-12-01

    Patients with advanced heart failure suffer from frequent hospitalizations. Non-invasive hemodynamic telemonitoring for assessment of ventricular filling pressure has been shown to reduce hospitalizations. We report on the right ventricular (RVP), the pulmonary artery (PAP) and the left atrial pressure (LAP) sensor for non-invasive assessment of the ventricular filling pressure. A literature search concerning the available implantable pressure sensors for noninvasive haemodynamic telemonitoring in patients with advanced heart failure was performed. Until now, only implantation of the PAP-sensor was able to reduce hospitalizations for cardiac decompensation and to improve quality of life. The right ventricular pressure sensor missed the primary endpoint of a significant reduction of hospitalizations, clinical data using the left atrial pressure sensor are still pending. The implantation of a pressure sensor for assessment of pulmonary artery filling pressure is suitable for reducing hospitalizations for heart failure and for improving quality of life in patients with advanced heart failure.

  17. Assessment of p-y Curves from Numerical Methods for a non-Slender Monopile in Cohesionless Soil

    DEFF Research Database (Denmark)

    Ibsen, Lars Bo; Roesen, Hanne Ravn; Wolf, Torben K.

    2013-01-01

    In current design the stiff large diameter monopile is a widely used solution as foundation of offshore wind turbines. Winds and waves subject the monopile to considerable lateral loads. The current design guidances apply the p-y curve method with formulations for the curves based on slender piles....... However, the behaviour of the stiff monopiles during lateral loading is not fully understood. In this paper case study from Barrow Offshore Wind Farm is used in a 3D finite element model. The analysis forms a basis for extraction of p-y curves which are used in an evaluation of the traditional curves...

  18. Assessment of p-y Curves from Numerical Methods for a non-Slender Monopile in Cohesionless Soil

    DEFF Research Database (Denmark)

    Wolf, Torben K.; Rasmussen, Kristian L.; Hansen, Mette

    In current design the stiff large diameter monopile is a widely used solution as foundation of offshore wind turbines. Winds and waves subject the monopile to considerable lateral loads. The current design guidances apply the p-y curve method with formulations for the curves based on slender piles....... However, the behaviour of the stiff monopiles during lateral loading is not fully understood. In this paper case study from Barrow Offshore Wind Farm is used in a 3D finite element model. The analysis forms a basis for extraction of p-y curves which are used in an evaluation of the traditional curves...

  19. Assessment of the causes of failures of roto-dynamic equipment in Cirus

    International Nuclear Information System (INIS)

    Rao, K.N.; Singh, S.; Ganeshan, P.

    1994-01-01

    As a part of Cirus reactor life extension program study, a service life evaluation of critical roto-dynamic equipment in Cirus such as primary coolant pumps, and their concrete foundation structures, pressurised water loop pumps, main air compressors and supply and exhaust fans, was performed. An assessment of the causes of failures of roto-dynamic equipment in Cirus was done. Based on assessment of the degradation mitigating features and comparison to similar roto-dynamic equipment and their concrete foundation structures, it was concluded that life extension of these roto-dynamic equipment and their structures is feasible. To support this conclusion a program involving: a) non-destructive testing, b) surveillance and monitoring and, c) preventive maintenance is recommended. (author). 4 refs

  20. Strategic environmental assessment can help solve environmental impact assessment failures in developing countries

    International Nuclear Information System (INIS)

    Alshuwaikhat, Habib M.

    2005-01-01

    The current trend of industrialization and urbanization in developing nations has a huge impact on anthropogenic and natural ecosystems. Pollution sources increase with the expansion of cities and cause contamination of water, air and soil. The absence of urban environmental planning and management strategies has resulted in greater concern for future urban development. This paper advocates the adoption of strategic environmental assessment (SEA) as a means to achieve sustainable development in developing countries. It investigates project-level environmental impact assessment (EIA) and its limitations. The exploration of SEA and its features are addressed. The effective implementation of SEA can create a roadmap for sustainable development. In many developing countries, the lack of transparency and accountability and ineffective public participation in the development of the policy, plan and program (PPP) would be mitigated by the SEA process. Moreover, the proactive and broadly based characteristics of SEA would benefit the institutional development of the PPP process, which is rarely experienced in many developing countries. The paper also explores the prospects for SEA and its guiding principles in developing countries. Finally, the paper calls for a coordinated effort between all government, nongovernment and international organizations involved with PPPs to enable developing countries to pursue a path of sustainable development through the development and application of strategic environmental assessment

  1. Development of a Watershed-Scale Long-Term Hydrologic Impact Assessment Model with the Asymptotic Curve Number Regression Equation

    Directory of Open Access Journals (Sweden)

    Jichul Ryu

    2016-04-01

    Full Text Available In this study, 52 asymptotic Curve Number (CN regression equations were developed for combinations of representative land covers and hydrologic soil groups. In addition, to overcome the limitations of the original Long-term Hydrologic Impact Assessment (L-THIA model when it is applied to larger watersheds, a watershed-scale L-THIA Asymptotic CN (ACN regression equation model (watershed-scale L-THIA ACN model was developed by integrating the asymptotic CN regressions and various modules for direct runoff/baseflow/channel routing. The watershed-scale L-THIA ACN model was applied to four watersheds in South Korea to evaluate the accuracy of its streamflow prediction. The coefficient of determination (R2 and Nash–Sutcliffe Efficiency (NSE values for observed versus simulated streamflows over intervals of eight days were greater than 0.6 for all four of the watersheds. The watershed-scale L-THIA ACN model, including the asymptotic CN regression equation method, can simulate long-term streamflow sufficiently well with the ten parameters that have been added for the characterization of streamflow.

  2. Two viewpoints for software failures and their relation in probabilistic safety assessment of digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2015-01-01

    As the use of digital systems in nuclear power plants increases, the reliability of the software becomes one of the important issues in probabilistic safety assessment. In this paper, two viewpoints for a software failure during the operation of a digital system or a statistical software test are identified, and the relation between them is provided. In conventional software reliability analysis, a failure is mainly viewed with respect to the system operation. A new viewpoint with respect to the system input is suggested. The failure probability density functions for the two viewpoints are defined, and the relation between the two failure probability density functions is derived. Each failure probability density function can be derived from the other failure probability density function by applying the derived relation between the two failure probability density functions. The usefulness of the derived relation is demonstrated by applying it to the failure data obtained from the software testing of a real system. The two viewpoints and their relation, as identified in this paper, are expected to help us extend our understanding of the reliability of safety-critical software. (author)

  3. Assessment selection in human-automation interaction studies: The Failure-GAM2E and review of assessment methods for highly automated driving.

    Science.gov (United States)

    Grane, Camilla

    2018-01-01

    Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Failure Monitoring and Condition Assessment of Steel-Concrete Adhesive Connection Using Ultrasonic Waves

    Directory of Open Access Journals (Sweden)

    Magdalena Rucka

    2018-02-01

    Full Text Available Adhesive bonding is increasingly being incorporated into civil engineering applications. Recently, the use of structural adhesives in steel-concrete composite systems is of particular interest. The aim of the study is an experimental investigation of the damage assessment of the connection between steel and concrete during mechanical degradation. Nine specimens consisted of a concrete cube and two adhesively bonded steel plates were examined. The inspection was based on the ultrasound monitoring during push-out tests. Ultrasonic waves were excited and registered by means of piezoelectric transducers every two seconds until the specimen failure. To determine the slip between the steel and concrete a photogrammetric method was applied. The procedure of damage evaluation is based on the monitoring of the changes in the amplitude and phase shift of signals measured during subsequent phases of degradation. To quantify discrepancies between the reference signal and other registered signals, the Sprague and Gears metric was applied. The results showed the possibilities and limitations of the proposed approach in diagnostics of adhesive connections between steel and concrete depending on the failure modes.

  5. Assessment of surge arrester failure rate and application studies in Hellenic high voltage transmission lines

    Energy Technology Data Exchange (ETDEWEB)

    Christodoulou, C.A.; Fotis, G.P.; Gonos, I.F.; Stathopulos, I.A. [National Technical University of Athens, School of Electrical and Computer Engineering, High Voltage Laboratory, 9 Iroon Politechniou St., Zografou Campus, 157 80 Athens (Greece); Ekonomou, L. [A.S.PE.T.E. - School of Pedagogical and Technological Education, Department of Electrical Engineering Educators, N. Heraklion, 141 21 Athens (Greece)

    2010-02-15

    The use of transmission line surge arresters to improve the lightning performance of transmission lines is becoming more common. Especially in areas with high soil resistivity and ground flash density, surge arresters constitute the most effective protection mean. In this paper a methodology for assessing the surge arrester failure rate based on the electrogeometrical model is presented. Critical currents that exceed arresters rated energy stress were estimated by the use of a simulation tool. The methodology is applied on operating Hellenic transmission lines of 150 kV. Several case studies are analyzed by installing surge arresters on different intervals, in relation to the region's tower footing resistance and the ground flash density. The obtained results are compared with real records of outage rate showing the effectiveness of the surge arresters in the reduction of the recorded failure rate. The presented methodology can be proved valuable to the studies of electric power systems designers intending in a more effective lightning protection, reducing the operational costs and providing continuity of service. (author)

  6. Readability Assessment of Online Patient Education Material on Congestive Heart Failure

    Science.gov (United States)

    2017-01-01

    Background Online health information is being used more ubiquitously by the general population. However, this information typically favors only a small percentage of readers, which can result in suboptimal medical outcomes for patients. Objective The readability of online patient education materials regarding the topic of congestive heart failure was assessed through six readability assessment tools. Methods The search phrase “congestive heart failure” was employed into the search engine Google. Out of the first 100 websites, only 70 were included attending to compliance with selection and exclusion criteria. These were then assessed through six readability assessment tools. Results Only 5 out of 70 websites were within the limits of the recommended sixth-grade readability level. The mean readability scores were as follows: the Flesch-Kincaid Grade Level (9.79), Gunning-Fog Score (11.95), Coleman-Liau Index (15.17), Simple Measure of Gobbledygook (SMOG) index (11.39), and the Flesch Reading Ease (48.87). Conclusion Most of the analyzed websites were found to be above the sixth-grade readability level recommendations. Efforts need to be made to better tailor online patient education materials to the general population. PMID:28656111

  7. Assessment of Ex-Vitro Anaerobic Digestion Kinetics of Crop Residues Through First Order Exponential Models: Effect of LAG Phase Period and Curve Factor

    Directory of Open Access Journals (Sweden)

    Abdul Razaque Sahito

    2013-04-01

    Full Text Available Kinetic studies of AD (Anaerobic Digestion process are useful to predict the performance of digesters and design appropriate digesters and also helpful in understanding inhibitory mechanisms of biodegradation. The aim of this study was to assess the anaerobic kinetics of crop residues digestion with buffalo dung. Seven crop residues namely, bagasse, banana plant waste, canola straw, cotton stalks, rice straw, sugarcane trash and wheat straw were selected from the field and were analyzed on MC (Moisture Contents, TS (Total Solids and VS (Volatile Solids with standard methods. In present study, three first order exponential models namely exponential model, exponential lag phase model and exponential curve factor model were used to assess the kinetics of the AD process of crop residues and the effect of lag phase and curve factor was analyzed based on statistical hypothesis testing and on information theory. Assessment of kinetics of the AD of crop residues and buffalo dung follows the first order kinetics. Out of the three models, the simple exponential model was the poorest model, while the first order exponential curve factor model is the best fit model. In addition to statistical hypothesis testing, the exponential curve factor model has least value of AIC (Akaike's Information Criterion and can generate methane production data more accurately. Furthermore, there is an inverse linear relationship between the lag phase period and the curve factor.

  8. Assessment of ex-vitro anaerobic digestion kinetics of crop residues through first order exponential models: effect of lag phase period and curve factor

    International Nuclear Information System (INIS)

    Sahito, A.R.; Brohi, K.M.

    2013-01-01

    Kinetic studies of AD (Anaerobic Digestion) process are useful to predict the performance of digesters and design appropriate digesters and also helpful in understanding inhibitory mechanisms of biodegradation. The aim of this study was to assess the anaerobic kinetics of crop residues digestion with buffalo dung. Seven crop residues namely, bagasse, banana plant waste, canola straw, cotton stalks, rice straw, sugarcane trash and wheat straw were selected from the field and were analyzed on MC (Moisture Contents), TS (Total Solids) and VS (Volatile Solids) with standard methods. In present study, three first order exponential models namely exponential model, exponential lag phase model and exponential curve factor model were used to assess the kinetics of the AD process of crop residues and the effect of lag phase and curve factor was analyzed based on statistical hypothesis testing and on information theory. Assessment of kinetics of the AD of crop residues and buffalo dung follows the first order kinetics. Out of the three models, the simple exponential model was the poorest model, while the first order exponential curve factor model is the best fit model. In addition to statistical hypothesis testing, the exponential curve factor model has least value of AIC (Akaike's Information Criterion) and can generate methane production data more accurately. Furthermore, there is an inverse linear relationship between the lag phase period and the curve factor. (author)

  9. Liver failure after hepatectomy: A risk assessment using the pre-hepatectomy shear wave elastography technique

    Energy Technology Data Exchange (ETDEWEB)

    Han, Hong, E-mail: han.hong@zs-hospital.sh.cn [Department of Ultrasound, Zhongshan Hospital, Fudan University, No. 180 Fenglin Road, Xuhui District, Shanghai 200032 (China); Hu, Hao; Xu, Ya Dan [Zhongshan Hospital, Fudan University, Shanghai Institute of Medical Imaging, No. 180 Fenglin Road, Xuhui District, Shanghai 200032 (China); Wang, Wen Ping, E-mail: puguang61@126.com [Department of Ultrasound, Zhongshan Hospital, Fudan University, No. 180 Fenglin Road, Xuhui District, Shanghai 200032 (China); Ding, Hong; Lu, Qing [Department of Ultrasound, Zhongshan Hospital, Fudan University, No. 180 Fenglin Road, Xuhui District, Shanghai 200032 (China)

    2017-01-15

    Objective: To determine the efficacy of liver stiffness (LS) measurements utilizing the Shear Wave Elastography (SWE) technique for predicting post-hepatectomy liver failure (PHLF) among patients with hepatocellular carcinoma (HCC). Methods: Data from eighty consecutive patients who were undergoing hepatectomy for HCC were prospectively identified and evaluated with preoperative SWE. The SWE was measured with advanced ultrasound equipment (Philips EPIQ7; Philips Healthcare, Seattle, WA, USA). PHLF classification was defined according to the International Study Group of Liver Surgery Recommendations (ISGLS). Results: SWE was successfully performed in 77 patients. According to the ISGLS criteria, PHLF occurred in 35.1% of patients (27 patients), including 2/25 patients with Grade A/B, respectively. Elevated SWE values (P = 0.002) and histological cirrhosis (P = 0.003) were independent predictors of PHLF according to the multivariate analysis. Patients with SWE values higher than or equal to 6.9 kPa were identified at higher risk of PHLF (area under the curve: 0.843, sensitivity: 77.8% and specificity: 78.0%). Postoperative dynamic course of the median the Model For End-stage Liver Disease (MELD) score showed irregular changes among patients with an SWE >6.9 kPa. Patients with an SWE <6.9 kPa, postoperative dynamic course of the median MELD score gradually decreased. Conclusion: LS measured with SWE is a valid and reliable method for the prediction of PHLF grade A/B among patients with HCC. SWE could become a routine examination for the preoperative evaluation of PHLF.

  10. Fibre failure assessment in carbon fibre reinforced polymers under fatigue loading by synchrotron X-ray computed tomography

    OpenAIRE

    Garcea, Serafina; Sinclair, Ian; Spearing, Simon

    2016-01-01

    In situ fatigue experiments using synchrotron X-ray computed tomography (SRCT) are used to assess the underpinning micromechanisms of fibre failure in double notch carbon/epoxy coupons. Observations showed fibre breaks along the 0º ply splits, associated with the presence and failure of bridging fibres, as well as fibres failed in the bulk composite within the 0º plies. A tendency for cluster formation, with multiple adjacent breaks in the bulk composite was observed when higher peak loads we...

  11. Modeling of Electrical Cable Failure in a Dynamic Assessment of Fire Risk

    Science.gov (United States)

    Bucknor, Matthew D.

    complexity to existing cable failure techniques and tuned to empirical data can better approximate the temperature response of a cables located in tightly packed cable bundles. The new models also provide a way to determine the conditions insides a cable bundle which allows for separate treatment of cables on the interior of the bundle from cables on the exterior of the bundle. The results from the DET analysis show that the overall assessed probability of cable failure can be significantly reduced by more realistically accounting for the influence that the fire brigade has on a fire progression scenario. The shielding analysis results demonstrate a significant reduction in the temperature response of a shielded versus a non-shielded cable bundle; however the computational cost of using a fire progression model that can capture these effects may be prohibitive for performing DET analyses with currently available computational fluid dynamics models and computational resources.

  12. Hormonal and cardiovascular reflex assessment in a female patient with pure autonomic failure

    Directory of Open Access Journals (Sweden)

    Heno Ferreira Lopes

    2000-09-01

    Full Text Available We report the case of a 72-year-old female with pure autonomic failure, a rare entity, whose diagnosis of autonomic dysfunction was determined with a series of complementary tests. For approximately 2 years, the patient has been experiencing dizziness and a tendency to fall, a significant weight loss, generalized weakness, dysphagia, intestinal constipation, blurred vision, dry mouth, and changes in her voice. She underwent clinical assessment and laboratory tests (biochemical tests, chest X-ray, digestive endoscopy, colonoscopy, chest computed tomography, abdomen and pelvis computed tomography, abdominal ultrasound, and ambulatory blood pressure monitoring. Measurements of catecholamine and plasmatic renin activity were performed at rest and after physical exercise. Finally the patient underwent physiological and pharmacological autonomic tests that better diagnosed dysautonomia.

  13. Procedures for conducting common cause failure analysis in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1992-05-01

    The principal objective of this report is to supplement the procedure developed in Mosleh et al. (1988, 1989) by providing more explicit guidance for a practical approach to common cause failures (CCF) analysis. The detailed CCF analysis following that procedure would be very labour intensive and time consuming. This document identifies a number of options for performing the more labour intensive parts of the analysis in an attempt to achieve a balance between the need for detail, the purpose of the analysis and the resources available. The document is intended to be compatible with the Agency's Procedures for Conducting Probabilistic Safety Assessments for Nuclear Power Plants (IAEA, 1992), but can be regarded as a stand-alone report to be used in conjunction with NUREG/CR-4780 (Mosleh et al., 1988, 1989) to provide additional detail, and discussion of key technical issues

  14. The constant failure rate model for fault tree evaluation as a tool for unit protection reliability assessment

    International Nuclear Information System (INIS)

    Vichev, S.; Bogdanov, D.

    2000-01-01

    The purpose of this paper is to introduce the fault tree analysis method as a tool for unit protection reliability estimation. The constant failure rate model applies for making reliability assessment, and especially availability assessment. For that purpose an example for unit primary equipment structure and fault tree example for simplified unit protection system is presented (author)

  15. [Application of decision curve on evaluation of MRI predictive model for early assessing pathological complete response to neoadjuvant therapy in breast cancer].

    Science.gov (United States)

    He, Y J; Li, X T; Fan, Z Q; Li, Y L; Cao, K; Sun, Y S; Ouyang, T

    2018-01-23

    Objective: To construct a dynamic enhanced MR based predictive model for early assessing pathological complete response (pCR) to neoadjuvant therapy in breast cancer, and to evaluate the clinical benefit of the model by using decision curve. Methods: From December 2005 to December 2007, 170 patients with breast cancer treated with neoadjuvant therapy were identified and their MR images before neoadjuvant therapy and at the end of the first cycle of neoadjuvant therapy were collected. Logistic regression model was used to detect independent factors for predicting pCR and construct the predictive model accordingly, then receiver operating characteristic (ROC) curve and decision curve were used to evaluate the predictive model. Results: ΔArea(max) and Δslope(max) were independent predictive factors for pCR, OR =0.942 (95% CI : 0.918-0.967) and 0.961 (95% CI : 0.940-0.987), respectively. The area under ROC curve (AUC) for the constructed model was 0.886 (95% CI : 0.820-0.951). Decision curve showed that in the range of the threshold probability above 0.4, the predictive model presented increased net benefit as the threshold probability increased. Conclusions: The constructed predictive model for pCR is of potential clinical value, with an AUC>0.85. Meanwhile, decision curve analysis indicates the constructed predictive model has net benefit from 3 to 8 percent in the likely range of probability threshold from 80% to 90%.

  16. Canagliflozin and Heart Failure in Type 2 Diabetes Mellitus: Results From the CANVAS Program (Canagliflozin Cardiovascular Assessment Study).

    Science.gov (United States)

    Rådholm, Karin; Figtree, Gemma; Perkovic, Vlado; Solomon, Scott D; Mahaffey, Kenneth W; de Zeeuw, Dick; Fulcher, Greg; Barrett, Terrance D; Shaw, Wayne; Desai, Mehul; Matthews, David R; Neal, Bruce

    2018-03-11

    BACKGROUND : Canagliflozin is a sodium glucose cotransporter 2 inhibitor that reduces the risk of cardiovascular events. We report the effects on heart failure and cardiovascular death overall, in those with and without a baseline history of heart failure, and in other participant subgroups. METHODS : The CANVAS Program (Canagliflozin Cardiovascular Assessment Study) enrolled 10 142 participants with type 2 diabetes mellitus and high cardiovascular risk. Participants were randomly assigned to canagliflozin or placebo and followed for a mean of 188 weeks. The primary end point for these analyses was adjudicated cardiovascular death or hospitalized heart failure. RESULTS : Participants with a history of heart failure at baseline (14.4%) were more frequently women, white, and hypertensive and had a history of prior cardiovascular disease (all P failure was reduced in those treated with canagliflozin compared with placebo (16.3 versus 20.8 per 1000 patient-years; hazard ratio [HR], 0.78; 95% confidence interval [CI], 0.67-0.91), as was fatal or hospitalized heart failure (HR, 0.70; 95% CI, 0.55-0.89) and hospitalized heart failure alone (HR, 0.67; 95% CI, 0.52-0.87). The benefit on cardiovascular death or hospitalized heart failure may be greater in patients with a prior history of heart failure (HR, 0.61; 95% CI, 0.46-0.80) compared with those without heart failure at baseline (HR, 0.87; 95% CI, 0.72-1.06; P interaction =0.021). The effects of canagliflozin compared with placebo on other cardiovascular outcomes and key safety outcomes were similar in participants with and without heart failure at baseline (all interaction P values >0.130), except for a possibly reduced absolute rate of events attributable to osmotic diuresis among those with a prior history of heart failure ( P =0.03). CONCLUSIONS : In patients with type 2 diabetes mellitus and an elevated risk of cardiovascular disease, canagliflozin reduced the risk of cardiovascular death or hospitalized heart

  17. A systematic methodology for creep master curve construction using the stepped isostress method (SSM): a numerical assessment

    Science.gov (United States)

    Miranda Guedes, Rui

    2018-02-01

    Long-term creep of viscoelastic materials is experimentally inferred through accelerating techniques based on the time-temperature superposition principle (TTSP) or on the time-stress superposition principle (TSSP). According to these principles, a given property measured for short times at a higher temperature or higher stress level remains the same as that obtained for longer times at a lower temperature or lower stress level, except that the curves are shifted parallel to the horizontal axis, matching a master curve. These procedures enable the construction of creep master curves with short-term experimental tests. The Stepped Isostress Method (SSM) is an evolution of the classical TSSP method. Higher reduction of the required number of test specimens to obtain the master curve is achieved by the SSM technique, since only one specimen is necessary. The classical approach, using creep tests, demands at least one specimen per each stress level to produce a set of creep curves upon which TSSP is applied to obtain the master curve. This work proposes an analytical method to process the SSM raw data. The method is validated using numerical simulations to reproduce the SSM tests based on two different viscoelastic models. One model represents the viscoelastic behavior of a graphite/epoxy laminate and the other represents an adhesive based on epoxy resin.

  18. Failure mode effects and criticality analysis: innovative risk assessment to identify critical areas for improvement in emergency department sepsis resuscitation.

    Science.gov (United States)

    Powell, Emilie S; O'Connor, Lanty M; Nannicelli, Anna P; Barker, Lisa T; Khare, Rahul K; Seivert, Nicholas P; Holl, Jane L; Vozenilek, John A

    2014-06-01

    Sepsis is an increasing problem in the practice of emergency medicine as the prevalence is increasing and optimal care to reduce mortality requires significant resources and time. Evidence-based septic shock resuscitation strategies exist, and rely on appropriate recognition and diagnosis, but variation in adherence to the recommendations and therefore outcomes remains. Our objective was to perform a multi-institutional prospective risk-assessment, using failure mode effects and criticality analysis (FMECA), to identify high-risk failures in ED sepsis resuscitation. We conducted a FMECA, which prospectively identifies critical areas for improvement in systems and processes of care, across three diverse hospitals. A multidisciplinary group of participants described the process of emergency department (ED) sepsis resuscitation to then create a comprehensive map and table listing all process steps and identified process failures. High-risk failures in sepsis resuscitation from each of the institutions were compiled to identify common high-risk failures. Common high-risk failures included limited availability of equipment to place the central venous catheter and conduct invasive monitoring, and cognitive overload leading to errors in decision-making. Additionally, we identified great variability in care processes across institutions. Several common high-risk failures in sepsis care exist: a disparity in resources available across hospitals, a lack of adherence to the invasive components of care, and cognitive barriers that affect expert clinicians' decision-making capabilities. Future work may concentrate on dissemination of non-invasive alternatives and overcoming cognitive barriers in diagnosis and knowledge translation.

  19. Assessment of Core Failure Limits for Light Water Reactor Fuel under Reactivity Initiated Accidents

    International Nuclear Information System (INIS)

    Jernkvist, Lars Olof; Massih, Ali R.

    2004-12-01

    Core failure limits for high-burnup light water reactor UO 2 fuel rods, subjected to postulated reactivity initiated accidents (RIAs), are here assessed by use of best-estimate computational methods. The considered RIAs are the hot zero power rod ejection accident (HZP REA) in pressurized water reactors and the cold zero power control rod drop accident (CZP CRDA) in boiling water reactors. Burnup dependent core failure limits for these events are established by calculating the fuel radial average enthalpy connected with incipient fuel pellet melting for fuel burnups in the range of 30 to 70 MWd/kgU. The postulated HZP REA and CZP CRDA result in lower enthalpies for pellet melting than RIAs that take place at rated power. Consequently, the enthalpy thresholds presented here are lower bounds to RIAs at rated power. The calculations are performed with best-estimate models, which are applied in the FRAPCON-3.2 and SCANAIR-3.2 computer codes. Based on the results of three-dimensional core kinetics analyses, the considered power transients are simulated by a Gaussian pulse shape, with a fixed width of either 25 ms (REA) or 45 ms (CRDA). Notwithstanding the differences in postulated accident scenarios between the REA and the CRDA, the calculated core failure limits for these two events are similar. The calculated enthalpy thresholds for fuel pellet melting decrease gradually with fuel burnup, from approximately 960 J/gUO 2 at 30 MWd/kgU to 810 J/gUO 2 at 70 MWd/kgU. The decline is due to depression of the UO 2 melting temperature with increasing burnup, in combination with burnup related changes to the radial power distribution within the fuel pellets. The presented fuel enthalpy thresholds for incipient UO 2 melting provide best-estimate core failure limits for low- and intermediate-burnup fuel. However, pulse reactor tests on high-burnup fuel rods indicate that the accumulation of gaseous fission products within the pellets may lead to fuel dispersal into the coolant at

  20. Assessment of the impact of fueling machine failure on the safety of the CANDU-PHWR

    International Nuclear Information System (INIS)

    Al-Kusayer, T.A.

    1982-01-01

    A survey of possible LOCA (Loss-of-Coolant Accident) initiating events that might take place for CANDU-PHWRs (Canadian Deuterium Uranium-Pressurized Heavy Water Reactors) has been conducted covering both direct and indirect initiators. Among the 22 initiating events that were surveyed in this study, four direct initiators have been selected and analyzed briefly. Those selected were a pump suction piping break, an isolation valve piping break, a bleed valve failure, and a fueling machine interface failure. These were selected as examples of failures that could take place in the inlet side, outlet side, or PHTS (Primary Heat Transport System) interfaces. The Pickering NGS (Unit-A) was used for this case study. Double failure (failure of the protective devices to operate when the process equipment fault occurs) and a triple failure (failure of the protective devices and the ECCS as well as the process equipment) were found to be highly improbable

  1. Assessment of heart rate, acidosis, consciousness, oxygenation, and respiratory rate to predict noninvasive ventilation failure in hypoxemic patients.

    Science.gov (United States)

    Duan, Jun; Han, Xiaoli; Bai, Linfu; Zhou, Lintong; Huang, Shicong

    2017-02-01

    To develop and validate a scale using variables easily obtained at the bedside for prediction of failure of noninvasive ventilation (NIV) in hypoxemic patients. The test cohort comprised 449 patients with hypoxemia who were receiving NIV. This cohort was used to develop a scale that considers heart rate, acidosis, consciousness, oxygenation, and respiratory rate (referred to as the HACOR scale) to predict NIV failure, defined as need for intubation after NIV intervention. The highest possible score was 25 points. To validate the scale, a separate group of 358 hypoxemic patients were enrolled in the validation cohort. The failure rate of NIV was 47.8 and 39.4% in the test and validation cohorts, respectively. In the test cohort, patients with NIV failure had higher HACOR scores at initiation and after 1, 12, 24, and 48 h of NIV than those with successful NIV. At 1 h of NIV the area under the receiver operating characteristic curve was 0.88, showing good predictive power for NIV failure. Using 5 points as the cutoff value, the sensitivity, specificity, positive predictive value, negative predictive value, and diagnostic accuracy for NIV failure were 72.6, 90.2, 87.2, 78.1, and 81.8%, respectively. These results were confirmed in the validation cohort. Moreover, the diagnostic accuracy for NIV failure exceeded 80% in subgroups classified by diagnosis, age, or disease severity and also at 1, 12, 24, and 48 h of NIV. Among patients with NIV failure with a HACOR score of >5 at 1 h of NIV, hospital mortality was lower in those who received intubation at ≤12 h of NIV than in those intubated later [58/88 (66%) vs. 138/175 (79%); p = 0.03). The HACOR scale variables are easily obtained at the bedside. The scale appears to be an effective way of predicting NIV failure in hypoxemic patients. Early intubation in high-risk patients may reduce hospital mortality.

  2. Assessment of p-y curves from numerical methods for a non-slender monopile in cohesionless soil

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, L. B.; Ravn Roesen, H. [Aalborg Univ. Dept. of Civil Engineering, Aalborg (Denmark); Hansen, Mette; Kirk Wolf, T. [COWI, Kgs. Lyngby (Denmark); Lange Rasmussen, K. [Niras, Aalborg (Denmark)

    2013-06-15

    In current design the monopile is a widely used solution as foundation of offshore wind turbines. Winds and waves subject the monopile to considerable lateral loads. The behaviour of monopiles under lateral loading is not fully understood and the current design guidances apply the p-y curve method in a Winkler model approach. The p-y curve method was originally developed for jag-piles used in the oil and gas industry which are much more slender than the monopile foundation. In recent years the 3D finite element analysis has become a tool in the investigation of complex geotechnical situations, such as the laterally loaded monopile. In this paper a 3D FEA is conducted as basis of an extraction of p-y curves, as a basis for an evaluation of the traditional curves. Two different methods are applied to create a list of data points used for the p-y curves: A force producing a similar response as seen in the ULS situation is applied stepwise; hereby creating the most realistic soil response. This method, however, does not generate sufficient data points around the rotation point of the pile. Therefore, also a forced horizontal displacement of the entire pile is applied, whereby displacements are created over the entire length of the pile. The response is extracted from the interface and the nearby soil elements respectively, as to investigate the influence this has on the computed curves. p-y curves are obtained near the rotation point by evaluation of soil response during a prescribed displacement but the response is not in clear agreement with the response during an applied load. Two different material models are applied. It is found that the applied material models have a significant influence on the stiffness of the evaluated p-y curves. The p-y curves evaluated by means of FEA are compared to the conventional p-y curve formulation which provides a much stiffer response. It is found that the best response is computed by implementing the Hardening Soil model and

  3. Failure Assessment for the High-Strength Pipelines with Constant-Depth Circumferential Surface Cracks

    Directory of Open Access Journals (Sweden)

    X. Liu

    2018-01-01

    Full Text Available In the oil and gas transportation system over long distance, application of high-strength pipeline steels can efficiently reduce construction and operation cost by increasing operational pressure and reducing the pipe wall thickness. Failure assessment is an important issue in the design, construction, and maintenance of the pipelines. The small circumferential surface cracks with constant depth in the welded pipelines are of practical interest. This work provides an engineering estimation procedure based upon the GE/EPRI method to determine the J-integral for the thin-walled pipelines with small constant-depth circumferential surface cracks subject to tension and bending loads. The values of elastic influence functions for stress intensity factor and plastic influence functions for fully plastic J-integral estimation are derived in tabulated forms through a series of three-dimensional finite element calculations for different crack geometries and material properties. To check confidence of the J-estimation solution in practical application, J-integral values obtained from detailed finite element (FE analyses are compared with those estimated from the new influence functions. Excellent agreement of FE results with the proposed J-estimation solutions for both tension and bending loads indicates that the new solutions can be applied for accurate structural integrity assessment of high-strength pipelines with constant-depth circumferential surface cracks.

  4. The results of questionnaire on quantitative assessment of 123I-metaiodobenzylguanidine myocardial scintigraphy in heart failure

    International Nuclear Information System (INIS)

    Nishimura, Tsunehiko; Sugishita, Yasurou; Sasaki, Yasuhito.

    1997-01-01

    This study was done by working group under the cooperation between Japanese Society of Nuclear Medicine and Japanese Circulation Society. We evaluated the usefulness of quantitative assessment of 123 I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy in heart failure by the results of questionnaire. Forty-nine (72.1%) of 68 selected institutions participated in this study. The incidence of MIBG myocardial scintigraphy used in heart failure was 41.1%. The imaging protocol was mostly done by both planar and SPECT at 15 min and 3.6 hr after intravenous injection of 111 MBq of MIBG. The quantitative assessment was mostly done by heart/mediastinum (H/M) ratio and washout rate analysis based on planar imaging. The mean normal value of H/M ratio were 2.34±0.36, and 2.49±0.40, at early and delayed images, respectively. The normal value of washout rate was 27.74±5.34%. On the other hand, those of H/M ratio in heart failure were 1.87±0.27, and 1.75±0.24, at early and delayed images, respectively. That of washout rate was 42.30±6.75%. These parameters were very useful for the evaluation of heart failure. In conclusion, MIBG myocardial scintigraphy was widely used for not only early detection and severity assessment, but also indication for therapy and prognosis evaluation in heart failure patients. (author)

  5. The results of questionnaire on quantitative assessment of {sup 123}I-metaiodobenzylguanidine myocardial scintigraphy in heart failure

    Energy Technology Data Exchange (ETDEWEB)

    Nishimura, Tsunehiko [Osaka Univ., Suita (Japan). Medical school; Sugishita, Yasurou; Sasaki, Yasuhito

    1997-12-01

    This study was done by working group under the cooperation between Japanese Society of Nuclear Medicine and Japanese Circulation Society. We evaluated the usefulness of quantitative assessment of {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial scintigraphy in heart failure by the results of questionnaire. Forty-nine (72.1%) of 68 selected institutions participated in this study. The incidence of MIBG myocardial scintigraphy used in heart failure was 41.1%. The imaging protocol was mostly done by both planar and SPECT at 15 min and 3.6 hr after intravenous injection of 111 MBq of MIBG. The quantitative assessment was mostly done by heart/mediastinum (H/M) ratio and washout rate analysis based on planar imaging. The mean normal value of H/M ratio were 2.34{+-}0.36, and 2.49{+-}0.40, at early and delayed images, respectively. The normal value of washout rate was 27.74{+-}5.34%. On the other hand, those of H/M ratio in heart failure were 1.87{+-}0.27, and 1.75{+-}0.24, at early and delayed images, respectively. That of washout rate was 42.30{+-}6.75%. These parameters were very useful for the evaluation of heart failure. In conclusion, MIBG myocardial scintigraphy was widely used for not only early detection and severity assessment, but also indication for therapy and prognosis evaluation in heart failure patients. (author)

  6. Using Probablilistic Risk Assessment to Model Medication System Failures in Long-Term Care Facilities

    National Research Council Canada - National Science Library

    Comden, Sharon C; Marx, David; Murphy-Carley, Margaret; Hale, Misti

    2005-01-01

    .... Discussion: The models provide contextual maps of the errors and behaviors that lead to medication delivery system failures, including unanticipated risks associated with regulatory practices and common...

  7. Serial Sonographic Assessment of Pulmonary Edema in Patients With Hypertensive Acute Heart Failure.

    Science.gov (United States)

    Martindale, Jennifer L; Secko, Michael; Kilpatrick, John F; deSouza, Ian S; Paladino, Lorenzo; Aherne, Andrew; Mehta, Ninfa; Conigiliaro, Alyssa; Sinert, Richard

    2018-02-01

    Objective measures of clinical improvement in patients with acute heart failure (AHF) are lacking. The aim of this study was to determine whether repeated lung sonography could semiquantitatively capture changes in pulmonary edema (B-lines) in patients with hypertensive AHF early in the course of treatment. We conducted a feasibility study in a cohort of adults with acute onset of dyspnea, severe hypertension in the field or at triage (systolic blood pressure ≥ 180 mm Hg), and a presumptive diagnosis of AHF. Patients underwent repeated dyspnea and lung sonographic assessments using a 10-cm visual analog scale (VAS) and an 8-zone scanning protocol. Lung sonographic assessments were performed at the time of triage, initial VAS improvement, and disposition from the emergency department. Sonographic pulmonary edema was independently scored offline in a randomized and blinded fashion by using a scoring method that accounted for both the sum of discrete B-lines and degree of B-line fusion. Sonographic pulmonary edema scores decreased significantly from initial to final sonographic assessments (P < .001). The median percentage decrease among the 20 included patient encounters was 81% (interquartile range, 55%-91%). Although sonographic pulmonary edema scores correlated with VAS scores (ρ = 0.64; P < .001), the magnitude of the change in these scores did not correlate with each other (ρ = -0.04; P = .89). Changes in sonographic pulmonary edema can be semiquantitatively measured by serial 8-zone lung sonography using a scoring method that accounts for B-line fusion. Sonographic pulmonary edema improves in patients with hypertensive AHF during the initial hours of treatment. © 2017 by the American Institute of Ultrasound in Medicine.

  8. Failure Modes Taxonomy for Reliability Assessment of Digital Instrumentation and Control Systems for Probabilistic Risk Analysis - Failure modes taxonomy for reliability assessment of digital I and C systems for PRA

    International Nuclear Information System (INIS)

    Amri, A.; Blundell, N.; ); Authen, S.; Betancourt, L.; Coyne, K.; Halverson, D.; Li, M.; Taylor, G.; Bjoerkman, K.; Brinkman, H.; Postma, W.; Bruneliere, H.; Chirila, M.; Gheorge, R.; Chu, L.; Yue, M.; Delache, J.; Georgescu, G.; Deleuze, G.; Quatrain, R.; Thuy, N.; Holmberg, J.-E.; Kim, M.C.; Kondo, K.; Mancini, F.; Piljugin, E.; Stiller, J.; Sedlak, J.; Smidts, C.; Sopira, V.

    2015-01-01

    Digital protection and control systems appear as upgrades in older nuclear power plants (NPP), and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital systems, there is a need to quantitatively assess the reliability of the digital systems in a justifiable manner. Due to the many unique attributes of digital systems (e.g., functions are implemented by software, units of the system interact in a communication network, faults can be identified and handled online), a number of modelling and data collection challenges exist, and international consensus on the reliability modelling has not yet been reached. The objective of the task group called DIGREL has been to develop a taxonomy of failure modes of digital components for the purposes of probabilistic risk analysis (PRA). An activity focused on the development of a common taxonomy of failure modes is seen as an important step towards standardised digital instrumentation and control (I and C) reliability assessment techniques for PRA. Needs from PRA has guided the work, meaning, e.g., that the I and C system and its failures are studied from the point of view of their functional significance point of view. The taxonomy will be the basis of future modelling and quantification efforts. It will also help to define a structure for data collection and to review PRA studies. The proposed failure modes taxonomy has been developed by first collecting examples of taxonomies provided by the task group organisations. This material showed some variety in the handling of I and C hardware failure modes, depending on the context where the failure modes have been defined. Regarding the software part of I and C, failure modes defined in NPP PRAs have been simple - typically a software CCF failing identical processing units. The DIGREL task group has defined a new failure modes taxonomy based on a hierarchical definition of five levels of abstraction: 1. system level (complete

  9. Non-invasive assessment of distribution volume ratios and binding potential: tissue heterogeneity and interindividually averaged time-activity curves

    Energy Technology Data Exchange (ETDEWEB)

    Reimold, M.; Mueller-Schauenburg, W.; Dohmen, B.M.; Bares, R. [Department of Nuclear Medicine, University of Tuebingen, Otfried-Mueller-Strasse 14, 72076, Tuebingen (Germany); Becker, G.A. [Nuclear Medicine, University of Leipzig, Leipzig (Germany); Reischl, G. [Radiopharmacy, University of Tuebingen, Tuebingen (Germany)

    2004-04-01

    Due to the stochastic nature of radioactive decay, any measurement of radioactivity concentration requires spatial averaging. In pharmacokinetic analysis of time-activity curves (TAC), such averaging over heterogeneous tissues may introduce a systematic error (heterogeneity error) but may also improve the accuracy and precision of parameter estimation. In addition to spatial averaging (inevitable due to limited scanner resolution and intended in ROI analysis), interindividual averaging may theoretically be beneficial, too. The aim of this study was to investigate the effect of such averaging on the binding potential (BP) calculated with Logan's non-invasive graphical analysis and the ''simplified reference tissue method'' (SRTM) proposed by Lammertsma and Hume, on the basis of simulated and measured positron emission tomography data [{sup 11}C]d-threo-methylphenidate (dMP) and [{sup 11}C]raclopride (RAC) PET. dMP was not quantified with SRTM since the low k {sub 2} (washout rate constant from the first tissue compartment) introduced a high noise sensitivity. Even for considerably different shapes of TAC (dMP PET in parkinsonian patients and healthy controls, [{sup 11}C]raclopride in patients with and without haloperidol medication) and a high variance in the rate constants (e.g. simulated standard deviation of K {sub 1}=25%), the BP obtained from average TAC was close to the mean BP (<5%). However, unfavourably distributed parameters, especially a correlated large variance in two or more parameters, may lead to larger errors. In Monte Carlo simulations, interindividual averaging before quantification reduced the variance from the SRTM (beyond a critical signal to noise ratio) and the bias in Logan's method. Interindividual averaging may further increase accuracy when there is an error term in the reference tissue assumption E=DV {sub 2}-DV ' (DV {sub 2} = distribution volume of the first tissue compartment, DV &apos

  10. Application of a few orthogonal polynomials to the assessment of the fracture failure probability of a spherical tank

    International Nuclear Information System (INIS)

    Cao Tianjie; Zhou Zegong

    1993-01-01

    This paper presents some methods to assess the fracture failure probability of a spherical tank. These methods convert the assessment of the fracture failure probability into the calculation of the moment of cracks and a one-dimensional integral. In the paper, we first derive series' formulae to calculation the moments of cracks on the occasion of the crack fatigue growth and the moments of crack opening displacements according to JWES-2805 code. We then use the first n moments of crack opening displacements and a few orthogonal polynomials to compose the probability density function of the crack opening displacement. Lastly, the fracture failure probability is obtained according to the interference theory. An example proves that these methods are simpler, quicker, and more accurate. At the same time, these methods avoid the disadvantage of Edgeworth's series method. (author)

  11. Flexible meta-regression to assess the shape of the benzene-leukemia exposure-response curve.

    NARCIS (Netherlands)

    Vlaanderen, J.J.|info:eu-repo/dai/nl/31403160X; Portengen, L.|info:eu-repo/dai/nl/269224742; Rothman, N.; Lan, Q.; Kromhout, H.|info:eu-repo/dai/nl/074385224; Vermeulen, R.|info:eu-repo/dai/nl/216532620

    2010-01-01

    BACKGROUND: Previous evaluations of the shape of the benzene-leukemia exposure-response curve (ERC) were based on a single set or on small sets of human occupational studies. Integrating evidence from all available studies that are of sufficient quality combined with flexible meta-regression models

  12. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    Science.gov (United States)

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly

  13. Assessment of a Business-to-Consumer (B2C) model for Telemonitoring patients with Chronic Heart Failure (CHF)

    NARCIS (Netherlands)

    A.S. Grustam (Andrija); Vrijhoef, H.J.M. (Hubertus J. M.); R. Koymans (Ron); Hukal, P. (Philipp); J.L. Severens (Hans)

    2017-01-01

    textabstractBackground: The purpose of this study is to assess the Business-to-Consumer (B2C) model for telemonitoring patients with Chronic Heart Failure (CHF) by analysing the value it creates, both for organizations or ventures that provide telemonitoring services based on it, and for society.

  14. Assessing the Value-Added by the Environmental Testing Process with the Aide of Physics/Engineering of Failure Evaluations

    Science.gov (United States)

    Cornford, S.; Gibbel, M.

    1997-01-01

    NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.

  15. Pertussis toxin treatment of whole blood. A novel approach to assess G protein function in congestive heart failure

    NARCIS (Netherlands)

    Maisel, A. S.; Michel, M. C.; Insel, P. A.; Ennis, C.; Ziegler, M. G.; Phillips, C.

    1990-01-01

    This study was designed to assess G protein function in mononuclear leukocytes (MNL) of patients with congestive heart failure (CHF). MNL membranes were ADP-ribosylated in vitro in the presence of pertussis or cholera toxin. The amount of pertussis toxin substrates did not differ significantly

  16. Traditional and new composite endpoints in heart failure clinical trials : facilitating comprehensive efficacy assessments and improving trial efficiency

    NARCIS (Netherlands)

    Anker, Stefan D. t; Schroeder, Stefan; Atar, Dan; Bax, Jeroen J.; Ceconi, Claudio; Cowie, Martin R.; AdamCrisp,; Dominjon, Fabienne; Ford, Ian; Ghofrani, Hossein-Ardeschir; Gropper, Savion; Hindricks, Gerhard; Hlatky, Mark A.; Holcomb, Richard; Honarpour, Narimon; Jukema, J. Wouter; Kim, Albert M.; Kunz, Michael; Lefkowitz, Martin; Le Floch, Chantal; Landmesser, Ulf; McDonagh, Theresa A.; McMurray, John J.; Merkely, Bela; Packer, Milton; Prasad, Krishna; Revkin, James; Rosano, Giuseppe M. C.; Somaratne, Ransi; Stough, Wendy Gattis; Voors, Adriaan A.; Ruschitzka, Frank

    Composite endpoints are commonly used as the primary measure of efficacy in heart failure clinical trials to assess the overall treatment effect and to increase the efficiency of trials. Clinical trials still must enrol large numbers of patients to accrue a sufficient number of outcome events and

  17. Clinical evaluation of new methods for the assessment of heart failure

    NARCIS (Netherlands)

    J.A.M. Wijbenga (Anke)

    1999-01-01

    textabstractAlthough every physician seems to know the term "heart failure", there is no general agreement on its definition. Due to the complex nature of heart failure and the changing Insights into its pathophysiology over time, many different definitions exist. l.' Some focus on clinical

  18. Direct and indirect assessment of skeletal muscle blood flow in chronic congestive heart failure

    International Nuclear Information System (INIS)

    LeJemtel, T.H.; Scortichini, D.; Katz, S.

    1988-01-01

    In patients with chronic congestive heart failure (CHF), skeletal muscle blood flow can be measured directly by the continuous thermodilution technique and by the xenon-133 clearance method. The continuous thermodilution technique requires retrograde catheterization of the femoral vein and, thus, cannot be repeated conveniently in patients during evaluation of pharmacologic interventions. The xenon-133 clearance, which requires only an intramuscular injection, allows repeated determination of skeletal muscle blood flow. In patients with severe CHF, a fixed capacity of the skeletal muscle vasculature to dilate appears to limit maximal exercise performance. Moreover, the changes in peak skeletal muscle blood flow noted during long-term administration of captopril, an angiotensin-converting enzyme inhibitor, appears to correlate with the changes in aerobic capacity. In patients with CHF, resting supine deep femoral vein oxygen content can be used as an indirect measurement of resting skeletal muscle blood flow. The absence of a steady state complicates the determination of peak skeletal muscle blood flow reached during graded bicycle or treadmill exercise in patients with chronic CHF. Indirect assessments of skeletal muscle blood flow and metabolism during exercise performed at submaximal work loads are currently developed in patients with chronic CHF

  19. On the major ductile fracture methodologies for failure assessment of nuclear reactor components

    International Nuclear Information System (INIS)

    Cruz, Julio R.B.; Andrade, Arnaldo H.P. de; Landes, John D.

    1996-01-01

    In structures like nuclear reactor components there is a special concern with the loads that may occur under postulated accident conditions. These loads can cause the stresses to go well beyond the linear elastic limits, requiring the use of ductile fracture mechanics methods to the prediction of the structure behavior. Since the use of numerical methods to apply EPFM concepts is expensive and time consuming, the existence of analytical engineering procedures are of great relevance. The lack of precision in detail, as compared with numerical nonlinear analyses, is compensated by the possibility of quick failure assessments. This is a determinant factor in situations where a systematic evaluation of a large range of geometries and loading conditions is necessary, like in thr application of the Leak-Before-Break (LBB) concept on nuclear piping. This paper outlines four ductile fracture analytical methods, pointing out positive and negative aspects of each one. The objective is to take advantage of this critical review to conceive a new methodology, one that would gather strong points of the major existent methods and would try to eliminate some of their drawbacks. (author)

  20. Performance of Noninvasive Assessment in the Diagnosis of Right Heart Failure After Left Ventricular Assist Device.

    Science.gov (United States)

    Joly, Joanna M; El-Dabh, Ashraf; Marshell, Ramey; Chatterjee, Arka; Smith, Michelle G; Tresler, Margaret; Kirklin, James K; Acharya, Deepak; Rajapreyar, Indranee N; Tallaj, José A; Pamboukian, Salpy V

    2018-06-01

    Right heart failure (RHF) after left ventricular assist device (LVAD) is associated with poor outcomes. Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) defines RHF as elevated right atrial pressure (RAP) plus venous congestion. The purpose of this study was to examine the diagnostic performance of the noninvasive INTERMACS criteria using RAP as the gold standard. We analyzed 108 patients with LVAD who underwent 341 right heart catheterizations (RHC) between January 1, 2006, and December 31, 2013. Physical exam, echocardiography, and laboratory data at the time of RHC were collected. Conventional two-by-two tables were used and missing data were excluded. The noninvasive INTERMACS definition of RHF is 32% sensitive (95% CI, 0.21-0.44) and 97% specific (95% CI, 0.95-0.99) for identifying elevated RAP. Clinical assessment failed to identify two-thirds of LVAD patients with RAP > 16 mm Hg. More than half of patients with elevated RAP did not have venous congestion, which may represent a physiologic opportunity to mitigate the progression of disease before end-organ damage occurs. One-quarter of patients who met the noninvasive definition of RHF did not actually have elevated RAP, potentially exposing patients to unnecessary therapies. In practice, if any component of the INTERMACS definition is present or equivocal, our data suggest RHC is warranted to establish the diagnosis.

  1. Analysis of multiple failure accident scenarios for development of probabilistic safety assessment model for KALIMER-600

    International Nuclear Information System (INIS)

    Kim, T.W.; Suk, S.D.; Chang, W.P.; Kwon, Y.M.; Jeong, H.Y.; Lee, Y.B.; Ha, K.S.; Kim, S.J.

    2009-01-01

    A sodium-cooled fast reactor (SFR), KALIMER-600, is under development at KAERI. Its fuel is the metal fuel of U-TRU-Zr and it uses sodium as coolant. Its advantages are found in the aspects of an excellent uranium resource utilization, inherent safety features, and nonproliferation. The probabilistic safety assessment (PSA) will be one of the initiating subjects for designing it from the aspects of a risk informed design (RID) as well as a technology-neutral licensing (TNL). The core damage is defined as coolant voiding, fuel melting, or cladding damage. Accident scenarios which lead to the core damage should be identified for the development of a Level-1 PSA model. The SSC-K computer code is used to identify the conditions which lead to core damage. KALIMER-600 has passive safety features such as passive shutdown functions, passive pump coast-down features, and passive decay heat removal systems. It has inherent reactivity feedback effects such as Doppler, sodium void, core axial expansion, control rod axial expansion, core radial expansion, etc. The accidents which are analyzed are the multiple failure accidents such as an unprotected transient overpower, a loss of flow, and a loss of heat sink events with degraded safety systems or functions. The safety functions to be considered here are a reactor trip, inherent reactivity feedback features, the pump coast-down, and the passive decay heat removal. (author)

  2. Serviceability Assessment for Cascading Failures in Water Distribution Network under Seismic Scenario

    Directory of Open Access Journals (Sweden)

    Qing Shuang

    2016-01-01

    Full Text Available The stability of water service is a hot point in industrial production, public safety, and academic research. The paper establishes a service evaluation model for the water distribution network (WDN. The serviceability is measured in three aspects: (1 the functionality of structural components under disaster environment; (2 the recognition of cascading failure process; and (3 the calculation of system reliability. The node and edge failures in WDN are interrelated under seismic excitations. The cascading failure process is provided with the balance of water supply and demand. The matrix-based system reliability (MSR method is used to represent the system events and calculate the nonfailure probability. An example is used to illustrate the proposed method. The cascading failure processes with different node failures are simulated. The serviceability is analyzed. The critical node can be identified. The result shows that the aged network has a greater influence on the system service under seismic scenario. The maintenance could improve the antidisaster ability of WDN. Priority should be given to controlling the time between the initial failure and the first secondary failure, for taking postdisaster emergency measures within this time period can largely cut down the spread of cascade effect in the whole WDN.

  3. Critical Assessment of Effective Interfacial Potentials Based on a Density Functional Theory for Wetting Phenomena on Curved Substrates

    Czech Academy of Sciences Publication Activity Database

    Nold, A.; Malijevský, Alexandr; Kalliadasis, S.

    2011-01-01

    Roč. 197, č. 1 (2011), s. 185-191 ISSN 1951-6355 R&D Projects: GA AV ČR IAA400720710 Grant - others:EPSRC(GB) EP/E046029; FP7 ITN(XE) 214919; ERC(XE) 247301 Institutional research plan: CEZ:AV0Z40720504 Keywords : wetting phenomena * curved substrates * theory Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.562, year: 2011

  4. Enucleation ratio efficacy might be a better predictor to assess learning curve of holmium laser enucleation of the prostate

    Directory of Open Access Journals (Sweden)

    Chang Wook Jeong

    2012-06-01

    Full Text Available PURPOSE: To appraise the evaluation methods for learning curve and to analyze the non-mentor-aided learning curve and early complications following the holmium laser enucleation of the prostate. MATERIALS AND METHODS:One-hundred and forty (n=140 consecutive patients who underwent HoLEP from July 2008 to July 2010 by a single surgeon (SJO were enrolled. Perioperative clinical variables, including enucleation time, morcellation time, enucleation ratio (enucleation weight/transitional zone volume, enucleation efficacy (enucleated weight/enucleation time, enucleation ratio efficacy (enucleation ratio/enucleation time, and early complication rate were analyzed. RESULTS: Mean prostate volume was 62.7 mL (range 21-162 and preoperative International Prostate Symptom Score (IPSS was 19.0 (4-35. Mean enucleation time and morcellation time were 49.9±23.8 (S.D. min and 11.0±9.7 min, respectively. Median duration of postoperative indwelling catheter was 1 (1-7 day and median hospital stay was 1 (1-6 day. There were a total of 31 surgery-related complications in 27 patients (19.3%, and all were manageable. There was an increasing trend of enucleation efficacy in the first 50 cases. However, enucleation efficacy was linearly correlated with the prostate size (correlation coefficients, R=0.701, p<0.001. But, enucleation ratio efficacy could eliminate the confounding effect of the prostate size (R=-0.101, p=0.233. The plateau of enucleation ratio efficacy was reached around the twenty-fifth case. CONCLUSIONS: Our results demonstrated that the operative learning curve plateau is reached after about 25 cases. We propose that a more appropriate parameter for estimating the operative learning curve is enucleation ratio efficacy, rather than enucleation efficacy.

  5. Fuzzy-logic assessment of failure hazard in pipelines due to mining activity

    Directory of Open Access Journals (Sweden)

    A. A. Malinowska

    2015-11-01

    Full Text Available The present research is aimed at a critical analysis of a method presently used for evaluating failure hazard in linear objects in mining areas. A fuzzy model of failure hazard of a linear object was created on the basis of the experience gathered so far. The rules of Mamdani fuzzy model have been used in the analyses. Finally the scaled model was integrated with a Geographic Information System (GIS, which was used to evaluate failure hazard in a water pipeline in a mining area.

  6. Disadvantages of using the area under the receiver operating characteristic curve to assess imaging tests: A discussion and proposal for an alternative approach

    International Nuclear Information System (INIS)

    Halligan, Steve; Altman, Douglas G.; Mallett, Susan

    2015-01-01

    The objectives are to describe the disadvantages of the area under the receiver operating characteristic curve (ROC AUC) to measure diagnostic test performance and to propose an alternative based on net benefit. We use a narrative review supplemented by data from a study of computer-assisted detection for CT colonography. We identified problems with ROC AUC. Confidence scoring by readers was highly non-normal, and score distribution was bimodal. Consequently, ROC curves were highly extrapolated with AUC mostly dependent on areas without patient data. AUC depended on the method used for curve fitting. ROC AUC does not account for prevalence or different misclassification costs arising from false-negative and false-positive diagnoses. Change in ROC AUC has little direct clinical meaning for clinicians. An alternative analysis based on net benefit is proposed, based on the change in sensitivity and specificity at clinically relevant thresholds. Net benefit incorporates estimates of prevalence and misclassification costs, and it is clinically interpretable since it reflects changes in correct and incorrect diagnoses when a new diagnostic test is introduced. ROC AUC is most useful in the early stages of test assessment whereas methods based on net benefit are more useful to assess radiological tests where the clinical context is known. Net benefit is more useful for assessing clinical impact. (orig.)

  7. Assessment of cardiac sympathetic nerve activity in children with chronic heart failure using quantitative iodine-123 metaiodobenzylguanidine imaging

    International Nuclear Information System (INIS)

    Karasawa, Kensuke; Ayusawa, Mamoru; Noto, Nobutaka; Sumitomo, Naokata; Okada, Tomoo; Harada, Kensuke

    2000-01-01

    Cardiac sympathetic nerve activity in children with chronic heart failure was examined by quantitative iodine-123 metaiodobenzylguanidine (MIBG) myocardial imaging in 33 patients aged 7.5±6.1 years (range 0-18 years), including 8 with cardiomyopathy, 15 with congenital heart disease, 3 with anthracycrine cardiotoxicity, 3 with myocarditis, 3 with primary pulmonary hypertension and 1 with Pompe's disease. Anterior planar images were obtained 15 min and 3 hr after the injection of iodine-123 MIBG. The cardiac iodine-123 MIBG uptake was assessed as the heart to upper mediastinum uptake activity ratio of the delayed image (H/M) and the cardiac percentage washout rate (%WR). The severity of chronic heart failure was class I (no medication) in 8 patients, class II (no symptom with medication) in 9, class III (symptom even with medication) in 10 and class IV (late cardiac death) in 6. H/M was 2.33±0.22 in chronic heart failure class I, 2.50±0.34 in class II, 1.95±0.61 in class III, and 1.39±0.29 in class IV (p<0.05). %WR was 24.8±12.8% in chronic heart failure class I, 23.3±10.2% in class II, 49.2±24.5% in class III, and 66.3±26.5% in class IV (p<0.05). The low H/M and high %WR were proportionate to the severity of chronic heart failure. Cardiac iodine-123 MIBG showed cardiac adrenergic neuronal dysfunction in children with severe chronic heart failure. Quantitative iodine-123 MIBG myocardial imaging is clinically useful as a predictor of therapeutic outcome and mortality in children with chronic heart failure. (author)

  8. Assessment of cardiac sympathetic nerve activity in children with chronic heart failure using quantitative iodine-123 metaiodobenzylguanidine imaging

    Energy Technology Data Exchange (ETDEWEB)

    Karasawa, Kensuke; Ayusawa, Mamoru; Noto, Nobutaka; Sumitomo, Naokata; Okada, Tomoo; Harada, Kensuke [Nihon Univ., Tokyo (Japan). School of Medicine

    2000-12-01

    Cardiac sympathetic nerve activity in children with chronic heart failure was examined by quantitative iodine-123 metaiodobenzylguanidine (MIBG) myocardial imaging in 33 patients aged 7.5{+-}6.1 years (range 0-18 years), including 8 with cardiomyopathy, 15 with congenital heart disease, 3 with anthracycrine cardiotoxicity, 3 with myocarditis, 3 with primary pulmonary hypertension and 1 with Pompe's disease. Anterior planar images were obtained 15 min and 3 hr after the injection of iodine-123 MIBG. The cardiac iodine-123 MIBG uptake was assessed as the heart to upper mediastinum uptake activity ratio of the delayed image (H/M) and the cardiac percentage washout rate (%WR). The severity of chronic heart failure was class I (no medication) in 8 patients, class II (no symptom with medication) in 9, class III (symptom even with medication) in 10 and class IV (late cardiac death) in 6. H/M was 2.33{+-}0.22 in chronic heart failure class I, 2.50{+-}0.34 in class II, 1.95{+-}0.61 in class III, and 1.39{+-}0.29 in class IV (p<0.05). %WR was 24.8{+-}12.8% in chronic heart failure class I, 23.3{+-}10.2% in class II, 49.2{+-}24.5% in class III, and 66.3{+-}26.5% in class IV (p<0.05). The low H/M and high %WR were proportionate to the severity of chronic heart failure. Cardiac iodine-123 MIBG showed cardiac adrenergic neuronal dysfunction in children with severe chronic heart failure. Quantitative iodine-123 MIBG myocardial imaging is clinically useful as a predictor of therapeutic outcome and mortality in children with chronic heart failure. (author)

  9. Multinational Assessment of Accuracy of Equations for Predicting Risk of Kidney Failure: A Meta-analysis

    NARCIS (Netherlands)

    Tangri, N.; Grams, M.E.; Levey, A.S.; Coresh, J.; Appel, L.J.; Astor, B.C.; Chodick, G.; Collins, A.J.; Djurdjev, O.; Elley, C.R.; Evans, M.; Garg, A.X.; Hallan, S.I.; Inker, L.A.; Ito, S.; Jee, S.H.; Kovesdy, C.P.; Kronenberg, F.; Heerspink, H.J.; Marks, A.; Nadkarni, G.N.; Navaneethan, S.D.; Nelson, R.G.; Titze, S.; Sarnak, M.J.; Stengel, B.; Woodward, M.; Iseki, K.; Wetzels, J.F.M.; et al.,

    2016-01-01

    IMPORTANCE: Identifying patients at risk of chronic kidney disease (CKD) progression may facilitate more optimal nephrology care. Kidney failure risk equations, including such factors as age, sex, estimated glomerular filtration rate, and calcium and phosphate concentrations, were previously

  10. Cohesive laws for assessment of materials failure:Theory, experimental methods and application

    DEFF Research Database (Denmark)

    Sørensen, Bent F.

    -directional laminates that are bonded together by adhesive joints, since thermosetting polymers, unlike metals, cannot be welded together. The structure can therefore fail by: - delamination (cracking along interfaces between layers inside the laminates) - adhesive joint failure (cracking along the laminate...

  11. Methodology for failure assessment of SMART SG tube with once-through helical-coiled type

    International Nuclear Information System (INIS)

    Kim, Young Jin; Choi, Shin Beom; Cho, Doo Ho; Chang, Yoon Suk

    2010-09-01

    In this research project, existing integrity evaluation method for SMART steam generator tube with crack-like flaw was reviewed to determine subject analysis model and investigate possibility of failure under crack closure behavior. Furthermore, failure pressure estimation was proposed for SMART steam generator tubes containing wear-type defects. For each subject, the following issues are addressed: 1. Determination of subject analysis model for SMART SG tube contaning crack-like flaw 2. Applicability review on existing integrity evaluation method and investigation of failure possibility for SMART SG tube containing crack-like flaw 3. Development of failure pressure estimation model for SMART SG tube with wear type defect It is anticipated that if the technologies developed in this study are applied, structural integrity can be estimated accurately

  12. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    Science.gov (United States)

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.

  13. Using the Human Activity Profile to Assess Functional Performance in Heart Failure.

    Science.gov (United States)

    Ribeiro-Samora, Giane Amorim; Pereira, Danielle Aparecida Gomes; Vieira, Otávia Alves; de Alencar, Maria Clara Noman; Rodrigues, Roseane Santo; Carvalho, Maria Luiza Vieira; Montemezzo, Dayane; Britto, Raquel Rodrigues

    2016-01-01

    To investigate (1) the validity of using the Human Activity Profile (HAP) in patients with heart failure (HF) to estimate functional capacity; (2) the association between the HAP and 6-Minute Walk Test (6MWT) distance; and (3) the ability of the HAP to differentiate between New York Heart Association (NYHA) functional classes. In a cross-sectional study, we evaluated 62 clinically stable patients with HF (mean age, 47.98 years; NYHA class I-III). Variables included maximal functional capacity as measured by peak oxygen uptake ((Equation is included in full-text article.)O2) using a cardiopulmonary exercise test (CPET), peak (Equation is included in full-text article.)O2 as estimated by the HAP, and exercise capacity as measured by the 6MWT. The difference between the measured (CPET) and estimated (HAP) peak (Equation is included in full-text article.)O2 against the average values showed a bias of 2.18 mL/kg/min (P = .007). No agreement was seen between these measures when applying the Bland-Altman method. Peak (Equation is included in full-text article.)O2 in the HAP showed a moderate association with the 6MWT distance (r = 0.62; P < .0001). Peak (Equation is included in full-text article.)O2 in the HAP was able to statistically differentiate NYHA functional classes I, II, and III (P < .05). The estimated peak (Equation is included in full-text article.)O2 using the HAP was not concordant with the gold standard CPET measure. On the contrary, the HAP was able to differentiate NYHA functional class associated with the 6MWT distance; therefore, the HAP is a useful tool for assessing functional performance in patients with HF.

  14. An assessment of underground and aboveground steam system failures in the SRS waste tank farms

    International Nuclear Information System (INIS)

    Hsu, T.C.; Shurrab, M.S.; Wiersma, B.J.

    1997-01-01

    Underground steam system failures in waste tank farms at the Savannah River Site (SRS) increased significantly in the 3--4 year period prior to 1995. The primary safety issues created by the failures were the formation of sub-surface voids in soil and the loss of steam jet transfer and waste evaporation capability, and the loss of heating and ventilation to the tanks. The average annual cost for excavation and repair of the underground steam system was estimated to be several million dollars. These factors prompted engineering personnel to re-consider long-term solutions to the problem. The primary cause of these failures was the inadequate thermal insulation utilized for steam lines associated with older tanks. The failure mechanisms were either pitting or localized general corrosion on the exterior of the pipe beneath the thermal insulation. The most realistic and practical solution is to replace the underground lines by installing aboveground steam systems, although this option will incur significant initial capital costs. Steam system components, installed aboveground in other areas of the tank farms have experienced few failures, while in continuous use. As a result, piecewise installation of temporary aboveground steam systems have been implemented in F-area whenever opportunities, i.e., failures, present themselves

  15. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  16. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  17. The role of minimum supply and social vulnerability assessment for governing critical infrastructure failure: current gaps and future agenda

    Directory of Open Access Journals (Sweden)

    M. Garschagen

    2018-04-01

    Full Text Available Increased attention has lately been given to the resilience of critical infrastructure in the context of natural hazards and disasters. The major focus therein is on the sensitivity of critical infrastructure technologies and their management contingencies. However, strikingly little attention has been given to assessing and mitigating social vulnerabilities towards the failure of critical infrastructure and to the development, design and implementation of minimum supply standards in situations of major infrastructure failure. Addressing this gap and contributing to a more integrative perspective on critical infrastructure resilience is the objective of this paper. It asks which role social vulnerability assessments and minimum supply considerations can, should and do – or do not – play for the management and governance of critical infrastructure failure. In its first part, the paper provides a structured review on achievements and remaining gaps in the management of critical infrastructure and the understanding of social vulnerabilities towards disaster-related infrastructure failures. Special attention is given to the current state of minimum supply concepts with a regional focus on policies in Germany and the EU. In its second part, the paper then responds to the identified gaps by developing a heuristic model on the linkages of critical infrastructure management, social vulnerability and minimum supply. This framework helps to inform a vision of a future research agenda, which is presented in the paper's third part. Overall, the analysis suggests that the assessment of socially differentiated vulnerabilities towards critical infrastructure failure needs to be undertaken more stringently to inform the scientifically and politically difficult debate about minimum supply standards and the shared responsibilities for securing them.

  18. The role of minimum supply and social vulnerability assessment for governing critical infrastructure failure: current gaps and future agenda

    Science.gov (United States)

    Garschagen, Matthias; Sandholz, Simone

    2018-04-01

    Increased attention has lately been given to the resilience of critical infrastructure in the context of natural hazards and disasters. The major focus therein is on the sensitivity of critical infrastructure technologies and their management contingencies. However, strikingly little attention has been given to assessing and mitigating social vulnerabilities towards the failure of critical infrastructure and to the development, design and implementation of minimum supply standards in situations of major infrastructure failure. Addressing this gap and contributing to a more integrative perspective on critical infrastructure resilience is the objective of this paper. It asks which role social vulnerability assessments and minimum supply considerations can, should and do - or do not - play for the management and governance of critical infrastructure failure. In its first part, the paper provides a structured review on achievements and remaining gaps in the management of critical infrastructure and the understanding of social vulnerabilities towards disaster-related infrastructure failures. Special attention is given to the current state of minimum supply concepts with a regional focus on policies in Germany and the EU. In its second part, the paper then responds to the identified gaps by developing a heuristic model on the linkages of critical infrastructure management, social vulnerability and minimum supply. This framework helps to inform a vision of a future research agenda, which is presented in the paper's third part. Overall, the analysis suggests that the assessment of socially differentiated vulnerabilities towards critical infrastructure failure needs to be undertaken more stringently to inform the scientifically and politically difficult debate about minimum supply standards and the shared responsibilities for securing them.

  19. Risk assessment of the emergency processes: Healthcare failure mode and effect analysis.

    Science.gov (United States)

    Taleghani, Yasamin Molavi; Rezaei, Fatemeh; Sheikhbardsiri, Hojat

    2016-01-01

    Ensuring about the patient's safety is the first vital step in improving the quality of care and the emergency ward is known as a high-risk area in treatment health care. The present study was conducted to evaluate the selected risk processes of emergency surgery department of a treatment-educational Qaem center in Mashhad by using analysis method of the conditions and failure effects in health care. In this study, in combination (qualitative action research and quantitative cross-sectional), failure modes and effects of 5 high-risk procedures of the emergency surgery department were identified and analyzed according to Healthcare Failure Mode and Effects Analysis (HFMEA). To classify the failure modes from the "nursing errors in clinical management model (NECM)", the classification of the effective causes of error from "Eindhoven model" and determination of the strategies to improve from the "theory of solving problem by an inventive method" were used. To analyze the quantitative data of descriptive statistics (total points) and to analyze the qualitative data, content analysis and agreement of comments of the members were used. In 5 selected processes by "voting method using rating", 23 steps, 61 sub-processes and 217 potential failure modes were identified by HFMEA. 25 (11.5%) failure modes as the high risk errors were detected and transferred to the decision tree. The most and the least failure modes were placed in the categories of care errors (54.7%) and knowledge and skill (9.5%), respectively. Also, 29.4% of preventive measures were in the category of human resource management strategy. "Revision and re-engineering of processes", "continuous monitoring of the works", "preparation and revision of operating procedures and policies", "developing the criteria for evaluating the performance of the personnel", "designing a suitable educational content for needs of employee", "training patients", "reducing the workload and power shortage", "improving team

  20. Assessment of insulin, lectin and vitamin C in chronic renal failure patients before and after haemodialysis

    International Nuclear Information System (INIS)

    Ahmed, A.M.; El-Yamani, N.A.; Youssif, Z.A.; Abdel-Razik, D.E.

    2006-01-01

    The present study was carried out to investigate the relative interaction between insulin, leptin and vitamin C in male patients with chronic renal failure and undergo regular haemodialysis (3 times/week). The study was carried out on 20 healthy volunteers as control (group I) and 20 with chronic renal failure (group II) which were studied before dialysis (A) and after dialysis (B). The serum results showed significant increases in creatinine, insulin and leptin levels in patient groups as compared to the control. On the other hand, significant decreases in the levels of glucose and vitamin C were recorded

  1. Motor systems energy efficiency supply curves: A methodology for assessing the energy efficiency potential of industrial motor systems

    International Nuclear Information System (INIS)

    McKane, Aimee; Hasanbeigi, Ali

    2011-01-01

    Motor-driven equipment accounts for approximately 60% of manufacturing final electricity use worldwide. A major barrier to effective policymaking, and to more global acceptance of the energy efficiency potential in industrial motor systems, is the lack of a transparent methodology for quantifying the magnitude and cost-effectiveness of these energy savings. This paper presents the results of groundbreaking analyses conducted for five countries and one region to begin to address this barrier. Using a combination of expert opinion and available data from the United States, Canada, the European Union, Thailand, Vietnam, and Brazil, bottom-up energy efficiency supply curve models were constructed to estimate the cost-effective electricity efficiency potentials and CO 2 emission reduction for three types of motor systems (compressed air, pumping, and fan) in industry for the selected countries/region. Based on these analyses, the share of cost-effective electricity saving potential of these systems as compared to the total motor system energy use in the base year varies between 27% and 49% for pumping, 21% and 47% for compressed air, and 14% and 46% for fan systems. The total technical saving potential varies between 43% and 57% for pumping, 29% and 56% for compressed air, and 27% and 46% for fan systems. - Highlights: → Development of conservation supply curves for the industrial motor systems. → An innovative approach combining available aggregate country-level data with expert opinion. → Results show both cost-effective and technical potential for energy saving and their costs. → Policy implication of the results are briefly discussed.

  2. Integrated assessment of energy efficiency technologies and CO_2 abatement cost curves in China’s road passenger car sector

    International Nuclear Information System (INIS)

    Peng, Bin-Bin; Fan, Ying; Xu, Jin-Hua

    2016-01-01

    Highlights: • Energy efficiency technologies in Chinese passenger cars are classified in detail. • CO_2-reduction potential and abatement cost are analyzed for technology bundles. • Marginal abatement cost curve is established from both micro and macro perspectives. • Spark ignition, diesel and hybrid electric vehicle paths should be firstly promoted. • Technology promotion should start from the area of taxies and high-performance cars. - Abstract: Road transport is one of the main sources of energy consumption and CO_2 emissions. It is essential to conserve energy and reduce emissions by promoting energy efficiency technologies (EETs) in this sector. This study first identifies EETs for the passenger cars and then classifies them into various technology bundles. It then analyzes the CO_2-reduction potentials and emissions abatement costs of 55 type-path, 246 type-path-technology, and 465 type-path-subtechnology bundles from micro-vehicular and macro-industrial perspectives during 2010–2030, based on which marginal abatement cost (MAC) curve for China’s road passenger car sector is established. Results show that the cumulative CO_2-reduction potential of EETs on passenger cars in China during 2010–2030 is about 2698.8 Mt, but only 4% is cost-effective. The EETs with low emissions abatement costs are mainly available in the spark ignition (SI), diesel, and hybrid electric vehicle (HEV) paths on the taxis and high-performance cars, and also in the transmission, vehicle body and SI technologies on the private cars, which could be promoted at present. The technologies with large emissions reduction potential are mainly available in the plug-in hybrid electric vehicle (PHEV) and electric vehicle (EV) paths, which would be the main channels for reducing carbon emissions in the long run.

  3. An assessment of the linear damage summation method for creep-fatigue failure with reference to a cast of type 316 stainless steel tested at 570 deg. C

    International Nuclear Information System (INIS)

    Wareing, J.; Bretherton, I.

    This paper presents preliminary results from the programme for hold period tests on a cast BQ of type 316 stainless steel at 570 deg. C. The results of tensile hold period tests on a relatively low ductility cast of type 316 stainless steel have indicated that the failure mechanism changes from a creep-fatigue interaction failure to a creep dominated failure at low strain levels. An assessment of the linear damage summation approach for failure prediction indicates that it is inappropriate for creep-fatigue interaction failures. For creep dominated fracture, failure occurs when the accumulation relaxation strain exhausts the material ductility i.e. Nsub(f epsilon R)=D. The failure criterion based on a creep summation in terms of time to fracture underestimates life

  4. An assessment of consumers’ subconscious responses to frontline employees’ attractiveness in a service failure and recovery situation

    Directory of Open Access Journals (Sweden)

    Christo Boshoff

    2017-06-01

    Full Text Available Background: Initial analyses of the impact of physical attractiveness in a business context have supported the ‘what is beautiful is good’ contention. However, in circumstances characterised by negative emotions, duress and stress, very little is known about how human beings respond at the subconscious level to the attractiveness of frontline service providers. Aim: The purpose of this study was to assess whether consumers who complain to a frontline service provider about a service failure respond differently at the subconscious level when the service provider involved in the service encounter is attractive compared with one who is less attractive. Method: Forty respondents were exposed to a video clip of a service failure and service recovery situation. While viewing the hypothetical scenario, two neuro-physiological measurements were used to collect data at the subconscious level, namely galvanic skin response (GSR and electroencephalography (EEG. Results: The results suggest that, at the subconscious level, customers respond differently to the service recovery efforts depending on the attractiveness of the frontline service provider who attempts to rectify the service failure. Conclusion: The results seem to suggest that the physical attractiveness of a frontline service provider moderates (or softens the negative emotions that a complaining customer might experience during a service failure and complaint situation – consistent with the ‘what is beautiful is good’ contention.

  5. An application of failure mode and effect analysis (FMEA to assess risks in petrochemical industry in Iran

    Directory of Open Access Journals (Sweden)

    Mehdi Kangavari

    2015-06-01

    Full Text Available Petrochemical industries have a high rate of accidents. Failure mode and effect analysis (FMEA is a systematic method and thus is capable of analyzing the risks of systems from concept phase to system disposal, detecting the failures in design stage, and determining the control measures and corrective actions for failures to reduce their impacts. The objectives of this research were to perform FMEA to identify risks in an Iranian petrochemical industry and determine the decrease of the risk priority number (RPN after implementation of intervention programs. This interventional study was performed at one petrochemical plant in Tehran, Iran in 2014. Relevant information about job categories and plant process was gathered using brainstorming techniques, fishbone diagram, and group decision making. The data were collected through interviews, observation, and documents investigations and was recorded in FMEA worksheets. The necessary corrective measures were performed on the basis of the results of initial FMEA. Forty eight failures were identified in welding unit by application of FMEA to assess risks. Welding processes especially working at height got the highest RPN. Obtained RPN for working at height before performing the corrective actions was 120 and the score was reduced to 96 after performing corrective measures. Calculated RPN for all processes was significantly reduced (p≤0.001 by implementing the corrective actions. Scores of RPN in all studied processes effectively decreased after performing corrective actions in a petrochemical industry. FMEA method is a useful tool for identifying risk intervention priorities and effectiveness in a studied petrochemical industry.

  6. Using impedance cardiography to assess left ventricular systolic function via postural change in patients with heart failure.

    Science.gov (United States)

    DeMarzo, Arthur P; Calvin, James E; Kelly, Russell F; Stamos, Thomas D

    2005-01-01

    For the diagnosis and management of heart failure, it would be useful to have a simple point-of-care test for assessing ventricular function that could be performed by a nurse. An impedance cardiography (ICG) parameter called systolic amplitude (SA) can serve as an indicator of left ventricular systolic function (LVSF). This study tested the hypothesis that patients with normal LVSF should have a significant increase in SA in response to an increase in end-diastolic volume caused by postural change from sitting upright to supine, while patients with depressed LVSF associated with heart failure should have a minimal increase or a decrease in SA from upright to supine. ICG data were obtained in 12 patients without heart disease and with normal LVSF and 18 patients with clinically diagnosed heart failure. Consistent with the hypothesis, patients with normal LVSF had a significant increase in SA from upright to supine, whereas heart failure patients had a minimal increase or a decrease in SA from upright to supine. This ICG procedure may be useful for monitoring the trend of patient response to titration of beta blockers and other medications. ICG potentially could be used to detect worsening LVSF and provide a means of measurement for adjusting treatment.

  7. Research Article. Characteristics of Sleep Apnea Assessed Before Discharge in Patients Hospitalized with Acute Heart Failure

    Directory of Open Access Journals (Sweden)

    Kocsis Ildikó

    2017-03-01

    Full Text Available Objectives. Evaluation of the characteristics of sleep apnea (SA in patients hospitalized with acute heart failure, considering that undiagnosed SA could contribute to early rehospitalization. Methods. 56 consecutive patients (13 women, 43 men, mean age 63.12 years with acute heart failure, in stable condition, underwent nocturnal polygraphy before hospital discharge. The type and severity of SA was determined. Besides descriptive statistics, correlations between the severity of SA and clinical and paraclinical characteristics were also analyzed (t-test, chi-square test, significancy at alpha 30/h. The apnea was predominantly obstructive (32 cases vs. 12 with central SA. Comparing the patients with mild or no SA with those with severe SA, we did not find statistically significant correlations (p>0.05 between the severity of SA and the majority of main clinical and paraclinical characteristics - age, sex, BMI, cardiac substrates of heart failure, comorbidities. Paradoxically, arterial hypertension (p=0.028 and atrial fibrillation (p=0.041 were significantly more prevalent in the group with mild or no SA. Conclusions. Before discharge, in the majority of patients hospitalized with acute heart failure moderate and severe SA is present, and is not related to the majority of patient related factors. Finding of significant SA in this setting is important, because its therapy could play an important role in preventing readmissions and improving prognosis.

  8. Multinational Assessment of Accuracy of Equations for Predicting Risk of Kidney Failure : A Meta-analysis

    NARCIS (Netherlands)

    Tangri, Navdeep; Grams, Morgan E.; Levey, Andrew S.; Coresh, Josef; Appel, Lawrence J.; Astor, Brad C.; Chodick, Gabriel; Collins, Allan J.; Djurdjev, Ognjenka; Elley, Raina; Evans, Marie; Garg, Amit X.; Hallan, Stein I.; Nicer, Lesley A.; Ito, Sadayoshi; Jee, Sun Ha; Kovesdy, Csaba P.; Kronenberg, Florian; Heerspink, Hiddo J. Lambers; Marks, Angharad; Nadkarni, Girish N.; Navaneethan, Sankar D.; Nelson, Robert G.; Titze, Stephanie; Sarnak, Mark J.; Stengel, Benedicte; Woodward, Mark; Iseki, Kunitoshi

    2016-01-01

    IMPORTANCE Identifying patients at risk of chronic kidney disease (CKD) progression may facilitate more optimal nephrology care. Kidney failure risk equations were previously developed and validated in 2 Canadian cohorts. Validation in other regions and in CKD populations not under the care of a

  9. Assessment of the de Hirsch Predictive Index Tests of Reading Failure.

    Science.gov (United States)

    Askov, Warren; And Others

    The predictive validity and the general usability of a battery of 10 tests reported by de Hirsch, Jansky, and Langford, the de Hirsch Predictive Index Tests of reading failure, were examined. The de Hirsch battery was administered to 433 kindergarten children in six public schools. When the pupils entered first grade, the Metropolitan Readiness…

  10. Risk Assessment Planning for Airborne Systems: An Information Assurance Failure Mode, Effects and Criticality Analysis Methodology

    Science.gov (United States)

    2012-06-01

    Visa Investigate Data Breach March 30, 2012 Visa and MasterCard are investigating whether a data security breach at one of the main companies that...30). MasterCard and Visa Investigate Data Breach . New York Times . Stamatis, D. (2003). Failure Mode Effect Analysis: FMEA from Theory to Execution

  11. Assessing changes in failure probability of dams in a changing climate

    Science.gov (United States)

    Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.

    2017-12-01

    Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.

  12. Failure prediction of low-carbon steel pressure vessel and cylindrical models

    International Nuclear Information System (INIS)

    Zhang, K.D.; Wang, W.

    1987-01-01

    The failure loads predicted by failure assessment methods (namely the net-section stress criterion; the EPRI engineering approach for elastic-plastic analysis; the CEGB failure assessment route; the modified R6 curve by Milne for strain hardening; and the failure assessment curve based on J estimation by Ainsworth) have been compared with burst test results on externally, axially sharp notched pressure vessel and open-ended cylinder models made from typical low-carbon steel St45 seamless tube which has a transverse true stress-strain curve of straight-line and parabola type and a high value of ultimate strength to yield. It was concluded from the comparison that whilst the net-section stress criterion and the CEGB route did not give conservative predictions, Milne's modified curve did give a conservative and good prediction; Ainsworth's curve gave a fairly conservative prediction; and EPRI solutions also could conditionally give a good prediction but the conditions are still somewhat uncertain. It is suggested that Milne's modified R6 curve is used in failure assessment of low-carbon steel pressure vessels. (author)

  13. Failure Impact Assessment for Large-Scale Landslides Located Near Human Settlement: Case Study in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chung

    2018-05-01

    Full Text Available In 2009, Typhoon Morakot caused over 680 deaths and more than 20,000 landslides in Taiwan. From 2010 to 2015, the Central Geological Survey of the Ministry of Economic Affairs identified 1047 potential large-scale landslides in Taiwan, of which 103 may have affected human settlements. This paper presents an analytical procedure that can be applied to assess the possible impact of a landslide collapse on nearby settlements. In this paper, existing technologies, including interpretation of remote sensing images, hydrogeological investigation, and numerical analysis, are integrated to evaluate potential failure scenarios and the landslide scale of a specific case: the Xinzhuang landslide. GeoStudio and RAMMS analysis modes and hazard classification produced the following results: (1 evaluation of the failure mechanisms and the influence zones of large-scale landslides; (2 assessment of the migration and accumulation of the landslide mass after failure; and (3 a landslide hazard and evacuation map. The results of the case study show that this analytical procedure can quantitatively estimate potential threats to human settlements. Furthermore, it can be applied to other villages and used as a reference in disaster prevention and evacuation planning.

  14. Assessment of obstructive sleep apnoea treatment success or failure after maxillomandibular advancement.

    Science.gov (United States)

    de Ruiter, M H T; Apperloo, R C; Milstein, D M J; de Lange, J

    2017-11-01

    Maxillomandibular advancement (MMA) is an alternative therapeutic option that is highly effective for treating obstructive sleep apnoea (OSA). MMA provides a solution for OSA patients that have difficulty accepting lifelong treatments with continuous positive airway pressure or mandibular advancement devices. The goal of this study was to investigate the different characteristics that determine OSA treatment success/failure after MMA. The apnoea-hypopnoea index (AHI) was used to determine the success or failure of OSA treatment after MMA. Sixty-two patients underwent MMA for moderate and severe OSA. A 71% success rate was observed with a mean AHI reduction of 69%. A statistically significant larger neck circumference was measured in patients with failed OSA treatments following MMA (P=0.008), and older patients had failed OSA treatments with MMA: 58 vs. 53 years respectively (P=0.037). Cephalometric analysis revealed no differences between successful and failed OSA treatment outcomes. There was no difference in maxillary and mandibular advancements between success and failed MMA-treated OSA patients. The complications most frequently reported following MMA were sensory disturbances in the inferior alveolar nerve (60%) and malocclusion (24%). The results suggest that age and neck girth may be important factors that could predict susceptibility to OSA treatment failures by MMA. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  15. Prospective assessment of the occurrence of anemia in patients with heart failure: results from the Study of Anemia in a Heart Failure Population (STAMINA-HFP) Registry.

    Science.gov (United States)

    Adams, Kirkwood F; Patterson, James H; Patterson, John H; Oren, Ron M; Mehra, Mandeep R; O'Connor, Christopher M; Piña, Ileana L; Miller, Alan B; Chiong, Jun R; Dunlap, Stephanie H; Cotts, William G; Felker, Gary M; Schocken, Douglas D; Schwartz, Todd A; Ghali, Jalal K

    2009-05-01

    Although a potentially important pathophysiologic factor in heart failure, the prevalence and predictors of anemia have not been well studied in unselected patients with heart failure. The Study of Anemia in a Heart Failure Population (STAMINA-HFP) Registry prospectively studied the prevalence of anemia and the relationship of hemoglobin to health-related quality of life and outcomes among patients with heart failure. A random selection algorithm was used to reduce bias during enrollment of patients seen in specialty clinics or clinics of community cardiologists with experience in heart failure. In this initial report, data on prevalence and correlates of anemia were analyzed in 1,076 of the 1,082 registry patients who had clinical characteristics and hemoglobin determined by finger-stick at baseline. Overall (n = 1,082), the registry patients were 41% female and 73% white with a mean age (+/-SD) of 64 +/- 14 years (68 +/- 13 years in community and 57 +/- 14 years in specialty sites, P 70 years affected. Initial results from the STAMINA-HFP Registry suggest that anemia is a common comorbidity in unselected outpatients with heart failure. Given the strong association of anemia with adverse outcomes in heart failure, this study supports further investigation concerning the importance of anemia as a therapeutic target in this condition.

  16. Assessment of the left ventricular systolic and diastolic function by the left ventricular density curve derived from intravenous digital subtraction angiography in children

    International Nuclear Information System (INIS)

    Horigome, Hitoshi; Satoh, Hideo; Isobe, Takeshi; Takita, Hitoshi

    1991-01-01

    To evaluate the left ventricular (LV) systolic and diastolic function, fifty-four children with various heart diseases underwent intravenous digital subtraction angiography (IV-DSA). Global left ventricular density curve was obtained through densitometry of the DSA images. The curve was smoothed by a third-degree Fourier transformation and systolic and diastolic indexes were obtained. In the control group, consisting of Kawasaki disease without coronary lesion and mild pulmonary stenosis, the peak ejection rate (PER) and the peak filling rate in early diastole (PFR-E) correlated positively with the heart rate (HR) in a quadratic curve manner [PER: r= 0.93 p<0.01, PFR-E: r= 0.94 p<0.01]. Time from end-diastolic to PER (T-PER) and time from end-systolic to PFR (T-PFR) were correlated negatively with HR [T-PER: r=-0.86 p<0.01, T-PFR: r=-0.91 p<0.01]. However, T-PER/RR and T-PFR/RR values were rather constant (20.9±3.2%, 17.0±2.6%, respectively). We also found significant correlations of PER and PFR-E with left ventricular ejection fraction (LVEF). Patients with corrected tetralogy of Fallot and with cardiomyopaties showed not only abnormal systolic indexes but some depressed diastolic indexes. LV density curve also disclosed isolated diastolic dysfunction in a group of aortic stenosis and in two patients with coronary lesions. A correlation of LVEF derived from the density curve and conventional area-length method was high [r= 0.91 p<0.001]. To evaluate the reproducibility, we were able to obtain the digital data twice with over one month interval on 24 patients. The intraobserver correlation was satisfactory. We applied the remasking method, resulting in improving the quality of digital images under spontaneous breathing. Our results indicated that IV-DSA was a less-invasive and clinically reliable method for assessment of LV function in children. (author)

  17. Assessment of the left ventricular systolic and diastolic function by the left ventricular density curve derived from intravenous digital subtraction angiography in children

    Energy Technology Data Exchange (ETDEWEB)

    Horigome, Hitoshi; Satoh, Hideo; Isobe, Takeshi; Takita, Hitoshi (Tsukuba Univ., Ibaraki (Japan). Inst. of Clinical Medicine)

    1991-05-01

    To evaluate the left ventricular (LV) systolic and diastolic function, fifty-four children with various heart diseases underwent intravenous digital subtraction angiography (IV-DSA). Global left ventricular density curve was obtained through densitometry of the DSA images. The curve was smoothed by a third-degree Fourier transformation and systolic and diastolic indexes were obtained. In the control group, consisting of Kawasaki disease without coronary lesion and mild pulmonary stenosis, the peak ejection rate (PER) and the peak filling rate in early diastole (PFR-E) correlated positively with the heart rate (HR) in a quadratic curve manner (PER: r= 0.93 p<0.01, PFR-E: r= 0.94 p<0.01). Time from end-diastolic to PER (T-PER) and time from end-systolic to PFR (T-PFR) were correlated negatively with HR (T-PER: r=-0.86 p<0.01, T-PFR: r=-0.91 p<0.01). However, T-PER/RR and T-PFR/RR values were rather constant (20.9+-3.2%, 17.0+-2.6%, respectively). We also found significant correlations of PER and PFR-E with left ventricular ejection fraction (LVEF). Patients with corrected tetralogy of Fallot and with cardiomyopaties showed not only abnormal systolic indexes but some depressed diastolic indexes. LV density curve also disclosed isolated diastolic dysfunction in a group of aortic stenosis and in two patients with coronary lesions. A correlation of LVEF derived from the density curve and conventional area-length method was high (r= 0.91 p<0.001). To evaluate the reproducibility, we were able to obtain the digital data twice with over one month interval on 24 patients. The intraobserver correlation was satisfactory. We applied the remasking method, resulting in improving the quality of digital images under spontaneous breathing. Our results indicated that IV-DSA was a less-invasive and clinically reliable method for assessment of LV function in children. (author).

  18. Comparing passive angle-torque curves recorded simultaneously with a load cell versus an isokinetic dynamometer during dorsiflexion stretch tolerance assessments.

    Science.gov (United States)

    Buckner, Samuel L; Jenkins, Nathaniel D M; Costa, Pablo B; Ryan, Eric D; Herda, Trent J; Cramer, Joel T

    2015-05-01

    The purpose of the present study was to compare the passive angle-torque curves and the passive stiffness (PS, N m °(-)(1)) values recorded simultaneously from a load cell versus an isokinetic dynamometer during dorsiflexion stretch tolerance assessments in vivo. Nine healthy men (mean ± SD age = 21.4 ± 1.6 years) completed stretch tolerance assessments on a custom-built apparatus where passive torque was measured simultaneously from an isokinetic dynamometer and a load cell. Passive torque values that corresponded with the last 10° of dorsiflexion, verified by surface electromyographic amplitude, were analyzed for each device (θ1, θ2, θ3, …, θ10). Passive torque values measured with the load cell were greater (p ≤ 0.05) than the dynamometer torque values for θ4 through θ10. There were more statistical differentiations among joint angles for passive torque measured by the load cell, and the load cell measured a greater (p ≤ 0.01) increase in passive torque and PS than the isokinetic dynamometer. These findings suggested that when examining the angle-torque curves from passive dorsiflexion stretch tolerance tests, a load cell placed under the distal end of the foot may be more sensitive than the torque recorded from an isokinetic dynamometer. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. Assessment of various failure theories for weight and cost optimized laminated composites using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Goyal, T. [Indian Institute of Technology Kanpur. Dept. of Aerospace Engineering, UP (India); Gupta, R. [Infotech Enterprises Ltd., Hyderabad (India)

    2012-07-01

    In this work, minimum weight-cost design for laminated composites is presented. A genetic algorithm has been developed for the optimization process. Maximum-Stress, Tsai-Wu and Tsai-Hill failure criteria have been used along with buckling analysis parameter for the margin of safety calculations. The design variables include three materials; namely Carbon-Epoxy, Glass-Epoxy, Kevlar-Epoxy; number of plies; ply orientation angles, varying from -75 deg. to 90 deg. in the intervals of 15 deg. and ply thicknesses which depend on the material in use. The total cost is a sum of material cost and layup cost. Layup cost is a function of the ply angle. Validation studies for solution convergence and weight-cost inverse proportionality are carried out. One set of results for shear loading are also validated from literature for a particular case. A Pareto-Optimal solution set is demonstrated for biaxial loading conditions. It is then extended to applied moments. It is found that global optimum for a given loading condition is a function of the failure criteria for shear loading, with Maximum Stress criteria giving the lightest-cheapest and Tsai-Wu criteria giving the heaviest-costliest optimized laminates. Optimized weight results are plotted from the three criteria to do a comparative study. This work gives a global optimized laminated composite and also a set of other local optimum laminates for a given set of loading conditions. The current algorithm also provides with adequate data to supplement the use of different failure criteria for varying loadings. This work can find use in the industry and/or academia considering the increased use of laminated composites in modern wind blades. (Author)

  20. Virtual reality as a metric for the assessment of laparoscopic psychomotor skills. Learning curves and reliability measures.

    Science.gov (United States)

    Gallagher, A G; Satava, R M

    2002-12-01

    The objective assessment of the psychomotor skills of surgeons is now a priority; however, this is a difficult task because of measurement difficulties associated with the assessment of surgery in vivo. In this study, virtual reality (VR) was used to overcome these problems. Twelve experienced (>50 minimal-access procedures), 12 inexperienced laparoscopic surgeons (Virtual Reality (MIST VR). Experienced laparoscopic surgeons performed the tasks significantly (p < 0.01) faster, with less error, more economy in the movement of instruments and the use of diathermy, and with greater consistency in performance. The standardized coefficient alpha for performance measures ranged from a = 0.89 to 0.98, showing high internal measurement consistency. Test-retest reliability ranged from r = 0.96 to r = 0.5. VR is a useful tool for evaluating the psychomotor skills needed to perform laparoscopic surgery.

  1. Assessment of inotropic and vasodilating effects of milrinone lactate in patients with dilated cardiomyopathy and severe heart failure

    Directory of Open Access Journals (Sweden)

    Edson Antonio Bregagnollo

    1999-02-01

    Full Text Available OBJECTIVE: To assess the hemodynamic and vasodilating effects of milrinone lactate (ML in patients with dilated cardiomyopathy (DCM and New York Heart Association (NYHA class III and IV heart failure. METHODS: Twenty patients with DCM and NYHA class III and IV heart failure were studied. The hemodynamic and vasodilating effects of ML, administered intravenously, were evaluated. The following variables were compared before and during drug infusion: cardiac output (CO and cardiac index (CI; pulmonary capillary wedge pressure (PCWP; mean aortic pressure (MAP; mean pulmonary artery pressure (MPAP; mean right atrial pressure (MRAP; left ventricular systolic and end-diastolic pressures (LVSP and LVEDP, respectively; peak rate of left ventricular pressure rise (dP/dt; systemic vascular resistance (SVR; pulmonary vascular resistance (PVR; and heart rate (HR. RESULTS: All patients showed a significant improvement of the analysed parameters of cardiac performance with an increase of CO and CI; a significant improvement in myocardial contractility (dP/dt and reduction of the LVEDP; PCWP; PAP; MAP; MRAP; SVR; PVR. Were observed no significant increase in HR occurred. CONCLUSION: Milrinone lactate is an inotropic dilating drug that, when administered intravenously, has beneficial effects on cardiac performance and myocardial contractility. It also promotes reduction of SVR and PVR in patients with DCM and NYHA class III and IV of heart failure.

  2. Trial application of the candidate root cause categorization scheme and preliminary assessment of selected data bases for the root causes of component failures program

    International Nuclear Information System (INIS)

    Bruske, S.Z.; Cadwallader, L.C.; Stepina, P.L.

    1985-04-01

    The objective of the Nuclear Regulatory Commission's (NRC) Root Causes of Component Failures Program is to develop and apply a categorization scheme for identifying root causes of failures for components that comprise safety and safety support systems of nuclear power plants. Results from this program will provide valuable input in the areas of probabilistic risk assessment, reliability assurance, and application of risk assessments in the inspection program. This report presents the trial application and assessment of the candidate root cause categorization scheme to three failure data bases: the In-Plant Reliability Data System (IPRDS), the Licensee Event Report (LER) data base, and the Nuclear Plant Reliability Data System (NPRDS). Results of the trial application/assessment show that significant root cause information can be obtained from these failure data bases

  3. Controller recovery from equipment failures in air traffic control: A framework for the quantitative assessment of the recovery context

    International Nuclear Information System (INIS)

    Subotic, Branka; Schuster, Wolfgang; Majumdar, Arnab; Ochieng, Washington

    2014-01-01

    Air Traffic Control (ATC) involves a complex interaction of human operators (primarily air traffic controllers), equipment and procedures. On the rare occasions when equipment malfunctions, controllers play a crucial role in the recovery process of the ATC system for continued safe operation. Research on human performance in other safety critical industries using human reliability assessment techniques has shown that the context in which recovery from failures takes place has a significant influence on the outcome of the process. This paper investigates the importance of context in which air traffic controller recovery from equipment failures takes place, defining it in terms of 20 Recovery Influencing Factors (RIFs). The RIFs are used to develop a novel approach for the quantitative assessment of the recovery context based on a metric referred to as the Recovery Context Indicator (RCI). The method is validated by a series of simulation exercises conducted at a specific ATC Centre. The proposed method is useful to assess recovery enhancement approaches within ATC centres

  4. Assessment of the coronary venous system in heart failure patients by blood pool agent enhanced whole-heart MRI

    Energy Technology Data Exchange (ETDEWEB)

    Manzke, Robert [University Hospital of Ulm, Department of Internal Medicine II, Ulm (Germany); Philips Research Europe, Clinical Sites Research, Hamburg (Germany); Binner, Ludwig; Bornstedt, Axel; Merkle, Nico; Lutz, Anja; Gradinger, Robert [University Hospital of Ulm, Department of Internal Medicine II, Ulm (Germany); Rasche, Volker [University Hospital of Ulm, Department of Internal Medicine II, Ulm (Germany); Experimental Cardiovascular Imaging, Internal Medicine II, Ulm (Germany)

    2011-04-15

    To investigate the feasibility of MRI for non-invasive assessment of the coronary sinus (CS) and the number and course of its major tributaries in heart failure patients. Fourteen non-ischaemic heart failure patients scheduled for cardiac resynchronisation therapy (CRT) underwent additional whole-heart coronary venography. MRI was performed 1 day before device implantation. The visibility, location and dimensions of the CS and its major tributaries were assessed and the number of potential implantation sites identified. The MRI results were validated by X-ray venography conventionally acquired during the device implantation procedure. The right atrium (RA), CS and mid-cardiac vein (MCV) could be visualised in all patients. 36% of the identified candidate branches were located posterolaterally, 48% laterally and 16% anterolaterally. The average diameter of the CS was quantified as 9.8 mm, the posterior interventricular vein (PIV) 4.6 mm, posterolateral segments 3.3 mm, lateral 2.9 mm and anterolateral 2.9 mm. Concordance with X-ray in terms of number and location of candidate branches was given in most cases. Contrast-enhanced MRI venography appears feasible for non-invasive pre-interventional assessment of the course of the CS and its major tributaries. (orig.)

  5. Assessment of the coronary venous system in heart failure patients by blood pool agent enhanced whole-heart MRI

    International Nuclear Information System (INIS)

    Manzke, Robert; Binner, Ludwig; Bornstedt, Axel; Merkle, Nico; Lutz, Anja; Gradinger, Robert; Rasche, Volker

    2011-01-01

    To investigate the feasibility of MRI for non-invasive assessment of the coronary sinus (CS) and the number and course of its major tributaries in heart failure patients. Fourteen non-ischaemic heart failure patients scheduled for cardiac resynchronisation therapy (CRT) underwent additional whole-heart coronary venography. MRI was performed 1 day before device implantation. The visibility, location and dimensions of the CS and its major tributaries were assessed and the number of potential implantation sites identified. The MRI results were validated by X-ray venography conventionally acquired during the device implantation procedure. The right atrium (RA), CS and mid-cardiac vein (MCV) could be visualised in all patients. 36% of the identified candidate branches were located posterolaterally, 48% laterally and 16% anterolaterally. The average diameter of the CS was quantified as 9.8 mm, the posterior interventricular vein (PIV) 4.6 mm, posterolateral segments 3.3 mm, lateral 2.9 mm and anterolateral 2.9 mm. Concordance with X-ray in terms of number and location of candidate branches was given in most cases. Contrast-enhanced MRI venography appears feasible for non-invasive pre-interventional assessment of the course of the CS and its major tributaries. (orig.)

  6. Failures in risk assessment and risk management for cosmetic preservatives in Europe and the impact on public health

    DEFF Research Database (Denmark)

    Schwensen, Jakob F; White, Ian R; Thyssen, Jacob P

    2015-01-01

    BACKGROUND: In view of the current and unprecedented increase in contact allergy to methylisothiazolinone (MI), we characterized and evaluated two recent epidemics of contact allergy to preservatives used in cosmetic products to address failures in risk assessment and risk management. OBJECTIVE......: To evaluate temporal trends of preservative contact allergy. METHODS: The study population included consecutive patch tested eczema patients seen at a university hospital between 1985 and 2013. A total of 23 138 patients were investigated for a contact allergy. RESULTS: The overall prevalence of contact...... the proportion of patients with current clinical disease attributable to methyldibromo glutaronitrile contact allergy decreased significantly following the ban on its use in cosmetic products (p

  7. Lagrangian Curves on Spectral Curves of Monopoles

    International Nuclear Information System (INIS)

    Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.

    2010-01-01

    We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.

  8. A procedure to identify and to assess risk parameters in a SCR (Steel Catenary Riser) due to the fatigue failure

    Energy Technology Data Exchange (ETDEWEB)

    Stefane, Wania [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Faculdade de Engenharia Mecanica; Morooka, Celso K. [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Dept. de Engenharia de Petroleo. Centro de Estudos de Petroleo; Pezzi Filho, Mario [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). E and P. ENGP/IPMI/ES; Matt, Cyntia G.C.; Franciss, Ricardo [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas (CENPES)

    2009-12-19

    The discovery of offshore fields in ultra deep water and the presence of reservoirs located in great depths below the seabed requires innovative solutions for offshore oil production systems. Many riser configurations have emerged as economically viable technological solutions for these scenarios. Therefore the study and the development of methodologies applied to riser design and procedures to calculate and to dimension production risers, taken into account the effects of mete ocean conditions, such as waves, current and platform motion in the fatigue failure is fundamental. The random nature of these conditions as well as the mechanical characteristics of the riser components are critical to a probabilistic treatment to ensure the greatest reliability for risers and minimum risks associated to different aspects of the operation like the safety of the installation, economical concerns and the environment. The current work presents a procedure of the identification and the assessment of main parameters of risk when considering fatigue failure. Static and dynamic behavior of Steel Catenary Riser (SCR) under the effects of mete ocean conditions and uncertainties related to total cumulative damage (Miner-Palmgren's rule) are taken into account. The methodology adopted is probabilistic and the approach is analytical. The procedure is based on the First Order Reliability Method (FORM) which usually presents low computational effort and acceptable accuracy. The procedure suggested is applied for two practical cases, one using data available from the literature and the second with data collected from an actual Brazilian offshore field operation. For both cases, results of the probability of failure due to fatigue were obtained for different locations along the SCR length connected to a semi-submersible platform. From these results, the sensitivity of the probability of failure due to fatigue for a SCR could be verified, and the most effective parameter could also be

  9. Failure of Grass Covered Flood Defences with Roads on Top Due to Wave Overtopping: A Probabilistic Assessment Method

    Directory of Open Access Journals (Sweden)

    Juan P. Aguilar-López

    2018-06-01

    Full Text Available Hard structures, i.e., roads, are commonly found over flood defences, such as dikes, in order to ensure access and connectivity between flood protected areas. Several climate change future scenario studies have concluded that flood defences will be required to withstand more severe storms than the ones used for their original design. Therefore, this paper presents a probabilistic methodology to assess the effect of a road on top of a dike: it gives the failure probability of the grass cover due to wave overtopping over a wide range of design storms. The methodology was developed by building two different dike configurations in computational fluid dynamics Navier–Stokes solution software; one with a road on top and one without a road. Both models were validated with experimental data collected from field-scale experiments. Later, both models were used to produce data sets for training simpler and faster emulators. These emulators were coupled to a simplified erosion model which allowed testing storm scenarios which resulted in local scouring conditioned statistical failure probabilities. From these results it was estimated that the dike with a road has higher probabilities (5 × 10−5 > Pf >1 × 10−4 of failure than a dike without a road (Pf < 1 × 10−6 if realistic grass quality spatial distributions were assumed. The coupled emulator-erosion model was able to yield realistic probabilities, given all the uncertainties in the modelling process and it seems to be a promising tool for quantifying grass cover erosion failure.

  10. A discussion about simplified methodologies for failure assessment of nuclear reactor components

    International Nuclear Information System (INIS)

    Cruz, J.R.B.; Andrade, A.H.P. de; Landes, J.D.

    1996-01-01

    Failure of nuclear reactor components like pressure vessels and piping must be avoided for all phases of reactor operation. Especially severe loading conditions come from postulated accident scenarios during which the integrity of the component is required. The use of Fracture Mechanics concepts to investigate the mechanical behavior of flawed structures in the non-linear regime is a complex subject due to the fact that the crack driving force (expressed in terms of J or CTOD) is not /only a function of the cracked geometry, but depends also on the plastic flow properties of the material. Since the numerical solutions by the finite element method are expensive and time consuming, the existence of simplified engineering procedures is of great relevance. These allow a ready identification of the main parameters affecting the crack driving force, and permit a fast and simple evaluation of the structural integrity of the cracked component. This paper presents an overview of the major simplified ductile fracture methodologies that have been proposed in the literature trying to point out their similarities, strong points and negative aspects. Once the best characteristics of each method are identified, they could then be combined to develop a single methodology, one that would be both easy to use and capable of making accurate failure predictions

  11. Methodology for probability of failure assessment of offshore pipelines; Metodologia qualitativa de avaliacao da probabilidade de falha de dutos rigidos submarinos estaticos

    Energy Technology Data Exchange (ETDEWEB)

    Pezzi Filho, Mario [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2005-07-01

    In this study it is presented a methodology for assessing the likelihood of failure for every failure mechanism defined for carbon steel static offshore pipelines. This methodology is aimed to comply with the Integrity Management policy established by the Company. Decision trees are used for the development of the methodology and the evaluation of the extent and the significance of these failure mechanisms. Decision trees enable also the visualization of the logical structure of algorithms which eventually will be used in risk assessment software. The benefits of the proposed methodology are presented and it is recommended that it be tested on static offshore pipelines installed in different assets for validation. (author)

  12. Assessment of diagnostic value of various tumors markers (CEA, CA199, CA50) for colorectal neoplasm with logistic regression and ROC curve

    International Nuclear Information System (INIS)

    Gu Ping; Huang Gang; Han Yuan

    2007-01-01

    Objective: To assess the diagnostic value of CEA, CA199 and CA50 for colorectal neoplasm by logistic regression and ROC curve. Methods: Serum CEA (with CLIA), CA199 (with ECLIA) and CA50 (with IRMA) levels were measured in 75 patients with colorectal cancer, 35 patients with benign colorectal disorders and 49 controls. The area under the ROC curve (AUC)s of CEA, CA199, CA50 from logistic regression results were compared. Results: In the cancer-benign disorder group, the AUC of CA50 was larger than the AUC of CA199. AUC of combined CEA, CA50 was largest: not only larger than any AUC of CEA, CA50, CA199 alone but also larger than the AUC of the combined three markers (0.875 vs 0.604). In cancer-control group, the AUC of combination of CEA, CA199 and CA50 was larger than any AUC of CEA, CA199 or CA50 alone. Both in the cancer-benign disorder group or cancer-control group, the AUC of CEA was larger than the AUC of CA199 or CA50. Conclusion: CEA is of definite value in the diagnosis of colorectal cancer. For differential diagnosis, the combination of CEA and CA50 can give more information, while the combination of three tumor markers is less helpful. As an advanced statistical method, logistic regression can improve the diagnostic sensitivity and specificity. (authors)

  13. The Reinvention of General Relativity: A Historiographical Framework for Assessing One Hundred Years of Curved Space-time.

    Science.gov (United States)

    Blum, Alexander; Lalli, Roberto; Renn, M Jürgen

    2015-09-01

    The history of the theory of general relativity presents unique features. After its discovery, the theory was immediately confirmed and rapidly changed established notions of space and time. The further implications of general relativity, however, remained largely unexplored until the mid 1950s, when it came into focus as a physical theory and gradually returned to the mainstream of physics. This essay presents a historiographical framework for assessing the history of general relativity by taking into account in an integrated narrative intellectual developments, epistemological problems, and technological advances; the characteristics of post-World War II and Cold War science; and newly emerging institutional settings. It argues that such a framework can help us understand this renaissance of general relativity as a result of two main factors: the recognition of the untapped potential of general relativity and an explicit effort at community building, which allowed this formerly disparate and dispersed field to benefit from the postwar changes in the scientific landscape.

  14. Anatomical curve identification

    Science.gov (United States)

    Bowman, Adrian W.; Katina, Stanislav; Smith, Joanna; Brown, Denise

    2015-01-01

    Methods for capturing images in three dimensions are now widely available, with stereo-photogrammetry and laser scanning being two common approaches. In anatomical studies, a number of landmarks are usually identified manually from each of these images and these form the basis of subsequent statistical analysis. However, landmarks express only a very small proportion of the information available from the images. Anatomically defined curves have the advantage of providing a much richer expression of shape. This is explored in the context of identifying the boundary of breasts from an image of the female torso and the boundary of the lips from a facial image. The curves of interest are characterised by ridges or valleys. Key issues in estimation are the ability to navigate across the anatomical surface in three-dimensions, the ability to recognise the relevant boundary and the need to assess the evidence for the presence of the surface feature of interest. The first issue is addressed by the use of principal curves, as an extension of principal components, the second by suitable assessment of curvature and the third by change-point detection. P-spline smoothing is used as an integral part of the methods but adaptations are made to the specific anatomical features of interest. After estimation of the boundary curves, the intermediate surfaces of the anatomical feature of interest can be characterised by surface interpolation. This allows shape variation to be explored using standard methods such as principal components. These tools are applied to a collection of images of women where one breast has been reconstructed after mastectomy and where interest lies in shape differences between the reconstructed and unreconstructed breasts. They are also applied to a collection of lip images where possible differences in shape between males and females are of interest. PMID:26041943

  15. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    Science.gov (United States)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  16. Assessing cell fusion and cytokinesis failure as mechanisms of clone 9 hepatocyte multinucleation in vitro.

    Science.gov (United States)

    Simic, Damir; Euler, Catherine; Thurby, Christina; Peden, Mike; Tannehill-Gregg, Sarah; Bunch, Todd; Sanderson, Thomas; Van Vleet, Terry

    2012-08-01

    In this in vitro model of hepatocyte multinucleation, separate cultures of rat Clone 9 cells are labeled with either red or green cell tracker dyes (Red Cell Tracker CMPTX or Vybrant CFDA SE Cell Tracer), plated together in mixed-color colonies, and treated with positive or negative control agents for 4 days. The fluorescent dyes become cell-impermeant after entering cells and are not transferred to adjacent cells in a population, but are inherited by daughter cells after fusion. The mixed-color cultures are then evaluated microscopically for multinucleation and analysis of the underlying mechanism (cell fusion/cytokinesis). Multinucleated cells containing only one dye have undergone cytokinesis failure, whereas dual-labeled multinucleated cells have resulted from fusion. © 2012 by John Wiley & Sons, Inc.

  17. An automated assessment method for the potential loss related to a dam failure

    International Nuclear Information System (INIS)

    Marche, C.; McNeil, E.; Boyer, R.

    1994-01-01

    The overall risk associated with a dam or dam group failure is a measure of the probability and severity of its effects to people, properties and the environment. A methodology for flooding impacts studies based on deterministic analysis of water depth and velocity is proposed. The methodology can be used for flooding impacts studies on entities for which the geographical position is either a unique point (such as a buildings), a series of linked points (e.g. roads) or a polygon (e.g. crop field). Software implementation of the methodology is based on numerical cartograpy, including terrain numerical modelling, free surface flow modelling and dual kriging. The software delineates the flood contours, identifies entities located in the flooding area, estimates flow conditions at the site locations of each affected entity and evaluates the corresponding impacts in these entities. 4 refs., 7 figs

  18. Assessment of leptin and some Antioxidants in Blood of Chronic Renal Failure Patients

    International Nuclear Information System (INIS)

    Ahmed, A.M.

    2004-01-01

    The aim of the present work is to study the effect of haemodialysis on the state of the antioxidants glutathione peroxidase (GPx) and vitamin C and the role of leptin hormone on the redox homeostasis in patients with chronic renal failure (CRF). This study was carried out on 25 patients (15 females and 10 males) with CRF,aged 19-55 years, in addition to 25 healthy control (10 females and 15 males). Patients were subjected to regular haemodialysis for 4 hours three times weekly and blood samples were collected before haemodialysis. In this study, plasma leptin was significantly increased in CRF group than normal control. Vitamin C and GPx were decreased significantly in CRF group in comparison with normal control. There was non-significant difference in serum leptin level between males and females in both control and patient groups. Patients showed significant lower body mass index (BMI) and albumin and higher cholesterol. In the control group, serum leptin levels showed significant positive correlation with BMI, while CRF group had significant negative correlation. In CRF group, serum leptin showed significant negative correlation with serum albumin and non-significant negative correlation with both creatinine and cholesterol. This is probably due to malnutrition status commonly occurs in renal failure. Serum leptin levels showed non-significant negative correlation between both GPx and Vitamin C in control group while in patient group, leptin showed significant positive correlation with GPx and vitamin C. In conclusion leptin acts on energy metabolism and plays a role in the modulation of cellular redox balance while the oxidative stress plays a role in many disease states. These diseases had increased incidence in uremia and particularly in haemodialysis

  19. An assessment of BWR [boiling water reactor] Mark-II containment challenges, failure modes, and potential improvements in performance

    International Nuclear Information System (INIS)

    Kelly, D.L.; Jones, K.R.; Dallman, R.J.; Wagner, K.C.

    1990-07-01

    This report assesses challenges to BWR Mark II containment integrity that could potentially arise from severe accidents. Also assessed are some potential improvements that could prevent core damage or containment failure, or could mitigate the consequences of such failure by reducing the release of fission products to the environment. These challenges and improvements are analyzed via a limited quantitative risk/benefit analysis of a generic BWR/4 reactor with Mark II containment. Point estimate frequencies of the dominant core damage sequences are obtained and simple containment event trees are constructed to evaluate the response of the containment to these severe accident sequences. The resulting containment release modes are then binned into source term release categories, which provide inputs to the consequence analysis. The output of the consequences analysis is used to construct an overall base case risk profile. Potential improvements and sensitivities are evaluated by modifying the event tree spilt fractions, thus generating a revised risk profile. Several important sensitivity cases are examined to evaluate the impact of phenomenological uncertainties on the final results. 75 refs., 25 figs., 65 tabs

  20. A Failure Criterion for Concrete

    DEFF Research Database (Denmark)

    Ottosen, N. S.

    1977-01-01

    A four-parameter failure criterion containing all the three stress invariants explicitly is proposed for short-time loading of concrete. It corresponds to a smooth convex failure surface with curved meridians, which open in the negative direction of the hydrostatic axis, and the trace in the devi......A four-parameter failure criterion containing all the three stress invariants explicitly is proposed for short-time loading of concrete. It corresponds to a smooth convex failure surface with curved meridians, which open in the negative direction of the hydrostatic axis, and the trace...

  1. Assessment of a Business-to-Consumer (B2C) model for Telemonitoring patients with Chronic Heart Failure (CHF).

    Science.gov (United States)

    Grustam, Andrija S; Vrijhoef, Hubertus J M; Koymans, Ron; Hukal, Philipp; Severens, Johan L

    2017-10-11

    The purpose of this study is to assess the Business-to-Consumer (B2C) model for telemonitoring patients with Chronic Heart Failure (CHF) by analysing the value it creates, both for organizations or ventures that provide telemonitoring services based on it, and for society. The business model assessment was based on the following categories: caveats, venture type, six-factor alignment, strategic market assessment, financial viability, valuation analysis, sustainability, societal impact, and technology assessment. The venture valuation was performed for three jurisdictions (countries) - Singapore, the Netherlands and the United States - in order to show the opportunities in a small, medium-sized, and large country (i.e. population). The business model assessment revealed that B2C telemonitoring is viable and profitable in the Innovating in Healthcare Framework. Analysis of the ecosystem revealed an average-to-excellent fit with the six factors. The structure and financing fit was average, public policy and technology alignment was good, while consumer alignment and accountability fit was deemed excellent. The financial prognosis revealed that the venture is viable and profitable in Singapore and the Netherlands but not in the United States due to relatively high salary inputs. The B2C model in telemonitoring CHF potentially creates value for patients, shareholders of the service provider, and society. However, the validity of the results could be improved, for instance by using a peer-reviewed framework, a systematic literature search, case-based cost/efficiency inputs, and varied scenario inputs.

  2. The Assessing of the Failure Behavior of Glass/Polyester Composites Subject to Quasi Static Stresses

    Science.gov (United States)

    Stanciu, M. D.; Savin, A.; Teodorescu-Drăghicescu, H.

    2017-06-01

    Using glass fabric reinforced composites for structure of wind turbine blades requires high mechanical strengths especially to cyclic stresses. Studies have shown that approximately 50% of composite material failure occurs because of fatigue. Composites behavior to cyclic stresses involves three stages regarding to stiffness variation: the first stage is characterized by the accelerated decline of stiffness with micro-cracks, the second stage - a slight decrease of stiffness characterized by the occurrence of delamination and third stage characterized by higher decreases of resistance and occurrence of fracture thereof. The aim of the paper is to analyzed the behavior of composites reinforced with glass fibers fabric type RT500 and polyester resin subjected to tensile cyclic loading with pulsating quasi-static regime with asymmetry coefficient R = 0. The samples were tested with the universal tensile machine LS100 Lloyd Instruments Plus, with a load capacity of 100 kN. The load was applied with different speeds of 1 mm/min, 10 mm/min and 20 mm/min. After tests, it was observed that the greatest permanent strains were recorded in the first load cycles when the total energy storage by material was lost due to internal friction. With increasing number of cycles, the glass/polyester composites ability to store energy of deformation decreases, the flow phenomenon characterized by large displacements to smaller loading forces appearing.

  3. Risk assessment of failure modes of gas diffuser liner of V94.2 siemens gas turbine by FMEA method

    Science.gov (United States)

    Mirzaei Rafsanjani, H.; Rezaei Nasab, A.

    2012-05-01

    Failure of welding connection of gas diffuser liner and exhaust casing is one of the failure modes of V94.2 gas turbines which are happened in some power plants. This defect is one of the uncertainties of customers when they want to accept the final commissioning of this product. According to this, the risk priority of this failure evaluated by failure modes and effect analysis (FMEA) method to find out whether this failure is catastrophic for turbine performance and is harmful for humans. By using history of 110 gas turbines of this model which are used in some power plants, the severity number, occurrence number and detection number of failure determined and consequently the Risk Priority Number (RPN) of failure determined. Finally, critically matrix of potential failures is created and illustrated that failure modes are located in safe zone.

  4. Risk Assessment of Total Coliform in X WTP’s Water Production Using Failure Mode And Effect Analysis Method

    Directory of Open Access Journals (Sweden)

    Bella Apriliani Amanda

    2017-07-01

    Full Text Available The greatest risk of drinking water supply is a failure to provide safe drinking water for communities. Based on IPA Kedunguling testing report on March 2016 noted that sample exceeding the quality standart of Peraturan Menteri Kesehatan RI No 492/2010 for the total coliform quality standart. The presence of total coliforms indicating water contamination by pathogen means the water is not safe to consume. The disinfection process has an importance rule in pathogen inactivation. Disinfectant performance is influenced by temperature, pH, turbidity, and the presence of organic materials. One way to control the quality of water produced by using a risk management approach Failure Modes and Effect Analysis (FMEA methods. The potential risks should be measured to determine causes of the problems and find the appropriate risk reduction. The risk assessment is using Risk Priority Number (RPN scale as a basis prioritization of remedial action on issues. Based on identification and risk analysis using FMEA known that the greatest risk of failure is the stipulation of chlorine dose and organic substances (category of high risk level; residual chlorine (category of moderate risk level; turbidity and pH (very low risk level category. Improvement proposal that can be done to reduce total coliforms presence in IPA Kedunguling is by increasing residual chlorine to 0.6 mg/l, set a daily chlorine level, controlling DBPs forming by lowering the concentration of organic precursor using granular activated carbon (GAC or aeration, by lowering the dose of disinfectant, set aside DBPs after the compound is formed using granular activated carbon (GAC, turbidity and pH monitoring, and regularly washing the filters

  5. Ultrasound of Jugular Veins for Assessment of Acute Dyspnea in Emergency Departments and for the Assessment of Acute Heart Failure.

    Science.gov (United States)

    Tzadok, Batsheva; Shapira, Shay; Tal-Or, Eran

    2018-05-01

    When a patient arrives at the emergency department (ED) presenting with symptoms of acute decompensated heart failure (ADHF), it is possible to reach a definitive diagnosis through many different venues, including medical history, physical examination, echocardiography, chest X-ray, and B-type natriuretic peptide (BNP) levels. Point-of-care ultrasound (POCUS) has become a mainstream tool for diagnosis and treatment in the field of emergency medicine, as well as in various other departments in the hospital setting. Currently, the main methods of diagnosis of ADHF using POCUS are pleural B-lines and inferior vena cava (IVC) width and respiratory variation. To examine the potential use and benefits of bedside ultrasound of the jugular veins in the evaluation of dyspneic patients for identification of ADHF. A blood BNP level was drawn from each participant at time of recruitment. The area and size of the internal jugular vein (IJV) during inspiration and expiration were examined. Our results showed that the respiratory area change of the IJVs had a specificity and sensitivity of nearly 70% accuracy rate in indentifying ADHF in our ED. Ultrasound of the IJV may be a useful tool for the diagnosis of ADHF because it is easy to measure and requires little skill. It is also not affected by patient body habitus.

  6. Failure to use routine prevention of disability (POD assessment resulting In permanent disability

    Directory of Open Access Journals (Sweden)

    Erika Zoulba

    2016-06-01

    Full Text Available Disability is one of problems in leprosy or Morbus Hansen (MH, which can cause the patient loose his autonomy and may affect his social relationship with family and community. Disability occurs due to neurological inflammation that can manifest as silent neuritis (which develops without any pain. Silent neuritis can be recognized early with a routine prevention of disability (POD assessment. A 19-year-old male patient was referred from a District General Hospital with a history of numbness and stiffness of his 4th and 5th fingers of his left hand since 1 month before admittance. The patient was refered by Community Health Center (CHC or PUSKESMAS after a one year treatment and RFT. During his treatment at the CHC, no assessment of peripheral nerve or POD had ever been performed. The POD assessment at our hospital demonstrated sensory deficit at some points of assessment on both palms and reduced muscle strength of the first and 5th fingers in both hands. Nerve conduction velocity (NCV performed at the outpatient of Neurology Department, showed multiple mononeuropathy MH with irreversible damage. Nerve damage is still considered reversible when it occurs less than 6 months. In this case, the silent neuritis was not detected early and there was delayed treatment; as showed by NCV which revealed a manifestation of irreversible nerve damage. Routine POD assessment may detect the condition and appropriate treatment may overcome the nerve damage.

  7. MYOCARDIAL PERFUSION ASSESSMENT IN FORECASTING EFFECT OF CORONARY ANGIOPLASTY IN PATIENTS WITH ISCHEMIC CHRONIC HEART FAILURE

    Directory of Open Access Journals (Sweden)

    A. B. Mironkov

    2015-01-01

    Full Text Available Aim. To define influence of the left ventricle (LV perfusion defects on the clinical status dynamics after coronary angioplasty in patients with the expressed myocardium dysfunction of ischemic etiology. Materials and methods. Examined 86 patients (81 men and 5 women aged from 46 to 73 years before and in 2–3 days after percutaneous coronary intervention with diagnosis: CAD, CHF with NYHA class III–IV, echocardiography parameters of LV: ejection fraction less than 40%, end-diastolic volume is more than 200 ml. Perfusion defects of myocardium estimated with use of ECG-gated single photon emission computed tomography. Predictors were defined: perfusion defects on LV apex (in score, perfusion defects in the area of LAD, LCx and RCA (%, the LV global perfusion defects (in score and %. Results. In 42% of cases 6-minute walk test increased to 3 times; The NYHA class decreased by 2 classes (group 1. In 28 cases 6-minute walk test increased to 2 times and the NYHA class decreased on 1 class. In 22 patients 6-minute walk test increased less than 50% of reference values and there was no dynamics NYHA class (50 patients of the group 2. Initial extent of LV global perfusion defects in group 1 – 41,2 ± 4,0%, in group 2 – 58,3 ± 2,4% (р = 0,0004. Similar values are received for perfusion indicators in the area of LAD and the LV apex. Prevalence of myocardial perfusion defects at rest reflects prevalence of a cardiosclerosis in a cardiac muscle. Conclusion. Degree of LV myocardial perfusion defects in patients with the expressed heart failure of ischemic etiology is the key indicator influencing clinical efficiency of coronary angioplasty. Critical size for definition of the favorable forecast of revascularization are 60% and more perfusion defects testifying that in a cardiac muscle the focal cardiosclerosis prevails over the functioning myocardium. 

  8. Small Data, Online Learning and Assessment Practices in Higher Education: A Case Study of Failure?

    Science.gov (United States)

    Watson, Cate; Wilson, Anna; Drew, Valerie; Thompson, Terrie Lynn

    2017-01-01

    In this paper, we present an in-depth case study of a single student who failed an online module which formed part of a master's programme in Professional Education and Leadership. We use this case study to examine assessment practices in higher education in the online environment. In taking this approach, we go against the current predilection…

  9. Failure and fatigue life assessment of steel railway bridges with brittle material

    NARCIS (Netherlands)

    Maljaars, J.

    2014-01-01

    Some existing steel bridges have been constructed from steels with a toughness that does not fulfil the requirements in modern standards. In such a case, standards for bridges do not provide an alternative assessment route. Yet such bridges may still be fit for purpose. This paper presents an

  10. A note on families of fragility curves

    International Nuclear Information System (INIS)

    Kaplan, S.; Bier, V.M.; Bley, D.C.

    1989-01-01

    In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve

  11. ECM using Edwards curves

    DEFF Research Database (Denmark)

    Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja

    2013-01-01

    -arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...

  12. Fission product release assessment for end fitting failure in Candu reactor loaded with CANFLEX-NU fuel bundles

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Dirk Joo; Jeong, Chang Joon; Lee, Kang Moon; Suk, Ho Chun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    Fission product release (FPR) assessment for End Fitting Failure (EFF) in CANDU reactor loaded with CANFLEX-natural uranium (NU) fuel bundles has been performed. The predicted results are compared with those for the reactor loaded with standard 37-element bundles. The total channel I-131 release at the end of transient for EFF accident is calculated to be 380.8 TBq and 602.9 TBq for the CANFLEX bundle and standard bundle channel cases, respectively. They are 4.9% and 7.9% of total inventory, respectively. The lower total releases of the CANFLEX bundle O6 channel are attributed to the lower initial fuel temperatures caused by the lower linear element power of the CANFLEX bundle compared with the standard bundle. 4 refs., 1 fig., 4 tabs. (Author)

  13. On the need for revising healthcare failure mode and effect analysis for assessing potential for patient harm in healthcare processes

    International Nuclear Information System (INIS)

    Abrahamsen, Håkon Bjorheim; Abrahamsen, Eirik Bjorheim; Høyland, Sindre

    2016-01-01

    Healthcare Failure Mode and Effect Analysis is a proactive, systematic method adapted from safety-critical industries increasingly used to assess the potential for patient harm in high-risk healthcare processes. In this paper we review and discuss this method. We point to some weaknesses and finally argue for two adjustments. One adjustment is regarding the way in which risk is evaluated, and the other is to adopt a broader evaluation of barrier performance. Examples are given from prehospital critical care and from the operating room environment within hospitals to illustrate these ideas. - Highlights: • This article discusses the appropriateness of using HFMEA in healthcare processes. • We conclude that HFMEA has an important role to play in such contexts. • We argue for two adjustments in the traditional HFMEA. • One is regarding the way risk is evaluated. • The other is to adopt a broader evaluation of barrier performance.

  14. Fission product release assessment for end fitting failure in Candu reactor loaded with CANFLEX-NU fuel bundles

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Dirk Joo; Jeong, Chang Joon; Lee, Kang Moon; Suk, Ho Chun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    Fission product release (FPR) assessment for End Fitting Failure (EFF) in CANDU reactor loaded with CANFLEX-natural uranium (NU) fuel bundles has been performed. The predicted results are compared with those for the reactor loaded with standard 37-element bundles. The total channel I-131 release at the end of transient for EFF accident is calculated to be 380.8 TBq and 602.9 TBq for the CANFLEX bundle and standard bundle channel cases, respectively. They are 4.9% and 7.9% of total inventory, respectively. The lower total releases of the CANFLEX bundle O6 channel are attributed to the lower initial fuel temperatures caused by the lower linear element power of the CANFLEX bundle compared with the standard bundle. 4 refs., 1 fig., 4 tabs. (Author)

  15. The Role of Device Diagnostic Algorithms in the Assessment and Management of Patients with Systolic Heart Failure: A Review

    Directory of Open Access Journals (Sweden)

    Andrew C. T. Ha

    2011-01-01

    Full Text Available Hospitalization due to heart failure (HF exacerbation represents a major burden in health care and portends a poor long-term prognosis for patients. As a result, there is considerable interest to develop novel tools and strategies to better detect onset of volume overload, as HF hospitalizations may be reduced if appropriate interventions can be promptly delivered. One such innovation is the use of device-based diagnostic parameters in HF patients with implantable cardioverter defibrillators (ICD and/or cardiac resynchronization therapy (CRT devices. These diagnostic algorithms can effectively monitor and detect changes in patients' HF status, as well as predict one's risk of HF hospitalization. This paper will review the role of these device diagnostics parameters in the assessment and management of HF patients in ambulatory settings. In addition, the integration of these novel algorithms in existing HF disease management models will be discussed.

  16. Australia's pesticide environmental risk assessment failure: the case of diuron and sugarcane.

    Science.gov (United States)

    Holmes, Glen

    2014-11-15

    In November 2012, the Australian Pesticide and Veterinary Medicines Authority (APVMA) concluded a 12 year review of the PSII herbicide diuron. One of the primary concerns raised during the review was the potential impact on aquatic ecosystems, particularly in the catchments draining to the Great Barrier Reef. The environmental risk assessment process used by the APVMA utilised a runoff risk model developed and validated under European farming conditions. However, the farming conditions in the sugarcane regions of the Great Barrier Reef catchments have environmental parameters beyond the currently validated bounds of the model. The use of the model to assess environmental risk in these regions is therefore highly inappropriate, demonstrating the pitfalls of a one size fits all approach. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Point-of-Care Ultrasonography to Assess Portal Vein Pulsatility and the Effect of Inhaled Milrinone and Epoprostenol in Severe Right Ventricular Failure: A Report of 2 Cases.

    Science.gov (United States)

    Tremblay, Jan-Alexis; Beaubien-Souligny, William; Elmi-Sarabi, Mahsa; Desjardins, Georges; Denault, André Y

    2017-10-15

    This article describes 2 patients with severe acute right ventricular failure causing circulatory shock. Portal vein pulsatility assessed by bedside ultrasonography suggested clinically relevant venous congestion. Management included cardiac preload reduction and combined inhalation of milrinone and epoprostenol to reduce right ventricular afterload. Portal vein ultrasonography may be useful in assessing right ventricular function in the acutely ill patient.

  18. Engineering failure assessment methods applied to pressurized components; Bruchmechanische Bewertung druckfuehrender Komponenten mittels ingenieurmaessiger Bewertungsverfahren

    Energy Technology Data Exchange (ETDEWEB)

    Zerbst, U.; Beeck, F.; Scheider, I.; Brocks, W. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Werkstofforschung

    1998-11-01

    Under the roof of SINTAP (Structural Integrity Assessment Procedures for European Industry), a European BRITE-EURAM project, a study is being carried out into the possibility of establishing on the basis of existing models a standard European flaw assessment method. The R6 Routine and the ETM are important, existing examples in this context. The paper presents the two methods, explaining their advantages and shortcomes as well as common features. Their applicability is shown by experiments with two pressure vessels subject to internal pressure and flawed by a surface crack or a through-wall crack, respectively. Both the R6 Routine and the ETM results have been compared with results of component tests carried out in the 1980s at TWI and are found to yield acceptable conservative, i.e. sufficiently safe, lifetime predictions, as they do not give lifetime assessments which unduly underestimate the effects of flaws under operational loads. (orig./CB) [Deutsch] Gegenwaertig wird im Rahmen von SINTAP (Structural Integrity Assessment Procedures for European Industries), einem europaeischen BRITE-EURAM-Projekt geprueft, inwieweit auf der Grundlage vorhandener Modelle eine einheitliche europaeische Fehlerbewertungsmethode erstellt werden kann. Eine zentrale Stellung kommt dabei Verfahren wie der R6-Routine und dem ETM zu. In der vorliegenden Arbeit wurden beide Methoden vorgestellt, wobei ihre Vor- und Nachteile, aber auch ihre Gemeinsamkeiten herausgearbeitet wurden. Die Anwendung wurde an zwei innendruckbelasteten Behaeltern mit Oberflaechen- bzw. wanddurchdringendem Riss demonstriert. Sowohl R6-Routine als auch ETM ergaben im Vergleich mit am TWI zu Beginn der 80er Jahre durchgefuehrten Bauteilexperimenten eine vertretbare konservative Vorhersage, d.h. eine nicht allzu grosse Unterschaetzung der ertragbaren Last der Bauteile. (orig.)

  19. Incorporating cumulative effects into environmental assessments of mariculture: Limitations and failures of current siting methods

    International Nuclear Information System (INIS)

    King, Sarah C.; Pushchak, Ronald

    2008-01-01

    Assessing and evaluating the cumulative impacts of multiple marine aquaculture facilities has proved difficult in environmental assessment. A retrospective review of 23 existing mariculture farms in southwestern New Brunswick was conducted to determine whether cumulative interactions would have justified site approvals. Based on current scientific evidence of cumulative effects, six new criteria were added to a set of far-field impacts and other existing criteria were expanded to include regional and cumulative environmental impacts in Hargrave's [Hargrave BT. A traffic light decision system for marine finfish aquaculture siting. Ocean Coast Manag 2002; 45:215-35.] Traffic Light Decision Support System (DSS) presently used in Canadian aquaculture environmental assessments. Before mitigation, 19 of the 23 sites failed the amended set of criteria and after considering mitigation, 8 sites failed. Site and ecosystem indices yielded varying site acceptability scores; however, many sites would not have been approved if siting decisions had been made within a regional management framework and cumulative impact criteria were considered in the site evaluation process

  20. Risk assessment of Giardia from a full scale MBR sewage treatment plant caused by membrane integrity failure.

    Science.gov (United States)

    Zhang, Yu; Chen, Zhimin; An, Wei; Xiao, Shumin; Yuan, Hongying; Zhang, Dongqing; Yang, Min

    2015-04-01

    Membrane bioreactors (MBR) are highly efficient at intercepting particles and microbes and have become an important technology for wastewater reclamation. However, many pathogens can accumulate in activated sludge due to the long residence time usually adopted in MBR, and thus may pose health risks when membrane integrity problems occur. This study presents data from a survey on the occurrence of water-borne Giardia pathogens in reclaimed water from a full-scale wastewater treatment plant with MBR experiencing membrane integrity failure, and assessed the associated risk for green space irrigation. Due to membrane integrity failure, the MBR effluent turbidity varied between 0.23 and 1.90 NTU over a period of eight months. Though this turbidity level still met reclaimed water quality standards (≤5 NTU), Giardia were detected at concentrations of 0.3 to 95 cysts/10 L, with a close correlation between effluent turbidity and Giardia concentration. All β-giardin gene sequences of Giardia in the WWTP influents were genotyped as Assemblages A and B, both of which are known to infect humans. An exponential dose-response model was applied to assess the risk of infection by Giardia. The risk in the MBR effluent with chlorination was 9.83×10(-3), higher than the acceptable annual risk of 1.0×10(-4). This study suggested that membrane integrity is very important for keeping a low pathogen level, and multiple barriers are needed to ensure the biological safety of MBR effluent. Copyright © 2015. Published by Elsevier B.V.

  1. Analysis of non simultaneous common mode failures. Application to the reliability assessment of the decay heat removal of the RNR 1500 project

    International Nuclear Information System (INIS)

    Natta, M.; Bloch, M.

    1991-01-01

    The experience with the LMFBR PHENIX has shown many cases of failures on identical and redundant components, which were close in time but not simultaneous and due to the same causes such as a design error, an unappropriate material, corrosion, ... Since the decay heat removal (DHR) must be assured for a long period after shutdown of the reactor, the overall reliability of the DHR system depends much on this type of successive failures by common mode causes, for which the usual β factor methods are not appropriate since they imply that the several failures are simultaneous. In this communication, two methods will be presented. The first one was used to assess the reliability of the DHR system of the RNR 1500 project. In this method, one modelize the occurrence of successive failures on n identical files by a sudden jump of the failure rate from the value λ attributed to the first failure to the value λ' attributed to the (n-1) still available files. This method leads to a quite natural quantification of the interest of diversity for highly redundant systems. For the RNR 1500 project where, in case of the loss of normal DHR path through the steam generators, the decay heat is removed by four separated sodium loops of 26 MW unit capacity in forced convection, the probabilistic assessment shows that it is necessary to diversify the sodium-sodium heat exchanger in order to fullfil the upper limit of 10 -7 /year for the probability of failure of DHR. A separate assessment for the main sequence leading to DHR loss was performed using a different method in which the successive failures are interpreted as a premature end of life, the lifetimes being directly used as random variables. This Monte-Carlo type method, which can be applied to any type of lifetime distribution, leads to results consistent to those obtained with the first one

  2. Echocardiographic assessment of right ventricular function in routine practice: Which parameters are useful to predict one-year outcome in advanced heart failure patients with dilated cardiomyopathy?

    Science.gov (United States)

    Kawata, Takayuki; Daimon, Masao; Kimura, Koichi; Nakao, Tomoko; Lee, Seitetsu L; Hirokawa, Megumi; Kato, Tomoko S; Watanabe, Masafumi; Yatomi, Yutaka; Komuro, Issei

    2017-10-01

    Right ventricular (RV) function has recently gained attention as a prognostic predictor of outcome even in patients who have left-sided heart failure. Since several conventional echocardiographic parameters of RV systolic function have been proposed, our aim was to determine if any of these parameters (tricuspid annular plane systolic excursion: TAPSE, tissue Doppler derived systolic tricuspid annular motion velocity: S', fractional area change: FAC) are associated with outcome in advanced heart failure patients with dilated cardiomyopathy (DCM). We retrospectively enrolled 68 DCM patients, who were New York Heart Association (NYHA) Class III or IV and had a left ventricular (LV) ejection fraction functional class IV, plasma brain natriuretic peptide concentration, intravenous inotrope use, left atrial volume index, and FAC were associated with outcome, whereas TAPSE and S' were not. Receiver-operating characteristic curve analysis showed that the optimal FAC cut-off value to identify patients with an event was rights reserved.

  3. Acute renal failure requiring renal replacement therapy in the intensive care unit: impact on prognostic assessment for shared decision making.

    Science.gov (United States)

    Johnson, Robert F; Gustin, Jillian

    2011-07-01

    A 69-year-old female was receiving renal replacement therapy (RRT) for acute renal failure (ARF) in an intensive care unit (ICU). Consultation was requested from the palliative medicine service to facilitate a shared decision-making process regarding goals of care. Clinician responsibility in shared decision making includes the formulation and expression of a prognostic assessment providing the necessary perspective for a spokesperson to match patient values with treatment options. For this patient, ARF requiring RRT in the ICU was used as a focal point for preparing a prognostic assessment. A prognostic assessment should include the outcomes of most importance to a discussion of goals of care: mortality risk and survivor functional status, in this case including renal recovery. A systematic review of the literature was conducted to document published data regarding these outcomes for adult patients receiving RRT for ARF in the ICU. Forty-one studies met the inclusion criteria. The combined mean values for short-term mortality, long-term mortality, renal-function recovery of short-term survivors, and renal-function recovery of long-term survivors were 51.7%, 68.6%, 82.0%, and 88.4%, respectively. This case example illustrates a process for formulating and expressing a prognostic assessment for an ICU patient requiring RRT for ARF. Data from the literature review provide baseline information that requires adjustment to reflect specific patient circumstances. The nature of the acute primary process, comorbidities, and severity of illness are key modifiers. Finally, the prognostic assessment is expressed during a family meeting using recommended principles of communication.

  4. Novel risk stratification with time course assessment of in-hospital mortality in patients with acute heart failure.

    Directory of Open Access Journals (Sweden)

    Takeshi Yagyu

    Full Text Available Patients with acute heart failure (AHF show various clinical courses during hospitalization. We aimed to identify time course predictors of in-hospital mortality and to establish a sequentially assessable risk model.We enrolled 1,035 consecutive AHF patients into derivation (n = 597 and validation (n = 438 cohorts. For risk assessments at admission, we utilized Get With the Guidelines-Heart Failure (GWTG-HF risk scores. We examined significant predictors of in-hospital mortality from 11 variables obtained during hospitalization and developed a risk stratification model using multiple logistic regression analysis. Across both cohorts, 86 patients (8.3% died during hospitalization. Using backward stepwise selection, we identified five time-course predictors: catecholamine administration, minimum platelet concentration, maximum blood urea nitrogen, total bilirubin, and C-reactive protein levels; and established a time course risk score that could sequentially assess a patient's risk status. The addition of a time course risk score improved the discriminative ability of the GWTG-HF risk score (c-statistics in derivation and validation cohorts: 0.776 to 0.888 [p = 0.002] and 0.806 to 0.902 [p<0.001], respectively. A calibration plot revealed a good relationship between observed and predicted in-hospital mortalities in both cohorts (Hosmer-Lemeshow chi-square statistics: 6.049 [p = 0.642] and 5.993 [p = 0.648], respectively. In each group of initial low-intermediate risk (GWTG-HF risk score <47 and initial high risk (GWTG-HF risk score ≥47, in-hospital mortality was about 6- to 9-fold higher in the high time course risk score group than in the low-intermediate time course risk score group (initial low-intermediate risk group: 20.3% versus 2.2% [p<0.001], initial high risk group: 57.6% versus 8.5% [p<0.001].A time course assessment related to in-hospital mortality during the hospitalization of AHF patients can clearly categorize a patient's on

  5. Older driver failures of attention at intersections: using change blindness methods to assess turn decision accuracy.

    Science.gov (United States)

    Caird, Jeff K; Edwards, Christopher J; Creaser, Janet I; Horrey, William J

    2005-01-01

    A modified version of the flicker technique to induce change blindness was used to examine the effects of time constraints on decision-making accuracy at intersections on a total of 62 young (18-25 years), middle-aged (26-64 years), young-old (65-73 years), and old-old (74+ years) drivers. Thirty-six intersection photographs were manipulated so that one object (i.e., pedestrian, vehicle, sign, or traffic control device) in the scene would change when the images were alternated for either 5 or 8 s using the modified flicker method. Young and middle-aged drivers made significantly more correct decisions than did young-old and old-old drivers. Logistic regression analysis of the data indicated that age and/or time were significant predictors of decision performance in 14 of the 36 intersections. Actual or potential applications of this research include driving assessment and crash investigation.

  6. Tsunami-hazard assessment based on subaquatic slope-failure susceptibility and tsunami-inundation modeling

    Science.gov (United States)

    Anselmetti, Flavio; Hilbe, Michael; Strupler, Michael; Baumgartner, Christoph; Bolz, Markus; Braschler, Urs; Eberli, Josef; Liniger, Markus; Scheiwiller, Peter; Strasser, Michael

    2015-04-01

    Due to their smaller dimensions and confined bathymetry, lakes act as model oceans that may be used as analogues for the much larger oceans and their margins. Numerous studies in the perialpine lakes of Central Europe have shown that their shores were repeatedly struck by several-meters-high tsunami waves, which were caused by subaquatic slides usually triggered by earthquake shaking. A profound knowledge of these hazards, their intensities and recurrence rates is needed in order to perform thorough tsunami-hazard assessment for the usually densely populated lake shores. In this context, we present results of a study combining i) basinwide slope-stability analysis of subaquatic sediment-charged slopes with ii) identification of scenarios for subaquatic slides triggered by seismic shaking, iii) forward modeling of resulting tsunami waves and iv) mapping of intensity of onshore inundation in populated areas. Sedimentological, stratigraphical and geotechnical knowledge of the potentially unstable sediment drape on the slopes is required for slope-stability assessment. Together with critical ground accelerations calculated from already failed slopes and paleoseismic recurrence rates, scenarios for subaquatic sediment slides are established. Following a previously used approach, the slides are modeled as a Bingham plastic on a 2D grid. The effect on the water column and wave propagation are simulated using the shallow-water equations (GeoClaw code), which also provide data for tsunami inundation, including flow depth, flow velocity and momentum as key variables. Combining these parameters leads to so called «intensity maps» for flooding that provide a link to the established hazard mapping framework, which so far does not include these phenomena. The current versions of these maps consider a 'worst case' deterministic earthquake scenario, however, similar maps can be calculated using probabilistic earthquake recurrence rates, which are expressed in variable amounts of

  7. Expected dose for the early failure scenario classes in the 2008 performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Helton, J.C.; Hansen, C.W.; Sallaberry, C.J.

    2014-01-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. In support of this development and an associated license application to the U.S. Nuclear Regulatory Commission (NRC), the DOE completed an extensive performance assessment (PA) for the proposed YM repository in 2008. This presentation describes the determination of expected dose to the reasonably maximally exposed individual (RMEI) specified in the NRC regulations for the YM repository for the early waste package (WP) failure scenario class and the early drip shield (DS) failure scenario class in the 2008 YM PA. The following topics are addressed: (i) properties of the early failure scenario classes and the determination of dose and expected dose the RMEI, (ii) expected dose and uncertainty in expected dose to the RMEI from the early WP failure scenario class, (iii) expected dose and uncertainty in expected dose to the RMEI from the early DS failure scenario class, (iv) expected dose and uncertainty in expected dose to the RMEI from the combined early WP and early DS failure scenario class with and without the inclusion of failures resulting from nominal processes, and (v) uncertainty in the occurrence of early failure scenario classes. The present article is part of a special issue of Reliability Engineering and System Safety devoted to the 2008 YM PA; additional articles in the issue describe other aspects of the 2008 YM PA. - Highlights: • Extensive work has been carried out by the U.S. DOE in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. • Properties of the early failure scenario classes (i.e. early waste package failure and early drip shield failure) in the 2008 YM performance assessment are described. • Determination of dose, expected dose and expected (mean

  8. Sequential Oxygenation Index and Organ Dysfunction Assessment within the First 3 Days of Mechanical Ventilation Predict the Outcome of Adult Patients with Severe Acute Respiratory Failure

    Directory of Open Access Journals (Sweden)

    Hsu-Ching Kao

    2013-01-01

    Full Text Available Objective. To determine early predictors of outcomes of adult patients with severe acute respiratory failure. Method. 100 consecutive adult patients with severe acute respiratory failure were evaluated in this retrospective study. Data including comorbidities, Sequential Organ Failure Assessment (SOFA score, Acute Physiological Assessment and Chronic Health Evaluation II (APACHE II score, PaO2, FiO2, PaO2/FiO2, PEEP, mean airway pressure (mPaw, and oxygenation index (OI on the 1st and the 3rd day of mechanical ventilation, and change in OI within 3 days were recorded. Primary outcome was hospital mortality; secondary outcome measure was ventilator weaning failure. Results. 38 out of 100 (38% patients died within the study period. 48 patients (48% failed to wean from ventilator. Multivariate analysis showed day 3 OI ( and SOFA ( score were independent predictors of hospital mortality. Preexisting cerebrovascular accident (CVA ( was the predictor of weaning failure. Results from Kaplan-Meier method demonstrated that higher day 3 OI was associated with shorter survival time (log-Rank test, . Conclusion. Early OI (within 3 days and SOFA score were predictors of mortality in severe acute respiratory failure. In the future, prospective studies measuring serial OIs in a larger scale of study cohort is required to further consolidate our findings.

  9. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy

    International Nuclear Information System (INIS)

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-01-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events. (paper)

  10. 2-D Electrical Resistivity Tomography (ERT) Assessment of Ground Failure in Urban Area

    Science.gov (United States)

    Nordiana, M. M.; Bery, A. A.; Taqiuddin, Z. M.; Jinmin, M.; Abir, I. A.

    2018-04-01

    This study was carried out to assess the foundation defects around an urban area in Selangor, Malaysia using 2-D electrical resistivity tomography (ERT). The affected structure is a three storey houses and having severe foundation-based cracks. Six 2-D ERT survey lines with 5 m minimum electrode spacing using Pole-dipole array were executed parallel to building’s wall. Four boreholes were conducted to identify the depth to competent layer to verify the 2-D ERT results. Inversion model of 2-D resistivity show that the study area consists of two main zones. The first zone is a low resistivity value (resistivity values of 100-1000 Ωm at 20-70 m depth. The second zone is the granite bedrock of more than 3500 Ωm with depth greater than 70 m. These results were complimented and confirmed by borehole records. The ERT and borehole record suggest that the clay, sand, saturated zone, highly weathered zone and boulders at foundation depths may lead to ground movements which affected the stability of the building.

  11. The China Patient-centered Evaluative Assessment of Cardiac Events (China PEACE) retrospective heart failure study design.

    Science.gov (United States)

    Yu, Yuan; Zhang, Hongzhao; Li, Xi; Lu, Yuan; Masoudi, Frederick A; Krumholz, Harlan M; Li, Jing

    2018-05-10

    Heart failure (HF) is a leading cause of hospitalisation in China, which is experiencing a rapid increase in cardiovascular disease prevalence. Yet, little is known about current burden of disease, quality of care and treatment outcomes of HF in China. The objective of this paper is to describe the study methodology, data collection and abstraction, and progress to date of the China Patient-centered Evaluative Assessment of Cardiac Events 5 Retrospective Heart Failure Study (China PEACE 5r-HF). The China PEACE 5r-HF Study will examine a nationally representative sample of more than 10 000 patient records hospitalised for HF in 2015 in China. The study is a retrospective cohort study. Patients have been selected using a two-stage sampling design stratified by economic-geographical regions. We will collect patient characteristics, diagnostic testing, treatments and in-hospital outcomes, including death and complications, and charges of hospitalisation. Data quality will be monitored by a central coordinating centre and will address case ascertainment, data abstraction and data management. As of October 2017, we have sampled 15 538 medical records from 189 hospitals, and have received 15 057 (96.9%) of these for data collection, and completed data abstraction and quality control on 7971. The Central Ethics Committee at the Chinese National Center for Cardiovascular Diseases approved the study. All collaborating hospitals accepted central ethics committee approval with the exception of 15 hospitals, which obtained local approval by internal ethics committees. Findings will be disseminated in future peer-reviewed papers and will serve as a foundation for improving the care for HF in China. NCT02877914. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Mapping basin-wide subaquatic slope failure susceptibility as a tool to assess regional seismic and tsunami hazards

    Science.gov (United States)

    Strasser, Michael; Hilbe, Michael; Anselmetti, Flavio S.

    2010-05-01

    occurred. Comparison of reconstructed critical stability conditions with the known distribution of landslide deposits reveals minimum and maximum threshold conditions for slopes that failed or remained stable, respectively. The resulting correlations reveal good agreements and suggest that the slope stability model generally succeeds in reproducing past events. The basin-wide mapping of subaquatic slope failure susceptibility through time thus can also be considered as a promising paleoseismologic tool that allows quantification of past earthquake ground shaking intensities. Furthermore, it can be used to assess the present-day slope failure susceptibility allowing for identification of location and estimation of size of future, potentially tsunamigenic subaquatic landslides. The new approach presented in our comprehensive lake study and resulting conceptual ideas can be vital to improve our understanding of larger marine slope instabilities and related seismic and oceanic geohazards along formerly glaciated ocean margins and closed basins worldwide.

  13. Environmental risk assessment of low density polyethylene unit using the method of failure mode and effect analysis

    Directory of Open Access Journals (Sweden)

    Salati Parinaz

    2012-01-01

    Full Text Available The ninth olefin plan of Arya Sasol Petrochemical Company (A.S.P.C. is regarded the largest gas Olefin Unit located on Pars Special Economic Energy Zone (P.S.E.E.Z. Considering the importance of the petrochemical unit, its environmental assessment seems necessary to identify and reduce potential hazards. For this purpose, after determining the scope of the study area, identification and measurement of the environmental parameters, environmental risk assessment of the unit was carried out using Environment Failure Mode and Effect Analysis (EFMEA. Using the noted method, sources causing environmental risks were identified, rated and prioritized. Beside, the impacts of the environmental aspects derived from the unit activities as well as their consequences were also analyzed. Furthermore, the identified impacts were prioritized based on Risk Priority Number (RPN and severity level of the consequences imposed on the affected environment. After performing statistical calculations, it was found that the environmental aspects owing the risk priority number higher than 15 have a high level of risk. Results obtained from Low Density Polyethylene Unit revealed that the highest risk belongs to the emergency vent system with risk priority number equal to 48. It is occurred due to imperfect performance of the reactor safety system leading to the emissions of ethylene gas, particles, and radioactive steam as well as air and noise pollutions. Results derived from secondary assessment of the environmental aspects, through difference in calculated RPN and activities risk levels showed that employing modern methods and risk assessment are have remarkably reduced the severity of risk and consequently detracted the damages and losses incurred on the environment.

  14. Effects of cardiac energy efficiency in diastolic heart failure. Assessment with positron emission tomography with 11C-acetate

    International Nuclear Information System (INIS)

    Hasegawa, Shinji; Yamamoto, Kazuhiro; Sakata, Yasushi; Takeda, Yasuharu; Kajimoto, Katsufumi; Kanai, Yasukazu; Hori, Masatsugu; Hatazawa, Jun

    2008-01-01

    Diastolic heart failure (DHF) has become a high social burden, and its major underlying cardiovascular disease is hypertensive heart disease. However, the pathogenesis of DHF remains to be clarified. This study aimed to assess the effects of cardiac energy efficiency in DHF patients. 11 C-Acetate positron emission tomography and echocardiography were conducted in 11 DHF Japanese patients and 10 normal volunteers. The myocardial clearance rate of radiolabeled 11 C-acetate was measured to calculate the work metabolic index (WMI), an index of cardiac efficiency. The ratio of peak mitral E wave velocity to peak early diastolic septal myocardial velocity (E/e') was calculated to assess left ventricular (LV) filling pressure. The LV mass index was greater and the mean age was higher in the DHF patients than in the normal volunteers. There was no difference in WMI between the two groups. However, WMI varied widely among the DHF patients and was inversely correlated with E/e' (r=-0.699, p=0.017). In contrast, there was no correlation in the normal volunteers. In conclusion, the inefficiency of energy utilization is not a primary cause of diastolic dysfunction or DHF, and cardiac efficiency may not affect diastolic function in normal hearts. However, the energy-wasting state may induce the elevation of LV filling pressure in DHF patients, which was considered to principally result from the progressive diastolic dysfunction. (author)

  15. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 3: Hardware component failure data; Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  16. Failure mode taxonomy for assessing the reliability of Field Programmable Gate Array based Instrumentation and Control systems

    International Nuclear Information System (INIS)

    McNelles, Phillip; Zeng, Zhao Chang; Renganathan, Guna; Chirila, Marius; Lu, Lixuan

    2017-01-01

    Highlights: • The use FPGAs in I&C systems in Nuclear Power Plants is an important issue (IAEA). • OECD-NEA published a failure mode taxonomy for software-based digital I&C systems. • This paper extends the OECD-NEA taxonomy to model FPGA-based systems. • FPGA failure modes, failure effects, uncovering methods are categorized/described. • Provides an example of modelling an FPGA-Based RTS/ESFAS using the FPGA taxonomy. - Abstract: Field Programmable Gate Arrays (FPGAs) are a form of programmable digital hardware configured to perform digital logic functions. This configuration (programming) is performed using Hardware Description Language (HDL), making FPGAs a form of HDL Programmed Device (HPD). In the nuclear field, FPGAs have seen use in upgrades and replacements of obsolete Instrumentation and Control (I&C) systems. This paper expands upon previous work that resulted in extensive FPGA failure mode data, to allow for the application of the OECD-NEA failure modes taxonomy. The OECD-NEA taxonomy presented a method to model digital (software-based) I&C systems, based on the hardware and software failure modes, failure uncovering effects and levels of abstraction, using a Reactor Trip System/Engineering Safety Feature Actuation System (RTS/ESFAS) as an example system. To create the FPGA taxonomy, this paper presents an additional “sub-component” level of abstraction, to demonstrate the effect of the FPGA failure modes and failure categories on an FPGA-based system. The proposed FPGA taxonomy is based on the FPGA failure modes, failure categories, failure effects and uncovering situations. The FPGA taxonomy is applied to the RTS/ESFAS test system, to demonstrate the effects of the anticipated FPGA failure modes on a digital I&C system, and to provide a modelling example for this proposed taxonomy.

  17. Preclinical endoscopic training using a part-task simulator: learning curve assessment and determination of threshold score for advancement to clinical endoscopy.

    Science.gov (United States)

    Jirapinyo, Pichamol; Abidi, Wasif M; Aihara, Hiroyuki; Zaki, Theodore; Tsay, Cynthia; Imaeda, Avlin B; Thompson, Christopher C

    2017-10-01

    Preclinical simulator training has the potential to decrease endoscopic procedure time and patient discomfort. This study aims to characterize the learning curve of endoscopic novices in a part-task simulator and propose a threshold score for advancement to initial clinical cases. Twenty novices with no prior endoscopic experience underwent repeated endoscopic simulator sessions using the part-task simulator. Simulator scores were collected; their inverse was averaged and fit to an exponential curve. The incremental improvement after each session was calculated. Plateau was defined as the session after which incremental improvement in simulator score model was less than 5%. Additionally, all participants filled out questionnaires regarding simulator experience after sessions 1, 5, 10, 15, and 20. A visual analog scale and NASA task load index were used to assess levels of comfort and demand. Twenty novices underwent 400 simulator sessions. Mean simulator scores at sessions 1, 5, 10, 15, and 20 were 78.5 ± 5.95, 176.5 ± 17.7, 275.55 ± 23.56, 347 ± 26.49, and 441.11 ± 38.14. The best fit exponential model was [time/score] = 26.1 × [session #] -0.615 ; r 2  = 0.99. This corresponded to an incremental improvement in score of 35% after the first session, 22% after the second, 16% after the third and so on. Incremental improvement dropped below 5% after the 12th session corresponding to the predicted score of 265. Simulator training was related to higher comfort maneuvering an endoscope and increased readiness for supervised clinical endoscopy, both plateauing between sessions 10 and 15. Mental demand, physical demand, and frustration levels decreased with increased simulator training. Preclinical training using an endoscopic part-task simulator appears to increase comfort level and decrease mental and physical demand associated with endoscopy. Based on a rigorous model, we recommend that novices complete a minimum of 12 training

  18. Contractibility of curves

    Directory of Open Access Journals (Sweden)

    Janusz Charatonik

    1991-11-01

    Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.

  19. Visual assessment of brain magnetic resonance imaging detects injury to cognitive regulatory sites in patients with heart failure.

    Science.gov (United States)

    Pan, Alan; Kumar, Rajesh; Macey, Paul M; Fonarow, Gregg C; Harper, Ronald M; Woo, Mary A

    2013-02-01

    Heart failure (HF) patients exhibit depression and executive function impairments that contribute to HF mortality. Using specialized magnetic resonance imaging (MRI) analysis procedures, brain changes appear in areas regulating these functions (mammillary bodies, hippocampi, and frontal cortex). However, specialized MRI procedures are not part of standard clinical assessment for HF (which is usually a visual evaluation), and it is unclear whether visual MRI examination can detect changes in these structures. Using brain MRI, we visually examined the mammillary bodies and frontal cortex for global and hippocampi for global and regional tissue changes in 17 HF and 50 control subjects. Significantly global changes emerged in the right mammillary body (HF 1.18 ± 1.13 vs control 0.52 ± 0.74; P = .024), right hippocampus (HF 1.53 ± 0.94 vs control 0.80 ± 0.86; P = .005), and left frontal cortex (HF 1.76 ± 1.03 vs control 1.24 ± 0.77; P = .034). Comparison of the visual method with specialized MRI techniques corroborates right hippocampal and left frontal cortical, but not mammillary body, tissue changes. Visual examination of brain MRI can detect damage in HF in areas regulating depression and executive function, including the right hippocampus and left frontal cortex. Visual MRI assessment in HF may facilitate evaluation of injury to these structures and the assessment of the impact of potential treatments for this damage. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Broadening failure rate distributions in PRA uncertainty analyses

    International Nuclear Information System (INIS)

    Martz, H.F.

    1984-01-01

    Several recent nuclear power plant probabilistic risk assessments (PRAs) have utilized broadened Reactor Safety Study (RSS) component failure rate population variability curves to compensate for such things as expert overvaluation bias in the estimates upon which the curves are based. A simple two-components of variation empirical Bayes model is proposed for use in estimating the between-expert variability curve in the presence of such biases. Under certain conditions this curve is a population variability curve. Comparisons are made with the existing method. The popular procedure appears to be generally much more conservative than the empirical Bayes method in removing such biases. In one case the broadened curve based on the popular method is more than two orders of magnitude broader than the empirical Bayes curve. In another case it is found that the maximum justifiable degree of broadening of the RSS curve is to increase α from 5% to 12%, which is significantly less than 20% value recommended in the popular approach. 15 references, 1 figure, 5 tables

  1. Texas curve margin of safety.

    Science.gov (United States)

    2013-01-01

    This software can be used to assist with the assessment of margin of safety for a horizontal curve. It is intended for use by engineers and technicians responsible for safety analysis or management of rural highway pavement or traffic control devices...

  2. Sepsis patients in the emergency department : stratification using the Clinical Impression Score, Predisposition, Infection, Response and Organ dysfunction score or quick Sequential Organ Failure Assessment score?

    NARCIS (Netherlands)

    Quinten, Vincent M.; van Meurs, Matijs; Wolffensperger, Anna E.; ter Maaten, Jan C.; Ligtenberg, Jack J M

    2017-01-01

    OBJECTIVE: The aim of this study was to compare the stratification of sepsis patients in the emergency department (ED) for ICU admission and mortality using the Predisposition, Infection, Response and Organ dysfunction (PIRO) and quick Sequential Organ Failure Assessment (qSOFA) scores with clinical

  3. Longitudinal microvascularity in achilles tendinopathy (power doppler ultrasound, magnetic resonance imaging time-intensity curves and the Victorian Institute of Sport Assessment-Achilles questionnaire): a pilot study

    International Nuclear Information System (INIS)

    Richards, Paula J.; McCall, Iain W.; Day, Christopher; Belcher, John; Maffulli, Nicola

    2010-01-01

    To evaluate the imaging of the natural history of Achilles tendinopathy microvascularisation in comparison with symptoms, using a validated disease-specific questionnaire [the Victorian Institute of Sport Assessment-Achilles (VISA-A)]. A longitudinal prospective pilot study of nine patients with post-contrast magnetic resonance imaging (MRI), time-intensity curve (TIC) enhancement, ultrasound (US) and power Doppler (PD) evaluation of tendinopathy of the mid-Achilles tendon undergoing conservative management (eccentric exercise) over 1 year. There were five men and four women [mean age 47 (range 30-62) years]. Six asymptomatic tendons with normal US and MRI appearance showed less enhancement than the tibial metaphysis did and showed a flat, constant, but very low rate of enhancement in the bone and Achilles tendon (9-73 arbitrary TIC units). These normal Achilles tendons on imaging showed a constant size throughout the year (mean 4.9 mm). At baseline the TIC enhancement in those with tendinopathy ranged from 90 arbitrary units to 509 arbitrary units. Over time, 11 abnormal Achilles tendons, whose symptoms settled, were associated with a reduction in MRI enhancement mirrored by a reduction in the number of vessels on power Doppler (8.0 to 2.7), with an improvement in morphology and a reduction in tendon size (mean 15-10.6 mm). One tendon did not change its abnormal imaging features, despite improving symptoms. Two patients developed contralateral symptoms and tendinopathy, and one had more abnormal vascularity on power Doppler and higher MRI TIC peaks in the asymptomatic side. In patient with conservatively managed tendinopathy of the mid-Achilles tendon over 1 year there was a reduction of MRI enhancement and number of vessels on power Doppler, followed by morphological improvements and a reduction in size. Vessels per se related to the abnormal morphology and size of the tendon rather than symptoms. Symptoms improve before the Achilles size reduces and the

  4. The sequential organ failure assessment (SOFA) score is an effective triage marker following staggered paracetamol (acetaminophen) overdose.

    Science.gov (United States)

    Craig, D G; Zafar, S; Reid, T W D J; Martin, K G; Davidson, J S; Hayes, P C; Simpson, K J

    2012-06-01

    The sequential organ failure assessment (SOFA) score is an effective triage marker following single time point paracetamol (acetaminophen) overdose, but has not been evaluated following staggered (multiple supratherapeutic doses over >8 h, resulting in cumulative dose of >4 g/day) overdoses. To evaluate the prognostic accuracy of the SOFA score following staggered paracetamol overdose. Time-course analysis of 50 staggered paracetamol overdoses admitted to a tertiary liver centre. Individual timed laboratory samples were correlated with corresponding clinical parameters and the daily SOFA scores were calculated. A total of 39/50 (78%) patients developed hepatic encephalopathy. The area under the SOFA receiver operator characteristic for death/liver transplantation was 87.4 (95% CI 73.2-95.7), 94.3 (95% CI 82.5-99.1), and 98.4 (95% CI 84.3-100.0) at 0, 24 and 48 h, respectively, postadmission. A SOFA score of paracetamol overdose, is associated with a good prognosis. Both the SOFA and APACHE II scores could improve triage of high-risk staggered paracetamol overdose patients. © 2012 Blackwell Publishing Ltd.

  5. A risk assessment methodology to evaluate the risk failure of managed aquifer recharge in the Mediterranean Basin

    Science.gov (United States)

    Rodríguez-Escales, Paula; Canelles, Arnau; Sanchez-Vila, Xavier; Folch, Albert; Kurtzman, Daniel; Rossetto, Rudy; Fernández-Escalante, Enrique; Lobo-Ferreira, João-Paulo; Sapiano, Manuel; San-Sebastián, Jon; Schüth, Christoph

    2018-06-01

    Managed aquifer recharge (MAR) can be affected by many risks. Those risks are related to different technical and non-technical aspects of recharge, like water availability, water quality, legislation, social issues, etc. Many other works have acknowledged risks of this nature theoretically; however, their quantification and definition has not been developed. In this study, the risk definition and quantification has been performed by means of fault trees and probabilistic risk assessment (PRA). We defined a fault tree with 65 basic events applicable to the operation phase. After that, we have applied this methodology to six different managed aquifer recharge sites located in the Mediterranean Basin (Portugal, Spain, Italy, Malta, and Israel). The probabilities of the basic events were defined by expert criteria, based on the knowledge of the different managers of the facilities. From that, we conclude that in all sites, the perception of the expert criteria of the non-technical aspects were as much or even more important than the technical aspects. Regarding the risk results, we observe that the total risk in three of the six sites was equal to or above 0.90. That would mean that the MAR facilities have a risk of failure equal to or higher than 90 % in the period of 2-6 years. The other three sites presented lower risks (75, 29, and 18 % for Malta, Menashe, and Serchio, respectively).

  6. Assessment of predictive models for the failure of titanium and ferrous alloys due to hydrogen effects. Report for the period of June 16 to September 15, 1981

    International Nuclear Information System (INIS)

    Archbold, T.F.; Bower, R.B.; Polonis, D.H.

    1982-04-01

    The 1977 version of the Simpson-Puls-Dutton model appears to be the most amenable with respect to utilizing known or readily estimated quantities. The Pardee-Paton model requires extensive calculations involving estimated quantities. Recent observations by Koike and Suzuki on vanadium support the general assumption that crack growth in hydride forming metals is determined by the rate of hydride formation, and their hydrogen atmosphere-displacive transformation model is of potential interest in explaining hydrogen embrittlement in ferrous alloys as well as hydride formers. The discontinuous nature of cracking due to hydrogen embrittlement appears to depend very strongly on localized stress intensities, thereby pointing to the role of microstructure in influencing crack initiation, fracture mode and crack path. The initiation of hydrogen induced failures over relatively short periods of time can be characterized with fair reliability using measurements of the threshold stress intensity. The experimental conditions for determining K/sub Th/ and ΔK/sub Th/ are designed to ensure plane strain conditions in most cases. Plane strain test conditions may be viewed as a conservative basis for predicting delayed failure. The physical configuration of nuclear waste canisters may involve elastic/plastic conditions rather than a state of plane strain, especially with thin-walled vessels. Under these conditions, alternative predictive tests may be considered, including COD and R-curve methods. The double cantilever beam technique employed by Boyer and Spurr on titanium alloys offers advantages for examining hydrogen induced delayed failure over long periods of time. 88 references

  7. Echo and BNP serial assessment in ambulatory heart failure care: Data on loop diuretic use and renal function

    Directory of Open Access Journals (Sweden)

    Frank Lloyd Dini

    2016-12-01

    Full Text Available We compared the follow-up data on loop diuretic use and renal function, as assessed by serum creatinine levels, and the estimated glomerular filtration rate (eGFR, of two groups of consecutive ambulatory HF patients: 1 the clinically-guided group, in which management was clinically driven based on the institutional protocol of the HF Unit of the Cardiovascular and Thoracic Department of Pisa (standard of care and 2 the echo and B-type natriuretic peptide (BNP guided group (patients conforming to the protocol of the Network Labs Ultrasound (NEBULA in HF Study Group: Pisa, Perugia, Pavia; Verona, Auckland, and Veruno, in which therapy was delivered according to the serial assessment of BNP and echocardiography. Patients whose follow-up was based on standard of care had a significant higher prevalence of worsening renal function, that was likely related to higher diuretic dosages, whilst, a better management of renal function was observed in the echo-BNP-guided group. The data is related to “Echo and natriuretic peptide guided therapy improves outcome and reduces worsening renal function in systolic heart failure: An observational study of 1137 outpatients” (A. Simioniuc, E. Carluccio, S. Ghio, A. Rossi, P. Biagioli, G. Reboldi, G.G. Galeotti, F. Lu, C. Zara, G. Whalley, P.G. Temporelli, F.L. Dini, 2016; K.J. Harjai, H.K. Dinshaw, E. Nunez, M. Shah, H. Thompson, T. Turgut, H.O. Ventura, 1999; A. Ahmed, A. Husain, T.E. Love, G. Gambassi, L.J. Dell׳Italia, G.S. Francis, M. Gheorghiade, R.M. Allman, S. Meleth, R.C. Bourge, 2006 [1–3].

  8. Echo and BNP serial assessment in ambulatory heart failure care: Data on loop diuretic use and renal function.

    Science.gov (United States)

    Dini, Frank Lloyd; Simioniuc, Anca; Carluccio, Erberto; Ghio, Stefano; Rossi, Andrea; Biagioli, Paolo; Reboldi, Gianpaolo; Galeotti, Gian Giacomo; Lu, Fei; Zara, Cornelia; Whalley, Gillian; Temporelli, Pier Luigi

    2016-12-01

    We compared the follow-up data on loop diuretic use and renal function, as assessed by serum creatinine levels, and the estimated glomerular filtration rate (eGFR), of two groups of consecutive ambulatory HF patients: 1) the clinically-guided group, in which management was clinically driven based on the institutional protocol of the HF Unit of the Cardiovascular and Thoracic Department of Pisa (standard of care) and 2) the echo and B-type natriuretic peptide (BNP) guided group (patients conforming to the protocol of the Network Labs Ultrasound (NEBULA) in HF Study Group: Pisa, Perugia, Pavia; Verona, Auckland, and Veruno), in which therapy was delivered according to the serial assessment of BNP and echocardiography. Patients whose follow-up was based on standard of care had a significant higher prevalence of worsening renal function, that was likely related to higher diuretic dosages, whilst, a better management of renal function was observed in the echo-BNP-guided group. The data is related to "Echo and natriuretic peptide guided therapy improves outcome and reduces worsening renal function in systolic heart failure: An observational study of 1137 outpatients" (A. Simioniuc, E. Carluccio, S. Ghio, A. Rossi, P. Biagioli, G. Reboldi, G.G. Galeotti, F. Lu, C. Zara, G. Whalley, P.G. Temporelli, F.L. Dini, 2016; K.J. Harjai, H.K. Dinshaw, E. Nunez, M. Shah, H. Thompson, T. Turgut, H.O. Ventura, 1999; A. Ahmed, A. Husain, T.E. Love, G. Gambassi, L.J. Dell׳Italia, G.S. Francis, M. Gheorghiade, R.M. Allman, S. Meleth, R.C. Bourge, 2006) [1], [2], [3].

  9. Failure of Passive Immune Transfer in Calves: A Meta-Analysis on the Consequences and Assessment of the Economic Impact.

    Directory of Open Access Journals (Sweden)

    Didier Raboisson

    Full Text Available Low colostrum intake at birth results in the failure of passive transfer (FPT due to the inadequate ingestion of colostral immunoglobulins (Ig. FPT is associated with an increased risk of mortality and decreased health and longevity. Despite the known management practices associated with low FPT, it remains an important issue in the field. Neither a quantitative analysis of FPT consequences nor an assessment of its total cost are available. To address this point, a meta-analysis on the adjusted associations between FPT and its outcomes was first performed. Then, the total costs of FPT in European systems were calculated using a stochastic method with adjusted values as the input parameters. The adjusted risks (and 95% confidence intervals for mortality, bovine respiratory disease, diarrhoea and overall morbidity in the case of FPT were 2.12 (1.43-3.13, 1.75 (1.50-2.03, 1.51 (1.05-2.17 and 1.91 (1.63-2.24, respectively. The mean (and 95% prediction interval total costs per calf with FPT were estimated to be €60 (€10-109 and €80 (€20-139 for dairy and beef, respectively. As a result of the double-step stochastic method, the proposed economic estimation constitutes the first estimate available for FPT. The results are presented in a way that facilitates their use in the field and, with limited effort, combines the cost of each contributor to increase the applicability of the economic assessment to the situations farm-advisors may face. The present economic estimates are also an important tool to evaluate the profitability of measures that aim to improve colostrum intake and FPT prevention.

  10. Design fatigue curve for Hastelloy-X

    International Nuclear Information System (INIS)

    Nishiguchi, Isoharu; Muto, Yasushi; Tsuji, Hirokazu

    1983-12-01

    In the design of components intended for elevated temperature service as the experimental Very High-Temperature gas-cooled Reactor (VHTR), it is essential to prevent fatigue failure and creep-fatigue failure. The evaluation method which uses design fatigue curves is adopted in the design rules. This report discussed several aspects of these design fatigue curves for Hastelloy-X (-XR) which is considered for use as a heat-resistant alloy in the VHTR. Examination of fatigue data gathered by a literature search including unpublished data showed that Brinkman's equation is suitable for the design curve of Hastelloy-X (-XR), where total strain range Δ epsilon sub(t) is used as independent variable and fatigue life Nsub(f) is transformed into log(log Nsub(f)). (author)

  11. An analytical model for interactive failures

    International Nuclear Information System (INIS)

    Sun Yong; Ma Lin; Mathew, Joseph; Zhang Sheng

    2006-01-01

    In some systems, failures of certain components can interact with each other, and accelerate the failure rates of these components. These failures are defined as interactive failure. Interactive failure is a prevalent cause of failure associated with complex systems, particularly in mechanical systems. The failure risk of an asset will be underestimated if the interactive effect is ignored. When failure risk is assessed, interactive failures of an asset need to be considered. However, the literature is silent on previous research work in this field. This paper introduces the concepts of interactive failure, develops an analytical model to analyse this type of failure quantitatively, and verifies the model using case studies and experiments

  12. Assessment of the risk of failure of high voltage substations due to environmental conditions and pollution on insulators.

    Science.gov (United States)

    Castillo Sierra, Rafael; Oviedo-Trespalacios, Oscar; Candelo, John E; Soto, Jose D

    2015-07-01

    Pollution on electrical insulators is one of the greatest causes of failure of substations subjected to high levels of salinity and environmental pollution. Considering leakage current as the main indicator of pollution on insulators, this paper focuses on establishing the effect of the environmental conditions on the risk of failure due to pollution on insulators and determining the significant change in the magnitude of the pollution on the insulators during dry and humid periods. Hierarchical segmentation analysis was used to establish the effect of environmental conditions on the risk of failure due to pollution on insulators. The Kruskal-Wallis test was utilized to determine the significant changes in the magnitude of the pollution due to climate periods. An important result was the discovery that leakage current was more common on insulators during dry periods than humid ones. There was also a higher risk of failure due to pollution during dry periods. During the humid period, various temperatures and wind directions produced a small change in the risk of failure. As a technical result, operators of electrical substations can now identify the cause of an increase in risk of failure due to pollution in the area. The research provides a contribution towards the behaviour of the leakage current under conditions similar to those of the Colombian Caribbean coast and how they affect the risk of failure of the substation due to pollution.

  13. Generic component failure data base

    International Nuclear Information System (INIS)

    Eide, S.A.; Calley, M.B.

    1992-01-01

    This report discusses comprehensive component generic failure data base which has been developed for light water reactor probabilistic risk assessments. The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) was used to generate component failure rates. Using this approach, most of the failure rates are based on actual plant data rather then existing estimates

  14. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  15. JUMPING THE CURVE

    Directory of Open Access Journals (Sweden)

    René Pellissier

    2012-01-01

    Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.

  16. Hemodynamic changes during weaning: can we assess and predict cardiac-related weaning failure by transthoracic echocardiography?

    Science.gov (United States)

    Voga, Gorazd

    2010-01-01

    Cardiac-related failure of weaning from mechanical ventilation is an important reason for prolonged mechanical ventilation, intensive care unit treatment, and increased morbidity and mortality. When transthoracic echocardiography (TTE) is routinely performed before a weaning trial, patients at high risk of cardiac-related failure can be detected by low left ventricular (LV) ejection fraction, diastolic dysfunction, and elevated LV filling pressure. During the weaning trial, a further increase of LV filling pressure and progression of diastolic failure can be observed by repeated TTE. Owing to certain limitations concerning patients and methodology, TTE cannot be employed in every patient and invasive hemodynamic monitoring is still mandatory in selected patients with repetitive weaning failure.

  17. Respiratory Failure

    Science.gov (United States)

    Respiratory failure happens when not enough oxygen passes from your lungs into your blood. Your body's organs, ... brain, need oxygen-rich blood to work well. Respiratory failure also can happen if your lungs can' ...

  18. Assessment of cerebral microbleeds by susceptibility-weighted imaging at 3T in patients with end-stage organ failure.

    Science.gov (United States)

    Sparacia, Gianvincenzo; Cannella, Roberto; Lo Re, Vincenzina; Gambino, Angelo; Mamone, Giuseppe; Miraglia, Roberto

    2018-02-17

    Cerebral microbleeds (CMBs) are small rounded lesions representing cerebral hemosiderin deposits surrounded by macrophages that results from previous microhemorrhages. The aim of this study was to review the distribution of cerebral microbleeds in patients with end-stage organ failure and their association with specific end-stage organ failure risk factors. Between August 2015 and June 2017, we evaluated 15 patients, 9 males, and 6 females, (mean age 65.5 years). Patients population was subdivided into three groups according to the organ failure: (a) chronic kidney failure (n = 8), (b) restrictive cardiomyopathy undergoing heart transplantation (n = 1), and (c) end-stage liver failure undergoing liver transplantation (n = 6). The MR exams were performed on a 3T MR unit and the SWI sequence was used for the detection of CMBs. CMBs were subdivided in supratentorial lobar distributed, supratentorial non-lobar distributed, and infratentorial distributed. A total of 91 microbleeds were observed in 15 patients. Fifty-nine CMBs lesions (64.8%) had supratentorial lobar distribution, 17 CMBs lesions (18.8%) had supratentorial non-lobar distribution and the remaining 15 CMBs lesions (16.4%) were infratentorial distributed. An overall predominance of supratentorial multiple lobar localizations was found in all types of end-stage organ failure. The presence of CMBs was significantly correlated with age, hypertension, and specific end-stage organ failure risk factors (p failure. The improved detection of CMBs with SWI sequences may contribute to a more accurate identification of patients with cerebral risk factors to prevent complications during or after the organ transplantation.

  19. A Global Assessment of Circulating Prolysyl Oxidase in Nonischemic Patients With Garden-variety Heart Failure With Preserved Ejection Fraction.

    Science.gov (United States)

    Muñoz Calvo, Benjamín; Villa Martínez, Ana; López Orgil, Susana; López Andrés, Natalia; Román García, Feliciano; Víctor Palomares, Virginia; de la Calle de la Villa, Esther; Nadador Patiño, Verónica; Arribas-Gómez, Ignacio

    2018-05-25

    Lysyl oxidase is overexpressed in the myocardium of patients with hypertensive cardiomyopathy. We aimed to explore whether patients with hypertensive-metabolic heart failure with preserved ejection fraction (HM-HFpEF) also have increased concentrations of circulating prolysyl oxidase (cpLOX) and its possible consequences. We quantified cpLOX concentrations in 85 nonischemic patients with stage C, HM-HFpEF, and compared them with those of 51 healthy controls. We also assessed the correlations of cpLOX with myocardial stiffness parameters, collagen turnover products and fibrogenic cytokines, as well as the predictive value of plasma proenzyme levels at 1-year of follow-up. We detected raised cpLOX values and found that they correlated with calculated E/E' ratios and stiffness constants. The subgroup of patients with type I diastolic dysfunction showed a single negative correlation between cpLOX and B-type natriuretic peptide whereas patients with a restrictive diastolic pattern showed a strong correlation between cpLOX and galectin-3. Kaplan-Meier analysis revealed that cpLOX > 52.20 ng/mL slightly increased the risk of a fatal outcome (log-rank = 4.45; P = .034). When Cox regression was used, cpLOX was found to be a significant independent predictor of cardiovascular death or hospitalization due to the decompensation of HM-HFpEF (HR, 1.360; 95%CI, 1.126-1.638; P = .046). Patients with symptomatic HM-HFpEF show high cpLOX serum levels associated with restrictive diastolic filling indices. These levels represent a moderate risk factor for poor clinical outcome. Throughout the natural history of HM-HFpEF, we observed that cpLOX concentrations were initially negatively correlated with B-type natriuretic peptide but positively correlated with galectin-3 as advanced diastolic dysfunction developed. Copyright © 2018. Published by Elsevier España, S.L.U.

  20. Failure assessment and evaluation of critical crack length for a fresh Zr-2 pressure tube of an Indian PHWR

    International Nuclear Information System (INIS)

    Krishnan, Suresh; Bhasin, Vivek; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1996-01-01

    Fracture analysis of Zr-2 pressure tubes having through wall axial crack was done using finite element method. The analysis was done for tubes in as received condition. During reactor operation the mechanical properties of Zr-2 undergo changes. The analysis is valid for pressure tubes of newly commissioned reactors. The main aim of the study was to determine critical crack length of pressure tubes in normal operating conditions. Elastic plastic fracture analysis was done for different crack lengths to determine applied J-integral values. Tearing modulus instability concept was used to evaluate critical crack length. One of the important parameter studied was, the effect of crack face pressure, which leaking fluid exert on the crack faces/lips of through wall axial crack. Its effect was found to be significant for pressure tubes. It increases the applied J-integral values. Approximate analytical solutions which takes into account the plasticity ahead of crack tip, are available and widely used. These formulae do not take into account the crack face pressure. Since, for the present situation the effect of crack face pressure is significant hence, detailed finite analysis was necessary. Detailed 3D finite element analysis gives an insight into the variation of J-integral values over the thickness of pressure tube. It was found that J values are maximum at the middle layer of the tube. A peak factor on J values was defined and evaluated as ratio of maximum J to average J across the thickness, crack opening area for each length was also evaluated. The knowledge of crack opening area is useful for leak before break studies. The failure assessment was also done using Central Electricity Generating Board (CEGB) R-6 method considering the ductile tearing. The reserve factors (or safety margins) for different crack lengths was evaluated using R-6 method. (author). 30 refs., 21 figs., 34 tabs

  1. Heart Failure

    Science.gov (United States)

    Heart failure is a condition in which the heart can't pump enough blood to meet the body's needs. Heart failure does not mean that your heart has stopped ... and shortness of breath Common causes of heart failure are coronary artery disease, high blood pressure and ...

  2. Definition of containment failure

    International Nuclear Information System (INIS)

    Cybulskis, P.

    1982-01-01

    Core meltdown accidents of the types considered in probabilistic risk assessments (PRA's) have been predicted to lead to pressures that will challenge the integrity of containment structures. Review of a number of PRA's indicates considerable variation in the predicted probability of containment failure as a function of pressure. Since the results of PRA's are sensitive to the prediction of the occurrence and the timing of containment failure, better understanding of realistic containment capabilities and a more consistent approach to the definition of containment failure pressures are required. Additionally, since the size and location of the failure can also significantly influence the prediction of reactor accident risk, further understanding of likely failure modes is required. The thresholds and modes of containment failure may not be independent

  3. A curve-fitting approach to estimate the arterial plasma input function for the assessment of glucose metabolic rate and response to treatment.

    NARCIS (Netherlands)

    Vriens, D.; Geus-Oei, L.F. de; Oyen, W.J.G.; Visser, E.P.

    2009-01-01

    For the quantification of dynamic (18)F-FDG PET studies, the arterial plasma time-activity concentration curve (APTAC) needs to be available. This can be obtained using serial sampling of arterial blood or an image-derived input function (IDIF). Arterial sampling is invasive and often not feasible

  4. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  5. Tornado-Shaped Curves

    Science.gov (United States)

    Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio

    2017-01-01

    In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.

  6. Simulating Supernova Light Curves

    International Nuclear Information System (INIS)

    Even, Wesley Paul; Dolence, Joshua C.

    2016-01-01

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth's atmosphere.

  7. Simulating Supernova Light Curves

    Energy Technology Data Exchange (ETDEWEB)

    Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-05

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.

  8. Image scaling curve generation

    NARCIS (Netherlands)

    2012-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  9. Image scaling curve generation.

    NARCIS (Netherlands)

    2011-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  10. Tempo curves considered harmful

    NARCIS (Netherlands)

    Desain, P.; Honing, H.

    1993-01-01

    In the literature of musicology, computer music research and the psychology of music, timing or tempo measurements are mostly presented in the form of continuous curves. The notion of these tempo curves is dangerous, despite its widespread use, because it lulls its users into the false impression

  11. The curve shortening problem

    CERN Document Server

    Chou, Kai-Seng

    2001-01-01

    Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.

  12. Failure to Fail

    Directory of Open Access Journals (Sweden)

    Samuel Vriezen

    2013-07-01

    Full Text Available Between pessimism and optimism, Samuel Vriezen attempts to intuit a third way through an assessment of failure and negativity in the consonances and tensions between the prosody of Irish playwright Samuel Becekett and American poet Gertrude Stein.

  13. Acute kidney failure

    Science.gov (United States)

    ... Renal failure - acute; ARF; Kidney injury - acute Images Kidney anatomy References Devarajan P. Biomarkers for assessment of renal function during acute kidney injury. In: Alpern RJ, Moe OW, Caplan M, ...

  14. Failure mode and effects analysis based risk profile assessment for stereotactic radiosurgery programs at three cancer centers in Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, Flavia C., E-mail: flavitiz@gmail.com [CNEN—Comissao Nacional de Energia Nuclear, Rio de Janeiro, RJ 22290-901, Brazil and LCR/UERJ—Laboratorio de Ciencias Radiologicas/Universidade do Estado do Rio de Janeiro, Rio de Janeiro, RJ 20550-013 (Brazil); Almeida, Carlos E. de [LCR/UERJ—Laboratorio de Ciencias Radiologicas/Universidade do Estado do Rio de Janeiro, Rio de Janeiro, RJ 20550-013 (Brazil); Saiful Huq, M. [Department of Radiation Oncology, University of Pittsburgh Cancer Institute and UPMC Cancer Center, Pittsburgh, Pennsylvania 15232 (United States)

    2016-01-15

    Purpose: The goal of this study was to evaluate the safety and quality management program for stereotactic radiosurgery (SRS) treatment processes at three radiotherapy centers in Brazil by using three industrial engineering tools (1) process mapping, (2) failure modes and effects analysis (FMEA), and (3) fault tree analysis. Methods: The recommendations of Task Group 100 of American Association of Physicists in Medicine were followed to apply the three tools described above to create a process tree for SRS procedure for each radiotherapy center and then FMEA was performed. Failure modes were identified for all process steps and values of risk priority number (RPN) were calculated from O, S, and D (RPN = O × S × D) values assigned by a professional team responsible for patient care. Results: The subprocess treatment planning was presented with the highest number of failure modes for all centers. The total number of failure modes were 135, 104, and 131 for centers I, II, and III, respectively. The highest RPN value for each center is as follows: center I (204), center II (372), and center III (370). Failure modes with RPN ≥ 100: center I (22), center II (115), and center III (110). Failure modes characterized by S ≥ 7, represented 68% of the failure modes for center III, 62% for center II, and 45% for center I. Failure modes with RPNs values ≥100 and S ≥ 7, D ≥ 5, and O ≥ 5 were considered as high priority in this study. Conclusions: The results of the present study show that the safety risk profiles for the same stereotactic radiotherapy process are different at three radiotherapy centers in Brazil. Although this is the same treatment process, this present study showed that the risk priority is different and it will lead to implementation of different safety interventions among the centers. Therefore, the current practice of applying universal device-centric QA is not adequate to address all possible failures in clinical processes at different

  15. Failure mode and effects analysis based risk profile assessment for stereotactic radiosurgery programs at three cancer centers in Brazil

    International Nuclear Information System (INIS)

    Teixeira, Flavia C.; Almeida, Carlos E. de; Saiful Huq, M.

    2016-01-01

    Purpose: The goal of this study was to evaluate the safety and quality management program for stereotactic radiosurgery (SRS) treatment processes at three radiotherapy centers in Brazil by using three industrial engineering tools (1) process mapping, (2) failure modes and effects analysis (FMEA), and (3) fault tree analysis. Methods: The recommendations of Task Group 100 of American Association of Physicists in Medicine were followed to apply the three tools described above to create a process tree for SRS procedure for each radiotherapy center and then FMEA was performed. Failure modes were identified for all process steps and values of risk priority number (RPN) were calculated from O, S, and D (RPN = O × S × D) values assigned by a professional team responsible for patient care. Results: The subprocess treatment planning was presented with the highest number of failure modes for all centers. The total number of failure modes were 135, 104, and 131 for centers I, II, and III, respectively. The highest RPN value for each center is as follows: center I (204), center II (372), and center III (370). Failure modes with RPN ≥ 100: center I (22), center II (115), and center III (110). Failure modes characterized by S ≥ 7, represented 68% of the failure modes for center III, 62% for center II, and 45% for center I. Failure modes with RPNs values ≥100 and S ≥ 7, D ≥ 5, and O ≥ 5 were considered as high priority in this study. Conclusions: The results of the present study show that the safety risk profiles for the same stereotactic radiotherapy process are different at three radiotherapy centers in Brazil. Although this is the same treatment process, this present study showed that the risk priority is different and it will lead to implementation of different safety interventions among the centers. Therefore, the current practice of applying universal device-centric QA is not adequate to address all possible failures in clinical processes at different

  16. Long-term effects as the cause of failure in electronic components

    International Nuclear Information System (INIS)

    Renz, H.; Kreichgauer, H.

    1989-01-01

    After a brief presentation of the utilisation properties of electronic components, their failure rates are discussed with particular reference to the socalled bath-tub curve. The main emphasis is on the construction and manufacture of integrated circuits and the possible types and causes of failure arising from the individual manufacturing stages (layout faults, internal corrosion, masking and etching errors, leakage currents, inadequate heat removal, etc.). A technical insurance assessment is then provided of the long-term failures associated with technological matters. (orig.) [de

  17. Comparison of linear-elastic-plastic, and fully plastic failure models in the assessment of piping integrity

    International Nuclear Information System (INIS)

    Streit, R.D.

    1981-01-01

    The failure evaluation of Pressurized Water Reactor (PWR) primary coolant loop pipe is often based on a plastic limit load criterion; i.e., failure occurs when the stress on the pipe section exceeds the material flow stress. However, in addition the piping system must be safe against crack propagation at stresses less than those leading to plastic instability. In this paper, elastic, elastic-plastic, and fully-plastic failure models are evaluated, and the requirements for piping integrity based on these models are compared. The model yielding the 'more' critical criteria for the given geometry and loading conditions defines the appropriate failure criterion. The pipe geometry and loading used in this study was choosen based on an evaluation of a guillotine break in a PWR primary coolant loop. It is assumed that the piping may contain cracks. Since a deep circumferential crack, can lead to a guillotine pipe break without prior leaking and thus without warning it is the focus of the failure model comparison study. The hot leg pipe, a 29 in. I.D. by 2.5 in. wall thickness stainless pipe, was modeled in this investigation. Cracks up to 90% through the wall were considered. The loads considered in this evaluation result from the internal pressure, dead weight, and seismic stresses. For the case considered, the internal pressure contributes the most to the failure loading. The maximum moment stress due to the dead weight and seismic moments are simply added to the pressure stress. Thus, with the circumferential crack geometry and uniform pressure stress, the problem is axisymmetric. It is analyzed using NIKE2D--an implicit, finite deformation, finite element code for analyzing two-dimensional elastic-plastic problems. (orig./GL)

  18. Learning Curve? Which One?

    Directory of Open Access Journals (Sweden)

    Paulo Prochno

    2004-07-01

    Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.

  19. ASSESSING THE NON-FINANCIAL PREDICTORS OF THE SUCCESS AND FAILURE OF YOUNG FIRMS IN THE NETHERLANDS

    Directory of Open Access Journals (Sweden)

    Philip VERGAUWEN

    2005-01-01

    Full Text Available In this study, the Lussier (1995 success and failure prediction model is improved and tested on asample of Dutch firms. Besides clearly defining a specific business plan, work experience is added asa variable, and contrary to previous researches, the discrete variables are dealt with appropriate thistime. The results of this improved model show that product/service timing, planning, managementexperience, knowledge of marketing, economic timing, professional advice, and having a businesspartner are predictors of success and failure for young firms in the Netherlands.

  20. An assessment of BWR [boiling water reactor] Mark III containment challenges, failure modes, and potential improvements in performance

    International Nuclear Information System (INIS)

    Schroeder, J.A.; Pafford, D.J.; Kelly, D.L.; Jones, K.R.; Dallman, F.J.

    1991-01-01

    This report describes risk-significant challenges posed to Mark III containment systems by severe accidents as identified for Grand Gulf. Design similarities and differences between the Mark III plants that are important to containment performance are summarized. The accident sequences responsible for the challenges and the postulated containment failure modes associated with each challenge are identified and described. Improvements are discussed that have the potential either to prevent or delay containment failure, or to mitigate the offsite consequences of a fission product release. For each of these potential improvements, a qualitative analysis is provided. A limited quantitative risk analysis is provided for selected potential improvements. 21 refs., 5 figs., 46 tabs

  1. Assessment of genetic mutations in the XRCC2 coding region by high resolution melting curve analysis and the risk of differentiated thyroid carcinoma in Iran

    Directory of Open Access Journals (Sweden)

    Shima Fayaz

    2012-01-01

    Full Text Available Homologous recombination (HR is the major pathway for repairing double strand breaks (DSBs in eukaryotes and XRCC2 is an essential component of the HR repair machinery. To evaluate the potential role of mutations in gene repair by HR in individuals susceptible to differentiated thyroid carcinoma (DTC we used high resolution melting (HRM analysis, a recently introduced method for detecting mutations, to examine the entire XRCC2 coding region in an Iranian population. HRM analysis was used to screen for mutations in three XRCC2 coding regions in 50 patients and 50 controls. There was no variation in the HRM curves obtained from the analysis of exons 1 and 2 in the case and control groups. In exon 3, an Arg188His polymorphism (rs3218536 was detected as a new melting curve group (OR: 1.46; 95%CI: 0.432-4.969; p = 0.38 compared with the normal melting curve. We also found a new Ser150Arg polymorphism in exon 3 of the control group. These findings suggest that genetic variations in the XRCC2 coding region have no potential effects on susceptibility to DTC. However, further studies with larger populations are required to confirm this conclusion.

  2. 2D-speckle tracking right ventricular strain to assess right ventricular systolic function in systolic heart failure. Analysis of the right ventricular free and posterolateral walls.

    Science.gov (United States)

    Mouton, Stéphanie; Ridon, Héléne; Fertin, Marie; Pentiah, Anju Duva; Goémine, Céline; Petyt, Grégory; Lamblin, Nicolas; Coisne, Augustin; Foucher-Hossein, Claude; Montaigne, David; de Groote, Pascal

    2017-10-15

    Right ventricular (RV) systolic function is a powerful prognostic factor in patients with systolic heart failure. The accurate estimation of RV function remains difficult. The aim of the study was to determine the diagnostic accuracy of 2D-speckle tracking RV strain in patients with systolic heart failure, analyzing both free and posterolateral walls. Seventy-six patients with dilated cardiopathy (left ventricular end-diastolic volume≥75ml/m 2 ) and left ventricular ejection fraction≤45% had an analysis of the RV strain. Feasibility, reproducibility and diagnostic accuracy of RV strain were analyzed and compared to other echocardiographic parameters of RV function. RV dysfunction was defined as a RV ejection fraction≤40% measured by radionuclide angiography. RV strain feasibility was 93.9% for the free-wall and 79.8% for the posterolateral wall. RV strain reproducibility was good (intra-observer and inter-observer bias and limits of agreement of 0.16±1.2% [-2.2-2.5] and 0.84±2.4 [-5.5-3.8], respectively). Patients with left heart failure have a RV systolic dysfunction that can be unmasked by advanced echocardiographic imaging: mean RV strain was -21±5.7% in patients without RV dysfunction and -15.8±5.1% in patients with RV dysfunction (p=0.0001). Mean RV strain showed the highest diagnostic accuracy to predict depressed RVEF (area under the curve (AUC) 0.75) with moderate sensitivity (60.5%) but high specificity (87.5%) using a cutoff value of -16%. RV strain seems to be a promising and more efficient measure than previous RV echocardiographic parameters for the diagnosis of RV systolic dysfunction. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. The crime kuznets curve

    OpenAIRE

    Buonanno, Paolo; Fergusson, Leopoldo; Vargas, Juan Fernando

    2014-01-01

    We document the existence of a Crime Kuznets Curve in US states since the 1970s. As income levels have risen, crime has followed an inverted U-shaped pattern, first increasing and then dropping. The Crime Kuznets Curve is not explained by income inequality. In fact, we show that during the sample period inequality has risen monotonically with income, ruling out the traditional Kuznets Curve. Our finding is robust to adding a large set of controls that are used in the literature to explain the...

  4. A self-controlled case series to assess the effectiveness of beta blockers for heart failure in reducing hospitalisations in the elderly

    Directory of Open Access Journals (Sweden)

    Pratt Nicole L

    2011-07-01

    Full Text Available Abstract Background To determine the suitability of using the self-controlled case series design to assess improvements in health outcomes using the effectiveness of beta blockers for heart failure in reducing hospitalisations as the example. Methods The Australian Government Department of Veterans' Affairs administrative claims database was used to undertake a self-controlled case-series in elderly patients aged 65 years or over to compare the risk of a heart failure hospitalisation during periods of being exposed and unexposed to a beta blocker. Two studies, the first using a one year period and the second using a four year period were undertaken to determine if the estimates varied due to changes in severity of heart failure over time. Results In the one year period, 3,450 patients and in the four year period, 12, 682 patients had at least one hospitalisation for heart failure. The one year period showed a non-significant decrease in hospitalisations for heart failure 4-8 months after starting beta-blockers, (RR, 0.76; 95% CI (0.57-1.02 and a significant decrease in the 8-12 months post-initiation of a beta blocker for heart failure (RR, 0.62; 95% CI (0.39, 0.99. For the four year study there was an increased risk of hospitalisation less than eight months post-initiation and significant but smaller decrease in the 8-12 month window (RR, 0.90; 95% CI (0.82, 0.98. Conclusions The results of the one year observation period are similar to those observed in randomised clinical trials indicating that the self-controlled case-series method can be successfully applied to assess health outcomes. However, the result appears sensitive to the study periods used and further research to understand the appropriate applications of this method in pharmacoepidemiology is still required. The results also illustrate the benefits of extending beta blocker utilisation to the older age group of heart failure patients in which their use is common but the evidence is

  5. Increased crop failure due to climate change: assessing adaptation options using models and socio-economic data for wheat in China

    Energy Technology Data Exchange (ETDEWEB)

    Challinor, Andrew J [Institute for Climate and Atmospheric Science, School of Earth and Environment, University of Leeds, Leeds LS2 9JT (United Kingdom); Simelton, Elisabeth S; Fraser, Evan D G [Sustainability Research Institute, School of Earth and Environment, University of Leeds, Leeds LS2 9JT (United Kingdom); Hemming, Debbie; Collins, Mathew, E-mail: a.j.challinor@leeds.ac.uk [Met Office Hadley Centre, FitzRoy Road, Exeter EX1 3PB (United Kingdom)

    2010-07-15

    Tools for projecting crop productivity under a range of conditions, and assessing adaptation options, are an important part of the endeavour to prioritize investment in adaptation. We present ensemble projections of crop productivity that account for biophysical processes, inherent uncertainty and adaptation, using spring wheat in Northeast China as a case study. A parallel 'vulnerability index' approach uses quantitative socio-economic data to account for autonomous farmer adaptation. The simulations show crop failure rates increasing under climate change, due to increasing extremes of both heat and water stress. Crop failure rates increase with mean temperature, with increases in maximum failure rates being greater than those in median failure rates. The results suggest that significant adaptation is possible through either socio-economic measures such as greater investment, or biophysical measures such as drought or heat tolerance in crops. The results also show that adaptation becomes increasingly necessitated as mean temperature and the associated number of extremes rise. The results, and the limitations of this study, also suggest directions for research for linking climate and crop models, socio-economic analyses and crop variety trial data in order to prioritize options such as capacity building, plant breeding and biotechnology.

  6. Contraceptive failure

    DEFF Research Database (Denmark)

    Rasch, Vibeke

    2002-01-01

    Most studies focusing on contraceptive failure in relation to pregnancy have focused on contraceptive failure among women having induced abortions, thereby neglecting those women who, despite contraceptive failure, accept the pregnancy and intend to carry the fetus to term. To get a more complete...... picture of the problem of contraceptive failure, this study focuses on contraceptive failure among women with diverse pregnancy outcomes. In all, 3520 pregnant women attending Odense University Hospital were included: 373 had induced abortions, 435 had spontaneous abortions, 97 had ectopic pregnancies......, and 2614 received antenatal care. The variables studied comprise age, partner relationship, number of births, occupational and economical situation, and contraceptive use.Contraceptive failure, defined as contraceptive use (condom, diaphragm, IUD, oral contraception, or another modern method...

  7. Assessing Preservice Teachers' Mathematics Cognitive Failures as Related to Mathematics Anxiety and Performance in Undergraduate Calculus

    Science.gov (United States)

    Awofala, Adeneye O. A.; Odogwu, Helen N.

    2017-01-01

    The study investigated mathematics cognitive failures as related to mathematics anxiety, gender and performance in calculus among 450 preservice teachers from four public universities in the South West geo-political zone of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  8. Analytical and computational methodology to assess the over pressures generated by a potential catastrophic failure of a cryogenic pressure vessel

    Energy Technology Data Exchange (ETDEWEB)

    Zamora, I.; Fradera, J.; Jaskiewicz, F.; Lopez, D.; Hermosa, B.; Aleman, A.; Izquierdo, J.; Buskop, J.

    2014-07-01

    Idom has participated in the risk evaluation of Safety Important Class (SIC) structures due to over pressures generated by a catastrophic failure of a cryogenic pressure vessel at ITER plant site. The evaluation implements both analytical and computational methodologies achieving consistent and robust results. (Author)

  9. Analytical and computational methodology to assess the over pressures generated by a potential catastrophic failure of a cryogenic pressure vessel

    International Nuclear Information System (INIS)

    Zamora, I.; Fradera, J.; Jaskiewicz, F.; Lopez, D.; Hermosa, B.; Aleman, A.; Izquierdo, J.; Buskop, J.

    2014-01-01

    Idom has participated in the risk evaluation of Safety Important Class (SIC) structures due to over pressures generated by a catastrophic failure of a cryogenic pressure vessel at ITER plant site. The evaluation implements both analytical and computational methodologies achieving consistent and robust results. (Author)

  10. Heart Failure

    OpenAIRE

    McMurray, John; Ponikowski, Piotr

    2011-01-01

    Heart failure occurs in 3% to 4% of adults aged over 65 years, usually as a consequence of coronary artery disease or hypertension, and causes breathlessness, effort intolerance, fluid retention, and increased mortality. The 5-year mortality in people with systolic heart failure ranges from 25% to 75%, often owing to sudden death following ventricular arrhythmia. Risks of cardiovascular events are increased in people with left ventricular systolic dysfunction (LVSD) or heart failure.

  11. Bond yield curve construction

    Directory of Open Access Journals (Sweden)

    Kožul Nataša

    2014-01-01

    Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.

  12. SRHA calibration curve

    Data.gov (United States)

    U.S. Environmental Protection Agency — an UV calibration curve for SRHA quantitation. This dataset is associated with the following publication: Chang, X., and D. Bouchard. Surfactant-Wrapped Multiwalled...

  13. Bragg Curve Spectroscopy

    International Nuclear Information System (INIS)

    Gruhn, C.R.

    1981-05-01

    An alternative utilization is presented for the gaseous ionization chamber in the detection of energetic heavy ions, which is called Bragg Curve Spectroscopy (BCS). Conceptually, BCS involves using the maximum data available from the Bragg curve of the stopping heavy ion (HI) for purposes of identifying the particle and measuring its energy. A detector has been designed that measures the Bragg curve with high precision. From the Bragg curve the range from the length of the track, the total energy from the integral of the specific ionization over the track, the dE/dx from the specific ionization at the beginning of the track, and the Bragg peak from the maximum of the specific ionization of the HI are determined. This last signal measures the atomic number, Z, of the HI unambiguously

  14. ROBUST DECLINE CURVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sutawanir Darwis

    2012-05-01

    Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.

  15. Power Curve Measurements FGW

    DEFF Research Database (Denmark)

    Georgieva Yankova, Ginka; Federici, Paolo

    This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2.......This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2....

  16. Curves and Abelian varieties

    CERN Document Server

    Alexeev, Valery; Clemens, C Herbert; Beauville, Arnaud

    2008-01-01

    This book is devoted to recent progress in the study of curves and abelian varieties. It discusses both classical aspects of this deep and beautiful subject as well as two important new developments, tropical geometry and the theory of log schemes. In addition to original research articles, this book contains three surveys devoted to singularities of theta divisors, of compactified Jacobians of singular curves, and of "strange duality" among moduli spaces of vector bundles on algebraic varieties.

  17. A curve-fitting approach to estimate the arterial plasma input function for the assessment of glucose metabolic rate and response to treatment.

    Science.gov (United States)

    Vriens, Dennis; de Geus-Oei, Lioe-Fee; Oyen, Wim J G; Visser, Eric P

    2009-12-01

    For the quantification of dynamic (18)F-FDG PET studies, the arterial plasma time-activity concentration curve (APTAC) needs to be available. This can be obtained using serial sampling of arterial blood or an image-derived input function (IDIF). Arterial sampling is invasive and often not feasible in practice; IDIFs are biased because of partial-volume effects and cannot be used when no large arterial blood pool is in the field of view. We propose a mathematic function, consisting of an initial linear rising activity concentration followed by a triexponential decay, to describe the APTAC. This function was fitted to 80 oncologic patients and verified for 40 different oncologic patients by area-under-the-curve (AUC) comparison, Patlak glucose metabolic rate (MR(glc)) estimation, and therapy response monitoring (Delta MR(glc)). The proposed function was compared with the gold standard (serial arterial sampling) and the IDIF. To determine the free parameters of the function, plasma time-activity curves based on arterial samples in 80 patients were fitted after normalization for administered activity (AA) and initial distribution volume (iDV) of (18)F-FDG. The medians of these free parameters were used for the model. In 40 other patients (20 baseline and 20 follow-up dynamic (18)F-FDG PET scans), this model was validated. The population-based curve, individually calibrated by AA and iDV (APTAC(AA/iDV)), by 1 late arterial sample (APTAC(1 sample)), and by the individual IDIF (APTAC(IDIF)), was compared with the gold standard of serial arterial sampling (APTAC(sampled)) using the AUC. Additionally, these 3 methods of APTAC determination were evaluated with Patlak MR(glc) estimation and with Delta MR(glc) for therapy effects using serial sampling as the gold standard. Excellent individual fits to the function were derived with significantly different decay constants (P AUC from APTAC(AA/iDV), APTAC(1 sample), and APTAC(IDIF) with the gold standard (APTAC(sampled)) were 0

  18. Curve of Spee and Its Relationship with Dentoskeletal Morphology

    Directory of Open Access Journals (Sweden)

    Prerna Raje Batham

    2013-01-01

    Conclusion: The curve of Spee is related to various dentoskeletal variables. Thus, the determination of this relationship is useful to assess the feasibility of leveling the curve of Spee by orthodontic treatment.

  19. Failure data specialization in quantitative risk assessments of process plants; Especializacao de dados de falha em analise quantitativa de riscos de plantas de processo

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Antonio C.O. [Bayer S.A., Sao Paulo, SP (Brazil); Melo, P.F. Frutuoso e [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia

    2005-07-01

    The aim of this paper is to show the Bayesian inference in reliability studies, which are used to failures rates updating in safety analyses. It is developed the impact of its using in quantitative risks assessments for industrial process plants. With this approach we find a structured and auditable way of showing the difference between an industrial installation with a good project and maintenance structure from another one that shows a low level of quality in these areas. In general the evidence from failures rates and as follow the frequency of occurrence from scenarios, which the risks taken in account in ERA, are taken from generics data banks, instead of, the installation in analysis. When using the plant data we need special effort to develop a data bank, that is, a maintenance managing system, which allows the data insertion as for example the SAP{sup R} and its PM module. (author)

  20. The influence of the epoxy interlayer on the assessment of failure conditions of push-out test specimens

    Czech Academy of Sciences Publication Activity Database

    Klusák, Jan; Helincks, P.; Seitl, Stanislav; De Corte, W.; Boel, V.; De Schutter, G.

    525-526, č. 1 (2013), s. 61-64 ISSN 1013-9826 R&D Projects: GA ČR GAP108/10/2049; GA ČR(CZ) GAP105/11/1551 Institutional support: RVO:68081723 Keywords : push-out test * generalized fracture mechanics * failure initiation * steel-concrete joint * epoxy adhesive layer Subject RIV: JL - Materials Fatigue, Friction Mechanics

  1. Key indicator tools for shallow slope failure assessment using soil chemical property signatures and soil colour variables.

    Science.gov (United States)

    Othman, Rashidi; Hasni, Shah Irani; Baharuddin, Zainul Mukrim; Hashim, Khairusy Syakirin Has-Yun; Mahamod, Lukman Hakim

    2017-10-01

    Slope failure has become a major concern in Malaysia due to the rapid development and urbanisation in the country. It poses severe threats to any highway construction industry, residential areas, natural resources and tourism activities. The extent of damages that resulted from this catastrophe can be lessened if a long-term early warning system to predict landslide prone areas is implemented. Thus, this study aims to characterise the relationship between Oxisols properties and soil colour variables to be manipulated as key indicators to forecast shallow slope failure. The concentration of each soil property in slope soil was evaluated from two different localities that consist of 120 soil samples from stable and unstable slopes located along the North-South Highway (PLUS) and East-West Highway (LPT). Analysis of variance established highly significant difference (P shallow slope failure were high value of L*(62), low values of c* (20) and h* (66), low concentration of iron (53 mg kg -1 ) and aluminium oxide (37 mg kg -1 ), low soil TOC (0.5%), low CEC (3.6 cmol/kg), slightly acidic soil pH (4.9), high amount of sand fraction (68%) and low amount of clay fraction (20%).

  2. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  3. Power Curve Measurements REWS

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere......This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here......, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...

  4. Interfaces and strain in InGaAsP/InP heterostructures assessed with dynamical simulations of high-resolution x-ray diffraction curves

    International Nuclear Information System (INIS)

    Vandenberg, J.M.

    1995-01-01

    The interfacial structure of a lattice-matched InGaAs/InP/(100)InP superlattice with a long period of ∼630 Angstrom has been studied by fully dynamical simulations of high-resolution x-ray diffraction curves. This structure exhibits a very symmetrical x-ray pattern enveloping a large number of closely spaced satellite intensities with pronounced maxima and minima. It appears in the dynamical analysis that the position and shape of these maxima and minima is extremely sensitive to the number N of molecular layers and atomic spacing d of the InGaAs and InP layer and in particular the presence of strained interfacial layers. The structural model of strained interfaces was also applied to an epitaxial lattice-matched 700 Angstrom InP/400 Angstrom InGaAsP/(100)InP beterostructure. 9 refs., 3 figs

  5. Validation of the NASA-TLX Score in Ongoing Assessment of Mental Workload During a Laparoscopic Learning Curve in Bariatric Surgery.

    Science.gov (United States)

    Ruiz-Rabelo, Juan Francisco; Navarro-Rodriguez, Elena; Di-Stasi, Leandro Luigi; Diaz-Jimenez, Nelida; Cabrera-Bermon, Juan; Diaz-Iglesias, Carlos; Gomez-Alvarez, Manuel; Briceño-Delgado, Javier

    2015-12-01

    Fatigue and mental workload are directly associated with high-complexity tasks. In general, difficult tasks produce a higher mental workload, leaving little opportunity to deal with new/unexpected events and increasing the likelihood of performance errors. The laparoscopic Roux-en-Y gastric bypass (LRYGB) learning curve is considered to be one of the most difficult to complete in laparoscopic surgery. We wished to validate the National Aeronautics and Space Administration Task Load Index (NASA-TLX) in LRYGB and identify factors that could provoke a higher mental workload for surgeons during the learning curve. A single surgeon was enrolled to undertake 70 consecutive LRYGB procedures with two internal surgeons mentoring the first 35 cases. Patients were consecutive and ranked from case 35 to case 105 according to the date of the surgical procedure ("case rank"). Self-ratings of satisfaction, performance, and fatigue were measured at the end of surgery using a validated NASA-TLX questionnaire. The procedure was recorded for later viewing by two external evaluators. General data for patients and surgical variables were collected prospectively. A moderate correlation between the NASA-TLX score, BMI, operative time, and volumes of blood drainage was observed. There was no correlation between the NASA-TLX score and duration of hospital stay or time of drain removal. BMI ≥50 kg/m(2), male sex, inexperienced first assistant, and type 2 diabetes mellitus were identified as independent predictive factors of a higher NASA-TLX score. The NASA-TLX is a valid tool to gauge mental workload in LRYGB.

  6. Curved electromagnetic missiles

    International Nuclear Information System (INIS)

    Myers, J.M.; Shen, H.M.; Wu, T.T.

    1989-01-01

    Transient electromagnetic fields can exhibit interesting behavior in the limit of great distances from their sources. In situations of finite total radiated energy, the energy reaching a distant receiver can decrease with distance much more slowly than the usual r - 2 . Cases of such slow decrease have been referred to as electromagnetic missiles. All of the wide variety of known missiles propagate in essentially straight lines. A sketch is presented here of a missile that can follow a path that is strongly curved. An example of a curved electromagnetic missile is explicitly constructed and some of its properties are discussed. References to details available elsewhere are given

  7. Algebraic curves and cryptography

    CERN Document Server

    Murty, V Kumar

    2010-01-01

    It is by now a well-known paradigm that public-key cryptosystems can be built using finite Abelian groups and that algebraic geometry provides a supply of such groups through Abelian varieties over finite fields. Of special interest are the Abelian varieties that are Jacobians of algebraic curves. All of the articles in this volume are centered on the theme of point counting and explicit arithmetic on the Jacobians of curves over finite fields. The topics covered include Schoof's \\ell-adic point counting algorithm, the p-adic algorithms of Kedlaya and Denef-Vercauteren, explicit arithmetic on

  8. IGMtransmission: Transmission curve computation

    Science.gov (United States)

    Harrison, Christopher M.; Meiksin, Avery; Stock, David

    2015-04-01

    IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.

  9. Heart failure: a weak link in CHA2 DS2 -VASc.

    Science.gov (United States)

    Friberg, Leif; Lund, Lars H

    2018-02-15

    In atrial fibrillation, stroke risk is assessed by the CHA 2 DS 2 -VASc score. Heart failure is included in CHA 2 DS 2 -VASc, but the rationale is uncertain. Our objective was to test if heart failure is a risk factor for stroke, independent of other risk factors in CHA 2 DS 2 -VASc. We studied 300 839 patients with atrial fibrillation in the Swedish Patient Register 2005-11. Three definitions of heart failure were used in order to assess the robustness of the results. In the main analysis, heart failure was defined by a hospital discharge diagnosis of heart failure as first or second diagnosis and a filled prescription of a diuretic within 3 months before index + 30 days. The second definition counted first or second discharge diagnoses failure diagnosis in open or hospital care before index + 30 days. Associations with outcomes were assessed with multivariable Cox analyses. Patients with heart failure were older (80.5 vs. 74.0 years, P failure and 3.1% without. Adjustment for the cofactors in CHA 2 DS 2 -VASc eradicated the difference in stroke risk between patients with and without heart failure (hazard ratio 1.01 with 95% confidence interval 0.96-1.05). The area under the receiver operating characteristic curve for CHA 2 DS 2 -VASc was not improved by points for heart failure. A clinical diagnosis of heart failure was not an independent risk factor for stroke in patients with atrial fibrillation, which may have implications for anticoagulation management. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.

  10. A practical approach to assess leg muscle oxygenation during ramp-incremental cycle ergometry in heart failure

    Directory of Open Access Journals (Sweden)

    A.C. Barroco

    2017-10-01

    Full Text Available Heart failure is characterized by the inability of the cardiovascular system to maintain oxygen (O2 delivery (i.e., muscle blood flow in non-hypoxemic patients to meet O2 demands. The resulting increase in fractional O2 extraction can be non-invasively tracked by deoxygenated hemoglobin concentration (deoxi-Hb as measured by near-infrared spectroscopy (NIRS. We aimed to establish a simplified approach to extract deoxi-Hb-based indices of impaired muscle O2 delivery during rapidly-incrementing exercise in heart failure. We continuously probed the right vastus lateralis muscle with continuous-wave NIRS during a ramp-incremental cardiopulmonary exercise test in 10 patients (left ventricular ejection fraction <35% and 10 age-matched healthy males. Deoxi-Hb is reported as % of total response (onset to peak exercise in relation to work rate. Patients showed lower maximum exercise capacity and O2 uptake-work rate than controls (P<0.05. The deoxi-Hb response profile as a function of work rate was S-shaped in all subjects, i.e., it presented three distinct phases. Increased muscle deoxygenation in patients compared to controls was demonstrated by: i a steeper mid-exercise deoxi-Hb-work rate slope (2.2±1.3 vs 1.0±0.3% peak/W, respectively; P<0.05, and ii late-exercise increase in deoxi-Hb, which contrasted with stable or decreasing deoxi-Hb in all controls. Steeper deoxi-Hb-work rate slope was associated with lower peak work rate in patients (r=–0.73; P=0.01. This simplified approach to deoxi-Hb interpretation might prove useful in clinical settings to quantify impairments in O2 delivery by NIRS during ramp-incremental exercise in individual heart failure patients.

  11. Proposal on How To Conduct a Biopharmaceutical Process Failure Mode and Effect Analysis (FMEA) as a Risk Assessment Tool.

    Science.gov (United States)

    Zimmermann, Hartmut F; Hentschel, Norbert

    2011-01-01

    With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here

  12. Application of ISO22000, failure mode, and effect analysis (FMEA) cause and effect diagrams and pareto in conjunction with HACCP and risk assessment for processing of pastry products.

    Science.gov (United States)

    Varzakas, Theodoros H

    2011-09-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.

  13. Heart Failure

    Science.gov (United States)

    ... Other diseases. Chronic diseases — such as diabetes, HIV, hyperthyroidism, hypothyroidism, or a buildup of iron (hemochromatosis) or ... transplantation or support with a ventricular assist device. Prevention The key to preventing heart failure is to ...

  14. Learning from uncertain curves

    DEFF Research Database (Denmark)

    Mallasto, Anton; Feragen, Aasa

    2017-01-01

    We introduce a novel framework for statistical analysis of populations of nondegenerate Gaussian processes (GPs), which are natural representations of uncertain curves. This allows inherent variation or uncertainty in function-valued data to be properly incorporated in the population analysis. Us...

  15. Power Curve Measurements

    DEFF Research Database (Denmark)

    Federici, Paolo; Kock, Carsten Weber

    This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...

  16. Power Curve Measurements, FGW

    DEFF Research Database (Denmark)

    Vesth, Allan; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....

  17. Power Curve Measurements

    DEFF Research Database (Denmark)

    Federici, Paolo; Vesth, Allan

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....

  18. Power Curve Measurements

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Gómez Arranz, Paula

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...

  19. Carbon Lorenz Curves

    NARCIS (Netherlands)

    Groot, L.F.M.|info:eu-repo/dai/nl/073642398

    2008-01-01

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across

  20. The Axial Curve Rotator.

    Science.gov (United States)

    Hunter, Walter M.

    This document contains detailed directions for constructing a device that mechanically produces the three-dimensional shape resulting from the rotation of any algebraic line or curve around either axis on the coordinate plant. The device was developed in response to student difficulty in visualizing, and thus grasping the mathematical principles…

  1. Nacelle lidar power curve

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Wagner, Rozenn

    This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...

  2. Power curve report

    DEFF Research Database (Denmark)

    Vesth, Allan; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...

  3. Textbook Factor Demand Curves.

    Science.gov (United States)

    Davis, Joe C.

    1994-01-01

    Maintains that teachers and textbook graphics follow the same basic pattern in illustrating changes in demand curves when product prices increase. Asserts that the use of computer graphics will enable teachers to be more precise in their graphic presentation of price elasticity. (CFR)

  4. ECM using Edwards curves

    NARCIS (Netherlands)

    Bernstein, D.J.; Birkner, P.; Lange, T.; Peters, C.P.

    2013-01-01

    This paper introduces EECM-MPFQ, a fast implementation of the elliptic-curve method of factoring integers. EECM-MPFQ uses fewer modular multiplications than the well-known GMP-ECM software, takes less time than GMP-ECM, and finds more primes than GMP-ECM. The main improvements above the

  5. Power Curve Measurements FGW

    DEFF Research Database (Denmark)

    Federici, Paolo; Kock, Carsten Weber

    The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...

  6. The five-point Likert scale for dyspnea can properly assess the degree of pulmonary congestion and predict adverse events in heart failure outpatients

    Directory of Open Access Journals (Sweden)

    Cristina K. Weber

    2014-01-01

    Full Text Available OBJECTIVES: Proper assessment of dyspnea is important in patients with heart failure. Our aim was to evaluate the use of the 5-point Likert scale for dyspnea to assess the degree of pulmonary congestion and to determine the prognostic value of this scale for predicting adverse events in heart failure outpatients. METHODS: We undertook a prospective study of outpatients with moderate to severe heart failure. The 5-point Likert scale was applied during regular outpatient visits, along with clinical assessments. Lung ultrasound with ≥15 B-lines and an amino-terminal portion of pro-B-type natriuretic peptide (NT-proBNP level >1000 pg/mL were used as a reference for pulmonary congestion. The patients were then assessed every 30 days during follow-up to identify adverse clinical outcomes. RESULTS: We included 58 patients (65.5% male, age 43.5±11 years with a mean left ventricular ejection fraction of 27±6%. In total, 29.3% of these patients had heart failure with ischemic etiology. Additionally, pulmonary congestion, as diagnosed by lung ultrasound, was present in 58% of patients. A higher degree of dyspnea (3 or 4 points on the 5-point Likert scale was significantly correlated with a higher number of B-lines (p = 0.016. Patients stratified into Likert = 3-4 were at increased risk of admission compared with those in class 1-2 after adjusting for age, left ventricular ejection fraction, New York Heart Association functional class and levels of NT-proBNP >1000 pg/mL (HR = 4.9, 95% CI 1.33-18.64, p = 0.017. CONCLUSION: In our series, higher baseline scores on the 5-point Likert scale were related to pulmonary congestion and were independently associated with adverse events during follow-up. This simple clinical tool can help to identify patients who are more likely to decompensate and whose treatment should be intensified.

  7. Optofluidic Fabry-Pérot Micro-Cavities Comprising Curved Surfaces for Homogeneous Liquid Refractometry—Design, Simulation, and Experimental Performance Assessment

    Directory of Open Access Journals (Sweden)

    Noha Gaber

    2016-04-01

    Full Text Available In the scope of miniaturized optical sensors for liquid refractometry, this work details the design, numerical simulation, and experimental characterization of a Fabry-Pérot resonator consisting of two deeply-etched silicon cylindrical mirrors with a micro-tube in between holding the liquid analyte under study. The curved surfaces of the tube and the cylindrical mirrors provide three-dimensional light confinement and enable achieving stability for the cavity illuminated by a Gaussian beam input. The resonant optofluidic cavity attains a high-quality factor (Q—over 2800—which is necessary for a sensitive refractometer, not only by providing a sharp interference spectrum peak that enables accurate tracing of the peak wavelengths shifts, but also by providing steep side peaks, which enables detection of refractive index changes by power level variations when operating at a fixed wavelength. The latter method can achieve refractometry without the need for spectroscopy tools, provided certain criteria explained in the details are met. By experimentally measuring mixtures of acetone-toluene with different ratios, refractive index variations of 0.0005 < Δn < 0.0022 could be detected, with sensitivity as high as 5500 μW/RIU.

  8. Noninvasive Assessment of Preload Reserve Enhances Risk Stratification of Patients With Heart Failure With Reduced Ejection Fraction.

    Science.gov (United States)

    Matsumoto, Kensuke; Onishi, Akira; Yamada, Hirotsugu; Kusunose, Kenya; Suto, Makiko; Hatani, Yutaka; Matsuzoe, Hiroki; Tatsumi, Kazuhiro; Tanaka, Hidekazu; Hirata, Ken-Ichi

    2018-05-01

    The leg-positive pressure maneuver can safely and noninvasively apply preload stress without increase in total body fluid volume. The purpose of this study was to determine whether preload stress could be useful for risk stratification of patients with heart failure with reduced ejection fraction. For this study, 120 consecutive patients with heart failure with reduced ejection fraction were prospectively recruited. The stroke work index was estimated as product of stroke volume index and mean blood pressure, and the E/e' ratio was calculated to estimate ventricular filling pressure. The echocardiographic parameters were obtained both at rest and during leg-positive pressure stress. During the median follow-up period of 20 months, 30 patients developed adverse cardiovascular events. During preload stress, stroke work index increased significantly (from 3280±1371 to 3857±1581 mm Hg·mL/m 2 ; P <0.001) along with minimal changes in ventricular filling pressure (E/e', from 16±10 to 17±9; P <0.05) in patients without cardiovascular events. However, patients with cardiovascular events showed impairment of Frank-Starling mechanism (stroke work index, from 2863±969 to 2903±1084 mm Hg·mL/m 2 ; P =0.70) and a serious increase in E/e' ratio (from 19±11 to 25±14; P <0.001). Both the patients without contractile reserve and those without diastolic reserve exhibited worse event-free survival than the others ( P <0.001). In a Cox proportional-hazards analysis, the changes in stroke work index (hazard ratio: 0.44 per 500 mm Hg·mL/m 2 increase; P =0.001) and in E/e' (hazard ratio: 2.58 per 5-U increase; P <0.001) were predictors of cardiovascular events. Contractile reserve and diastolic reserve during leg-positive pressure stress are important determinants of cardiovascular outcomes for patients with heart failure with reduced ejection fraction. © 2018 American Heart Association, Inc.

  9. SU-F-T-250: What Does It Take to Correctly Assess the High Failure Modes of an Advanced Radiotherapy Procedure Such as Stereotactic Body Radiation Therapy?

    International Nuclear Information System (INIS)

    Han, D; Vile, D; Rosu, M; Palta, J

    2016-01-01

    Purpose: Assess the correct implementation of risk-based methodology of TG 100 to optimize quality management and patient safety procedures for Stereotactic Body Radiation Therapy. Methods: A detailed process map of SBRT treatment procedure was generated by a team of three physicists with varying clinical experience at our institution to assess the potential high-risk failure modes. The probabilities of occurrence (O), severity (S) and detectability (D) for potential failure mode in each step of the process map were assigned by these individuals independently on the scale from1 to 10. The risk priority numbers (RPN) were computed and analyzed. The highest 30 potential modes from each physicist’s analysis were then compared. Results: The RPN values assessed by the three physicists ranged from 30 to 300. The magnitudes of the RPN values from each physicist were different, and there was no concordance in the highest RPN values recorded by three physicists independently. The 10 highest RPN values belonged to sub steps of CT simulation, contouring and delivery in the SBRT process map. For these 10 highest RPN values, at least two physicists, irrespective of their length of experience had concordance but no general conclusions emerged. Conclusion: This study clearly shows that the risk-based assessment of a clinical process map requires great deal of preparation, group discussions, and participation by all stakeholders. One group albeit physicists cannot effectively implement risk-based methodology proposed by TG100. It should be a team effort in which the physicists can certainly play the leading role. This also corroborates TG100 recommendation that risk-based assessment of clinical processes is a multidisciplinary team effort.

  10. SU-F-T-250: What Does It Take to Correctly Assess the High Failure Modes of an Advanced Radiotherapy Procedure Such as Stereotactic Body Radiation Therapy?

    Energy Technology Data Exchange (ETDEWEB)

    Han, D; Vile, D; Rosu, M; Palta, J [Virginia Commonwealth University, Richmond, VA (United States)

    2016-06-15

    Purpose: Assess the correct implementation of risk-based methodology of TG 100 to optimize quality management and patient safety procedures for Stereotactic Body Radiation Therapy. Methods: A detailed process map of SBRT treatment procedure was generated by a team of three physicists with varying clinical experience at our institution to assess the potential high-risk failure modes. The probabilities of occurrence (O), severity (S) and detectability (D) for potential failure mode in each step of the process map were assigned by these individuals independently on the scale from1 to 10. The risk priority numbers (RPN) were computed and analyzed. The highest 30 potential modes from each physicist’s analysis were then compared. Results: The RPN values assessed by the three physicists ranged from 30 to 300. The magnitudes of the RPN values from each physicist were different, and there was no concordance in the highest RPN values recorded by three physicists independently. The 10 highest RPN values belonged to sub steps of CT simulation, contouring and delivery in the SBRT process map. For these 10 highest RPN values, at least two physicists, irrespective of their length of experience had concordance but no general conclusions emerged. Conclusion: This study clearly shows that the risk-based assessment of a clinical process map requires great deal of preparation, group discussions, and participation by all stakeholders. One group albeit physicists cannot effectively implement risk-based methodology proposed by TG100. It should be a team effort in which the physicists can certainly play the leading role. This also corroborates TG100 recommendation that risk-based assessment of clinical processes is a multidisciplinary team effort.

  11. Nitrile/Buna N Material Failure Assessment for an O-Ring used on the Gaseous Hydrogen Flow Control Valve (FCV) of the Space Shuttle Main Engine

    Science.gov (United States)

    Wingard, Doug

    2006-01-01

    After the rollout of Space Shuttle Discovery in April 2005 in preparation for return-to-flight, there was a failure of the Orbiter (OV-103) helium signature leak test in the gaseous hydrogen (GH2) system. Leakage was attributed to the Flow Control Valve (FCV) in Main Engine 3. The FCV determined to be the source of the leak for OV-103 is designated as LV-58. The nitrile/Buna N rubber O-ring seal was removed from LV-58, and failure analysis indicated radial cracks providing leak paths in one quadrant. Cracks were eventually found in 6 of 9 FCV O-rings among the three Shuttle Orbiters, though none were as severe as those for LV-58, OV-103. Testing by EM10 at MSFC on all 9 FCV O- rings included: laser dimensional, Shore A hardness and properties from a dynamic mechanical analyzer (DMA) and an Instron tensile machine. The following test data was obtained on the cracked quadrant of the LV-58, OV-103 O-ring: (1) the estimated compression set was only 9.5%, compared to none for the rest of the O-ring; (2) Shore A hardness for the O.D. was higher by almost 4 durometer points than for the rest of the O-ring; and (3) DMA data showed that the storage/elastic modulus E was almost 25% lower than for the rest of the O-ring. Of the 8 FCV O-rings tested on an Instron, 4 yielded tensile strengths that were below the MIL spec requirement of 1350 psi-a likely influence of rubber cracking. Comparisons were made between values of modulus determined by DNA (elastic) and Instron (Young s). Each nitrile/Buna N O-ring used in the FCV conforms to the MIL-P-25732C specification. A number of such O-rings taken from shelf storage at MSFC and Kennedy Space Center (KSC) were used to generate a reference curve of DMA glass transition temperature (Tg) vs. shelf storage time ranging from 8 to 26 years. A similar reference curve of TGA onset temperature (of rubber weight loss) vs. shelf storage time was also generated. The DMA and TGA data for the used FCV O-rings were compared to the reference

  12. Safety assessment for electricity generation failure accident of gas cooled nuclear power plant using system dynamics (SD) method

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Tae Ho [Seoul National Univ. (Korea, Republic of). Dept. of Nuclear Engineering

    2013-04-15

    The power production failure happens in the loss of coolant of the nuclear power plants (NPPs). The air ingress is a serious accident in gas cooled NPPs. The quantification of the study performed by the system dynamics (SD) method which is processed by the feedback algorithms. The Vensim software package is used for the simulation, which is performed by the Monte-Carlo method. Two kinds of considerations as the economic and safety properties are important in NPPs. The result shows the stability of the operation when the power can be decided. The maximum value of risk is the 11.77 in 43rd and the minimum value is 0.0 in several years. So, the success of the circulation of coolant is simulated by the dynamical values. (orig.)

  13. Assessment of myocardial washout of Tc-99m-sestamibi in patients with chronic heart failure. Comparison with normal control

    Energy Technology Data Exchange (ETDEWEB)

    Kumita, Shin-ichiro; Seino, Yoshihiko; Cho, Keiichi; Nakajo, Hidenobu; Toba, Masahiro; Fukushima, Yoshimitsu; Takano, Teruo; Kumazaki, Tatsuo [Nippon Medical School, Tokyo (Japan); Okamoto, Noriake [Bristol-Myers Squibb K.K., Tokyo (Japan)

    2002-06-01

    In contrast to {sup 201}TlCl, {sup 99m}Tc-sestamibi shows very slow myocardial clearance after its initial myocardial uptake. In the present study, myocardial washout of {sup 99m}Tc-sestamibi was calculated in patients with non-ischemic chronic heart failure (CHF) and compared with biventricular parameters obtained from first-pass and ECG-gated myocardial perfusion SPECT data. After administration of {sup 99m}Tc-sestamibi, 25 patients with CHF and 8 normal controls (NC) were examined by ECG-gated myocardial perfusion SPECT and planar data acquisition in the early and delayed (interval of 3 hours) phase. Left ventricular ejection fraction (LVEF, %), peak filling rate (PFR, sec{sup -1}), end-diastolic volume (LVEDV, ml) and end-systolic volume (LVESV, ml) were automatically calculated from the ECG-gated SPECT data. Myocardial washout rates over 3 hours were calculated from the early and delayed planar images. Myocardial washout rates in the CHF group (39.6{+-}5.2%) were significantly higher than those in the NC group (31.2{+-}5.5%, p<0.01). The myocardial washout rates for the 33 subjects showed significant correlations with LVEF (r=-0.61, p<0.001), PFR (r=-0.47, p<0.01), LVEDV (r=0.45, p<0.01) and LVESV (r=0.48, p<0.01). The myocardial washout rate of {sup 99m}Tc-sestamibi is considered to be a novel marker for the diagnosis of myocardial damage in patients with chronic heart failure. (author)

  14. Global experience curves for wind farms

    International Nuclear Information System (INIS)

    Junginger, M.; Faaij, A.; Turkenburg, W.C.

    2005-01-01

    In order to forecast the technological development and cost of wind turbines and the production costs of wind electricity, frequent use is made of the so-called experience curve concept. Experience curves of wind turbines are generally based on data describing the development of national markets, which cause a number of problems when applied for global assessments. To analyze global wind energy price development more adequately, we compose a global experience curve. First, underlying factors for past and potential future price reductions of wind turbines are analyzed. Also possible implications and pitfalls when applying the experience curve methodology are assessed. Second, we present and discuss a new approach of establishing a global experience curve and thus a global progress ratio for the investment cost of wind farms. Results show that global progress ratios for wind farms may lie between 77% and 85% (with an average of 81%), which is significantly more optimistic than progress ratios applied in most current scenario studies and integrated assessment models. While the findings are based on a limited amount of data, they may indicate faster price reduction opportunities than so far assumed. With this global experience curve we aim to improve the reliability of describing the speed with which global costs of wind power may decline

  15. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  16. Carbon Lorenz Curves

    Energy Technology Data Exchange (ETDEWEB)

    Groot, L. [Utrecht University, Utrecht School of Economics, Janskerkhof 12, 3512 BL Utrecht (Netherlands)

    2008-11-15

    The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across countries. These tools allow policy-makers and the general public to grasp at a single glance the impact of conventional distribution rules such as equal caps or grandfathering, or more sophisticated ones, on the distribution of greenhouse gas emissions. Second, using the Samuelson rule for the optimal provision of a public good, the Pareto-optimal distribution of carbon emissions is compared with the distribution that follows if countries follow Nash-Cournot abatement strategies. It is shown that the Pareto-optimal distribution under the Samuelson rule can be approximated by the equal cap division, represented by the diagonal in the Lorenz curve diagram.

  17. Assessment of Lymph Nodes and Prostate Status Using Early Dynamic Curves with (18)F-Choline PET/CT in Prostate Cancer.

    Science.gov (United States)

    Mathieu, Cédric; Ferrer, Ludovic; Carlier, Thomas; Colombié, Mathilde; Rusu, Daniela; Kraeber-Bodéré, Françoise; Campion, Loic; Rousseau, Caroline

    2015-01-01

    Dynamic image acquisition with (18)F-Choline [fluorocholine (FCH)] PET/CT in prostate cancer is mostly used to overcome the bladder repletion, which could obstruct the loco-regional analysis. The aim of our study was to analyze early dynamic FCH acquisitions to define pelvic lymph node or prostate pathological status. Retrospective analysis was performed on 39 patients for initial staging (n = 18), or after initial treatment (n = 21). Patients underwent 10-min dynamic acquisitions centered on the pelvis, after injection of 3-4 MBq/kg of FCH. Whole-body images were acquired about 1 h after injection using a PET/CT GE Discovery LS (GE-LS) or Siemens Biograph mCT (mCT). Maximum and mean SUV according to time were measured on nodal and prostatic lesions. SUVmean was corrected for partial volume effect (PVEC) with suitable recovery coefficients. The status of each lesion was based on histological results or patient follow-up (>6 months). A Mann-Whitney test and ANOVA were used to compare mean and receiver operating characteristic (ROC) curve analysis. The median PSA was 8.46 ng/mL and the median Gleason score was 3 + 4. Ninety-two lesions (43 lymph nodes and 49 prostate lesions) were analyzed, including 63 malignant lesions. In early dynamic acquisitions, the maximum and mean SUV were significantly higher, respectively, on mCT and GE-LS, in malignant versus benign lesions (p dynamic imaging using PET/CT FCH allowed prostate cancer detection in situations where proof of malignancy is difficult to obtain.

  18. Clinical significance of power spectral analysis of heart rate variability and {sup 123}I-metaiodobenzylguanidine (MIBG) myocardial imaging for assessing the severity of heart failure

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, Yoshio; Fukuoka, Shuji; Shimotsu, Yoriko; Sasaki, Tatsuya; Kamakura, Shiro; Yasumura, Yoshio; Miyatake, Kunio; Shimomura, Katsuro [National Cardiovascular Center, Suita, Osaka (Japan); Tani, Akihiro

    1997-04-01

    The significance of power spectral analysis of heart rate variability and of MIBG myocardial imaging to see the sympathetic nervous function was evaluated in patients with congestive heart failure due to dilated cardiomyopathy. Subjects were 10 normal volunteers and 8 patients with severity NYHA II; 10 normals and 25 patients with NYHA II and III; and 17 patients treated with a beta-blocker (metoprolol 5-40 mg). ECG was recorded with a portable ECG recorder for measuring RR intervals for 24 hr, which were applied for power spectral analysis. Early and delayed imagings with 111 MBq of {sup 123}I-MIBG were performed at 15 min and 4 hr, respectively, after its intravenous administration for acquisition of anterior planar and SPECT images. Myocardial blood flow SPECT was also done with 111 MBq of {sup 201}Tl given intravenously, and difference of total defect scores between MIBG and Tl images was computed. MIBG myocardial sympathetic nerve imaging in those patients was found useful to assess the severity of heart failure, to predict the risk patients for beta-blocker treatment and to assess the risk in complicated ventricular tachycardia. (K.H.)

  19. Dynamics of curved fronts

    CERN Document Server

    Pelce, Pierre

    1989-01-01

    In recent years, much progress has been made in the understanding of interface dynamics of various systems: hydrodynamics, crystal growth, chemical reactions, and combustion. Dynamics of Curved Fronts is an important contribution to this field and will be an indispensable reference work for researchers and graduate students in physics, applied mathematics, and chemical engineering. The book consist of a 100 page introduction by the editor and 33 seminal articles from various disciplines.

  20. International Wage Curves

    OpenAIRE

    David G. Blanchflower; Andrew J. Oswald

    1992-01-01

    The paper provides evidence for the existence of a negatively sloped locus linking the level of pay to the rate of regional (or industry) unemployment. This "wage curve" is estimated using microeconomic data for Britain, the US, Canada, Korea, Austria, Italy, Holland, Switzerland, Norway, and Germany, The average unemployment elasticity of pay is approximately -0.1. The paper sets out a multi-region efficiency wage model and argues that its predictions are consistent with the data.

  1. Estimating Corporate Yield Curves

    OpenAIRE

    Antionio Diaz; Frank Skinner

    2001-01-01

    This paper represents the first study of retail deposit spreads of UK financial institutions using stochastic interest rate modelling and the market comparable approach. By replicating quoted fixed deposit rates using the Black Derman and Toy (1990) stochastic interest rate model, we find that the spread between fixed and variable rates of interest can be modeled (and priced) using an interest rate swap analogy. We also find that we can estimate an individual bank deposit yield curve as a spr...

  2. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  3. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 3, Hardware component failure data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add equipment types of action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data

  4. Coupled Large Scale Hydro-mechanical Modelling for cap-rock Failure Risk Assessment of CO2 Storage in Deep Saline Aquifers

    International Nuclear Information System (INIS)

    Rohmer, J.; Seyedi, D.M.

    2010-01-01

    This work presents a numerical strategy of large scale hydro-mechanical simulations to assess the risk of damage in cap-rock formations during a CO 2 injection process. The proposed methodology is based on the development of a sequential coupling between a multiphase fluid flow (TOUGH2) and a hydro-mechanical calculation code (Code-Aster) that enables us to perform coupled hydro-mechanical simulation at a regional scale. The likelihood of different cap-rock damage mechanisms can then be evaluated based on the results of the coupled simulations. A scenario based approach is proposed to take into account the effect of the uncertainty of model parameters on damage likelihood. The developed methodology is applied for the cap-rock failure analysis of deep aquifer of the Dogger formation in the context of the Paris basin multilayered geological system as a demonstration example. The simulation is carried out at a regional scale (100 km) considering an industrial mass injection rate of CO 2 of 10 Mt/y. The assessment of the stress state after 10 years of injection is conducted through the developed sequential coupling. Two failure mechanisms have been taken into account, namely the tensile fracturing and the shear slip reactivation of pre-existing fractures. To deal with the large uncertainties due to sparse data on the layer formations, a scenario based strategy is undertaken. It consists in defining a first reference modelling scenario considering the mean values of the hydro-mechanical properties for each layer. A sensitivity analysis is then carried out and shows the importance of both the initial stress state and the reservoir hydraulic properties on the cap-rock failure tendency. On this basis, a second scenario denoted 'critical' is defined so that the most influential model parameters are taken in their worst configuration. None of these failure criteria is activated for the considered conditions. At a phenomenological level, this study points out three key

  5. Development of an Electronic Medical Record Based Alert for Risk of HIV Treatment Failure in a Low-Resource Setting

    Science.gov (United States)

    Puttkammer, Nancy; Zeliadt, Steven; Balan, Jean Gabriel; Baseman, Janet; Destiné, Rodney; Domerçant, Jean Wysler; France, Garilus; Hyppolite, Nathaelf; Pelletier, Valérie; Raphael, Nernst Atwood; Sherr, Kenneth; Yuhas, Krista; Barnhart, Scott

    2014-01-01

    Background The adoption of electronic medical record systems in resource-limited settings can help clinicians monitor patients' adherence to HIV antiretroviral therapy (ART) and identify patients at risk of future ART failure, allowing resources to be targeted to those most at risk. Methods Among adult patients enrolled on ART from 2005–2013 at two large, public-sector hospitals in Haiti, ART failure was assessed after 6–12 months on treatment, based on the World Health Organization's immunologic and clinical criteria. We identified models for predicting ART failure based on ART adherence measures and other patient characteristics. We assessed performance of candidate models using area under the receiver operating curve, and validated results using a randomly-split data sample. The selected prediction model was used to generate a risk score, and its ability to differentiate ART failure risk over a 42-month follow-up period was tested using stratified Kaplan Meier survival curves. Results Among 923 patients with CD4 results available during the period 6–12 months after ART initiation, 196 (21.2%) met ART failure criteria. The pharmacy-based proportion of days covered (PDC) measure performed best among five possible ART adherence measures at predicting ART failure. Average PDC during the first 6 months on ART was 79.0% among cases of ART failure and 88.6% among cases of non-failure (pART initiation were added to PDC, the risk score differentiated between those who did and did not meet failure criteria over 42 months following ART initiation. Conclusions Pharmacy data are most useful for new ART adherence alerts within iSanté. Such alerts offer potential to help clinicians identify patients at high risk of ART failure so that they can be targeted with adherence support interventions, before ART failure occurs. PMID:25390044

  6. Development of an electronic medical record based alert for risk of HIV treatment failure in a low-resource setting.

    Directory of Open Access Journals (Sweden)

    Nancy Puttkammer

    Full Text Available The adoption of electronic medical record systems in resource-limited settings can help clinicians monitor patients' adherence to HIV antiretroviral therapy (ART and identify patients at risk of future ART failure, allowing resources to be targeted to those most at risk.Among adult patients enrolled on ART from 2005-2013 at two large, public-sector hospitals in Haiti, ART failure was assessed after 6-12 months on treatment, based on the World Health Organization's immunologic and clinical criteria. We identified models for predicting ART failure based on ART adherence measures and other patient characteristics. We assessed performance of candidate models using area under the receiver operating curve, and validated results using a randomly-split data sample. The selected prediction model was used to generate a risk score, and its ability to differentiate ART failure risk over a 42-month follow-up period was tested using stratified Kaplan Meier survival curves.Among 923 patients with CD4 results available during the period 6-12 months after ART initiation, 196 (21.2% met ART failure criteria. The pharmacy-based proportion of days covered (PDC measure performed best among five possible ART adherence measures at predicting ART failure. Average PDC during the first 6 months on ART was 79.0% among cases of ART failure and 88.6% among cases of non-failure (p<0.01. When additional information including sex, baseline CD4, and duration of enrollment in HIV care prior to ART initiation were added to PDC, the risk score differentiated between those who did and did not meet failure criteria over 42 months following ART initiation.Pharmacy data are most useful for new ART adherence alerts within iSanté. Such alerts offer potential to help clinicians identify patients at high risk of ART failure so that they can be targeted with adherence support interventions, before ART failure occurs.

  7. Investigation of learning and experience curves

    Energy Technology Data Exchange (ETDEWEB)

    Krawiec, F.; Thornton, J.; Edesess, M.

    1980-04-01

    The applicability of learning and experience curves for predicting future costs of solar technologies is assessed, and the major test case is the production economics of heliostats. Alternative methods for estimating cost reductions in systems manufacture are discussed, and procedures for using learning and experience curves to predict costs are outlined. Because adequate production data often do not exist, production histories of analogous products/processes are analyzed and learning and aggregated cost curves for these surrogates estimated. If the surrogate learning curves apply, they can be used to estimate solar technology costs. The steps involved in generating these cost estimates are given. Second-generation glass-steel and inflated-bubble heliostat design concepts, developed by MDAC and GE, respectively, are described; a costing scenario for 25,000 units/yr is detailed; surrogates for cost analysis are chosen; learning and aggregate cost curves are estimated; and aggregate cost curves for the GE and MDAC designs are estimated. However, an approach that combines a neoclassical production function with a learning-by-doing hypothesis is needed to yield a cost relation compatible with the historical learning curve and the traditional cost function of economic theory.

  8. Development of a Short-term Failure Assessment of High Density Polyethylene Pipe Welds - Application of the Limit Load Analysis -

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Ho-Wan; Han, Jae-Jun; Kim, Yun-Jae [Korea University, Seoul (Korea, Republic of); Kim, Jong-Sung [Sunchon National University, Suncheon (Korea, Republic of); Kim, Jeong-Hyeon; Jang, Chang-Heui [KAIST, Daejeon (Korea, Republic of)

    2015-04-15

    In the US, the number of cases of subterranean water contamination from tritium leaking through a damaged buried nuclear power plant pipe continues to increase, and the degradation of the buried metal piping is emerging as a major issue. A pipe blocked from corrosion and/or degradation can lead to loss of cooling capacity in safety-related piping resulting in critical issues related to the safety and integrity of nuclear power plant operation. The ASME Boiler and Pressure Vessel Codes Committee (BPVC) has recently approved Code Case N-755 that describes the requirements for the use of polyethylene (PE) pipe for the construction of Section III, Division 1 Class 3 buried piping systems for service water applications in nuclear power plants. This paper contains tensile and slow crack growth (SCG) test results for high-density polyethylene (HDPE) pipe welds under the environmental conditions of a nuclear power plant. Based on these tests, the fracture surface of the PENT specimen was analyzed, and the fracture mechanisms of each fracture area were determined. Finally, by using 3D finite element analysis, limit loads of HDPE related to premature failure were verified.

  9. Vitamin D and Heart Failure.

    Science.gov (United States)

    Marshall Brinkley, D; Ali, Omair M; Zalawadiya, Sandip K; Wang, Thomas J

    2017-10-01

    Vitamin D is principally known for its role in calcium homeostasis, but preclinical studies implicate multiple pathways through which vitamin D may affect cardiovascular function and influence risk for heart failure. Many adults with cardiovascular disease have low vitamin D status, making it a potential therapeutic target. We review the rationale and potential role of vitamin D supplementation in the prevention and treatment of chronic heart failure. Substantial observational evidence has associated low vitamin D status with the risk of heart failure, ventricular remodeling, and clinical outcomes in heart failure, including mortality. However, trials assessing the influence of vitamin D supplementation on surrogate markers and clinical outcomes in heart failure have generally been small and inconclusive. There are insufficient data to recommend routine assessment or supplementation of vitamin D for the prevention or treatment of chronic heart failure. Prospective trials powered for clinical outcomes are warranted.

  10. Failure Modes

    DEFF Research Database (Denmark)

    Jakobsen, K. P.; Burcharth, H. F.; Ibsen, Lars Bo

    1999-01-01

    The present appendix contains the derivation of ten different limit state equations divided on three different failure modes. Five of the limit state equations can be used independently of the characteristics of the subsoil, whereas the remaining five can be used for either drained or undrained s...

  11. A national assessment of underground natural gas storage: identifying wells with designs likely vulnerable to a single-point-of-failure

    Science.gov (United States)

    Michanowicz, Drew R.; Buonocore, Jonathan J.; Rowland, Sebastian T.; Konschnik, Katherine E.; Goho, Shaun A.; Bernstein, Aaron S.

    2017-05-01

    The leak of processed natural gas (PNG) from October 2015 to February 2016 from the Aliso Canyon storage facility, near Los Angeles, California, was the largest single accidental release of greenhouse gases in US history. The Interagency Task Force on Natural Gas Storage Safety and California regulators recently recommended operators phase out single-point-of-failure (SPF) well designs. Here, we develop a national dataset of UGS well activity in the continental US to assess regulatory data availability and uncertainty, and to assess the prevalence of certain well design deficiencies including single-point-of-failure designs. We identified 14 138 active UGS wells associated with 317 active UGS facilities in 29 states using regulatory and company data. State-level wellbore datasets contained numerous reporting inconsistencies that limited data concatenation. We identified 2715 active UGS wells across 160 facilities that, like the failed well at Aliso Canyon, predated the storage facility, and therefore were not originally designed for gas storage. The majority (88%) of these repurposed wells are located in OH, MI, PA, NY, and WV. Repurposed wells have a median age of 74 years, and the 2694 repurposed wells constructed prior to 1979 are particularly likely to exhibit design-related deficiencies. An estimated 210 active repurposed wells were constructed before 1917—before cement zonal isolation methods were utilized. These wells are located in OH, PA, NY, and WV and represent the highest priority related to potential design deficiencies that could lead to containment loss. This national baseline assessment identifies regulatory data uncertainties, highlights a potentially widespread vulnerability of the natural gas supply chain, and can aid in prioritization and oversight for high-risk wells and facilities.

  12. Uniformization of elliptic curves

    OpenAIRE

    Ülkem, Özge; Ulkem, Ozge

    2015-01-01

    Every elliptic curve E defined over C is analytically isomorphic to C*=qZ for some q ∊ C*. Similarly, Tate has shown that if E is defined over a p-adic field K, then E is analytically isomorphic to K*=qZ for some q ∊ K . Further the isomorphism E(K) ≅ K*/qZ respects the action of the Galois group GK/K, where K is the algebraic closure of K. I will explain the construction of this isomorphism.

  13. Assessment of the knowledge and perception of support of patients with heart failure SOPICA study IN SPAIN.

    Science.gov (United States)

    Miró, Ò; Escoda, R; Martín-Sánchez, F J; Herrero, P; Jacob, J; Rizzi, M; Aguirre, A; Andueza, J A; Bueno, H; Llorens, P

    2016-01-01

    To understand the perceptions of patients with heart failure (HF) concerning their disease, treatment and support, as well as the specialists who provide care after a decompensation, and to determine whether there is a relationship between the type of specialist involved in the follow-up and the medium-term prognosis. A multicentre, prospective cohort study consecutively included patients with acute HF in the emergency department. The patients were interviewed by telephone 91-180days after their emergency department visit. We investigated the relationship between the type of specialist who performed the follow-up and the emergency department visits or hospitalisations using Cox regression models, with progressive adjustment by groups of potential confounders of these relationships. We interviewed 785 patients. Thirty-three percent (95%CI: 30%-36%) considered their disease mild, 64% (60%-67%) required help from third parties for daily activities, 65% (61%-68%) had no recent therapeutic changes, and 69% (67%-72%) received the same treatment in the exacerbations. The perceived support varied significantly depending on the factor under consideration (from greater to lesser: family, hospital, emergency department, health centre, religion and patient associations; p<.05 in all comparisons). Thirty-nine percent (36%-43%) of the patients with decompensations consulted directly with the emergency department, with no prior changes in treatment. At discharge, general practitioners (74%, 71%-77%) and cardiologists (74%, 70%-77%) were the most involved in the follow-up, although the specialty was not related to the prognosis. There are various aspects of the perception of patients with HF concerning their disease that are susceptible to future interventions. Patient follow-up involves various specialties, but all achieve similar results in the medium term. Copyright © 2016. Published by Elsevier España, S.L.U.

  14. Role of Biomarkers for the Prevention, Assessment, and Management of Heart Failure: A Scientific Statement From the American Heart Association.

    Science.gov (United States)

    Chow, Sheryl L; Maisel, Alan S; Anand, Inder; Bozkurt, Biykem; de Boer, Rudolf A; Felker, G Michael; Fonarow, Gregg C; Greenberg, Barry; Januzzi, James L; Kiernan, Michael S; Liu, Peter P; Wang, Thomas J; Yancy, Clyde W; Zile, Michael R

    2017-05-30

    Natriuretic peptides have led the way as a diagnostic and prognostic tool for the diagnosis and management of heart failure (HF). More recent evidence suggests that natriuretic peptides along with the next generation of biomarkers may provide added value to medical management, which could potentially lower risk of mortality and readmissions. The purpose of this scientific statement is to summarize the existing literature and to provide guidance for the utility of currently available biomarkers. The writing group used systematic literature reviews, published translational and clinical studies, clinical practice guidelines, and expert opinion/statements to summarize existing evidence and to identify areas of inadequacy requiring future research. The panel reviewed the most relevant adult medical literature excluding routine laboratory tests using MEDLINE, EMBASE, and Web of Science through December 2016. The document is organized and classified according to the American Heart Association to provide specific suggestions, considerations, or contemporary clinical practice recommendations. A number of biomarkers associated with HF are well recognized, and measuring their concentrations in circulation can be a convenient and noninvasive approach to provide important information about disease severity and helps in the detection, diagnosis, prognosis, and management of HF. These include natriuretic peptides, soluble suppressor of tumorgenicity 2, highly sensitive troponin, galectin-3, midregional proadrenomedullin, cystatin-C, interleukin-6, procalcitonin, and others. There is a need to further evaluate existing and novel markers for guiding therapy and to summarize their data in a standardized format to improve communication among researchers and practitioners. HF is a complex syndrome involving diverse pathways and pathological processes that can manifest in circulation as biomarkers. A number of such biomarkers are now clinically available, and monitoring their

  15. Roc curves for continuous data

    CERN Document Server

    Krzanowski, Wojtek J

    2009-01-01

    Since ROC curves have become ubiquitous in many application areas, the various advances have been scattered across disparate articles and texts. ROC Curves for Continuous Data is the first book solely devoted to the subject, bringing together all the relevant material to provide a clear understanding of how to analyze ROC curves.The fundamental theory of ROC curvesThe book first discusses the relationship between the ROC curve and numerous performance measures and then extends the theory into practice by describing how ROC curves are estimated. Further building on the theory, the authors prese

  16. Updating of adventitious fuel pin failure frequency in sodium-cooled fast reactors and probabilistic risk assessment on consequent severe accident in Monju

    International Nuclear Information System (INIS)

    Fukano, Yoshitaka; Kurisaka, Kenichi; Nishimura, Masahiro; Naruto, Kenichi

    2015-01-01

    Experimental studies, deterministic approaches and probabilistic risk assessments (PRAs) on local fault (LF) propagation in sodium-cooled fast reactors (SFRs) have been performed in many countries because LFs have been historically considered as one of the possible causes of severe accidents. Adventitious-fuel-pin-failures (AFPFs) have been considered to be the most dominant initiators of LFs in these PRAs because of their high frequency of occurrence during reactor operation and possibility of fuel-element-failure-propagation (FEFP). A PRA on FEFP from AFPF (FEFPA) in the Japanese prototype SFR (Monju) was performed in this study based on the state-of-the-art knowledge, reflecting the most recent operation procedures under off-normal conditions. Frequency of occurrence of AFPF in SFRs which was the initiating event of the event tree in this PRA was updated using a variety of methods based on the above-mentioned latest review on experiences of this phenomenon. As a result, the frequency of occurrence of, and the core damage frequency (CDF) from, AFPF in Monju was significantly reduced to a negligible magnitude compared with those in the existing PRAs. It was, therefore concluded that the CDF of FEFPA in Monju could be comprised in that of anticipated transient without scram or protected loss of heat sink events from both the viewpoint of occurrence probability and consequences. (author)

  17. Types of Heart Failure

    Science.gov (United States)

    ... Introduction Types of Heart Failure Classes of Heart Failure Heart Failure in Children Advanced Heart Failure • Causes and ... and procedures related to heart disease and stroke. Heart Failure Questions to Ask Your Doctor Use these questions ...

  18. Classes of Heart Failure

    Science.gov (United States)

    ... Introduction Types of Heart Failure Classes of Heart Failure Heart Failure in Children Advanced Heart Failure • Causes and ... and Advanced HF • Tools and Resources • Personal Stories Heart Failure Questions to Ask Your Doctor Use these questions ...

  19. Failure to identify an acute exercise effect on executive function assessed by the Wisconsin Card Sorting Test

    Directory of Open Access Journals (Sweden)

    Chun-Chih Wang

    2015-03-01

    Conclusion: Acute aerobic exercise failed to influence executive function as assessed by the WCST, revealing that this classical neuropsychological test tapping executive function may not be sensitive to acute exercise. Our findings suggest that acute exercise does not broadly affect the entire family of executive functions, or its effect on a specific aspect of executive function may be task-dependent, as proposed by Etnier and Chang (2009.

  20. Failure Analysis

    International Nuclear Information System (INIS)

    Iorio, A.F.; Crespi, J.C.

    1987-01-01

    After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)

  1. SINTAP: draft of a unified European failure assessment procedure. An introduction; SINTAP: Entwurf einer vereinheitlichten europaeischen Fehlerbewertungsprozedur. Eine Einfuehrung

    Energy Technology Data Exchange (ETDEWEB)

    Zerbst, U.; Kocak, M. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Werkstofforschung; Wiesner, C. [The Welding Institute (TWI), Cambridge (United Kingdom). Structural Integrity Dept.; Hodulak, L. [Fraunhofer-Institut fuer Werkstoffmechanik (IWM), Freiburg im Breisgau (Germany)

    1999-07-01

    Fracture mechanics based flaw assessment concepts are increasingly used in industrial regulations and standards. A considerable number of different guidelines and procedures are available which are partly based on each other but also exhibit significant differences. At this background, the EU sponsored SINTAP, an interdisciplinary Brite Euram project. SINTAP stands for 'Structural Integrity Assessment Procedures for European Industry'. From 1996 to 1999 seventeen organisations from nine European countries participated in the project the aim of which was to unify the available procedures. The present report is an introduction into the content, the structure and the scientific background of the procedure which was generated within SINTAP. (orig.) [German] Bruchmechanische Bewertungskonzepte werden international in zunehmendem Masse Bestandteil industrieller Regelwerke und Fachbereichsstandards. Bereits heute existiert eine Vielzahl derartiger Vorschriften und Rechnerprogramme, die teilweise aufeinander aufbauen, teilweise aber auch signifikante Unterschiede aufweisen. Vor diesem Hintergrund wurde durch die EU von 1996 bis 1999 SINTAP, ein interdisziplinaeres Brite-Euram-Projekt gefoerdert, dessen Ziel in einer Vereinheitlichung der vorhandenen Ansaetze bestand, und an dem siebzehn Institutionen aus neun europaeischen Laendern beteiligt waren. SINTAP steht fuer 'Structural Integrity Assessment Procedures for European Industry'. Der vorliegende Bericht ist eine Einfuehrung in Inhalt, Struktur und wissenschaftlichen Hintergrund der im Ergebnis des Projektes entstandenen SINTAP-Prozedur. (orig.)

  2. Precision-Recall-Gain Curves:PR Analysis Done Right

    OpenAIRE

    Flach, Peter; Kull, Meelis

    2015-01-01

    Precision-Recall analysis abounds in applications of binary classification where true negatives do not add value and hence should not affect assessment of the classifier's performance. Perhaps inspired by the many advantages of receiver operating characteristic (ROC) curves and the area under such curves for accuracy-based performance assessment, many researchers have taken to report Precision-Recall (PR) curves and associated areas as performance metric. We demonstrate in this paper that thi...

  3. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  4. Curved Josephson junction

    International Nuclear Information System (INIS)

    Dobrowolski, Tomasz

    2012-01-01

    The constant curvature one and quasi-one dimensional Josephson junction is considered. On the base of Maxwell equations, the sine–Gordon equation that describes an influence of curvature on the kink motion was obtained. It is showed that the method of geometrical reduction of the sine–Gordon model from three to lower dimensional manifold leads to an identical form of the sine–Gordon equation. - Highlights: ► The research on dynamics of the phase in a curved Josephson junction is performed. ► The geometrical reduction is applied to the sine–Gordon model. ► The results of geometrical reduction and the fundamental research are compared.

  5. Curved-Duct

    Directory of Open Access Journals (Sweden)

    Je Hyun Baekt

    2000-01-01

    Full Text Available A numerical study is conducted on the fully-developed laminar flow of an incompressible viscous fluid in a square duct rotating about a perpendicular axis to the axial direction of the duct. At the straight duct, the rotation produces vortices due to the Coriolis force. Generally two vortex cells are formed and the axial velocity distribution is distorted by the effect of this Coriolis force. When a convective force is weak, two counter-rotating vortices are shown with a quasi-parabolic axial velocity profile for weak rotation rates. As the rotation rate increases, the axial velocity on the vertical centreline of the duct begins to flatten and the location of vorticity center is moved near to wall by the effect of the Coriolis force. When the convective inertia force is strong, a double-vortex secondary flow appears in the transverse planes of the duct for weak rotation rates but as the speed of rotation increases the secondary flow is shown to split into an asymmetric configuration of four counter-rotating vortices. If the rotation rates are increased further, the secondary flow restabilizes to a slightly asymmetric double-vortex configuration. Also, a numerical study is conducted on the laminar flow of an incompressible viscous fluid in a 90°-bend square duct that rotates about axis parallel to the axial direction of the inlet. At a 90°-bend square duct, the feature of flow by the effect of a Coriolis force and a centrifugal force, namely a secondary flow by the centrifugal force in the curved region and the Coriolis force in the downstream region, is shown since the centrifugal force in curved region and the Coriolis force in downstream region are dominant respectively.

  6. Elliptic curves for applications (Tutorial)

    NARCIS (Netherlands)

    Lange, T.; Bernstein, D.J.; Chatterjee, S.

    2011-01-01

    More than 25 years ago, elliptic curves over finite fields were suggested as a group in which the Discrete Logarithm Problem (DLP) can be hard. Since then many researchers have scrutinized the security of the DLP on elliptic curves with the result that for suitably chosen curves only exponential

  7. Titration Curves: Fact and Fiction.

    Science.gov (United States)

    Chamberlain, John

    1997-01-01

    Discusses ways in which datalogging equipment can enable titration curves to be measured accurately and how computing power can be used to predict the shape of curves. Highlights include sources of error, use of spreadsheets to generate titration curves, titration of a weak acid with a strong alkali, dibasic acids, weak acid and weak base, and…

  8. Aging, Maturation and Growth of Sauropodomorph Dinosaurs as Deduced from Growth Curves Using Long Bone Histological Data: An Assessment of Methodological Constraints and Solutions.

    Science.gov (United States)

    Griebeler, Eva Maria; Klein, Nicole; Sander, P Martin

    2013-01-01

    Information on aging, maturation, and growth is important for understanding life histories of organisms. In extinct dinosaurs, such information can be derived from the histological growth record preserved in the mid-shaft cortex of long bones. Here, we construct growth models to estimate ages at death, ages at sexual maturity, ages at which individuals were fully-grown, and maximum growth rates from the growth record preserved in long bones of six sauropod dinosaur individuals (one indeterminate mamenchisaurid, two Apatosaurus sp., two indeterminate diplodocids, and one Camarasaurus sp.) and one basal sauropodomorph dinosaur individual (Plateosaurus engelhardti). Using these estimates, we establish allometries between body mass and each of these traits and compare these to extant taxa. Growth models considered for each dinosaur individual were the von Bertalanffy model, the Gompertz model, and the logistic model (LGM), all of which have inherently fixed inflection points, and the Chapman-Richards model in which the point is not fixed. We use the arithmetic mean of the age at the inflection point and of the age at which 90% of asymptotic mass is reached to assess respectively the age at sexual maturity or the age at onset of reproduction, because unambiguous indicators of maturity in Sauropodomorpha are lacking. According to an AIC-based model selection process, the LGM was the best model for our sauropodomorph sample. Allometries established are consistent with literature data on other Sauropodomorpha. All Sauropodomorpha reached full size within a time span similar to scaled-up modern mammalian megaherbivores and had similar maximum growth rates to scaled-up modern megaherbivores and ratites, but growth rates of Sauropodomorpha were lower than of an average mammal. Sauropodomorph ages at death probably were lower than that of average scaled-up ratites and megaherbivores. Sauropodomorpha were older at maturation than scaled-up ratites and average mammals, but

  9. Aging, Maturation and Growth of Sauropodomorph Dinosaurs as Deduced from Growth Curves Using Long Bone Histological Data: An Assessment of Methodological Constraints and Solutions.

    Directory of Open Access Journals (Sweden)

    Eva Maria Griebeler

    Full Text Available Information on aging, maturation, and growth is important for understanding life histories of organisms. In extinct dinosaurs, such information can be derived from the histological growth record preserved in the mid-shaft cortex of long bones. Here, we construct growth models to estimate ages at death, ages at sexual maturity, ages at which individuals were fully-grown, and maximum growth rates from the growth record preserved in long bones of six sauropod dinosaur individuals (one indeterminate mamenchisaurid, two Apatosaurus sp., two indeterminate diplodocids, and one Camarasaurus sp. and one basal sauropodomorph dinosaur individual (Plateosaurus engelhardti. Using these estimates, we establish allometries between body mass and each of these traits and compare these to extant taxa. Growth models considered for each dinosaur individual were the von Bertalanffy model, the Gompertz model, and the logistic model (LGM, all of which have inherently fixed inflection points, and the Chapman-Richards model in which the point is not fixed. We use the arithmetic mean of the age at the inflection point and of the age at which 90% of asymptotic mass is reached to assess respectively the age at sexual maturity or the age at onset of reproduction, because unambiguous indicators of maturity in Sauropodomorpha are lacking. According to an AIC-based model selection process, the LGM was the best model for our sauropodomorph sample. Allometries established are consistent with literature data on other Sauropodomorpha. All Sauropodomorpha reached full size within a time span similar to scaled-up modern mammalian megaherbivores and had similar maximum growth rates to scaled-up modern megaherbivores and ratites, but growth rates of Sauropodomorpha were lower than of an average mammal. Sauropodomorph ages at death probably were lower than that of average scaled-up ratites and megaherbivores. Sauropodomorpha were older at maturation than scaled-up ratites and average

  10. Lower head failure analysis

    International Nuclear Information System (INIS)

    Rempe, J.L.; Thinnes, G.L.; Allison, C.M.; Cronenberg, A.W.

    1991-01-01

    The US Nuclear Regulatory Commission is sponsoring a lower vessel head research program to investigate plausible modes of reactor vessel failure in order to determine (a) which modes have the greatest likelihood of occurrence during a severe accident and (b) the range of core debris and accident conditions that lead to these failures. This paper presents the methodology and preliminary results of an investigation of reactor designs and thermodynamic conditions using analytic closed-form approximations to assess the important governing parameters in non-dimensional form. Preliminary results illustrate the importance of vessel and tube geometrical parameters, material properties, and external boundary conditions on predicting vessel failure. Thermal analyses indicate that steady-state temperature distributions will occur in the vessel within several hours, although the exact time is dependent upon vessel thickness. In-vessel tube failure is governed by the tube-to-debris mass ratio within the lower head, where most penetrations are predicted to fail if surrounded by molten debris. Melt penetration distance is dependent upon the effective flow diameter of the tube. Molten debris is predicted to penetrate through tubes with a larger effective flow diameter, such as a boiling water reactor (BWR) drain nozzle. Ex-vessel tube failure for depressurized reactor vessels is predicted to be more likely for a BWR drain nozzle penetration because of its larger effective diameter. At high pressures (between ∼0.1 MPa and ∼12 MPa) ex-vessel tube rupture becomes a dominant failure mechanism, although tube ejection dominates control rod guide tube failure at lower temperatures. However, tube ejection and tube rupture predictions are sensitive to the vessel and tube radial gap size and material coefficients of thermal expansion

  11. Learning curves in health professions education.

    Science.gov (United States)

    Pusic, Martin V; Boutis, Kathy; Hatala, Rose; Cook, David A

    2015-08-01

    Learning curves, which graphically show the relationship between learning effort and achievement, are common in published education research but are not often used in day-to-day educational activities. The purpose of this article is to describe the generation and analysis of learning curves and their applicability to health professions education. The authors argue that the time is right for a closer look at using learning curves-given their desirable properties-to inform both self-directed instruction by individuals and education management by instructors.A typical learning curve is made up of a measure of learning (y-axis), a measure of effort (x-axis), and a mathematical linking function. At the individual level, learning curves make manifest a single person's progress towards competence including his/her rate of learning, the inflection point where learning becomes more effortful, and the remaining distance to mastery attainment. At the group level, overlaid learning curves show the full variation of a group of learners' paths through a given learning domain. Specifically, they make overt the difference between time-based and competency-based approaches to instruction. Additionally, instructors can use learning curve information to more accurately target educational resources to those who most require them.The learning curve approach requires a fine-grained collection of data that will not be possible in all educational settings; however, the increased use of an assessment paradigm that explicitly includes effort and its link to individual achievement could result in increased learner engagement and more effective instructional design.

  12. Imaging assessment of a portable hemodialysis device: detection of possible failure modes and monitoring of functional performance.

    Science.gov (United States)

    Olorunsola, Olufoladare G; Kim, Steven H; Chang, Ryan; Kuo, Yuo-Chen; Hetts, Steven W; Heller, Alex; Kant, Rishi; Saeed, Maythem; Fissell, William H; Roy, Shuvo; Wilson, Mark W

    2014-03-27

    The purpose of this study was to investigate the utility and limitations of various imaging modalities in the noninvasive assessment of a novel compact hemodialyzer under development for renal replacement therapy, with specific aim towards monitoring its functional performance. The prototype is a 4×3×6 cm aluminum cartridge housing "blood" and "dialysate" flow paths arranged in parallel. A sheet of semipermeable silicon nanopore membranes forms the blood-dialysate interface, allowing passage of small molecules. Blood flow was simulated using a peristaltic pump to instill iodinated contrast through the blood compartment, while de-ionized water was instilled through the dialysate compartment at a matched rate in the countercurrent direction. Images were acquired under these flow conditions using multi-detector computed tomography (MDCT), fluoroscopy, high-resolution quantitative computed tomography (HR-QCT), and magnetic resonance imaging (MRI). MDCT was used to monitor contrast diffusion efficiency by plotting contrast density as a function of position along the path of flow through the cartridge during steady state infusion at 1 and 20 mL/min. Both linear and exponential regressions were used to model contrast decay along the flow path. Both linear and exponential models of contrast decay appeared to be reasonable approximations, yielding similar results for contrast diffusion during a single pass through the cartridge. There was no measurable difference in contrast diffusion when comparing 1 mL/min and 20 mL/min flow rates. Fluoroscopy allowed a gross qualitative assessment of flow within the device, and revealed flow inhomogeneity within the corner of the cartridge opposite the blood inlet port. MRI and HR-QCT were both severely limited due to the paramagnetic properties and high atomic number of the target material, respectively. During testing, we encountered several causes of device malfunction, including leak formation, trapped gas, and contrast

  13. Nutrition in Heart Failure

    Directory of Open Access Journals (Sweden)

    Reci Meseri

    2013-10-01

    Full Text Available Heart failure is defined as decreased ability of heart due to various reasons. It%u2019s seen 2-3% but the prevalence increases sharply after the age of seventy. The objectives of nutrition therapy in heart failure are to prevent from water retention and edema, to avoid from hard digestion and to offer a balanced diet. In order to avoid fluid retention and edema, daily sodium and fluid intake must be monitored carefully. Main dilemma of the heart failure patients is the obesity-cachexia dilemma. Since one of the main reasons of heart failure is cardiovascular diseases, in first phase, the patient may be obese. In the later phases, cachexia may show up. It was shown that cachexia is associated with mortality. Within this period, patients should not be over-fed and the patient should pass from catabolic state to anabolic state slowly. If the gastrointestinal track is functional oral/enteral feeding must be preferred. Multi vitamin and mineral supportsmay be beneficial, which may replace the increased loss, increase anti-inflammatory response and be anti-oxidants. Large, controlled and well-designed studies must be conducted in order to evaluate the benefits of nutritional practices such as nutritional assessment, enteral feeding and nutrient supports in heart failure patients.

  14. A Journey Between Two Curves

    Directory of Open Access Journals (Sweden)

    Sergey A. Cherkis

    2007-03-01

    Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.

  15. Assessment of missiles generated by pressure component failure and its application to recent gas-cooled nuclear plant design

    International Nuclear Information System (INIS)

    Tulacz, J.; Smith, R.E.

    1980-01-01

    Methods for establishing characteristics of missiles following pressure barrier rupture have been reviewed in order to enable evaluation of structural response to missile impact and to aid the design of barriers to protect essential plant on gas cooled nuclear plant against unacceptable damage from missile impact. Methods for determining structural response of concrete barriers to missile impact have been reviewed and some methods used for assessing the adequacy of steel barriers on gas-cooled nuclear plant have been described. The possibility of making an incredibility case for some of the worst missiles based on probability arguments is briefly discussed. It is shown that there may be scope for such arguments but there are difficulties in quantifying some of the probability factors. (U.K.)

  16. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    Lo, T.; Woo, H.H.; Holman, G.S.; Chou, C.K.

    1984-02-01

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  17. Soil Water Retention Curve

    Science.gov (United States)

    Johnson, L. E.; Kim, J.; Cifelli, R.; Chandra, C. V.

    2016-12-01

    Potential water retention, S, is one of parameters commonly used in hydrologic modeling for soil moisture accounting. Physically, S indicates total amount of water which can be stored in soil and is expressed in units of depth. S can be represented as a change of soil moisture content and in this context is commonly used to estimate direct runoff, especially in the Soil Conservation Service (SCS) curve number (CN) method. Generally, the lumped and the distributed hydrologic models can easily use the SCS-CN method to estimate direct runoff. Changes in potential water retention have been used in previous SCS-CN studies; however, these studies have focused on long-term hydrologic simulations where S is allowed to vary at the daily time scale. While useful for hydrologic events that span multiple days, the resolution is too coarse for short-term applications such as flash flood events where S may not recover its full potential. In this study, a new method for estimating a time-variable potential water retention at hourly time-scales is presented. The methodology is applied for the Napa River basin, California. The streamflow gage at St Helena, located in the upper reaches of the basin, is used as the control gage site to evaluate the model performance as it is has minimal influences by reservoirs and diversions. Rainfall events from 2011 to 2012 are used for estimating the event-based SCS CN to transfer to S. As a result, we have derived the potential water retention curve and it is classified into three sections depending on the relative change in S. The first is a negative slope section arising from the difference in the rate of moving water through the soil column, the second is a zero change section representing the initial recovery the potential water retention, and the third is a positive change section representing the full recovery of the potential water retention. Also, we found that the soil water moving has traffic jam within 24 hours after finished first

  18. Heart failure - tests

    Science.gov (United States)

    CHF - tests; Congestive heart failure - tests; Cardiomyopathy - tests; HF - tests ... the best test to: Identify which type of heart failure (systolic, diastolic, valvular) Monitor your heart failure and ...

  19. Low-Probability High-Consequence (LPHC) Failure Events in Geologic Carbon Sequestration Pipelines and Wells: Framework for LPHC Risk Assessment Incorporating Spatial Variability of Risk

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Budnitz, Robert J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-31

    If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO2 annually, with the CO2 delivered to many thousands of wells that will inject the CO2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelines are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of

  20. Design of a multicentre randomized controlled trial to assess the safety and efficacy of dose titration by specialized nurses in patients with heart failure. ETIFIC study protocol.

    Science.gov (United States)

    Oyanguren, Juana; García-Garrido, LLuisa; Nebot Margalef, Magdalena; Lekuona, Iñaki; Comin-Colet, Josep; Manito, Nicolás; Roure, Julia; Ruiz Rodriguez, Pilar; Enjuanes, Cristina; Latorre, Pedro; Torcal Laguna, Jesús; García-Gutiérrez, Susana

    2017-11-01

    Heart failure (HF) is associated with many hospital admissions and relatively high mortality, rates decreasing with administration of beta-blockers (BBs), angiotensin-converting-enzyme inhibitors, angiotensin II receptor blockers, and mineralocorticoid receptor antagonists. The effect is dose dependent, suboptimal doses being common in clinical practice. The 2012 European guidelines recommend close monitoring and dose titration by HF nurses. Our main aim is to compare BB doses achieved by patients after 4 months in intervention (HF nurse-managed) and control (cardiologist-managed) groups. Secondary aims include comparing doses of the other aforementioned drugs achieved after 4 months, adverse events, and outcomes at 6 months in the two groups. We have designed a multicentre (20 hospitals) non-inferiority randomized controlled trial, including patients with new-onset HF, left ventricular ejection fraction ≤40%, and New York Heart Association class II-III, with no contraindications to BBs. We will also conduct qualitative analysis to explore potential barriers to and facilitators of dose titration by HF nurses. In the intervention group, HF nurses will implement titration as prescribed by cardiologists, following a protocol. In controls, cardiologists will both prescribe and titrate doses. The study variables are doses of each of the drugs after 4 months relative to the target dose (%), New York Heart Association class, left ventricular ejection fraction, N-terminal pro B-type natriuretic peptide levels, 6 min walk distance, comorbidities, renal function, readmissions, mortality, quality of life, and psychosocial characteristics. The trial seeks to assess whether titration by HF nurses of drugs recommended in practice guidelines is safe and not inferior to direct management by cardiologists. The results could have an impact on clinical practice. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of

  1. Assessment of vasodilator therapy in patients with severe congestive heart failure: limitations of measurements of left ventricular ejection fraction and volumes

    International Nuclear Information System (INIS)

    Firth, B.G.; Dehmer, G.J.; Markham, R.V. Jr.; Willerson, J.T.; Hillis, L.D.

    1982-01-01

    Although noninvasive techniques are often used to assess the effect of vasodilator therapy in patients with congestive heart failure, it is unknown whether changes in noninvasively determined left ventricular ejection fraction, volume, or dimension reliably reflect alterations in intracardiac pressure and flow. Accordingly, we compared the acute effect of sodium nitroprusside on left ventricular volume and ejection fraction (determined scintigraphically) with its effect on intracardiac pressure and forward cardiac index (determined by thermodilution) in 12 patients with severe, chronic congestive heart failure and a markedly dilated left ventricle. Nitroprusside (infused at 1.3 +/- 1.1 [mean +/- standard deviation] microgram/kg/min) caused a decrease in mean systemic arterial, mean pulmonary arterial, and mean pulmonary capillary wedge pressure as well as a concomitant increase in forward cardiac index. Simultaneously, left ventricular end-diastolic and end-systolic volume indexes decreased, but the scintigraphically determined cardiac index did not change significantly. Left ventricular ejection fraction averaged 0.19 +/- 0.05 before nitroprusside administration and increased by less than 0.05 units in response to nitroprusside in 11 of 12 patients. The only significant correlation between scintigraphically and invasively determined variables was that between the percent change in end-diastolic volume index and the percent change in pulmonary capillary wedge pressure (r . 0.68, p . 0.01). Although nitroprusside produced changes in scintigraphically determined left ventricular ejection fraction, end-systolic volume index, and cardiac index, these alterations bore no predictable relation to changes in intracardiac pressure, forward cardiac index, or vascular resistance. Furthermore, nitroprusside produced a considerably greater percent change in the invasively measured variables than in the scintigraphically determined ones

  2. Comparison of performance of various tumour response criteria in assessment of regorafenib activity in advanced gastrointestinal stromal tumours after failure of imatinib and sunitinib.

    Science.gov (United States)

    Shinagare, Atul B; Jagannathan, Jyothi P; Kurra, Vikram; Urban, Trinity; Manola, Judith; Choy, Edwin; Demetri, George D; George, Suzanne; Ramaiya, Nikhil H

    2014-03-01

    To compare performance of various tumour response criteria (TRCs) in assessment of regorafenib activity in patients with advanced gastrointestinal stromal tumour (GIST) with prior failure of imatinib and sunitinib. Twenty participants in a phase II trial received oral regorafenib (median duration 47 weeks; interquartile range (IQR) 24-88) with computed tomography (CT) imaging at baseline and every two months thereafter. Tumour response was prospectively determined on using Response Evaluation Criteria in Solid Tumours (RECIST) 1.1, and retrospectively reassessed for comparison per RECIST 1.0, World Health Organization (WHO) and Choi criteria, using the same target lesions. Clinical benefit rate [CBR; complete or partial response (CR or PR) or stable disease (SD)≥16 weeks] and progression-free survival (PFS) were compared between various TRCs using kappa statistics. Performance of TRCs in predicting overall survival (OS) was compared by comparing OS in groups with progression-free intervals less than or greater than 20 weeks by each TRC using c-statistics. PR was more frequent by Choi (90%) than RECIST 1.1, RECIST 1.0 and WHO (20% each), however, CBR was similar between various TRCs (overall CBR 85-90%, 95-100% agreement between all TRC pairs). PFS per RECIST 1.0 was similar to RECIST 1.1 (median 44 weeks versus 58 weeks), and shorter for WHO (median 34 weeks) and Choi (median 24 weeks). With RECIST 1.1, RECIST 1.0 and WHO, there was moderate concordance between PFS and OS (c-statistics 0.596-0.679). Choi criteria had less favourable concordance (c-statistic 0.506). RECIST 1.1 and WHO performed somewhat better than Choi criteria as TRC for response evaluation in patients with advanced GIST after prior failure on imatinib and sunitinib. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Fermions in curved spacetimes

    Energy Technology Data Exchange (ETDEWEB)

    Lippoldt, Stefan

    2016-01-21

    In this thesis we study a formulation of Dirac fermions in curved spacetime that respects general coordinate invariance as well as invariance under local spin base transformations. We emphasize the advantages of the spin base invariant formalism both from a conceptual as well as from a practical viewpoint. This suggests that local spin base invariance should be added to the list of (effective) properties of (quantum) gravity theories. We find support for this viewpoint by the explicit construction of a global realization of the Clifford algebra on a 2-sphere which is impossible in the spin-base non-invariant vielbein formalism. The natural variables for this formulation are spacetime-dependent Dirac matrices subject to the Clifford-algebra constraint. In particular, a coframe, i.e. vielbein field is not required. We disclose the hidden spin base invariance of the vielbein formalism. Explicit formulas for the spin connection as a function of the Dirac matrices are found. This connection consists of a canonical part that is completely fixed in terms of the Dirac matrices and a free part that can be interpreted as spin torsion. The common Lorentz symmetric gauge for the vielbein is constructed for the Dirac matrices, even for metrics which are not linearly connected. Under certain criteria, it constitutes the simplest possible gauge, demonstrating why this gauge is so useful. Using the spin base formulation for building a field theory of quantized gravity and matter fields, we show that it suffices to quantize the metric and the matter fields. This observation is of particular relevance for field theory approaches to quantum gravity, as it can serve for a purely metric-based quantization scheme for gravity even in the presence of fermions. Hence, in the second part of this thesis we critically examine the gauge, and the field-parametrization dependence of renormalization group flows in the vicinity of non-Gaussian fixed points in quantum gravity. While physical

  4. Assessment of sustained effects of levosimendan and dobutamine on left ventricular systolic functions by using novel tissue Doppler derived indices in patients with advanced heart failure.

    Science.gov (United States)

    Oner, Ender; Erturk, Mehmet; Birant, Ali; Kurtar Mansıroglu, Aslı; Akturk, Ibrahim Faruk; Karakurt, Huseyin; Yalcin, Ahmet Arif; Uzun, Fatih; Somuncu, Mustafa Umut; Yildirim, Aydin

    2015-01-01

    Previous studies comparing levosimendan vs. dobutamine have revealed that levosimendan is better in relieving symptoms. Echocardiographic studies have been done using second measurements immediately following a dobutamine infusion or while it was still being administered. The aim of our study was assessment of sustained effects of 24 h levosimendan and dobutamine infusions on left ventricular systolic functions. A total of 61 patients with acutely decompensated heart failure with New York Heart Association (NYHA) class III or IV symptoms were randomized to receive either levosimendan or dobutamine 2:1 in an open label fashion. Before and 5 days after the initiation of infusions, functional class was assessed, N-terminal prohormone of B-type natriuretic peptide (NT-proBNP) levels and left ventricular ejection fraction (LVEF), mitral inflow peak E and A wave velocity, and E/A ratios were measured; using tissue Doppler imaging, isovolumic myocardial acceleration (IVA), peak myocardial velocity during isovolumic contraction (IVV), peak systolic velocity during ejection period (Sa), early (E') and late (A') diastolic velocities, and E'/A' and E/E' ratios were measured. The NYHA class improved in both groups, but improvements were prominent in the levosimendan group. NT-proBNP levels were significantly reduced in the levosimendan group. Improvements in LVEF and diastolic indices were significant in the levosimendan group. Tissue Doppler-derived systolic indices of IVV and IVA increased significantly in the levosimendan group. Improvements in left ventricular systolic and diastolic functions continue after a levosimendan infusion.

  5. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.

    1999-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)

  6. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1998-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)

  7. Nanowire failure: long = brittle and short = ductile.

    Science.gov (United States)

    Wu, Zhaoxuan; Zhang, Yong-Wei; Jhon, Mark H; Gao, Huajian; Srolovitz, David J

    2012-02-08

    Experimental studies of the tensile behavior of metallic nanowires show a wide range of failure modes, ranging from ductile necking to brittle/localized shear failure-often in the same diameter wires. We performed large-scale molecular dynamics simulations of copper nanowires with a range of nanowire lengths and provide unequivocal evidence for a transition in nanowire failure mode with change in nanowire length. Short nanowires fail via a ductile mode with serrated stress-strain curves, while long wires exhibit extreme shear localization and abrupt failure. We developed a simple model for predicting the critical nanowire length for this failure mode transition and showed that it is in excellent agreement with both the simulation results and the extant experimental data. The present results provide a new paradigm for the design of nanoscale mechanical systems that demarcates graceful and catastrophic failure. © 2012 American Chemical Society

  8. Diuretics for heart failure.

    Science.gov (United States)

    Faris, Rajaa F; Flather, Marcus; Purcell, Henry; Poole-Wilson, Philip A; Coats, Andrew J S

    2012-02-15

    Chronic heart failure is a major cause of morbidity and mortality worldwide. Diuretics are regarded as the first-line treatment for patients with congestive heart failure since they provide symptomatic relief. The effects of diuretics on disease progression and survival remain unclear. To assess the harms and benefits of diuretics for chronic heart failure Updated searches were run in the Cochrane Central Register of Controlled Trials in The Cochrane Library (CENTRAL Issue 1 of 4, 2011), MEDLINE (1966 to 22 February 2011), EMBASE (1980 to 2011 Week 07) and HERDIN database (1990 to February 2011). We hand searched pertinent journals and reference lists of papers were inspected. We also contacted manufacturers and researchers in the field. No language restrictions were applied. Double-blinded randomised controlled trials of diuretic therapy comparing one diuretic with placebo, or one diuretic with another active agent (e.g. ACE inhibitors, digoxin) in patients with chronic heart failure. Two authors independently abstracted the data and assessed the eligibility and methodological quality of each trial. Extracted data were analysed by determining the odds ratio for dichotomous data, and difference in means for continuous data, of the treated group compared with controls. The likelihood of heterogeneity of the study population was assessed by the Chi-square test. If there was no evidence of statistical heterogeneity and pooling of results was clinically appropriate, a combined estimate was obtained using the fixed-effects model. This update has not identified any new studies for inclusion. The review includes 14 trials (525 participants), 7 were placebo-controlled, and 7 compared diuretics against other agents such as ACE inhibitors or digoxin. We analysed the data for mortality and for worsening heart failure. Mortality data were available in 3 of the placebo-controlled trials (202 participants). Mortality was lower for participants treated with diuretics than for

  9. Microvascular Anastomosis: Proposition of a Learning Curve.

    Science.gov (United States)

    Mokhtari, Pooneh; Tayebi Meybodi, Ali; Benet, Arnau; Lawton, Michael T

    2018-04-14

    Learning to perform a microvascular anastomosis is one of the most difficult tasks in cerebrovascular surgery. Previous studies offer little regarding the optimal protocols to maximize learning efficiency. This failure stems mainly from lack of knowledge about the learning curve of this task. To delineate this learning curve and provide information about its various features including acquisition, improvement, consistency, stability, and recall. Five neurosurgeons with an average surgical experience history of 5 yr and without any experience in bypass surgery performed microscopic anastomosis on progressively smaller-caliber silastic tubes (Biomet, Palm Beach Gardens, Florida) during 24 consecutive sessions. After a 1-, 2-, and 8-wk retention interval, they performed recall test on 0.7-mm silastic tubes. The anastomoses were rated based on anastomosis patency and presence of any leaks. Improvement rate was faster during initial sessions compared to the final practice sessions. Performance decline was observed in the first session of working on a smaller-caliber tube. However, this rapidly improved during the following sessions of practice. Temporary plateaus were seen in certain segments of the curve. The retention interval between the acquisition and recall phase did not cause a regression to the prepractice performance level. Learning the fine motor task of microvascular anastomosis adapts to the basic rules of learning such as the "power law of practice." Our results also support the improvement of performance during consecutive sessions of practice. The objective evidence provided may help in developing optimized learning protocols for microvascular anastomosis.

  10. Advanced composites structural concepts and materials technologies for primary aircraft structures: Structural response and failure analysis

    Science.gov (United States)

    Dorris, William J.; Hairr, John W.; Huang, Jui-Tien; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    Non-linear analysis methods were adapted and incorporated in a finite element based DIAL code. These methods are necessary to evaluate the global response of a stiffened structure under combined in-plane and out-of-plane loading. These methods include the Arc Length method and target point analysis procedure. A new interface material model was implemented that can model elastic-plastic behavior of the bond adhesive. Direct application of this method is in skin/stiffener interface failure assessment. Addition of the AML (angle minus longitudinal or load) failure procedure and Hasin's failure criteria provides added capability in the failure predictions. Interactive Stiffened Panel Analysis modules were developed as interactive pre-and post-processors. Each module provides the means of performing self-initiated finite elements based analysis of primary structures such as a flat or curved stiffened panel; a corrugated flat sandwich panel; and a curved geodesic fuselage panel. This module brings finite element analysis into the design of composite structures without the requirement for the user to know much about the techniques and procedures needed to actually perform a finite element analysis from scratch. An interactive finite element code was developed to predict bolted joint strength considering material and geometrical non-linearity. The developed method conducts an ultimate strength failure analysis using a set of material degradation models.

  11. Heart failure - home monitoring

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/patientinstructions/000113.htm Heart failure - home monitoring To use the sharing features on ... your high blood pressure Fast food tips Heart failure - discharge Heart failure - fluids and diuretics Heart failure - what to ...

  12. Prospective Assessment of Patterns of Failure After High-Precision Definitive (Chemo)Radiation in Head-and-Neck Squamous Cell Carcinoma

    International Nuclear Information System (INIS)

    Gupta, Tejpal; Jain, Sandeep; Agarwal, Jai Prakash; Ghosh-Laskar, Sarbani; Phurailatpam, Reena; Pai-Shetty, Rajershi; Dinshaw, Ketayun A.

    2011-01-01

    Purpose: To prospectively analyze patterns of failure in patients with head-and-neck squamous cell carcinoma treated with definitive high-precision radiotherapy with a focus on location of failure relative to target volume coverage. Methods and Materials: Sixty patients treated with three-dimensional conformal radiotherapy or intensity-modulated radiation therapy were included. Locoregional failure volume was defined on the planning data set at relapse, and dose received was analyzed by use of dose-volume histograms. Results: Thirteen patients were deemed to have had locoregional failures, of which two did not have any viable tumor on salvage neck dissection, leaving eleven patients with proven persistent or recurrent locoregional disease. Of these, 9 patients had in-field failure, 1 marginal failure, and 1 both in-field and marginal failures. Overall, only 2 of 11 patients (18%) with relapse had any marginal failure. Of the 20 sites of locoregional failure, 15 (75%) were in-field and 5 (25%) marginal. Distant metastases were detected in 3 patients, whereas a second new primary developed in 3 others. With a median follow-up of 26 months (interquartile range, 18-31 months) for surviving patients, the 3-year local control, locoregional control, disease-free survival, and overall survival rates were 75.3%, 74%, 67.2%, and 60.5%, respectively. Conclusions: Locoregional relapse remains the predominant pattern of failure in head-and-neck squamous cell carcinoma treated with high-precision definitive radiotherapy with the majority of failures occurring 'in-field' within the high-dose volume. Marginal failures can occur, particularly in the vicinity of the spared parotid gland. The therapeutic index of high-precision conformal radiotherapy is largely dependent on adequate selection and delineation of target volumes and organs at risk.

  13. The role of post-failure brittleness of soft rocks in the assessment of stability of intact masses: FDEM technique applications to ideal problems

    Science.gov (United States)

    Lollino, Piernicola; Andriani, Gioacchino Francesco; Fazio, Nunzio Luciano; Perrotti, Michele

    2016-04-01

    Strain-softening under low confinement stress, i.e. the drop of strength that occurs in the post-failure stage, represents a key factor of the stress-strain behavior of rocks. However, this feature of the rock behavior is generally underestimated or even neglected in the assessment of boundary value problems of intact soft rock masses. This is typically the case when the stability of intact rock masses is treated by means of limit equilibrium or finite element analyses, for which rigid-plastic or elastic perfectly-plastic constitutive models, generally implementing peak strength conditions of the rock, are respectively used. In fact, the aforementioned numerical techniques are characterized by intrinsic limitations that do not allow to account for material brittleness, either for the method assumptions or due to numerical stability problems, as for the case of the finite element method, unless sophisticated regularization techniques are implemented. However, for those problems that concern the stability of intact soft rock masses at low stress levels, as for example the stability of shallow underground caves or that of rock slopes, the brittle stress-strain response of rock in the post-failure stage cannot be disregarded due to the risk of overestimation of the stability factor. This work is aimed at highlighting the role of post-peak brittleness of soft rocks in the analysis of specific ideal problems by means of the use of a hybrid finite-discrete element technique (FDEM) that allows for the simulation of the rock stress-strain brittle behavior in a proper way. In particular, the stability of two ideal cases, represented by a shallow underground rectangular cave and a vertical cliff, has been analyzed by implementing a post-peak brittle behavior of the rock and the comparison with a non-brittle response of the rock mass is also explored. To this purpose, the mechanical behavior of a soft calcarenite belonging to the Calcarenite di Gravina formation, extensively

  14. Intricate Assessment and Evaluation of Effect of Bruxism on Long-term Survival and Failure of Dental Implants: A Comparative Study.

    Science.gov (United States)

    Yadav, Kajal; Nagpal, Abhishek; Agarwal, S K; Kochhar, Aarti

    2016-08-01

    Dental implants are one of the common lines of treatment used for the treatment of missing tooth. Various risk factors are responsible for the failure of the dental implants and occurrence of postoperative complications. Bruxism is one such factor responsible for the failure of the dental implants. The actual relation between bruxism and dental implants is a subject of long-term controversy. Hence, we carried out this retrospective analysis to assess the complications occurring in dental implants in patients with and without bruxism. The present study included 1100 patients which were treated for rehabilitation by dental implant procedure at 21 dental offices of Ghaziabad (India) from 2004 to 2014. Analyzing the clinical records of the patients along with assessing the photographs of the patients was done for confirming the diagnosis of bruxism. Clinical re-evaluation of the patients, who came back for follow-up, was done to confirm the diagnosis of bruxism. Systemic questionnaires as used by previous workers were used to evaluate the patients about the self-conscience of the condition. Estimation of the mechanical complications was done only in those cases which occurred on the surfaces of the restoration of the dental implants. All the results were analyzed by Statistical Package for Social Sciences (SPSS) software. Student's t-test and Pearson's chi-square test were used to evaluate the level of significance. In both bruxer and non-bruxers, maximum number of dental implants was placed in anterior maxillary region. Significant difference was obtained while comparing the two groups for dimensions of the dental implants used. On comparing the total implant failed cases between bruxers and non-bruxers group, statistically significant result was obtained. Statistically significant difference was obtained while comparing the two study groups based on the health parameters, namely hypertension, diabetes, and smoking habit. Success of dental implant is significantly

  15. Models of genus one curves

    OpenAIRE

    Sadek, Mohammad

    2010-01-01

    In this thesis we give insight into the minimisation problem of genus one curves defined by equations other than Weierstrass equations. We are interested in genus one curves given as double covers of P1, plane cubics, or complete intersections of two quadrics in P3. By minimising such a curve we mean making the invariants associated to its defining equations as small as possible using a suitable change of coordinates. We study the non-uniqueness of minimisations of the genus one curves des...

  16. Determination of sieve grading curves using an optical device

    OpenAIRE

    PHAM, AM; DESCANTES, Yannick; DE LARRARD, François

    2011-01-01

    The grading curve of an aggregate is a fundamental characteristic for mix design that can easily be modified to adjust several mix properties. While sieve analysis remains the reference method to determine this curve, optical devices are developing, allowing easier and faster assessment of aggregate grading. Unfortunately, optical grading results significantly differ from sieve grading curves. As a consequence, getting full acceptance of these new methods requires building bridges between the...

  17. Comparison of regional versus global assessment of left ventricular function in patients with left ventricular dysfunction, heart failure, or both after myocardial infarction: the valsartan in acute myocardial infarction echocardiographic study

    DEFF Research Database (Denmark)

    Thune, Jens Jakob; Køber, Lars; Pfeffer, Marc A

    2006-01-01

    or complementary information about prognosis after MI. METHODS: Echocardiography was performed in 610 patients with LV dysfunction, heart failure, or both after MI enrolled in the Valsartan in Acute MI trial. LVEF was estimated by biplane Simpson's rule, and WMI was assessed using a 16-segment model in 502...

  18. Assessment of the Successes and Failures of Decentralized Energy Solutions and Implications for the Water–Energy–Food Security Nexus: Case Studies from Developing Countries

    Directory of Open Access Journals (Sweden)

    Dawit Diriba Guta

    2017-06-01

    Full Text Available Access to reliable and affordable energy is vital for sustainable development. In the off-grid areas of developing countries, decentralized energy solutions have received increasing attention due to their contributions to reducing poverty. However, most of the rural population in many developing countries still has little or no access to modern energy technologies. This paper assesses the factors that determine the successes and failures of decentralized energy solutions based on local harmonized case studies from heterogeneous contexts from Asia, sub-Saharan Africa, and South America. The case studies were analyzed through the coupled lenses of energy transition and the Water–Energy–Food Security (WEF Nexus. The findings indicate that access to modern decentralized energy solutions has not resulted in complete energy transitions due to various tradeoffs with the other domains of the WEF Nexus. On the other hand, the case studies point at the potential for improvements in food security, incomes, health, the empowerment of women, and resource conservation when synergies between decentralized energy solutions and other components of the WEF Nexus are present.

  19. Nuclear cardiology and heart failure

    International Nuclear Information System (INIS)

    Giubbini, Raffaele; Bertagna, Francesco; Milan, Elisa; Mut, Fernando; Dondi, Maurizio; Metra, Marco; Rodella, Carlo

    2009-01-01

    The prevalence of heart failure in the adult population is increasing. It varies between 1% and 2%, although it mainly affects elderly people (6-10% of people over the age of 65 years will develop heart failure). The syndrome of heart failure arises as a consequence of an abnormality in cardiac structure, function, rhythm, or conduction. Coronary artery disease is the leading cause of heart failure and it accounts for this disorder in 60-70% of all patients affected. Nuclear techniques provide unique information on left ventricular function and perfusion by gated-single photon emission tomography (SPECT). Myocardial viability can be assessed by both SPECT and PET imaging. Finally, autonomic dysfunction has been shown to increase the risk of death in patients with heart disease and this may be applicable to all patients with cardiac disease regardless of aetiology. MIBG scanning has a very promising prognostic value in patients with heart failure. (orig.)

  20. Nuclear cardiology and heart failure

    Energy Technology Data Exchange (ETDEWEB)

    Giubbini, Raffaele; Bertagna, Francesco [University of Brescia, Department of Nuclear Medicine, Brescia (Italy); Milan, Elisa [Ospedale Di Castelfranco Veneto, Nuclear Medicine Unit, Castelfranco Veneto (Italy); Mut, Fernando; Dondi, Maurizio [International Atomic Energy Agency, Nuclear Medicine Section, Division of Human Health, Vienna (Austria); Metra, Marco [University of Brescia, Department of Cardiology, Brescia (Italy); Rodella, Carlo [Health Physics Department, Spedali Civili di Brescia, Brescia (Italy)

    2009-12-15

    The prevalence of heart failure in the adult population is increasing. It varies between 1% and 2%, although it mainly affects elderly people (6-10% of people over the age of 65 years will develop heart failure). The syndrome of heart failure arises as a consequence of an abnormality in cardiac structure, function, rhythm, or conduction. Coronary artery disease is the leading cause of heart failure and it accounts for this disorder in 60-70% of all patients affected. Nuclear techniques provide unique information on left ventricular function and perfusion by gated-single photon emission tomography (SPECT). Myocardial viability can be assessed by both SPECT and PET imaging. Finally, autonomic dysfunction has been shown to increase the risk of death in patients with heart disease and this may be applicable to all patients with cardiac disease regardless of aetiology. MIBG scanning has a very promising prognostic value in patients with heart failure. (orig.)

  1. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  2. ASSESSMENT OF THE CHANGES IN BLOOD PRESSURE CIRCADIAN PROFILE AND VARIABILITY IN PATIENTS WITH CHRONIC HEART FAILURE AND ARTERIAL HYPERTENSION DURING COMBINED THERAPY INCLUDING IVABRADINE

    Directory of Open Access Journals (Sweden)

    M. V. Surovtseva

    2012-01-01

    Full Text Available Aim. To assess the changes in blood pressure (BP circadian profile and variability in patients with chronic heart failure (CHF of ischemic etiology and arterial hypertension (HT due to the complex therapy including ivabradine. Material and methods. Patients (n=90 with CHF class II–III NYHA associated with stable angina II-III class and HT were examined. The patients were randomized into 3 groups depending on received drugs: perindopril and ivabradine - group 1; perindopril, bisoprolol and ivabradine - group 2; perindopril and bisoprolol - group 3. The duration of therapy was 6 months. Ambulatory BP monitoring (ABPM was assessed at baseline and after treatment. Results. More significant reduction in average 24-hours systolic BP was found in groups 1 and 2 compared to group 3 (Δ%: -19.4±0,4; -21.1±0.4 and -11.8±0.6, respectively as well as diastolic BP (Δ%: -10.6±0.6; -12.9±0.4 and -4,3±0.3, respectively and other ABPM indicators. Improvement of BP circadian rhythm was found due to increase in the number of «Dipper» patients (p=0.016. More significant reduction in average daily and night systolic and diastolic BP (p=0.001, as well as daily and night BP variability (p=0.001 was also found in patients of group 2 compared to these of group 1. Conclusion. Moderate antihypertensive effect (in respect of both diastolic and systolic BP was shown when ivabradine was included into the complex therapy of patients with ischemic CHF and HT. The effect was more pronounced when ivabradine was combined with perindopril and bisoprolol. This was accompanied by reduction in high BP daily variability and improvement of the BP circadian rhythm. 

  3. ASSESSMENT OF THE CHANGES IN BLOOD PRESSURE CIRCADIAN PROFILE AND VARIABILITY IN PATIENTS WITH CHRONIC HEART FAILURE AND ARTERIAL HYPERTENSION DURING COMBINED THERAPY INCLUDING IVABRADINE

    Directory of Open Access Journals (Sweden)

    M. V. Surovtseva

    2015-12-01

    Full Text Available Aim. To assess the changes in blood pressure (BP circadian profile and variability in patients with chronic heart failure (CHF of ischemic etiology and arterial hypertension (HT due to the complex therapy including ivabradine. Material and methods. Patients (n=90 with CHF class II–III NYHA associated with stable angina II-III class and HT were examined. The patients were randomized into 3 groups depending on received drugs: perindopril and ivabradine - group 1; perindopril, bisoprolol and ivabradine - group 2; perindopril and bisoprolol - group 3. The duration of therapy was 6 months. Ambulatory BP monitoring (ABPM was assessed at baseline and after treatment. Results. More significant reduction in average 24-hours systolic BP was found in groups 1 and 2 compared to group 3 (Δ%: -19.4±0,4; -21.1±0.4 and -11.8±0.6, respectively as well as diastolic BP (Δ%: -10.6±0.6; -12.9±0.4 and -4,3±0.3, respectively and other ABPM indicators. Improvement of BP circadian rhythm was found due to increase in the number of «Dipper» patients (p=0.016. More significant reduction in average daily and night systolic and diastolic BP (p=0.001, as well as daily and night BP variability (p=0.001 was also found in patients of group 2 compared to these of group 1. Conclusion. Moderate antihypertensive effect (in respect of both diastolic and systolic BP was shown when ivabradine was included into the complex therapy of patients with ischemic CHF and HT. The effect was more pronounced when ivabradine was combined with perindopril and bisoprolol. This was accompanied by reduction in high BP daily variability and improvement of the BP circadian rhythm. 

  4. Clinical relationship of myocardial sympathetic nervous activity to cardiovascular functions in chronic heart failure. Assessment by myocardial scintigraphy with 123I-metaiodobenzylguanidine

    International Nuclear Information System (INIS)

    Wada, Yukoh; Miura, Masaetsu; Fujiwara, Satomi; Mori, Shunpei; Seiji, Kazumasa; Kimura, Tokihisa

    2003-01-01

    The aim of this study was to clarify the relationship between cardiac sympathetic nervous activity (SNA) assessed by radioiodinated metaiodobenzylguanidine ( 123 I-MIBG), an analogue of norepinephrine and cardiovascular functions in patients with chronic heart failure (CHF). Subjects were 17 patients with CHF. A dose of 111 MBq of 123 I-MIBG was administered intravenously, and 5-minute anterior planar images were obtained 15 minutes (early image) and 3 hours (delayed image) after the injection. The heart/mediastinum (H/M) count ratio was defined to quantify cardiac 123 I-MIBG uptake. The washout ratio (WR) of 123 I-MIBG from the heart was calculated as follows: (early counts-delayed counts)/early counts x 100 (%). Echocardiography was performed on all patients within 1 week of 123 I-MIBG scintigraphy to measure stroke volume index (SVI). Blood pressure and heart rate (HR) in the resting state were also recorded to calculate cardiovascular functions including cardiac output, pulse pressure (PP), and mean blood pressure. Significant linear correlations were found between the early H/M ratio of 123 I-MIBG and SVI, and between the delayed H/M ratio of 123 I-MIBG and SVI, respectively. WR of 123 I-MIBG was correlated with HR, and was inversely correlated with SVI and with PP, respectively. It is likely that a decrease in SVI is associated with enhanced cardiac SNA in severe CHF. 123 I-MIBG scintigraphy is effective in assessing the cardiac functional status and SNA in patients with CHF in vivo. Moreover, changes in PP and HR indicate well alteration in SNA. (author)

  5. Quantum fields in curved space

    International Nuclear Information System (INIS)

    Birrell, N.D.; Davies, P.C.W.

    1982-01-01

    The book presents a comprehensive review of the subject of gravitational effects in quantum field theory. Quantum field theory in Minkowski space, quantum field theory in curved spacetime, flat spacetime examples, curved spacetime examples, stress-tensor renormalization, applications of renormalization techniques, quantum black holes and interacting fields are all discussed in detail. (U.K.)

  6. Impact of Different Surgeons on Dental Implant Failure.

    Science.gov (United States)

    Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    To assess the influence of several factors on the prevalence of dental implant failure, with special consideration of the placement of implants by different dental surgeons. This retrospective study is based on 2,670 patients who received 10,096 implants at one specialist clinic. Only the data of patients and implants treated by surgeons who had inserted a minimum of 200 implants at the clinic were included. Kaplan-Meier curves were stratified with respect to the individual surgeon. A generalized estimating equation (GEE) method was used to account for the fact that repeated observations (several implants) were placed in a single patient. The factors bone quantity, bone quality, implant location, implant surface, and implant system were analyzed with descriptive statistics separately for each individual surgeon. A total of 10 surgeons were eligible. The differences between the survival curves of each individual were statistically significant. The multivariate GEE model showed the following variables to be statistically significant: surgeon, bruxism, intake of antidepressants, location, implant length, and implant system. The surgeon with the highest absolute number of failures was also the one who inserted the most implants in sites of poor bone and used turned implants in most cases, whereas the surgeon with the lowest absolute number of failures used mainly modern implants. Separate survival analyses of turned and modern implants stratified for the individual surgeon showed statistically significant differences in cumulative survival. Different levels of failure incidence could be observed between the surgeons, occasionally reaching significant levels. Although a direct causal relationship could not be ascertained, the results of the present study suggest that the surgeons' technique, skills, and/or judgment may negatively influence implant survival rates.

  7. Resiliency of the Nation's Power Grid: Assessing Risks of Premature Failure of Large Power Transformers Under Climate Warming and Increased Heat Waves

    Science.gov (United States)

    Schlosser, C. A.; Gao, X.; Morgan, E.

    2017-12-01

    The aging pieces of our nation's power grid - the largest machine ever built - are at a critical time. Key assets in the transmission system, including large power transformers (LPTs), are approaching their originally designed lifetimes. Moreover, extreme weather and climate events upon which these design lifetimes are partially based are expected to change. In particular, more frequent and intense heat waves can accelerate the degradation of LPTs' insulation/cooling system. Thus, there are likely thousands of LPTs across the United States under increasing risk of premature failure - yet this risk has not been assessed. In this study, we investigate the impact of climate warming and corresponding shifts in heat waves for critical LPTs located in the Northeast corridor of the United States to assess: To what extent do changes in heat waves/events present a rising threat to the transformer network over the Northeast U.S. and to what extent can climate mitigation reduce this risk? This study focuses on a collection of LPTs with a high degree of "betweenness" - while recognizing other factors such as: connectivity, voltage rating, MVA rating, approximate price, weight, location/proximity to major transportation routes, and age. To assess the risk of future change in heat wave occurrence we use an analogue method, which detects the occurrence of heat waves based on associated large-scale atmospheric conditions. This method is compared to the more conventional approach that uses model-simulated daily maximum temperature. Under future climate warming scenarios, multi-model medians of both methods indicate strong increases in heat wave frequency during the latter half of this century. Under weak climate mitigation - the risks imposed from heat wave occurrence could quadruple, but a modest mitigation scenario cuts the increasing threat in half. As important, the analogue method substantially improves the model consensus through reduction of the interquartile range by a

  8. Extended analysis of cooling curves

    International Nuclear Information System (INIS)

    Djurdjevic, M.B.; Kierkus, W.T.; Liliac, R.E.; Sokolowski, J.H.

    2002-01-01

    Thermal Analysis (TA) is the measurement of changes in a physical property of a material that is heated through a phase transformation temperature range. The temperature changes in the material are recorded as a function of the heating or cooling time in such a manner that allows for the detection of phase transformations. In order to increase accuracy, characteristic points on the cooling curve have been identified using the first derivative curve plotted versus time. In this paper, an alternative approach to the analysis of the cooling curve has been proposed. The first derivative curve has been plotted versus temperature and all characteristic points have been identified with the same accuracy achieved using the traditional method. The new cooling curve analysis also enables the Dendrite Coherency Point (DCP) to be detected using only one thermocouple. (author)

  9. Elimination of chromatographic and mass spectrometric problems in GC-MS analysis of Lavender essential oil by multivariate curve resolution techniques: Improving the peak purity assessment by variable size moving window-evolving factor analysis.

    Science.gov (United States)

    Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan

    2015-03-01

    In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.

  10. Pathophysiological Characteristics Underlying Different Glucose Response Curves

    DEFF Research Database (Denmark)

    Hulman, Adam; Witte, Daniel R; Vistisen, Dorte

    2018-01-01

    different glucose curve patterns and studied their stability and reproducibility over 3 years of follow-up. RESEARCH DESIGN AND METHODS: We analyzed data from participants without diabetes from the observational cohort from the European Group for the Study of Insulin Resistance: Relationship between Insulin...... and secretion. The glucose patterns identified at follow-up were similar to those at baseline, suggesting that the latent class method is robust. We integrated our classification model into an easy-to-use online application that facilitates the assessment of glucose curve patterns for other studies. CONCLUSIONS...... Sensitivity and Cardiovascular Disease study; participants had a five-time point OGTT at baseline (n = 1,443) and after 3 years (n = 1,045). Measures of insulin sensitivity and secretion were assessed at baseline with a euglycemic-hyperinsulinemic clamp and intravenous glucose tolerance test. Heterogeneous...

  11. IDF-curves for precipitation In Belgium

    International Nuclear Information System (INIS)

    Mohymont, Bernard; Demarde, Gaston R.

    2004-01-01

    The Intensity-Duration-Frequency (IDF) curves for precipitation constitute a relationship between the intensity, the duration and the frequency of rainfall amounts. The intensity of precipitation is expressed in mm/h, the duration or aggregation time is the length of the interval considered while the frequency stands for the probability of occurrence of the event. IDF-curves constitute a classical and useful tool that is primarily used to dimension hydraulic structures in general, as e.g., sewer systems and which are consequently used to assess the risk of inundation. In this presentation, the IDF relation for precipitation is studied for different locations in Belgium. These locations correspond to two long-term, high-quality precipitation networks of the RMIB: (a) the daily precipitation depths of the climatological network (more than 200 stations, 1951-2001 baseline period); (b) the high-frequency 10-minutes precipitation depths of the hydro meteorological network (more than 30 stations, 15 to 33 years baseline period). For the station of Uccle, an uninterrupted time-series of more than one hundred years of 10-minutes rainfall data is available. The proposed technique for assessing the curves is based on maximum annual values of precipitation. A new analytical formula for the IDF-curves was developed such that these curves stay valid for aggregation times ranging from 10 minutes to 30 days (when fitted with appropriate data). Moreover, all parameters of this formula have physical dimensions. Finally, adequate spatial interpolation techniques are used to provide nationwide extreme values precipitation depths for short- to long-term durations With a given return period. These values are estimated on the grid points of the Belgian ALADIN-domain used in the operational weather forecasts at the RMIB.(Author)

  12. Modeling of Triangular Lattice Space Structures with Curved Battens

    Science.gov (United States)

    Chen, Tzikang; Wang, John T.

    2005-01-01

    Techniques for simulating an assembly process of lattice structures with curved battens were developed. The shape of the curved battens, the tension in the diagonals, and the compression in the battens were predicted for the assembled model. To be able to perform the assembly simulation, a cable-pulley element was implemented, and geometrically nonlinear finite element analyses were performed. Three types of finite element models were created from assembled lattice structures for studying the effects of design and modeling variations on the load carrying capability. Discrepancies in the predictions from these models were discussed. The effects of diagonal constraint failure were also studied.

  13. Investigating sensitivity, specificity, and area under the curve of the Clinical COPD Questionnaire, COPD Assessment Test, and Modified Medical Research Council scale according to GOLD using St George's Respiratory Questionnaire cutoff 25 (and 20 as reference

    Directory of Open Access Journals (Sweden)

    Tsiligianni IG

    2016-05-01

    Full Text Available Ioanna G Tsiligianni,1,2 Harma J Alma,1,2 Corina de Jong,1,2 Danijel Jelusic,3 Michael Wittmann,3 Michael Schuler,4 Konrad Schultz,3 Boudewijn J Kollen,1 Thys van der Molen,1,2 Janwillem WH Kocks1,2 1Department of General Practice, 2GRIAC Research Institute, University Medical Center Groningen, University of Groningen, Groningen, the Netherlands; 3Klinik Bad Reichenhall, Center for Rehabilitation, Pulmonology and Orthopedics, Bad Reichenhall, 4Department of Medical Psychology, Psychotherapy and Rehabilitation Sciences, University of Würzburg, Würzburg, Germany Background: In the GOLD (Global initiative for chronic Obstructive Lung Disease strategy document, the Clinical COPD Questionnaire (CCQ, COPD Assessment Test (CAT, or modified Medical Research Council (mMRC scale are recommended for the assessment of symptoms using the cutoff points of CCQ ≥1, CAT ≥10, and mMRC scale ≥2 to indicate symptomatic patients. The current study investigates the criterion validity of the CCQ, CAT and mMRC scale based on a reference cutoff point of St George’s Respiratory Questionnaire (SGRQ ≥25, as suggested by GOLD, following sensitivity and specificity analysis. In addition, areas under the curve (AUCs of the CCQ, CAT, and mMRC scale were compared using two SGRQ cutoff points (≥25 and ≥20.Materials and methods: Two data sets were used: study A, 238 patients from a pulmonary rehabilitation program; and study B, 101 patients from primary care. Receiver-operating characteristic (ROC curves were used to assess the correspondence between the recommended cutoff points of the questionnaires.Results: Sensitivity, specificity, and AUC scores for cutoff point SGRQ ≥25 were: study A, 0.99, 0.43, and 0.96 for CCQ ≥1, 0.92, 0.48, and 0.89 for CAT ≥10, and 0.68, 0.91, and 0.91 for mMRC ≥2; study B, 0.87, 0.77, and 0.9 for CCQ ≥1, 0.76, 0.73, and 0.82 for CAT ≥10, and 0.21, 1, and 0.81 for mMRC ≥2. Sensitivity, specificity, and AUC scores for

  14. Investigating sensitivity, specificity, and area under the curve of the Clinical COPD Questionnaire, COPD Assessment Test, and Modified Medical Research Council scale according to GOLD using St George's Respiratory Questionnaire cutoff 25 (and 20) as reference.

    Science.gov (United States)

    Tsiligianni, Ioanna G; Alma, Harma J; de Jong, Corina; Jelusic, Danijel; Wittmann, Michael; Schuler, Michael; Schultz, Konrad; Kollen, Boudewijn J; van der Molen, Thys; Kocks, Janwillem Wh

    2016-01-01

    In the GOLD (Global initiative for chronic Obstructive Lung Disease) strategy document, the Clinical COPD Questionnaire (CCQ), COPD Assessment Test (CAT), or modified Medical Research Council (mMRC) scale are recommended for the assessment of symptoms using the cutoff points of CCQ ≥1, CAT ≥10, and mMRC scale ≥2 to indicate symptomatic patients. The current study investigates the criterion validity of the CCQ, CAT and mMRC scale based on a reference cutoff point of St George's Respiratory Questionnaire (SGRQ) ≥25, as suggested by GOLD, following sensitivity and specificity analysis. In addition, areas under the curve (AUCs) of the CCQ, CAT, and mMRC scale were compared using two SGRQ cutoff points (≥25 and ≥20). Two data sets were used: study A, 238 patients from a pulmonary rehabilitation program; and study B, 101 patients from primary care. Receiver-operating characteristic (ROC) curves were used to assess the correspondence between the recommended cutoff points of the questionnaires. Sensitivity, specificity, and AUC scores for cutoff point SGRQ ≥25 were: study A, 0.99, 0.43, and 0.96 for CCQ ≥1, 0.92, 0.48, and 0.89 for CAT ≥10, and 0.68, 0.91, and 0.91 for mMRC ≥2; study B, 0.87, 0.77, and 0.9 for CCQ ≥1, 0.76, 0.73, and 0.82 for CAT ≥10, and 0.21, 1, and 0.81 for mMRC ≥2. Sensitivity, specificity, and AUC scores for cutoff point SGRQ ≥20 were: study A, 0.99, 0.73, and 0.99 for CCQ ≥1, 0.91, 0.73, and 0.94 for CAT ≥10, and 0.66, 0.95, and 0.94 for mMRC ≥2; study B, 0.8, 0.89, and 0.89 for CCQ ≥1, 0.69, 0.78, and 0.8 for CAT ≥10, and 0.18, 1, and 0.81 for mMRC ≥2. Based on data from these two different samples, this study showed that the suggested cutoff point for the SGRQ (≥25) did not seem to correspond well with the established cutoff points of the CCQ or CAT scales, resulting in low specificity levels. The correspondence with the mMRC scale seemed satisfactory, though not optimal. The SGRQ threshold of ≥20

  15. Computational aspects of algebraic curves

    CERN Document Server

    Shaska, Tanush

    2005-01-01

    The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove

  16. Assessing customers’ perception regarding service failure and recovery strategies and consumer future behavior in the restaurant industry ;Evidence from Mashhad, Iran

    OpenAIRE

    Memarbashi, Shahryar

    2012-01-01

    ABSTRACT: The main objective of this research is to conduct a study to analyse service failure categories and service recovery strategies used and future customer behaviorin the context of hotelrestaurants in Mashhad, Iran. Also to evaluate the impact of demographic characteristics of customers on service failure, service recovery and customer intention. The Thesis involves 300 respondents from Mashhad, Iran. The sampling unit is hotel restaurant customers and the data needed for the researc...

  17. The costs of heart failure in Poland from the public payer's perspective. Polish programme assessing diagnostic procedures, treatment and costs in patients with heart failure in randomly selected outpatient clinics and hospitals at different levels of care: POLKARD.

    Science.gov (United States)

    Czech, Marcin; Opolski, Grzegorz; Zdrojewski, Tomasz; Dubiel, Jacek S; Wizner, Barbara; Bolisęga, Dorota; Fedyk-Łukasik, Małgorzata; Grodzicki, Tomasz

    2013-01-01

    Heart failure (HF) is a chronic disease of great clinical and economic significance for both the healthcare system and patients themselves. To determine the consumption of medical resources for treatment and care of HF patients and to estimate the related costs. The study involved 400 primary care practices and 396 specialist outpatient clinics, as well as 259 hospitals at all reference levels. The sample was representative and supplemented with patient interview data. Based on the consumption of particular resources and the unit costs of services in 2011, costs of care for HF patients in Poland were estimated. Separate analyses were conducted depending on the stage of the disease (according to NYHA classification I-IV). The public payer's perspective and a one year time horizon were adopted. Direct annual costs of an HF patient's treatment in Poland may range between PLN 3,373.23 and 7,739.49 (2011), the main cost item being hospitalisation. The total costs for the healthcare system could be as high as PLN 1,703 million, which is 3.16% of the National Health Fund's budget (Ex. rate from 05.03.2012: 1 EUR = 4.14 PLN). The costs of treating heart failure in Poland are high; proper allocation of resources to diagnostic procedures and treatment may contribute to rationalisation of the relevant expenditure.

  18. 51Cr - erythrocyte survival curves

    International Nuclear Information System (INIS)

    Paiva Costa, J. de.

    1982-07-01

    Sixteen patients were studied, being fifteen patients in hemolytic state, and a normal individual as a witness. The aim was to obtain better techniques for the analysis of the erythrocytes, survival curves, according to the recommendations of the International Committee of Hematology. It was used the radiochromatic method as a tracer. Previously a revisional study of the International Literature was made in its aspects inherent to the work in execution, rendering possible to establish comparisons and clarify phonomena observed in cur investigation. Several parameters were considered in this study, hindering both the exponential and the linear curves. The analysis of the survival curves of the erythrocytes in the studied group, revealed that the elution factor did not present a homogeneous answer quantitatively to all, though, the result of the analysis of these curves have been established, through listed programs in the electronic calculator. (Author) [pt

  19. Melting curves of gammairradiated DNA

    International Nuclear Information System (INIS)

    Hofer, H.; Altmann, H.; Kehrer, M.

    1978-08-01

    Melting curves of gammairradiated DNA and data derived of them, are reported. The diminished stability is explained by basedestruction. DNA denatures completely at room temperature, if at least every fifth basepair is broken or weakened by irradiation. (author)

  20. Management of the learning curve

    DEFF Research Database (Denmark)

    Pedersen, Peter-Christian; Slepniov, Dmitrij

    2016-01-01

    Purpose – This paper focuses on the management of the learning curve in overseas capacity expansions. The purpose of this paper is to unravel the direct as well as indirect influences on the learning curve and to advance the understanding of how these affect its management. Design...... the dimensions of the learning process involved in a capacity expansion project and identified the direct and indirect labour influences on the production learning curve. On this basis, the study proposes solutions to managing learning curves in overseas capacity expansions. Furthermore, the paper concludes...... with measures that have the potential to significantly reduce the non-value-added time when establishing new capacities overseas. Originality/value – The paper uses a longitudinal in-depth case study of a Danish wind turbine manufacturer and goes beyond a simplistic treatment of the lead time and learning...

  1. Sex- and Site-Specific Normative Data Curves for HR-pQCT.

    Science.gov (United States)

    Burt, Lauren A; Liang, Zhiying; Sajobi, Tolulope T; Hanley, David A; Boyd, Steven K

    2016-11-01

    The purpose of this study was to develop age-, site-, and sex-specific centile curves for common high-resolution peripheral quantitative computed tomography (HR-pQCT) and finite-element (FE) parameters for males and females older than 16 years. Participants (n = 866) from the Calgary cohort of the Canadian Multicentre Osteoporosis Study (CaMos) between the ages of 16 and 98 years were included in this study. Participants' nondominant radius and left tibia were scanned using HR-pQCT. Standard and automated segmentation methods were performed and FE analysis estimated apparent bone strength. Centile curves were generated for males and females at the tibia and radius using the generalized additive models for location, scale, and shape (GAMLSS) package in R. After GAMLSS analysis, age-, sex-, and site-specific centiles (10th, 25th, 50th, 75th, 90th) for total bone mineral density and trabecular number as well as failure load have been calculated. Clinicians and researchers can use these reference curves as a tool to assess bone health and changes in bone quality. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  2. Growth curves for Laron syndrome.

    OpenAIRE

    Laron, Z; Lilos, P; Klinger, B

    1993-01-01

    Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls co...

  3. Flow over riblet curved surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Loureiro, J B R; Freire, A P Silva, E-mail: atila@mecanica.ufrj.br [Mechanical Engineering Program, Federal University of Rio de Janeiro (COPPE/UFRJ), C.P. 68503, 21.941-972, Rio de Janeiro, RJ (Brazil)

    2011-12-22

    The present work studies the mechanics of turbulent drag reduction over curved surfaces by riblets. The effects of surface modification on flow separation over steep and smooth curved surfaces are investigated. Four types of two-dimensional surfaces are studied based on the morphometric parameters that describe the body of a blue whale. Local measurements of mean velocity and turbulence profiles are obtained through laser Doppler anemometry (LDA) and particle image velocimetry (PIV).

  4. Association between Functional Variables and Heart Failure after Myocardial Infarction in Rats

    Energy Technology Data Exchange (ETDEWEB)

    Polegato, Bertha F.; Minicucci, Marcos F.; Azevedo, Paula S.; Gonçalves, Andréa F.; Lima, Aline F.; Martinez, Paula F.; Okoshi, Marina P.; Okoshi, Katashi; Paiva, Sergio A. R.; Zornoff, Leonardo A. M., E-mail: lzornoff@fmb.unesp.br [Faculdade de Medicina de Botucatu - Universidade Estadual Paulista ' Júlio de mesquita Filho' - UNESP Botucatu, SP (Brazil)

    2016-02-15

    Heart failure prediction after acute myocardial infarction may have important clinical implications. To analyze the functional echocardiographic variables associated with heart failure in an infarction model in rats. The animals were divided into two groups: control and infarction. Subsequently, the infarcted animals were divided into groups: with and without heart failure. The predictive values were assessed by logistic regression. The cutoff values predictive of heart failure were determined using ROC curves. Six months after surgery, 88 infarcted animals and 43 control animals were included in the study. Myocardial infarction increased left cavity diameters and the mass and wall thickness of the left ventricle. Additionally, myocardial infarction resulted in systolic and diastolic dysfunction, characterized by lower area variation fraction values, posterior wall shortening velocity, E-wave deceleration time, associated with higher values of E / A ratio and isovolumic relaxation time adjusted by heart rate. Among the infarcted animals, 54 (61%) developed heart failure. Rats with heart failure have higher left cavity mass index and diameter, associated with worsening of functional variables. The area variation fraction, the E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate were functional variables predictors of heart failure. The cutoff values of functional variables associated with heart failure were: area variation fraction < 31.18%; E / A > 3.077; E-wave deceleration time < 42.11 and isovolumic relaxation time adjusted by heart rate < 69.08. In rats followed for 6 months after myocardial infarction, the area variation fraction, E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate are predictors of heart failure onset.

  5. Association between Functional Variables and Heart Failure after Myocardial Infarction in Rats

    International Nuclear Information System (INIS)

    Polegato, Bertha F.; Minicucci, Marcos F.; Azevedo, Paula S.; Gonçalves, Andréa F.; Lima, Aline F.; Martinez, Paula F.; Okoshi, Marina P.; Okoshi, Katashi; Paiva, Sergio A. R.; Zornoff, Leonardo A. M.

    2016-01-01

    Heart failure prediction after acute myocardial infarction may have important clinical implications. To analyze the functional echocardiographic variables associated with heart failure in an infarction model in rats. The animals were divided into two groups: control and infarction. Subsequently, the infarcted animals were divided into groups: with and without heart failure. The predictive values were assessed by logistic regression. The cutoff values predictive of heart failure were determined using ROC curves. Six months after surgery, 88 infarcted animals and 43 control animals were included in the study. Myocardial infarction increased left cavity diameters and the mass and wall thickness of the left ventricle. Additionally, myocardial infarction resulted in systolic and diastolic dysfunction, characterized by lower area variation fraction values, posterior wall shortening velocity, E-wave deceleration time, associated with higher values of E / A ratio and isovolumic relaxation time adjusted by heart rate. Among the infarcted animals, 54 (61%) developed heart failure. Rats with heart failure have higher left cavity mass index and diameter, associated with worsening of functional variables. The area variation fraction, the E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate were functional variables predictors of heart failure. The cutoff values of functional variables associated with heart failure were: area variation fraction < 31.18%; E / A > 3.077; E-wave deceleration time < 42.11 and isovolumic relaxation time adjusted by heart rate < 69.08. In rats followed for 6 months after myocardial infarction, the area variation fraction, E/A ratio, E-wave deceleration time and isovolumic relaxation time adjusted by heart rate are predictors of heart failure onset

  6. Intersection numbers of spectral curves

    CERN Document Server

    Eynard, B.

    2011-01-01

    We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the Marino-Vafa formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV, Marino-Vafa formula, and Mumford formula.

  7. Dissolution glow curve in LLD

    International Nuclear Information System (INIS)

    Haverkamp, U.; Wiezorek, C.; Poetter, R.

    1990-01-01

    Lyoluminescence dosimetry is based upon light emission during dissolution of previously irradiated dosimetric materials. The lyoluminescence signal is expressed in the dissolution glow curve. These curves begin, depending on the dissolution system, with a high peak followed by an exponentially decreasing intensity. System parameters that influence the graph of the dissolution glow curve, are, for example, injection speed, temperature and pH value of the solution and the design of the dissolution cell. The initial peak does not significantly correlate with the absorbed dose, it is mainly an effect of the injection. The decay of the curve consists of two exponential components: one fast and one slow. The components depend on the absorbed dose and the dosimetric materials used. In particular, the slow component correlates with the absorbed dose. In contrast to the fast component the argument of the exponential function of the slow component is independent of the dosimetric materials investigated: trehalose, glucose and mannitol. The maximum value, following the peak of the curve, and the integral light output are a measure of the absorbed dose. The reason for the different light outputs of various dosimetric materials after irradiation with the same dose is the differing solubility. The character of the dissolution glow curves is the same following irradiation with photons, electrons or neutrons. (author)

  8. Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.

    Science.gov (United States)

    Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M

    2014-12-01

    In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.

  9. Heart failure - medicines

    Science.gov (United States)

    CHF - medicines; Congestive heart failure - medicines; Cardiomyopathy - medicines; HF - medicines ... You will need to take most of your heart failure medicines every day. Some medicines are taken ...

  10. Treatment of thoraco-lumbar curves in adolescent females affected by idiopathic scoliosis with a progressive action short brace (PASB: assessment of results according to the SRS committee on bracing and nonoperative management standardization criteria

    Directory of Open Access Journals (Sweden)

    Perisano Carlo

    2009-09-01

    Full Text Available Abstract Background The effectiveness of conservative treatment of scoliosis is controversial. Some studies suggest that brace is effective in stopping curve progression, whilst others did not report such an effect. The purpose of the present study was to effectiveness of Progressive Action Short Brace (PASB in the correction of thoraco-lumbar curves, in agreement with the Scoliosis Research Society (SRS Committee on Bracing and Nonoperative Management Standardisation Criteria. Methods Fifty adolescent females (mean age 11.8 ± 0.5 years with thoraco-lumbar curve and a pre-treatment Risser score ranging from 0 to 2 have been enrolled. The minimum duration of follow-up was 24 months (mean: 55.4 ± 44.5 months. Antero-posterior radiographs were used to estimate the curve magnitude (CM and the torsion of the apical vertebra (TA at 5 time points: beginning of treatment (t1, one year after the beginning of treatment (t2, intermediate time between t1 and t4 (t3, end of weaning (t4, 2-year minimum follow-up from t4 (t5. Three situations were distinguished: curve correction, curve stabilisation and curve progression. The Kruskal Wallis and Spearman Rank Correlation tests have been used as statistical tests. Results CM mean value was 29,30 ± 5,16 SD at t1 and 14,67 ± 7,65 SD at t5. TA was 12.70 ± 6,14 SD at t1 and 8,95 ± 5,82 at t5. The variation between measures of Cobb and Perdriolle degrees at t1,2,3,4,5 and between CM t5-t1 and TA t5-t1 were significantly different. Curve correction was accomplished in 94% of patients, whereas a curve stabilisation was obtained in 6% of patients. Conclusion The PASB, due to its peculiar biomechanical action on vertebral modelling, is highly effective in correcting thoraco-lumbar curves.

  11. Triggers of State Failure

    Science.gov (United States)

    2010-03-01

    à 1990) à l’aide de données collatérales additionnelles. Les auteurs du document n’ont pas essayé de suivre les événements qui étaient en cours...défaillance étatique, et les auteurs du document ont reconnu qu’il ne serait pas très utile de reproduire toutes ces informations. Le document s’appuie le...30 Carment, David. Assessing state failure: implications for theory and policy. in Third World Quarterly. Vol 24, no 3. pp 407-427. 30 DRDC

  12. TELECOMMUNICATIONS INFRASTRUCTURE AND GDP /JIPP CURVE/

    Directory of Open Access Journals (Sweden)

    Mariana Kaneva

    2016-07-01

    Full Text Available The relationship between telecommunications infrastructure and economic activity is under discussion in many scientific papers. Most of the authors use for research and analysis the Jipp curve. A lot of doubts about the correctness of the Jipp curve appear in terms of applying econometric models. The aim of this study is a review of the Jipp curve, refining the possibility of its application in modern conditions. The methodology used in the study is based on dynamic econometric models, including tests for nonstationarity and tests for causality. The focus of this study is directed to methodological problems in measuring the local density types of telecommunication networks. This study offers a specific methodology for assessing the Jipp law, through VAR-approach and Granger causality tests. It is proved that mechanical substitution of momentary aggregated variables (such as the number of subscribers of a telecommunication network at the end of the year and periodically aggregated variables (such as GDP per capita in the Jipp�s curve is methodologically wrong. Researchers have to reconsider the relationship set in the Jipp�s curve by including additional variables that characterize the Telecommunications sector and the economic activity in a particular country within a specified time period. GDP per capita should not be regarded as a single factor for the local density of telecommunications infrastructure. New econometric models studying the relationship between the investments in telecommunications infrastructure and economic development may be not only linear regression models, but also other econometric models. New econometric models should be proposed after testing and validating with sound economic theory and econometric methodology.

  13. Variation in Cognitive Failures: An Individual Differences Investigation of Everyday Attention and Memory Failures

    Science.gov (United States)

    Unsworth, Nash; Brewer, Gene A.; Spillers, Gregory J.

    2012-01-01

    The present study examined individual differences in everyday cognitive failures assessed by diaries. A large sample of participants completed various cognitive ability measures in the laboratory. Furthermore, a subset of these participants also recorded everyday cognitive failures (attention, retrospective memory, and prospective memory failures)…

  14. Failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  15. A practical procedure for the selection of time-to-failure models based on the assessment of trends in maintenance data

    International Nuclear Information System (INIS)

    Louit, D.M.; Pascual, R.; Jardine, A.K.S.

    2009-01-01

    Many times, reliability studies rely on false premises such as independent and identically distributed time between failures assumption (renewal process). This can lead to erroneous model se