WorldWideScience

Sample records for kaplan-meier estimated probability

  1. About an adaptively weighted Kaplan-Meier estimate.

    Science.gov (United States)

    Plante, Jean-François

    2009-09-01

    The minimum averaged mean squared error nonparametric adaptive weights use data from m possibly different populations to infer about one population of interest. The definition of these weights is based on the properties of the empirical distribution function. We use the Kaplan-Meier estimate to let the weights accommodate right-censored data and use them to define the weighted Kaplan-Meier estimate. The proposed estimate is smoother than the usual Kaplan-Meier estimate and converges uniformly in probability to the target distribution. Simulations show that the performances of the weighted Kaplan-Meier estimate on finite samples exceed that of the usual Kaplan-Meier estimate. A case study is also presented.

  2. Modified Weighted Kaplan-Meier Estimator

    Directory of Open Access Journals (Sweden)

    Mohammad Shafiq

    2007-01-01

    Full Text Available In many medical studies majority of the study subjects do not reach to the event of interest during the study period. In such situations survival probabilities can be estimated for censored observation by Kaplan Meier estimator. However in case of heavy censoring these estimates are biased and over estimate the survival probabilities. For heavy censoring a new method was proposed (Bahrawar Jan, 2005 to estimate the survival probabilities by weighting the censored observations by non-censoring rate. But the main defect in this weighted method is that it gives zero weight to the last censored observation. To over come this difficulty a new weight is proposed which also gives a non-zero weight to the last censored observation.

  3. Understanding survival analysis: Kaplan-Meier estimate.

    Science.gov (United States)

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-10-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.

  4. The Kaplan-Meier Integral in the Presence of Covariates

    DEFF Research Database (Denmark)

    Gerds, Thomas A.; Beyersmann, Jan; Starkopf, Liis

    2017-01-01

    In a series of papers, Winfried Stute introduced and studied the Kaplan-Meier integral as an estimator of parameters of the joint distribution of survival times and covariates based on right censored survival times. We present a review of this work and show that his estimator has an inverse...... probability of censoring weighting (IPCW) representation. We further investigate large sample bias and efficiency. As a central application in a biostatistical context, Kaplan-Meier integrals are used to estimate transition probabilities in a non-Markov illness-death model. We extend already existing...

  5. The Kaplan-Meier Theatre

    Science.gov (United States)

    Gerds, Thomas A.

    2016-01-01

    Survival is difficult to estimate when observation periods of individuals differ in length. Students imagine sailing the Titanic and then recording whether they "live" or "die." A clever algorithm is performed which results in the Kaplan-Meier estimate of survival.

  6. Competing risk bias was common in Kaplan-Meier risk estimates published in prominent medical journals.

    Science.gov (United States)

    van Walraven, Carl; McAlister, Finlay A

    2016-01-01

    Risk estimates from Kaplan-Meier curves are well known to medical researchers, reviewers, and editors. In this study, we determined the proportion of Kaplan-Meier analyses published in prominent medical journals that are potentially biased because of competing events ("competing risk bias"). We randomly selected 100 studies that had at least one Kaplan-Meier analysis and were recently published in prominent medical journals. Susceptibility to competing risk bias was determined by examining the outcome and potential competing events. In susceptible studies, bias was quantified using a previously validated prediction model when the number of outcomes and competing events were given. Forty-six studies (46%) contained Kaplan-Meier analyses susceptible to competing risk bias. Sixteen studies (34.8%) susceptible to competing risk cited the number of outcomes and competing events; in six of these studies (6/16, 37.5%), the outcome risk from the Kaplan-Meier estimate (relative to the true risk) was biased upward by 10% or more. Almost half of Kaplan-Meier analyses published in medical journals are susceptible to competing risk bias and may overestimate event risk. This bias was found to be quantitatively important in a third of such studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. The Kaplan-Meier theatre

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander

    2016-01-01

    Survival probabilities are not straightforward toobtain when observation periods of individuals differ in length. The Kaplan–Meier theatre is a classroom activity, which starts by a data collection exercise where students imagine sailing on the Titanic. Several students ‘fall in the water’ where....... The Kaplan–Meier method assumes that censored individuals have the same survival chances as the individuals who are still observed. During the Kaplan–Meier theatre, students perform a clever algorithm (Efron 1967), which translates the assumption into action and results in the Kaplan–Meier estimate...

  8. The analysis of competing events like cause-specific mortality--beware of the Kaplan-Meier method

    NARCIS (Netherlands)

    Verduijn, Marion; Grootendorst, Diana C.; Dekker, Friedo W.; Jager, Kitty J.; le Cessie, Saskia

    2011-01-01

    Kaplan-Meier analysis is a popular method used for analysing time-to-event data. In case of competing event analyses such as that of cardiovascular and non-cardiovascular mortality, however, the Kaplan-Meier method profoundly overestimates the cumulative mortality probabilities for each of the

  9. Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.

    Science.gov (United States)

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A

    2015-11-01

    Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and

  10. Kaplan-Meier survival analysis overestimates cumulative incidence of health-related events in competing risk settings: a meta-analysis.

    Science.gov (United States)

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter; Ghali, William A; Marshall, Deborah A

    2018-01-01

    Kaplan-Meier survival analysis overestimates cumulative incidence in competing risks (CRs) settings. The extent of overestimation (or its clinical significance) has been questioned, and CRs methods are infrequently used. This meta-analysis compares the Kaplan-Meier method to the cumulative incidence function (CIF), a CRs method. We searched MEDLINE, EMBASE, BIOSIS Previews, Web of Science (1992-2016), and article bibliographies for studies estimating cumulative incidence using the Kaplan-Meier method and CIF. For studies with sufficient data, we calculated pooled risk ratios (RRs) comparing Kaplan-Meier and CIF estimates using DerSimonian and Laird random effects models. We performed stratified meta-analyses by clinical area, rate of CRs (CRs/events of interest), and follow-up time. Of 2,192 identified abstracts, we included 77 studies in the systematic review and meta-analyzed 55. The pooled RR demonstrated the Kaplan-Meier estimate was 1.41 [95% confidence interval (CI): 1.36, 1.47] times higher than the CIF. Overestimation was highest among studies with high rates of CRs [RR = 2.36 (95% CI: 1.79, 3.12)], studies related to hepatology [RR = 2.60 (95% CI: 2.12, 3.19)], and obstetrics and gynecology [RR = 1.84 (95% CI: 1.52, 2.23)]. The Kaplan-Meier method overestimated the cumulative incidence across 10 clinical areas. Using CRs methods will ensure accurate results inform clinical and policy decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. [Survival analysis with competing risks: estimating failure probability].

    Science.gov (United States)

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  12. Application of Kaplan-Meier analysis in reliability evaluation of products cast from aluminium alloys

    OpenAIRE

    J. Szymszal; A. Gierek; J. Kliś

    2010-01-01

    The article evaluates the reliability of AlSi17CuNiMg alloys using Kaplan-Meier-based technique, very popular as a survival estimation tool in medical science. The main object of survival analysis is a group (or groups) of units for which the time of occurrence of an event (failure) taking place after some time of waiting is estimated. For example, in medicine, the failure can be patient’s death. In this study, the failure was the specimen fracture during a periodical fatigue test, while the ...

  13. Análisis de supervivencia en presencia de riesgos competitivos: estimadores de la probabilidad de suceso Survival analysis with competing risks: estimating failure probability

    Directory of Open Access Journals (Sweden)

    Javier Llorca

    2004-10-01

    Full Text Available Objetivo: Mostrar el efecto de los riesgos competitivos de muerte en el análisis de supervivencia. Métodos: Se presenta un ejemplo sobre la supervivencia libre de rechazo tras un trasplante cardíaco, en el que la muerte antes de desarrollar el rechazo actúa como riesgo competitivo. Mediante una simulación se comparan el estimador de Kaplan-Meier y el modelo de decrementos múltiples. Resultados: El método de Kaplan-Meier sobrestima el riesgo de rechazo. A continuación, se expone la aplicación del modelo de decrementos múltiples para el análisis de acontecimientos secundarios (en el ejemplo, la muerte tras el rechazo. Finalmente, se discuten las asunciones propias del método de Kaplan-Meier y las razones por las que no puede ser aplicado en presencia de riesgos competitivos. Conclusiones: El análisis de supervivencia debe ajustarse por los riesgos competitivos de muerte para evitar la sobrestimación del riesgo de fallo que se produce con el método de Kaplan-Meier.Objective: To show the impact of competing risks of death on survival analysis. Method: We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. Results: The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection. Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Conclusions: Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  14. Biostatistics with emphasis on life table survival rate calculations (including Kaplan Meier) and the logrank test

    International Nuclear Information System (INIS)

    Mould, Richard F.

    1995-01-01

    Purpose/Objective: To explain some of the most useful statistical calculation procedures which are relevant to radiation oncologists and to provide insights on what tests and procedures should be used in various situations such as when survival rates and their associated standard errors have to be determined. To describe some of the problems and pitfalls in clinical trial designs which have to be overcome if a trial is to have the possibility of reaching a successful conclusion. To review methods of computing criteria to quantitatively describe criteria of success (eg. quality of life, long-term survival, cure) of radiation oncology and to suggest possible future statistical improvements in this area. Chi-Squared Test: The chi-squared test is probably the most useful of the tests of statistical significance for the radiation oncologist. Applications will be described, including goodness of fit tests and 2x2 contingency tables which are the simplest of the generalized nxm contingency tables. Degrees of Freedom and P<0.05 for Significance Testing: An Introduction will be given to the meaning of P<0.05 in relation to significance testing and the use of tables of critical values of a test statistic (eg. chi-squared) which are given as a function of degrees of freedom and P-values. Survival Rate Calculations for Grouped and Ungrouped Data: The life-table method (sometimes termed the actuarial method) will be explained for both grouped data (eg. survival times grouped in annual intervals for patients who have died and for those who are still alive or lost to follow-up) and for ungrouped data (when individual survival times are used). The method for ungrouped data is variously termed the Kaplan-Meier or Product Limit method. Logrank Test: This is the most useful test for comparison of the survival experience of two groups of patients and its use will be explained. In part the computation is similar to that for the Kaplan-Meier/Product Limit method

  15. Biostatistics with emphasis on life table survival rate calculations (including Kaplan Meier) and the logrank test

    Energy Technology Data Exchange (ETDEWEB)

    Mould, Richard F

    1995-07-01

    Purpose/Objective: To explain some of the most useful statistical calculation procedures which are relevant to radiation oncologists and to provide insights on what tests and procedures should be used in various situations such as when survival rates and their associated standard errors have to be determined. To describe some of the problems and pitfalls in clinical trial designs which have to be overcome if a trial is to have the possibility of reaching a successful conclusion. To review methods of computing criteria to quantitatively describe criteria of success (eg. quality of life, long-term survival, cure) of radiation oncology and to suggest possible future statistical improvements in this area. Chi-Squared Test: The chi-squared test is probably the most useful of the tests of statistical significance for the radiation oncologist. Applications will be described, including goodness of fit tests and 2x2 contingency tables which are the simplest of the generalized nxm contingency tables. Degrees of Freedom and P<0.05 for Significance Testing: An Introduction will be given to the meaning of P<0.05 in relation to significance testing and the use of tables of critical values of a test statistic (eg. chi-squared) which are given as a function of degrees of freedom and P-values. Survival Rate Calculations for Grouped and Ungrouped Data: The life-table method (sometimes termed the actuarial method) will be explained for both grouped data (eg. survival times grouped in annual intervals for patients who have died and for those who are still alive or lost to follow-up) and for ungrouped data (when individual survival times are used). The method for ungrouped data is variously termed the Kaplan-Meier or Product Limit method. Logrank Test: This is the most useful test for comparison of the survival experience of two groups of patients and its use will be explained. In part the computation is similar to that for the Kaplan-Meier/Product Limit method.

  16. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    Science.gov (United States)

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  17. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves

    Directory of Open Access Journals (Sweden)

    Guyot Patricia

    2012-02-01

    Full Text Available Abstract Background The results of Randomized Controlled Trials (RCTs on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. Methods We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios with statistics based on repeated reconstructions by multiple observers. Results The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. Conclusion The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  18. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Science.gov (United States)

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  19. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Directory of Open Access Journals (Sweden)

    Arnd Gross

    Full Text Available BACKGROUND: Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav, SAS export (xpt or text file (dat, which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. RESULTS: On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. CONCLUSIONS: We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  20. KMWin – A Convenient Tool for Graphical Presentation of Results from Kaplan-Meier Survival Time Analysis

    Science.gov (United States)

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912

  1. Gastric emptying of solids in humans: improved evaluation by Kaplan-Meier plots, with special reference to obesity and gender

    International Nuclear Information System (INIS)

    Grybaeck, P.; Naeslund, E.; Hellstroem, P.M.; Jacobsson, H.; Backman, L.

    1996-01-01

    It has been suggested that obesity is associated with an altered rate of gastric emptying, and that there are also sex differences in gastric emptying. The results of earlier studies examining gastric emptying rates in obesity and in males and females have proved inconsistent. The aim of this study was to investigate the influence of obesity and gender on gastric emptying, by extending conventional evaluation methods with Kaplan-Meier plots, in order to assess whether these factors have to be accounted for when interpreting results of scintigraphic gastric emptying tests. Twenty-one normal-weight volunteers and nine obese subjects were fed a standardised technetium-99m labelled albumin omelette. Imaging data were acquired at 5- and 10-min intervals in both posterior and anterior projections with the subjects in the sitting position. The half-emptying time, analysed by Kaplan-Meier plot (log-rank test), were shorter in obese subjects compared to normal-weight subjects and later in females compared to males. Also, the lag-phase and half-emptying time were shorter in obese females than in normal females. This study shows an association between different gastric emptying rates and obesity and gender. Therefore, body mass index and gender have to be accounted for when interpreting results of scintigraphic gastric emptying studies. (orig.). With 6 figs., 4 tabs

  2. Gastric emptying of solids in humans: improved evaluation by Kaplan-Meier plots, with special reference to obesity and gender

    Energy Technology Data Exchange (ETDEWEB)

    Grybaeck, P. [Department of Diagnostic Radiology, Karolinska Hospital, Stockholm (Sweden); Naeslund, E. [Department of Surgery, Karolinska Institute at Danderyd Hospital, Stockholm (Sweden); Hellstroem, P.M. [Department of Internal Medicine, Karolinska Hospital, Stockholm (Sweden); Jacobsson, H. [Department of Diagnostic Radiology, Karolinska Hospital, Stockholm (Sweden)]|[Department of Nuclear Medicine, Karolinska Hospital, Stockholm (Sweden); Backman, L. [Department of Surgery, Karolinska Institute at Danderyd Hospital, Stockholm (Sweden)

    1996-12-01

    It has been suggested that obesity is associated with an altered rate of gastric emptying, and that there are also sex differences in gastric emptying. The results of earlier studies examining gastric emptying rates in obesity and in males and females have proved inconsistent. The aim of this study was to investigate the influence of obesity and gender on gastric emptying, by extending conventional evaluation methods with Kaplan-Meier plots, in order to assess whether these factors have to be accounted for when interpreting results of scintigraphic gastric emptying tests. Twenty-one normal-weight volunteers and nine obese subjects were fed a standardised technetium-99m labelled albumin omelette. Imaging data were acquired at 5- and 10-min intervals in both posterior and anterior projections with the subjects in the sitting position. The half-emptying time, analysed by Kaplan-Meier plot (log-rank test), were shorter in obese subjects compared to normal-weight subjects and later in females compared to males. Also, the lag-phase and half-emptying time were shorter in obese females than in normal females. This study shows an association between different gastric emptying rates and obesity and gender. Therefore, body mass index and gender have to be accounted for when interpreting results of scintigraphic gastric emptying studies. (orig.). With 6 figs., 4 tabs.

  3. A versatile test for equality of two survival functions based on weighted differences of Kaplan-Meier curves.

    Science.gov (United States)

    Uno, Hajime; Tian, Lu; Claggett, Brian; Wei, L J

    2015-12-10

    With censored event time observations, the logrank test is the most popular tool for testing the equality of two underlying survival distributions. Although this test is asymptotically distribution free, it may not be powerful when the proportional hazards assumption is violated. Various other novel testing procedures have been proposed, which generally are derived by assuming a class of specific alternative hypotheses with respect to the hazard functions. The test considered by Pepe and Fleming (1989) is based on a linear combination of weighted differences of the two Kaplan-Meier curves over time and is a natural tool to assess the difference of two survival functions directly. In this article, we take a similar approach but choose weights that are proportional to the observed standardized difference of the estimated survival curves at each time point. The new proposal automatically makes weighting adjustments empirically. The new test statistic is aimed at a one-sided general alternative hypothesis and is distributed with a short right tail under the null hypothesis but with a heavy tail under the alternative. The results from extensive numerical studies demonstrate that the new procedure performs well under various general alternatives with a caution of a minor inflation of the type I error rate when the sample size is small or the number of observed events is small. The survival data from a recent cancer comparative study are utilized for illustrating the implementation of the process. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Measuring survival time: a probability-based approach useful in healthcare decision-making.

    Science.gov (United States)

    2011-01-01

    In some clinical situations, the choice between treatment options takes into account their impact on patient survival time. Due to practical constraints (such as loss to follow-up), survival time is usually estimated using a probability calculation based on data obtained in clinical studies or trials. The two techniques most commonly used to estimate survival times are the Kaplan-Meier method and the actuarial method. Despite their limitations, they provide useful information when choosing between treatment options.

  5. ["That flesh, pink and perishable": analysis of disease-free survival analysis in breast cancer in Gipuzkoa (Spain) in the presence of competing risks].

    Science.gov (United States)

    Martínez-Camblor, Pablo; Larrañaga, Nerea; Sarasqueta, Cristina; Mitxelena, María José; Basterretxea, Mikel

    2009-01-01

    To analyze time of disease-free survival and relative survival in women diagnosed with breast cancer in the province of Gipuzkoa within the context of competing risks by assessing differences between the direct use of the Kaplan-Meier estimator and the multiple decrement method on the one hand, and relative survival on the other. All registered breast cancer cases in Gipuzkoa in 1995 and 1996 with stages other than stage IV were included. An 8-year follow-up for recurrence and a 10-year follow-up for survival were performed. Time of disease-free survival was studied by the multiple decrement model. Observed survival and survival corrected by the expected mortality in the population (relative survival) were also studied. Estimation of the probability of recurrence at 8 years with the multiple decrement method was 8.8% lower than that obtained with the Kaplan-Meier method. The difference between the observed and relative survival rates at 10 years was 10.8%. Both results show how, in this case, the Kaplan-Meier estimator overestimates both the probability of recurrence and that of mortality from the disease. Two issues are often overlooked when performing survival analyses: firstly, because of the lack of independence between survival time and censoring time, the results obtained by the Kaplan-Meier estimator are uninterpretable; secondly, it is an incontrovertible fact that one way or another, everyone causes failures. In this approach, survival analyses must take into account the probability of failure in the general population of reference. The results obtained in this study show that superficial use of the Kaplan Meier estimator overestimates both the probability of recurrence and that of mortality caused by the disease.

  6. Applying Kaplan-Meier to Item Response Data

    Science.gov (United States)

    McNeish, Daniel

    2018-01-01

    Some IRT models can be equivalently modeled in alternative frameworks such as logistic regression. Logistic regression can also model time-to-event data, which concerns the probability of an event occurring over time. Using the relation between time-to-event models and logistic regression and the relation between logistic regression and IRT, this…

  7. Challenges in risk estimation using routinely collected clinical data: The example of estimating cervical cancer risks from electronic health-records.

    Science.gov (United States)

    Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A

    2018-06-01

    Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Bias and precision of methods for estimating the difference in restricted mean survival time from an individual patient data meta-analysis

    Directory of Open Access Journals (Sweden)

    Béranger Lueza

    2016-03-01

    Full Text Available Abstract Background The difference in restricted mean survival time ( rmstD t ∗ $$ rmstD\\left({t}^{\\ast}\\right $$ , the area between two survival curves up to time horizon t ∗ $$ {t}^{\\ast } $$ , is often used in cost-effectiveness analyses to estimate the treatment effect in randomized controlled trials. A challenge in individual patient data (IPD meta-analyses is to account for the trial effect. We aimed at comparing different methods to estimate the rmstD t ∗ $$ rmstD\\left({t}^{\\ast}\\right $$ from an IPD meta-analysis. Methods We compared four methods: the area between Kaplan-Meier curves (experimental vs. control arm ignoring the trial effect (Naïve Kaplan-Meier; the area between Peto curves computed at quintiles of event times (Peto-quintile; the weighted average of the areas between either trial-specific Kaplan-Meier curves (Pooled Kaplan-Meier or trial-specific exponential curves (Pooled Exponential. In a simulation study, we varied the between-trial heterogeneity for the baseline hazard and for the treatment effect (possibly correlated, the overall treatment effect, the time horizon t ∗ $$ {t}^{\\ast } $$ , the number of trials and of patients, the use of fixed or DerSimonian-Laird random effects model, and the proportionality of hazards. We compared the methods in terms of bias, empirical and average standard errors. We used IPD from the Meta-Analysis of Chemotherapy in Nasopharynx Carcinoma (MAC-NPC and its updated version MAC-NPC2 for illustration that included respectively 1,975 and 5,028 patients in 11 and 23 comparisons. Results The Naïve Kaplan-Meier method was unbiased, whereas the Pooled Exponential and, to a much lesser extent, the Pooled Kaplan-Meier methods showed a bias with non-proportional hazards. The Peto-quintile method underestimated the rmstD t ∗ $$ rmstD\\left({t}^{\\ast}\\right $$ , except with non-proportional hazards at t ∗ $$ {t}^{\\ast } $$ = 5 years. In the presence of treatment effect

  9. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  10. Survival modeling for the estimation of transition probabilities in model-based economic evaluations in the absence of individual patient data: a tutorial.

    Science.gov (United States)

    Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J

    2014-02-01

    Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model

  11. Estimation of Unemployment Duration in Botoşani County Using Survival Analysis

    Directory of Open Access Journals (Sweden)

    Darabă Gabriel

    2017-01-01

    Full Text Available In this paper we aim at estimating the unemployment duration in Botosani County in order tostudy the impact of individual characteristics (gender, age, place of residence, unemploymentbenefit, etc. on the length of unemployment spells. We use Cox regression model to measure theeffects of gender, age, residential environment, etc. on the hazard rate of leaving unemploymentandKaplan-Meier estimator to compare survival probabilities among different categories ofunemployed persons. The study is carried out on a sample of 200 unemployment spellsregisteredwith the Employment Agency of Botoşani County from January 2012 to December 2015. Theresults reveal that place of residence, unemployment benefit and unemployed category have asignificant impact on unemployment spells.

  12. The development of a new algorithm to calculate a survival function in non-parametric ways

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    In this study, a generalized formula of the Kaplan-Meier method is developed. The idea of this algorithm is that the result of the Kaplan-Meier estimator is the same as that of the redistribute-to-the right algorithm. Hence, the result of the Kaplan-Meier estimator is used when we redistribute to the right. This can be explained as the following steps, at first, the same mass is distributed to all the points. At second, when you reach the censored points, you must redistribute the mass of that point to the right according to the following rule; to normalize the masses, which are located to the right of the censored point, and redistribute the mass of the censored point to the right according to the ratio of the normalized mass. Until now, we illustrate the main idea of this algorithm.The meaning of that idea is more efficient than PL-estimator in the sense that it decreases the mass of after that area. Just like a redistribute to the right algorithm, this method is enough for the probability theory

  13. «Esa corporeidad mortal y rosa»: análisis del tiempo libre de enfermedad del cáncer de mama en Gipuzkoa en presencia de riesgos competitivos "That deadly and pink corporeity": Analysis of disease-free survival analysis in breast cancer in Gipuzkoa (Spain in the presence of competing risks

    Directory of Open Access Journals (Sweden)

    Pablo Martínez-Camblor

    2009-12-01

    sobrestimación tanto de la probabilidad de recidiva como de la mortalidad debida a la enfermedad.Objective: To analyze time of disease-free survival and relative survival in women diagnosed with breast cancer in the province of Gipuzkoa within the context of competing risks by assessing differences between the direct use of the Kaplan-Meier estimator and the multiple decrement method on the one hand, and relative survival on the other. Methods: All registered breast cancer cases in Gipuzkoa in 1995 and 1996 with stages other than stage IV were included. An 8-year follow-up for recurrence and a 10-year follow-up for survival were performed. Time of disease-free survival was studied by the multiple decrement model. Observed survival and survival corrected by the expected mortality in the population (relative survival were also studied. Results: Estimation of the probability of recurrence at 8 years with the multiple decrement method was 8.8% lower than that obtained with the Kaplan-Meier method. The difference between the observed and relative survival rates at 10 years was 10.8%. Both results show how, in this case, the Kaplan-Meier estimator overestimates both the probability of recurrence and that of mortality from the disease. Conclusions: Two issues are often overlooked when performing survival analyses: firstly, because of the lack of independence between survival time and censoring time, the results obtained by the Kaplan-Meier estimator are uninterpretable; secondly, it is an incontrovertible fact that one way or another, everyone causes failures. In this approach, survival analyses must take into account the probability of failure in the general population of reference. The results obtained in this study show that superficial use of the Kaplan Meier estimator overestimates both the probability of recurrence and that of mortality caused by the disease.

  14. Survival chance in papillary thyroid cancer in Hungary: individual survival probability estimation using the Markov method

    International Nuclear Information System (INIS)

    Esik, Olga; Tusnady, Gabor; Daubner, Kornel; Nemeth, Gyoergy; Fuezy, Marton; Szentirmay, Zoltan

    1997-01-01

    Purpose: The typically benign, but occasionally rapidly fatal clinical course of papillary thyroid cancer has raised the need for individual survival probability estimation, to tailor the treatment strategy exclusively to a given patient. Materials and methods: A retrospective study was performed on 400 papillary thyroid cancer patients with a median follow-up time of 7.1 years to establish a clinical database for uni- and multivariate analysis of the prognostic factors related to survival (Kaplan-Meier product limit method and Cox regression). For a more precise prognosis estimation, the effect of the most important clinical events were then investigated on the basis of a Markov renewal model. The basic concept of this approach is that each patient has an individual disease course which (besides the initial clinical categories) is affected by special events, e.g. internal covariates (local/regional/distant relapses). On the supposition that these events and the cause-specific death are influenced by the same biological processes, the parameters of transient survival probability characterizing the speed of the course of the disease for each clinical event and their sequence were determined. The individual survival curves for each patient were calculated by using these parameters and the independent significant clinical variables selected from multivariate studies, summation of which resulted in a mean cause-specific survival function valid for the entire group. On the basis of this Markov model, prediction of the cause-specific survival probability is possible for extrastudy cases, if it is supposed that the clinical events occur within new patients in the same manner and with the similar probability as within the study population. Results: The patient's age, a distant metastasis at presentation, the extent of the surgical intervention, the primary tumor size and extent (pT), the external irradiation dosage and the degree of TSH suppression proved to be

  15. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  16. Survival Analysis of Patients with End Stage Renal Disease

    Science.gov (United States)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  17. Impact of BCL2 and p53 on postmastectomy radiotherapy response in high-risk breast cancer. A subgroup analysis of DBCG82 b

    DEFF Research Database (Denmark)

    Kyndi, M.; Sorensen, F.B.; Alsner, J.

    2008-01-01

    -Meier probability plots showed a significantly improved overall survival after PMRT for the BCL2 positive subgroup, whereas practically no survival improvement was seen after PMRT for the BCL2 negative subgroup. In multivariate analysis of OS, however, no significant interaction was found between BCL2......Purpose. To examine p53 and BCL2 expression in high-risk breast cancer patients randomized to postmastectomy radiotherapy (PMRT). Patients and methods. The present analysis included 1000 of 3 083 high-risk breast cancer patients randomly assigned to PMRT in the DBCG82 b&c studies. Tissue microarray......, Kaplan-Meier probability plots, Log-rank test, and Cox univariate and multivariate regression analyses. Results. p53 accumulation was not significantly associated with increased overall mortality, DM or LRR probability in univariate or multivariate Cox regression analyses. Kaplan-Meier probability plots...

  18. Statistical evaluation of a project to estimate fish trajectories through the intakes of Kaplan hydropower turbines

    Science.gov (United States)

    Sutton, Virginia Kay

    This paper examines statistical issues associated with estimating paths of juvenile salmon through the intakes of Kaplan turbines. Passive sensors, hydrophones, detecting signals from ultrasonic transmitters implanted in individual fish released into the preturbine region were used to obtain the information to estimate fish paths through the intake. Aim and location of the sensors affects the spatial region in which the transmitters can be detected, and formulas relating this region to sensor aiming directions are derived. Cramer-Rao lower bounds for the variance of estimators of fish location are used to optimize placement of each sensor. Finally, a statistical methodology is developed for analyzing angular data collected from optimally placed sensors.

  19. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date....... It is common practice to apply the Kaplan-Meier or Aalen-Johansen estimator to the total sample and report either the estimated cumulative incidence curve or just a single point on the curve as a description of the disease risk. METHODS: We argue that, whenever the disease or disorder of interest is influenced...

  20. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  1. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  2. Assessing the effect of quantitative and qualitative predictors on gastric cancer individuals survival using hierarchical artificial neural network models.

    Science.gov (United States)

    Amiri, Zohreh; Mohammad, Kazem; Mahmoudi, Mahmood; Parsaeian, Mahbubeh; Zeraati, Hojjat

    2013-01-01

    There are numerous unanswered questions in the application of artificial neural network models for analysis of survival data. In most studies, independent variables have been studied as qualitative dichotomous variables, and results of using discrete and continuous quantitative, ordinal, or multinomial categorical predictive variables in these models are not well understood in comparison to conventional models. This study was designed and conducted to examine the application of these models in order to determine the survival of gastric cancer patients, in comparison to the Cox proportional hazards model. We studied the postoperative survival of 330 gastric cancer patients who suffered surgery at a surgical unit of the Iran Cancer Institute over a five-year period. Covariates of age, gender, history of substance abuse, cancer site, type of pathology, presence of metastasis, stage, and number of complementary treatments were entered in the models, and survival probabilities were calculated at 6, 12, 18, 24, 36, 48, and 60 months using the Cox proportional hazards and neural network models. We estimated coefficients of the Cox model and the weights in the neural network (with 3, 5, and 7 nodes in the hidden layer) in the training group, and used them to derive predictions in the study group. Predictions with these two methods were compared with those of the Kaplan-Meier product limit estimator as the gold standard. Comparisons were performed with the Friedman and Kruskal-Wallis tests. Survival probabilities at different times were determined using the Cox proportional hazards and a neural network with three nodes in the hidden layer; the ratios of standard errors with these two methods to the Kaplan-Meier method were 1.1593 and 1.0071, respectively, revealed a significant difference between Cox and Kaplan-Meier (P neural network, and the neural network and the standard (Kaplan-Meier), as well as better accuracy for the neural network (with 3 nodes in the hidden layer

  3. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  4. Meier-Gorlin syndrome

    NARCIS (Netherlands)

    Munnik, S.A. de; Hoefsloot, E.H.; Roukema, J.; Schoots, J.; Knoers, N.V.A.M.; Brunner, H.G.; Jackson, A.P.; Bongers, E.M.H.F.

    2015-01-01

    Meier-Gorlin syndrome (MGS) is a rare autosomal recessive primordial dwarfism disorder, characterized by microtia, patellar applasia/hypoplasia, and a proportionate short stature. Associated clinical features encompass feeding problems, congenital pulmonary emphysema, mammary hypoplasia in females

  5. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  6. Impact of BCL2 and p53 on postmastectomy radiotherapy response in high-risk breast cancer. A subgroup analysis of DBCG82 b&c

    DEFF Research Database (Denmark)

    Kyndi, Marianne; Sørensen, Flemming Brandt; Knudsen, Helle

    2008-01-01

    -Meier probability plots showed a significantly improved overall survival after PMRT for the BCL2 positive subgroup, whereas practically no survival improvement was seen after PMRT for the BCL2 negative subgroup. In multivariate analysis of OS, however, no significant interaction was found between BCL2......PURPOSE: To examine p53 and BCL2 expression in high-risk breast cancer patients randomized to postmastectomy radiotherapy (PMRT). PATIENTS AND METHODS: The present analysis included 1 000 of 3 083 high-risk breast cancer patients randomly assigned to PMRT in the DBCG82 b&c studies. Tissue...... tests, Kaplan-Meier probability plots, Log-rank test, and Cox univariate and multivariate regression analyses. RESULTS: p53 accumulation was not significantly associated with increased overall mortality, DM or LRR probability in univariate or multivariate Cox regression analyses. Kaplan...

  7. Some Supplementary Methods for the Analysis of the Delis-Kaplan Executive Function System

    Science.gov (United States)

    Crawford, John R.; Garthwaite, Paul H.; Sutherland, David; Borland, Nicola

    2011-01-01

    Supplementary methods for the analysis of the Delis-Kaplan Executive Function System (Delis, Kaplan, & Kramer, 2001) are made available, including (a) quantifying the number of abnormally low achievement scores exhibited by an individual and accompanying this with an estimate of the percentage of the normative population expected to exhibit at…

  8. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  9. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  10. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Estimating Loss to Follow-Up in HIV-Infected Patients on Antiretroviral Therapy: The Effect of the Competing Risk of Death in Zambia and Switzerland

    Science.gov (United States)

    Mwango, Albert; Stringer, Jeffrey; Ledergerber, Bruno; Mulenga, Lloyd; Bucher, Heiner C.; Westfall, Andrew O.; Calmy, Alexandra; Boulle, Andrew; Chintu, Namwinga; Egger, Matthias; Chi, Benjamin H.

    2011-01-01

    Background Loss to follow-up (LTFU) is common in antiretroviral therapy (ART) programmes. Mortality is a competing risk (CR) for LTFU; however, it is often overlooked in cohort analyses. We examined how the CR of death affected LTFU estimates in Zambia and Switzerland. Methods and Findings HIV-infected patients aged ≥18 years who started ART 2004–2008 in observational cohorts in Zambia and Switzerland were included. We compared standard Kaplan-Meier curves with CR cumulative incidence. We calculated hazard ratios for LTFU across CD4 cell count strata using cause-specific Cox models, or Fine and Gray subdistribution models, adjusting for age, gender, body mass index and clinical stage. 89,339 patients from Zambia and 1,860 patients from Switzerland were included. 12,237 patients (13.7%) in Zambia and 129 patients (6.9%) in Switzerland were LTFU and 8,498 (9.5%) and 29 patients (1.6%), respectively, died. In Zambia, the probability of LTFU was overestimated in Kaplan-Meier curves: estimates at 3.5 years were 29.3% for patients starting ART with CD4 cells Switzerland since only few patients died. The results from Cox and Fine and Gray models were similar: in Zambia the risk of loss to follow-up and death increased with decreasing CD4 counts at the start of ART, whereas in Switzerland there was a trend in the opposite direction, with patients with higher CD4 cell counts more likely to be lost to follow-up. Conclusions In ART programmes in low-income settings the competing risk of death can substantially bias standard analyses of LTFU. The CD4 cell count and other prognostic factors may be differentially associated with LTFU in low-income and high-income settings. PMID:22205933

  12. Analytical Method to Estimate Fatigue Life Time Duration in Service for Runner Blade Mechanism of Kaplan Turbines

    Directory of Open Access Journals (Sweden)

    Ana – Maria Budai

    2010-10-01

    Full Text Available The paper present an analytical method that can be used to determianted fatigue life time duration in service for runner blade mechanism of Kaplan turbines. The study was made for lever button of runer blade mechanism using two analytical relation to calculate the maximum number of stress cycles whereupon the mechanism work without any damage. To estimate fatigue life time duration will be used a formula obtained from one of most comon cumulative damage methodology taking in consideration the real exploatation conditions of a specified Kapaln turbine.

  13. Meier-Gorlin syndrome

    OpenAIRE

    de Munnik, Sonja A; Hoefsloot, Elisabeth H; Roukema, Jolt; Schoots, Jeroen; Knoers, Nine V A M; Brunner, Han G; Jackson, Andrew P; Bongers, Ernie M H F

    2015-01-01

    Meier-Gorlin syndrome (MGS) is a rare autosomal recessive primordial dwarfism disorder, characterized by microtia, patellar applasia/hypoplasia, and a proportionate short stature. Associated clinical features encompass feeding problems, congenital pulmonary emphysema, mammary hypoplasia in females and urogenital anomalies, such as cryptorchidism and hypoplastic labia minora and majora. Typical facial characteristics during childhood comprise a small mouth with full lips and micro-retrognathia...

  14. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  15. Impact of BCL2 and p53 on postmastectomy radiotherapy response in high-risk breast cancer. A subgroup analysis of DBCG82 b and c

    International Nuclear Information System (INIS)

    Kyndi, M.; Alsner, J.; Nielsen, H.M.; Overgaard, J.; Soerensen, F.B.; Knudsen, H.; Overgaard, M.

    2008-01-01

    Purpose. To examine p53 and BCL2 expression in high-risk breast cancer patients randomized to postmastectomy radiotherapy (PMRT). Patients and methods. The present analysis included 1 000 of 3 083 high-risk breast cancer patients randomly assigned to PMRT in the DBCG82 b and c studies. Tissue microarray sections were stained with immunohistochemistry for p53 and BCL2. Median potential follow-up was 17 years. Clinical endpoints were locoregional recurrence (LRR), distant metastases (DM), overall mortality, and overall survival (OS). Statistical analyses included Kappa statistics, χ2 or exact tests, Kaplan-Meier probability plots, Log-rank test, and Cox univariate and multivariate regression analyses. Results. p53 accumulation was not significantly associated with increased overall mortality, DM or LRR probability in univariate or multivariate Cox regression analyses. Kaplan-Meier probability plots showed reduced OS and improved DM and LRR probabilities after PMRT within subgroups of both p53 negative and p53 positive patients. Negative BCL2 expression was significantly associated with increased overall mortality, DM and LRR probability in multivariate Cox regression analyses. Kaplan-Meier probability plots showed a significantly improved overall survival after PMRT for the BCL2 positive subgroup, whereas practically no survival improvement was seen after PMRT for the BCL2 negative subgroup. In multivariate analysis of OS, however, no significant interaction was found between BCL2 and randomization status. Significant reductions in LRR probability after PMRT were recorded within both the BCL2 positive and BCL2 negative subgroups. Conclusion. p53 was not associated with survival after radiotherapy in high-risk breast cancer, but BCL2 might be

  16. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  17. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  18. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  19. Unsteady load on an oscillating Kaplan turbine runner

    Science.gov (United States)

    Puolakka, O.; Keto-Tokoi, J.; Matusiak, J.

    2013-02-01

    A Kaplan turbine runner oscillating in turbine waterways is subjected to a varying hydrodynamic load. Numerical simulation of the related unsteady flow is time-consuming and research is very limited. In this study, a simplified method based on unsteady airfoil theory is presented for evaluation of the unsteady load for vibration analyses of the turbine shaft line. The runner is assumed to oscillate as a rigid body in spin and axial heave, and the reaction force is resolved into added masses and dampings. The method is applied on three Kaplan runners at nominal operating conditions. Estimates for added masses and dampings are considered to be of a magnitude significant for shaft line vibration. Moderate variation in the added masses and minor variation in the added dampings is found in the frequency range of interest. Reference results for added masses are derived by solving the boundary value problem for small motions of inviscid fluid using the finite element method. Good correspondence is found in the added mass estimates of the two methods. The unsteady airfoil method is considered accurate enough for design purposes. Experimental results are needed for validation of unsteady load analyses.

  20. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date...... by calendar time trends, the total sample Kaplan-Meier and Aalen-Johansen estimators do not provide useful estimates of the general risk in the target population. We present some alternatives to this type of analysis. RESULTS: We show how a proportional hazards model may be used to extrapolate disease risk...... estimates if proportionality is a reasonable assumption. If not reasonable, we instead advocate that a more useful description of the disease risk lies in the age-specific cumulative incidence curves across strata given by time of entry or perhaps just the end of follow-up estimates across all strata...

  1. Fisher classifier and its probability of error estimation

    Science.gov (United States)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  2. Initial high anti-emetic efficacy of granisetron with dexamethasone is not maintained over repeated cycles.

    Science.gov (United States)

    de Wit, R.; van den Berg, H.; Burghouts, J.; Nortier, J.; Slee, P.; Rodenburg, C.; Keizer, J.; Fonteyn, M.; Verweij, J.; Wils, J.

    1998-01-01

    We have reported previously that the anti-emetic efficacy of single agent 5HT3 antagonists is not maintained when analysed with the measurement of cumulative probabilities. Presently, the most effective anti-emetic regimen is a combination of a 5HT3 antagonist plus dexamethasone. We, therefore, assessed the sustainment of efficacy of such a combination in 125 patients, scheduled to receive cisplatin > or = 70 mg m(-2) either alone or in combination with other cytotoxic drugs. Anti-emetic therapy was initiated with 10 mg of dexamethasone and 3 mg of granisetron intravenously, before cisplatin. On days 1-6, patients received 8 mg of dexamethasone and 1 mg of granisetron twice daily by oral administration. Protection was assessed during all cycles and calculated based on cumulative probability analyses using the method of Kaplan-Meier and a model for transitional probabilities. Irrespective of the type of analysis used, the anti-emetic efficacy of granisetron/dexamethasone decreased over cycles. The initial complete acute emesis protection rate of 66% decreased to 30% according to the method of Kaplan-Meier and to 39% using the model for transitional probabilities. For delayed emesis, the initial complete protection rate of 52% decreased to 21% (Kaplan-Meier) and to 43% (transitional probabilities). In addition, we observed that protection failure in the delayed emesis period adversely influenced the acute emesis protection in the next cycle. We conclude that the anti-emetic efficacy of a 5HT3 antagonist plus dexamethasone is not maintained over multiple cycles of highly emetogenic chemotherapy, and that the acute emesis protection is adversely influenced by protection failure in the delayed emesis phase. PMID:9652766

  3. A probability score for preoperative prediction of type 2 diabetes remission following RYGB surgery

    Science.gov (United States)

    Still, Christopher D.; Wood, G. Craig; Benotti, Peter; Petrick, Anthony T.; Gabrielsen, Jon; Strodel, William E.; Ibele, Anna; Seiler, Jamie; Irving, Brian A.; Celaya, Melisa P.; Blackstone, Robin; Gerhard, Glenn S.; Argyropoulos, George

    2014-01-01

    BACKGROUND Type 2 diabetes (T2D) is a metabolic disease with significant medical complications. Roux-en-Y gastric bypass (RYGB) surgery is one of the few interventions that remit T2D in ~60% of patients. However, there is no accurate method for predicting preoperatively the probability for T2D remission. METHODS A retrospective cohort of 2,300 RYGB patients at Geisinger Clinic was used to identify 690 patients with T2D and complete electronic data. Two additional T2D cohorts (N=276, and N=113) were used for replication at 14 months following RYGB. Kaplan-Meier analysis was used in the primary cohort to create survival curves until remission. A Cox proportional hazards model was used to estimate the hazard ratios on T2D remission. FINDINGS Using 259 preoperative clinical variables, four (use of insulin, age, HbA1c, and type of antidiabetic medication) were sufficient to develop an algorithm that produces a type 2 diabetes remission (DiaRem) score over five years. The DiaRem score spans from 0 to 22 and was divided into five groups corresponding to five probability-ranges for T2D remission: 0–2 (88%–99%), 3–7 (64%–88%), 8–12 (23%–49%), 13–17 (11%–33%), 18–22 (2%–16%). The DiaRem scores in the replication cohorts, as well as under various definitions of diabetes remission, conformed to the DiaRem score of the primary cohort. INTERPRETATION The DiaRem score is a novel preoperative method for predicting the probability (from 2% to 99%) for T2D remission following RYGB surgery. FUNDING This research was supported by the Geisinger Health System and the National Institutes of Health. PMID:24579062

  4. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  5. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  6. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  7. Kaplan turbine tip vortex cavitation - analysis and prevention

    Science.gov (United States)

    Motycak, L.; Skotak, A.; Kupcik, R.

    2012-11-01

    The work is focused on one type of Kaplan turbine runner cavitation - a tip vortex cavitation. For detailed description of the tip vortex, the CFD analysis is used. On the basis of this analysis it is possible to estimate the intensity of cavitating vortex core, danger of possible blade surface and runner chamber cavitation pitting. In the paper, the ways how to avoid the pitting effect of the tip vortex are described. In order to prevent the blade surface against pitting, the following possibilities as the change of geometry of the runner blade, dimension of tip clearance and finally the installation of the anti-cavitation lips are discussed. The knowledge of the shape and intensity of the tip vortex helps to design the anti-cavitation lips more sophistically. After all, the results of the model tests of the Kaplan runner with or without anti-cavitation lips and the results of the CFD analysis are compared.

  8. Competing approaches to analysis of failure times with competing risks.

    Science.gov (United States)

    Farley, T M; Ali, M M; Slaymaker, E

    2001-12-15

    For the analysis of time to event data in contraceptive studies when individuals are subject to competing causes for discontinuation, some authors have recently advocated the use of the cumulative incidence rate as a more appropriate measure to summarize data than the complement of the Kaplan-Meier estimate of discontinuation. The former method estimates the rate of discontinuation in the presence of competing causes, while the latter is a hypothetical rate that would be observed if discontinuations for the other reasons could not occur. The difference between the two methods of analysis is the continuous time equivalent of a debate that took place in the contraceptive literature in the 1960s, when several authors advocated the use of net (adjusted or single decrement life table rates) rates in preference to crude rates (multiple decrement life table rates). A small simulation study illustrates the interpretation of the two types of estimate - the complement of the Kaplan-Meier estimate corresponds to a hypothetical rate where discontinuations for other reasons did not occur, while the cumulative incidence gives systematically lower estimates. The Kaplan-Meier estimates are more appropriate when estimating the effectiveness of a contraceptive method, but the cumulative incidence estimates are more appropriate when making programmatic decisions regarding contraceptive methods. Other areas of application, such as cancer studies, may prefer to use the cumulative incidence estimates, but their use should be determined according to the application. Copyright 2001 John Wiley & Sons, Ltd.

  9. Elevated plasma vitamin B12 levels and cancer prognosis: A population-based cohort study

    DEFF Research Database (Denmark)

    Arendt, Johan Frederik Håkonsen; Farkas, Dora Kormendine; Pedersen, Lars

    2015-01-01

    patients without a plasma Cbl measurement. Patients treated with Cbl were excluded. Survival probability was assessed using Kaplan-Meier curves. Mortality risk ratios (MRR) were computed using Cox proportional hazard regression, adjusted for age, sex, calendar year, cancer stage and comorbidity, scored...

  10. Genetics Home Reference: Meier-Gorlin syndrome

    Science.gov (United States)

    ... Additional NIH Resources (1 link) National Institute of Neurological Disorders and Stroke: Microcephaly Information Page Educational Resources (10 links) Boston Children's Hospital: Growth Problems Disease InfoSearch: Meier-Gorlin syndrome ...

  11. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  12. Censoring: a new approach for detection limits in total-reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.; Braziewicz, J.

    2004-01-01

    It is shown that the detection limits in the total-reflection X-ray fluorescence (TXRF), which restrict quantification of very low concentrations of trace elements in the samples, can be accounted for using the statistical concept of censoring. We demonstrate that the incomplete TXRF measurements containing the so-called 'nondetects', i.e. the non-measured concentrations falling below the detection limits and represented by the estimated detection limit values, can be viewed as the left random-censored data, which can be further analyzed using the Kaplan-Meier (KM) method correcting for nondetects. Within this approach, which uses the Kaplan-Meier product-limit estimator to obtain the cumulative distribution function corrected for the nondetects, the mean value and median of the detection limit censored concentrations can be estimated in a non-parametric way. The Monte Carlo simulations performed show that the Kaplan-Meier approach yields highly accurate estimates for the mean and median concentrations, being within a few percent with respect to the simulated, uncensored data. This means that the uncertainties of KM estimated mean value and median are limited in fact only by the number of studied samples and not by the applied correction procedure for nondetects itself. On the other hand, it is observed that, in case when the concentration of a given element is not measured in all the samples, simple approaches to estimate a mean concentration value from the data yield erroneous, systematically biased results. The discussed random-left censoring approach was applied to analyze the TXRF detection-limit-censored concentration measurements of trace elements in biomedical samples. We emphasize that the Kaplan-Meier approach allows one to estimate the mean concentrations being substantially below the mean level of detection limits. Consequently, this approach gives a new access to lower the effective detection limits for TXRF method, which is of prime interest for

  13. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  14. Internal Medicine residents use heuristics to estimate disease probability

    Directory of Open Access Journals (Sweden)

    Sen Phang

    2015-12-01

    Conclusions: Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  15. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  16. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  17. Carbonic anhydrase IX and response to postmastectomy radiotherapy in high-risk breast cancer: a subgroup analysis of the DBCG82 b and c trials

    DEFF Research Database (Denmark)

    Kyndi, M.; Sorensen, F.B.; Alsner, J.

    2008-01-01

    -points were loco-regional recurrence, distant metastases, disease-specific survival and overall survival. Statistical analyses included kappa statistics, chi(2) or exact tests, Kaplan-Meier probability plots, Log-rank test and Cox regression analyses. Results CA IX was assessable in 945 cores. The percentage...

  18. Methods for estimating drought streamflow probabilities for Virginia streams

    Science.gov (United States)

    Austin, Samuel H.

    2014-01-01

    Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.

  19. Dental age estimation: the role of probability estimates at the 10 year threshold.

    Science.gov (United States)

    Lucas, Victoria S; McDonald, Fraser; Neil, Monica; Roberts, Graham

    2014-08-01

    The use of probability at the 18 year threshold has simplified the reporting of dental age estimates for emerging adults. The availability of simple to use widely available software has enabled the development of the probability threshold for individual teeth in growing children. Tooth development stage data from a previous study at the 10 year threshold were reused to estimate the probability of developing teeth being above or below the 10 year thresh-hold using the NORMDIST Function in Microsoft Excel. The probabilities within an individual subject are averaged to give a single probability that a subject is above or below 10 years old. To test the validity of this approach dental panoramic radiographs of 50 female and 50 male children within 2 years of the chronological age were assessed with the chronological age masked. Once the whole validation set of 100 radiographs had been assessed the masking was removed and the chronological age and dental age compared. The dental age was compared with chronological age to determine whether the dental age correctly or incorrectly identified a validation subject as above or below the 10 year threshold. The probability estimates correctly identified children as above or below on 94% of occasions. Only 2% of the validation group with a chronological age of less than 10 years were assigned to the over 10 year group. This study indicates the very high accuracy of assignment at the 10 year threshold. Further work at other legally important age thresholds is needed to explore the value of this approach to the technique of age estimation. Copyright © 2014. Published by Elsevier Ltd.

  20. Influence of Working Environment on Fatigue Life Time Duration for Runner Blades of Kaplan Turbines

    Directory of Open Access Journals (Sweden)

    Ana-Maria Budai

    2010-10-01

    Full Text Available The paper present an analytical analyzes refer to influence of working environment on life time duration in service of runner blades of Kaplan turbines. The study are made using only analytical method, the entry dates being obtained from measurements made in situ for a Kaplan turbine. To calculate the maximum number of stress cycles whereupon the runner blades work without any damage it was used an analytical relation known in specialized literatures under the name of Morrow’s relation. To estimate fatigue life time duration will be used a formula obtained from one of most common cumulative damage methodology taking in consideration the real exploitation conditions of a specified Kaplan turbine.

  1. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  2. Kaplan turbine tip vortex cavitation – analysis and prevention

    International Nuclear Information System (INIS)

    Motycak, L; Skotak, A; Kupcik, R

    2012-01-01

    The work is focused on one type of Kaplan turbine runner cavitation – a tip vortex cavitation. For detailed description of the tip vortex, the CFD analysis is used. On the basis of this analysis it is possible to estimate the intensity of cavitating vortex core, danger of possible blade surface and runner chamber cavitation pitting. In the paper, the ways how to avoid the pitting effect of the tip vortex are described. In order to prevent the blade surface against pitting, the following possibilities as the change of geometry of the runner blade, dimension of tip clearance and finally the installation of the anti-cavitation lips are discussed. The knowledge of the shape and intensity of the tip vortex helps to design the anti-cavitation lips more sophistically. After all, the results of the model tests of the Kaplan runner with or without anti-cavitation lips and the results of the CFD analysis are compared.

  3. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  4. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  5. Research on the cavitation characteristic of Kaplan turbine under sediment flow condition

    International Nuclear Information System (INIS)

    Weili, L; Jinling, L; Xingqi, L; Yuan, L

    2010-01-01

    The sediment concentration in many rivers in our world is very high, and the Kaplan turbine running in these rivers are usually seriously abraded. Since the existence of sand, the probability of cavitation is greatly enhanced. Under the joint action and mutual promotion of cavitation and sand erosion, serious abrasion could be made, the hydraulic performance of the Kaplan turbine may be descended, and the safety and stability of turbine are greatly threatened. Therefore, it is very important and significant to investigate the cavitation characteristic of Kaplan turbine under sediment flow condition. In this paper, numerical simulation of cavitation characteristic in pure water and solid-liquid two-phase flow in Kaplan turbine was performed. The solid-liquid two-fluid model were adopted in the numerical simulation, and the pressure, velocity and particle concentration distributive regularity on turbine blade surface under different diameter and concentration was revealed. Particle trajectory model was used to investigate the region and degree of runner blade abrasion in different conditions. The results showed that serious sand abrasion could be found near the blade head and outlet in large flow rate working condition. Relatively slight abrasion may be found near blade flange in small flow rate working condition. The more the sediment concentration and the large the sand diameter, the serious the runner is abraded, and the greater the efficiency is decreased. further analysis of the combined effects of wear and abrasion was performed. The result shows that the cavitation in silt flow is more serious than in pure water. The runner cavitation performance become worse under high sand concentration and large particle diameter, and the efficiency decrease greatly with the increase of sediment concentration.

  6. Research on the cavitation characteristic of Kaplan turbine under sediment flow condition

    Energy Technology Data Exchange (ETDEWEB)

    Weili, L; Jinling, L; Xingqi, L; Yuan, L, E-mail: liaoweili2004@163.co [Institute of Water Resources and Hydro-Electric Engineering, Xi' an University of Technology No.5 South Jinhua Road, Xi' an, Shaanxi, 710048 (China)

    2010-08-15

    The sediment concentration in many rivers in our world is very high, and the Kaplan turbine running in these rivers are usually seriously abraded. Since the existence of sand, the probability of cavitation is greatly enhanced. Under the joint action and mutual promotion of cavitation and sand erosion, serious abrasion could be made, the hydraulic performance of the Kaplan turbine may be descended, and the safety and stability of turbine are greatly threatened. Therefore, it is very important and significant to investigate the cavitation characteristic of Kaplan turbine under sediment flow condition. In this paper, numerical simulation of cavitation characteristic in pure water and solid-liquid two-phase flow in Kaplan turbine was performed. The solid-liquid two-fluid model were adopted in the numerical simulation, and the pressure, velocity and particle concentration distributive regularity on turbine blade surface under different diameter and concentration was revealed. Particle trajectory model was used to investigate the region and degree of runner blade abrasion in different conditions. The results showed that serious sand abrasion could be found near the blade head and outlet in large flow rate working condition. Relatively slight abrasion may be found near blade flange in small flow rate working condition. The more the sediment concentration and the large the sand diameter, the serious the runner is abraded, and the greater the efficiency is decreased. further analysis of the combined effects of wear and abrasion was performed. The result shows that the cavitation in silt flow is more serious than in pure water. The runner cavitation performance become worse under high sand concentration and large particle diameter, and the efficiency decrease greatly with the increase of sediment concentration.

  7. Research on the cavitation characteristic of Kaplan turbine under sediment flow condition

    Science.gov (United States)

    Weili, L.; Jinling, L.; Xingqi, L.; Yuan, L.

    2010-08-01

    The sediment concentration in many rivers in our world is very high, and the Kaplan turbine running in these rivers are usually seriously abraded. Since the existence of sand, the probability of cavitation is greatly enhanced. Under the joint action and mutual promotion of cavitation and sand erosion, serious abrasion could be made, the hydraulic performance of the Kaplan turbine may be descended, and the safety and stability of turbine are greatly threatened. Therefore, it is very important and significant to investigate the cavitation characteristic of Kaplan turbine under sediment flow condition. In this paper, numerical simulation of cavitation characteristic in pure water and solid-liquid two-phase flow in Kaplan turbine was performed. The solid-liquid two-fluid model were adopted in the numerical simulation, and the pressure, velocity and particle concentration distributive regularity on turbine blade surface under different diameter and concentration was revealed. Particle trajectory model was used to investigate the region and degree of runner blade abrasion in different conditions. The results showed that serious sand abrasion could be found near the blade head and outlet in large flow rate working condition. Relatively slight abrasion may be found near blade flange in small flow rate working condition. The more the sediment concentration and the large the sand diameter, the serious the runner is abraded, and the greater the efficiency is decreased. further analysis of the combined effects of wear and abrasion was performed. The result shows that the cavitation in silt flow is more serious than in pure water. The runner cavitation performance become worse under high sand concentration and large particle diameter, and the efficiency decrease greatly with the increase of sediment concentration.

  8. Comparison of density estimators. [Estimation of probability density functions

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.; Monahan, J.F.

    1977-09-01

    Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)

  9. Simplified tools for measuring retention in care in antiretroviral treatment program in Ethiopia: cohort and current retention in care.

    Science.gov (United States)

    Assefa, Yibeltal; Worku, Alemayehu; Wouters, Edwin; Koole, Olivier; Haile Mariam, Damen; Van Damme, Wim

    2012-01-01

    Patient retention in care is a critical challenge for antiretroviral treatment programs. This is mainly because retention in care is related to adherence to treatment and patient survival. It is therefore imperative that health facilities and programs measure patient retention in care. However, the currently available tools, such as Kaplan Meier, for measuring retention in care have a lot of practical limitations. The objective of this study was to develop simplified tools for measuring retention in care. Retrospective cohort data were collected from patient registers in nine health facilities in Ethiopia. Retention in care was the primary outcome for the study. Tools were developed to measure "current retention" in care during a specific period of time for a specific "ART-age group" and "cohort retention" in care among patients who were followed for the last "Y" number of years on ART. "Probability of retention" based on the tool for "cohort retention" in care was compared with "probability of retention" based on Kaplan Meier. We found that the new tools enable to measure "current retention" and "cohort retention" in care. We also found that the tools were easy to use and did not require advanced statistical skills. Both "current retention" and "cohort retention" are lower among patients in the first two "ART-age groups" and "ART-age cohorts" than in subsequent "ART-age groups" and "ART-age cohorts". The "probability of retention" based on the new tools were found to be similar to the "probability of retention" based on Kaplan Meier. The simplified tools for "current retention" and "cohort retention" will enable practitioners and program managers to measure and monitor rates of retention in care easily and appropriately. We therefore recommend that health facilities and programs start to use these tools in their efforts to improve retention in care and patient outcomes.

  10. Simplified tools for measuring retention in care in antiretroviral treatment program in Ethiopia: cohort and current retention in care.

    Directory of Open Access Journals (Sweden)

    Yibeltal Assefa

    Full Text Available INTRODUCTION: Patient retention in care is a critical challenge for antiretroviral treatment programs. This is mainly because retention in care is related to adherence to treatment and patient survival. It is therefore imperative that health facilities and programs measure patient retention in care. However, the currently available tools, such as Kaplan Meier, for measuring retention in care have a lot of practical limitations. The objective of this study was to develop simplified tools for measuring retention in care. METHODS: Retrospective cohort data were collected from patient registers in nine health facilities in Ethiopia. Retention in care was the primary outcome for the study. Tools were developed to measure "current retention" in care during a specific period of time for a specific "ART-age group" and "cohort retention" in care among patients who were followed for the last "Y" number of years on ART. "Probability of retention" based on the tool for "cohort retention" in care was compared with "probability of retention" based on Kaplan Meier. RESULTS: We found that the new tools enable to measure "current retention" and "cohort retention" in care. We also found that the tools were easy to use and did not require advanced statistical skills. Both "current retention" and "cohort retention" are lower among patients in the first two "ART-age groups" and "ART-age cohorts" than in subsequent "ART-age groups" and "ART-age cohorts". The "probability of retention" based on the new tools were found to be similar to the "probability of retention" based on Kaplan Meier. CONCLUSION: The simplified tools for "current retention" and "cohort retention" will enable practitioners and program managers to measure and monitor rates of retention in care easily and appropriately. We therefore recommend that health facilities and programs start to use these tools in their efforts to improve retention in care and patient outcomes.

  11. Estimating the probability that the Taser directly causes human ventricular fibrillation.

    Science.gov (United States)

    Sun, H; Haemmerich, D; Rahko, P S; Webster, J G

    2010-04-01

    This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.

  12. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. The estimation of collision probabilities in complicated geometries

    International Nuclear Information System (INIS)

    Roth, M.J.

    1969-04-01

    This paper demonstrates how collision probabilities in complicated geometries may be estimated. It is assumed that the reactor core may be divided into a number of cells each with simple geometry so that a collision probability matrix can be calculated for each cell by standard methods. It is then shown how these may be joined together. (author)

  14. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  15. MAINTAINANCE OF KAPLAN TURBINE TO ENHANCE THE EFFICIENCY

    OpenAIRE

    Mr. Shakti Prasanna Khadanga*; Nitish Kumar; Milind Kumar Singh; L. Raj Kumar

    2016-01-01

    Hydro power plant is the source of renewable energy which leads to reduction in burning of fossil fuels. So the environment is no longer polluted. This project depicts how sediment erosion occurs in Kaplan turbine and the various components of Kaplan turbine where actually erosion takes place. It reduces efficiency [7] and life of hydro power turbine but also causes problems in operations and maintenance. We conducted some necessary test on Kaplan turbine in fluid power laboratory. We are d...

  16. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  17. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  18. Clinical outcomes in patients with node-negative breast cancer treated based on the recurrence score results: evidence from a large prospectively designed registry.

    Science.gov (United States)

    Stemmer, Salomon M; Steiner, Mariana; Rizel, Shulamith; Soussan-Gutman, Lior; Ben-Baruch, Noa; Bareket-Samish, Avital; Geffen, David B; Nisenbaum, Bella; Isaacs, Kevin; Fried, Georgeta; Rosengarten, Ora; Uziely, Beatrice; Svedman, Christer; McCullough, Debbie; Maddala, Tara; Klang, Shmuel H; Zidan, Jamal; Ryvo, Larisa; Kaufman, Bella; Evron, Ella; Karminsky, Natalya; Goldberg, Hadassah; Shak, Steven; Liebermann, Nicky

    2017-01-01

    The 21-gene Recurrence Score® (RS) assay is a validated prognostic/predictive tool in ER + early-stage breast cancer. However, clinical outcome data from prospective studies in RS ≥ 11 patients are lacking, as are relevant real-life clinical practice data. In this retrospective analysis of a prospectively designed registry, we evaluated treatments/clinical outcomes in patients undergoing RS-testing through Clalit Health Services. The analysis included N0 ER + HER2-negative breast cancer patients who were RS-tested from 1/2006 through 12/2010. Medical records were reviewed to verify treatments/recurrences/survival. The cohort included 1801 patients (median follow-up, 6.2 years). Median age was 60 years, 50.4% were grade 2 and 81.1% had invasive ductal carcinoma; 48.9% had RS < 18, 40.7% RS 18-30, and 10.4% RS ≥ 31, with chemotherapy use of 1.4, 23.7, and 87.2%, respectively. The 5-year Kaplan-Meier estimates for distant recurrence were 0.8, 3.0, and 8.6%, for patients with RS < 18, RS 18-30 and RS ≥ 31, respectively; the corresponding 5-year Kaplan-Meier estimates for breast cancer death were 0.0, 0.9, and 6.2%. Chemotherapy-untreated patients with RS < 11 ( n  = 304) and 11-25 ( n  = 1037) (TAILORx categorizatio n ) had 5-year Kaplan-Meier estimates for distant recurrence risk/breast cancer death of 1.0%/0.0% and 1.3%/0.4%, respectively. Our results extend those of the prospective TAILORx trial: the 5-year Kaplan-Meier estimates for distant recurrence and breast cancer death rate for the RS < 18 patients were very low supporting the use of endocrine therapy alone. Furthermore, in chemotherapy-untreated patients with RS 11-25 (where TAILORx patients were randomized to chemoendocrine or endocrine therapy alone), 5-year distant recurrence rates were also very low, suggesting that chemotherapy would not have conferred clinically meaningful benefit.

  19. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  20. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  1. Meier-Gorlin syndrome Clinical genetics and genomics

    NARCIS (Netherlands)

    S. de Munnik (Sonja); E.H. Hoefsloot (Lies); J. Roukema (Jolt); J. Schoots (Jeroen); N.V.A.M. Knoers (Nine); H.G. Brunner; A.P. Jackson (Andrew); E. Bongers (Ernie)

    2015-01-01

    textabstractMeier-Gorlin syndrome (MGS) is a rare autosomal recessive primordial dwarfism disorder, characterized by microtia, patellar applasia/hypoplasia, and a proportionate short stature. Associated clinical features encompass feeding problems, congenital pulmonary emphysema, mammary hypoplasia

  2. Meier-Gorlin syndrome Clinical genetics and genomics

    NARCIS (Netherlands)

    De Munnik, Sonja A.; Hoefsloot, Elisabeth H.; Roukema, Jolt; Schoots, Jeroen; Knoers, Nine Vam; Brunner, Han G.; Jackson, Andrew P.; Bongers, Ernie Mhf

    2015-01-01

    Meier-Gorlin syndrome (MGS) is a rare autosomal recessive primordial dwarfism disorder, characterized by microtia, patellar applasia/hypoplasia, and a proportionate short stature. Associated clinical features encompass feeding problems, congenital pulmonary emphysema, mammary hypoplasia in females

  3. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  4. Kaplan SpellRead. What Works Clearinghouse Intervention Report

    Science.gov (United States)

    What Works Clearinghouse, 2007

    2007-01-01

    "Kaplan SpellRead" (formerly known as "SpellRead Phonological Auditory Training"[R]) is a literacy program for struggling readers in grades 2 or above, including special education students, English language learners, and students more than two years below grade level in reading. "Kaplan SpellRead" integrates the…

  5. Optimization of Kaplan turbines. A contribution to economic efficiency; Optimierung von Kaplan-Turbinen. Ein Beitrag zur Betriebswirtschaftlichkeit

    Energy Technology Data Exchange (ETDEWEB)

    Sevcik, Petr

    2009-07-01

    The Kaplan turbine has the best theoretical efficiency chart in the total range of operation. In order to achieve these good properties, the turbine has to be adjusted optimally. In general, these settings are performed by the manufacturer of turbines during commissioning. In practice one often meets Kaplan turbines where the scenery does not correspond to the optimal control line. The author of the contribution under consideration reports on possible causes for these errors and also methods of how this scenery can be optimized cost-effectively and how to minimize power losses.

  6. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower

  7. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  8. Nested Cohort - R software package

    Science.gov (United States)

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  9. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  10. Long-Term Survivors Using Intraoperative Radiotherapy for Recurrent Gynecologic Malignancies

    International Nuclear Information System (INIS)

    Tran, Phuoc T.; Su Zheng; Hara, Wendy; Husain, Amreen; Teng, Nelson; Kapp, Daniel S.

    2007-01-01

    Purpose: To analyze the outcomes of therapy and identify prognostic factors for patients treated with surgery followed by intraoperative radiotherapy (IORT) for gynecologic malignancies at a single institution. Methods and Materials: We performed a retrospective review of 36 consecutive patients treated with IORT to 44 sites with mean follow-up of 50 months. The primary site was the cervix in 47%, endometrium in 31%, vulva in 14%, vagina in 6%, and fallopian tubes in 3%. Previous RT had failed in 72% of patients, and 89% had recurrent disease. Of 38 IORT sessions, 84% included maximal cytoreductive surgery, including 18% exenterations. The mean age was 52 years (range, 30-74), mean tumor size was 5 cm (range, 0.5-12), previous disease-free interval was 32 months (range, 0-177), and mean IORT dose was 1,152 cGy (range, 600-1,750). RT and systemic therapy after IORT were given to 53% and 24% of the cohort, respectively. The outcomes measured were locoregional control (LRC), distant metastasis-free survival (DMFS), disease-specific survival (DSS), and treatment-related complications. Results: The Kaplan-Meier 5-year LRC, DMFS, and DSS probability for the whole group was 44%, 51%, and 47%, respectively. For cervical cancer patients, the Kaplan-Meier 5-year LRC, DMFS, and DSS estimate was 45%, 60%, and 46%, respectively. The prognostic factors found on multivariate analysis (p ≤ 0.05) were the disease-free interval for LRC, tumor size for DMFS, and cervical primary, previous surgery, and locoregional relapse for DSS. Our cohort had 10 Grade 3-4 complications associated with treatment (surgery and IORT) and a Kaplan-Meier 5-year Grade 3-4 complication-free survival rate of 72%. Conclusions: Survival for pelvic recurrence of gynecologic cancer is poor (range, 0-25%). IORT after surgery seems to confer long-term local control in carefully selected patients

  11. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    Science.gov (United States)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  12. Conditional survival of patients with diffuse large B-cell lymphoma

    DEFF Research Database (Denmark)

    Møller, Michael Boe; Pedersen, Niels Tinggaard; Christensen, Bjarne E

    2006-01-01

    BACKGROUND: Prognosis of lymphoma patients is usually estimated at the time of diagnosis and the estimates are guided by the International Prognostic Index (IPI). However, conditional survival estimates are more informative clinically, as they consider those patients only who have already survive...... survival probability provides more accurate prognostic information than the conventional survival rate estimated from the time of diagnosis.......BACKGROUND: Prognosis of lymphoma patients is usually estimated at the time of diagnosis and the estimates are guided by the International Prognostic Index (IPI). However, conditional survival estimates are more informative clinically, as they consider those patients only who have already survived...... a period of time after treatment. Conditional survival data have not been reported for lymphoma patients. METHODS: Conditional survival was estimated for 1209 patients with diffuse large B-cell lymphoma (DLBCL) from the population-based LYFO registry of the Danish Lymphoma Group. The Kaplan-Meier method...

  13. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...

  14. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  15. ORIGINAL ARTICLES Antiretroviral treatment for children

    African Journals Online (AJOL)

    Kaplan-Meier survival estimate for 407 children at 1 year was. 84% (95% ... highly active antiretroviral therapy (HAART) to 3 million people living with HIV I AIDS in ... 5 Furthermore, improvements in growth and body composition parameters,.

  16. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  17. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  18. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  19. Use of probabilistic methods for estimating failure probabilities and directing ISI-efforts

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, F; Brickstad, B [University of Uppsala, (Switzerland)

    1988-12-31

    Some general aspects of the role of Non Destructive Testing (NDT) efforts on the resulting probability of core damage is discussed. A simple model for the estimation of the pipe break probability due to IGSCC is discussed. It is partly based on analytical procedures, partly on service experience from the Swedish BWR program. Estimates of the break probabilities indicate that further studies are urgently needed. It is found that the uncertainties about the initial crack configuration are large contributors to the total uncertainty. Some effects of the inservice inspection are studied and it is found that the detection probabilities influence the failure probabilities. (authors).

  20. Interview with Danny Kaplan

    Science.gov (United States)

    Rossman, Allan; Kaplan, Danny

    2017-01-01

    Danny Kaplan is DeWitt Wallace Professor of Mathematics and Computer Science at Macalester College. He received Macalester's Excellence in teaching Award in 2006 and the CAUSE/USCOTS Lifetime Achievement Award in 2017. This interview took place via email on March 4-June 17, 2017. Topics covered in the interview include: (1) the current state of…

  1. Retinitis pigmentosa reduces the risk of proliferative diabetic retinopathy: a nationwide population-based cohort study.

    Directory of Open Access Journals (Sweden)

    Yuh-Fang Chen

    Full Text Available PURPOSE: To study the association between retinitis pigmentosa (RP and the progression of diabetic retinopathy (DR. METHODS: Using the Longitudinal Health Insurance Database 2000 of Taiwan, we identified individuals with an initial diagnosis for RP during the period of 1997-2008. A non-RP comparison group, 10-fold frequency matched by sex, age, index year and the year of diabetes diagnosed, were randomly selected from the same database. The occurrence of DR was observed for all subjects until the end of 2009. The Kaplan-Meier curves were used to illustrate the cumulative probability of developing DR for the RP group and comparison groups. The hazard ratio (HR of DR for the RP group relative to the comparison group was estimated using Cox proportional hazards model after adjusting for potential confounders. RESULTS: The Kaplan-Meier curves were not statistically significant different between the RP group and the comparison group. However, the RP group had a higher cumulative probability of developing DR during the first six to seven years. The cumulative probability kept increasing and became higher in the comparison group but remained unchanged in the RP group. The HR for the RP patients comparing with the comparison group was 0.96 (95% confidence interval (CI = 0.43-2.14. Stratified by severity, RP was associated with a non-statistically significant reduced risk of proliferative DR (PDR (HR = 0.70, 95% CI = 0.16-3.14. The HR for non-proliferative DR (NPDR was 1.08 (95% CI = 0.40-2.86. CONCLUSION: In this study, RP was not statistically significant associated with the incidence of DR.

  2. A procedure for estimation of pipe break probabilities due to IGSCC

    International Nuclear Information System (INIS)

    Bergman, M.; Brickstad, B.; Nilsson, F.

    1998-06-01

    A procedure has been developed for estimation of the failure probability of welds joints in nuclear piping susceptible to intergranular stress corrosion cracking. The procedure aims at a robust and rapid estimate of the failure probability for a specific weld with known stress state. Random properties are taken into account of the crack initiation rate, the initial crack length, the in-service inspection efficiency and the leak rate. A computer realization of the procedure has been developed for user friendly applications by design engineers. Some examples are considered to investigate the sensitivity of the failure probability to different input quantities. (au)

  3. Estimating the Probability of Wind Ramping Events: A Data-driven Approach

    OpenAIRE

    Wang, Cheng; Wei, Wei; Wang, Jianhui; Qiu, Feng

    2016-01-01

    This letter proposes a data-driven method for estimating the probability of wind ramping events without exploiting the exact probability distribution function (PDF) of wind power. Actual wind data validates the proposed method.

  4. Dynamic Model of Kaplan Turbine Regulating System Suitable for Power System Analysis

    Directory of Open Access Journals (Sweden)

    Jie Zhao

    2015-01-01

    Full Text Available Accurate modeling of Kaplan turbine regulating system is of great significance for grid security and stability analysis. In this paper, Kaplan turbine regulating system model is divided into the governor system model, the blade control system model, and the turbine and water diversion system model. The Kaplan turbine has its particularity, and the on-cam relationship between the wicket gate opening and the runner blade angle under a certain water head on the whole range was obtained by high-order curve fitting method. Progressively the linearized Kaplan turbine model, improved ideal Kaplan turbine model, and nonlinear Kaplan turbine model were developed. The nonlinear Kaplan turbine model considered the correction function of the blade angle on the turbine power, thereby improving the model simulation accuracy. The model parameters were calculated or obtained by the improved particle swarm optimization (IPSO algorithm. For the blade control system model, the default blade servomotor time constant given by value of one simplified the modeling and experimental work. Further studies combined with measured test data verified the established model accuracy and laid a foundation for further research into the influence of Kaplan turbine connecting to the grid.

  5. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  6. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  7. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  8. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  9. Treatment of primary tracheal carcinoma. The role of external and endoluminal radiotherapy

    International Nuclear Information System (INIS)

    Harms, W.; Wannenmacher, M.; Becker, H.; Herth, F.; Gagel, B.

    2000-01-01

    Background and Purpose: In a retrospective study the role of radiation therapy for the treatment of primary tracheal carcinoma was investigated. Patients and Methods: Between 1984 and 1997, 25 patients with primary tracheal carcinoma were treated with external beam radiotherapy (17 squamous-cell carcinoma [SCC], 8 adenoid cystic carcinoma [ACC], median dose SCC 60 Gy, ACC 55 Gy). An additional brachytherapy boost was carried out in 10/25 patients (median dose SCC 18 Gy, ACC 15 Gy). Ten patients underwent operative treatment. Results: The median survival (Kaplan-Meier) for patients with SCC was 33 months (ACC 94.2). The 1-, 2- and 5-year survival rates (Kaplan-Meier) for patients with SCC were 64.7% (ACC 85.7%), 64.7% (ACC 85.7%), and 26% (ACC 85.7%). Patients with ACC and patients with a complete remission after treatment had a significantly better survival probability (log rank test, p [de

  10. First Passage Probability Estimation of Wind Turbines by Markov Chain Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2013-01-01

    Markov Chain Monte Carlo simulation has received considerable attention within the past decade as reportedly one of the most powerful techniques for the first passage probability estimation of dynamic systems. A very popular method in this direction capable of estimating probability of rare events...... of the method by modifying the conditional sampler. In this paper, applicability of the original SS is compared to the recently introduced modifications of the method on a wind turbine model. The model incorporates a PID pitch controller which aims at keeping the rotational speed of the wind turbine rotor equal...... to its nominal value. Finally Monte Carlo simulations are performed which allow assessment of the accuracy of the first passage probability estimation by the SS methods....

  11. Estimation and asymptotic theory for transition probabilities in Markov Renewal Multi–state models

    NARCIS (Netherlands)

    Spitoni, C.; Verduijn, M.; Putter, H.

    2012-01-01

    In this paper we discuss estimation of transition probabilities for semi–Markov multi–state models. Non–parametric and semi–parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional

  12. Estimating Bird / Aircraft Collision Probabilities and Risk Utilizing Spatial Poisson Processes

    Science.gov (United States)

    2012-06-10

    ESTIMATING BIRD/AIRCRAFT COLLISION PROBABILITIES AND RISK UTILIZING SPATIAL POISSON PROCESSES GRADUATE...AND RISK UTILIZING SPATIAL POISSON PROCESSES GRADUATE RESEARCH PAPER Presented to the Faculty Department of Operational Sciences...COLLISION PROBABILITIES AND RISK UTILIZING SPATIAL POISSON PROCESSES Brady J. Vaira, BS, MS Major, USAF Approved

  13. Meier-Gorlin syndrome Clinical genetics and genomics

    OpenAIRE

    Munnik, Sonja; Hoefsloot, Lies; Roukema, Jolt; Schoots, Jeroen; Knoers, Nine; Brunner, H.G.; Jackson, Andrew; Bongers, Ernie

    2015-01-01

    textabstractMeier-Gorlin syndrome (MGS) is a rare autosomal recessive primordial dwarfism disorder, characterized by microtia, patellar applasia/hypoplasia, and a proportionate short stature. Associated clinical features encompass feeding problems, congenital pulmonary emphysema, mammary hypoplasia in females and urogenital anomalies, such as cryptorchidism and hypoplastic labia minora and majora. Typical facial characteristics during childhood comprise a small mouth with full lips and micro-...

  14. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  15. Mixture models for undiagnosed prevalent disease and interval-censored incident disease: applications to a cohort assembled from electronic health records.

    Science.gov (United States)

    Cheung, Li C; Pan, Qing; Hyun, Noorie; Schiffman, Mark; Fetterman, Barbara; Castle, Philip E; Lorey, Thomas; Katki, Hormuzd A

    2017-09-30

    For cost-effectiveness and efficiency, many large-scale general-purpose cohort studies are being assembled within large health-care providers who use electronic health records. Two key features of such data are that incident disease is interval-censored between irregular visits and there can be pre-existing (prevalent) disease. Because prevalent disease is not always immediately diagnosed, some disease diagnosed at later visits are actually undiagnosed prevalent disease. We consider prevalent disease as a point mass at time zero for clinical applications where there is no interest in time of prevalent disease onset. We demonstrate that the naive Kaplan-Meier cumulative risk estimator underestimates risks at early time points and overestimates later risks. We propose a general family of mixture models for undiagnosed prevalent disease and interval-censored incident disease that we call prevalence-incidence models. Parameters for parametric prevalence-incidence models, such as the logistic regression and Weibull survival (logistic-Weibull) model, are estimated by direct likelihood maximization or by EM algorithm. Non-parametric methods are proposed to calculate cumulative risks for cases without covariates. We compare naive Kaplan-Meier, logistic-Weibull, and non-parametric estimates of cumulative risk in the cervical cancer screening program at Kaiser Permanente Northern California. Kaplan-Meier provided poor estimates while the logistic-Weibull model was a close fit to the non-parametric. Our findings support our use of logistic-Weibull models to develop the risk estimates that underlie current US risk-based cervical cancer screening guidelines. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  16. The Short-Term and Intermediate-Term Risk of Second Neoplasms After Diagnosis and Treatment of Unilateral Vestibular Schwannoma: Analysis of 9460 Cases

    International Nuclear Information System (INIS)

    Carlson, Matthew L.; Glasgow, Amy E.; Jacob, Jeffrey T.; Habermann, Elizabeth B.; Link, Michael J.

    2016-01-01

    Purpose: To determine the incidence of second intracranial neoplasms after the diagnosis and treatment of sporadic vestibular schwannoma (VS). Methods and Materials: Analysis of the Surveillance, Epidemiology, and End Results (SEER) database including all patients identified with a diagnosis of VS and a second intracranial tumor. The Kaplan-Meier method was used to determine the incidence of second tumors while allowing for censoring at loss to follow-up or death. Multivariable associations between treatment modality and second tumor formation were explored using Cox proportional hazards regression analysis. Two illustrative cases are also presented. Results: In all, 9460 patients with unilateral VS were identified between 2004 and 2012. Overall, 66 (0.7%) patients experienced a separate intracranial tumor, benign or malignant, after treatment of VS. Kaplan-Meier estimates for time to second neoplasm at 1, 3, and 5 years were 0.3%, 0.7%, and 0.8%, respectively. Multivariable comparison between VS treatment modalities revealed that the risk of second tumor formation was similar between radiation and surgery (hazard ratio [HR] 0.74; 95% confidence interval [CI] 0.36-1.51; P=.93) but greater for tumors managed with observation alone compared with radiation (HR 2.48; 95% CI 1.31-4.71; P<.01). A total of 6 (0.06%) intracranial malignancies were diagnosed after VS treatment. Kaplan-Meier estimates for time to malignancy at 1, 3, and 5 years were 0%, 0.1%, and 0.1%, respectively. After adjustment for age at diagnosis, sex, and treatment modality, the probability of malignancy after radiation was not greater than after observation alone or microsurgery (HR 4.88; 95% CI 0.85-28.14; P=.08) during the study period. Conclusions: The risk for the development of a second intracranial neoplasm, benign or malignant, at 5 years after treatment of unilateral VS is approximately 0.8%, whereas the risk of acquiring a separate malignancy is 0.1%, or approximately 1 per 1000 cases

  17. The Short-Term and Intermediate-Term Risk of Second Neoplasms After Diagnosis and Treatment of Unilateral Vestibular Schwannoma: Analysis of 9460 Cases

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Matthew L., E-mail: carlson.matthew@mayo.edu [Department of Otorhinolaryngology, Mayo Clinic School of Medicine, Rochester, Minnesota (United States); Department of Neurologic Surgery, Mayo Clinic School of Medicine, Rochester, Minnesota (United States); Glasgow, Amy E. [Division of Health Care Policy and Research and the Robert D. and Patricia E. Kern Center for the Science of Health Care Delivery, Mayo Clinic School of Medicine, Rochester, Minnesota (United States); Jacob, Jeffrey T. [Department of Neurologic Surgery, Mayo Clinic School of Medicine, Rochester, Minnesota (United States); Habermann, Elizabeth B. [Division of Health Care Policy and Research and the Robert D. and Patricia E. Kern Center for the Science of Health Care Delivery, Mayo Clinic School of Medicine, Rochester, Minnesota (United States); Link, Michael J. [Department of Otorhinolaryngology, Mayo Clinic School of Medicine, Rochester, Minnesota (United States); Department of Neurologic Surgery, Mayo Clinic School of Medicine, Rochester, Minnesota (United States)

    2016-07-15

    Purpose: To determine the incidence of second intracranial neoplasms after the diagnosis and treatment of sporadic vestibular schwannoma (VS). Methods and Materials: Analysis of the Surveillance, Epidemiology, and End Results (SEER) database including all patients identified with a diagnosis of VS and a second intracranial tumor. The Kaplan-Meier method was used to determine the incidence of second tumors while allowing for censoring at loss to follow-up or death. Multivariable associations between treatment modality and second tumor formation were explored using Cox proportional hazards regression analysis. Two illustrative cases are also presented. Results: In all, 9460 patients with unilateral VS were identified between 2004 and 2012. Overall, 66 (0.7%) patients experienced a separate intracranial tumor, benign or malignant, after treatment of VS. Kaplan-Meier estimates for time to second neoplasm at 1, 3, and 5 years were 0.3%, 0.7%, and 0.8%, respectively. Multivariable comparison between VS treatment modalities revealed that the risk of second tumor formation was similar between radiation and surgery (hazard ratio [HR] 0.74; 95% confidence interval [CI] 0.36-1.51; P=.93) but greater for tumors managed with observation alone compared with radiation (HR 2.48; 95% CI 1.31-4.71; P<.01). A total of 6 (0.06%) intracranial malignancies were diagnosed after VS treatment. Kaplan-Meier estimates for time to malignancy at 1, 3, and 5 years were 0%, 0.1%, and 0.1%, respectively. After adjustment for age at diagnosis, sex, and treatment modality, the probability of malignancy after radiation was not greater than after observation alone or microsurgery (HR 4.88; 95% CI 0.85-28.14; P=.08) during the study period. Conclusions: The risk for the development of a second intracranial neoplasm, benign or malignant, at 5 years after treatment of unilateral VS is approximately 0.8%, whereas the risk of acquiring a separate malignancy is 0.1%, or approximately 1 per 1000 cases

  18. Human error probability estimation using licensee event reports

    International Nuclear Information System (INIS)

    Voska, K.J.; O'Brien, J.N.

    1984-07-01

    Objective of this report is to present a method for using field data from nuclear power plants to estimate human error probabilities (HEPs). These HEPs are then used in probabilistic risk activities. This method of estimating HEPs is one of four being pursued in NRC-sponsored research. The other three are structured expert judgment, analysis of training simulator data, and performance modeling. The type of field data analyzed in this report is from Licensee Event reports (LERs) which are analyzed using a method specifically developed for that purpose. However, any type of field data or human errors could be analyzed using this method with minor adjustments. This report assesses the practicality, acceptability, and usefulness of estimating HEPs from LERs and comprehensively presents the method for use

  19. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  20. Estimation of probability of failure for damage-tolerant aerospace structures

    Science.gov (United States)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  1. Survival analysis II: Cox regression

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.

    2011-01-01

    In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the

  2. Is there racial/ethnic variance in cervical cancer- specific survival of ...

    African Journals Online (AJOL)

    incident cervical carcinoma, between 1992 and 1999, in the Surveillance Epidemiology and End Results (SEER) Data was linked with Medicare to examine the impact of race/ethnicity on overall and cancer-specific survival, using Kaplan Meier survival estimates and multivariable Cox Regression model. Results: There was ...

  3. Water hammer 2 phase analysis hydraulic system with a Kaplan turbine

    OpenAIRE

    Dudlik, A.; Koutnik, J.

    2009-01-01

    This investigation has been carried out for a case of sudden closing of a Kaplan turbine from a runaway operation. This work has been done at Fraunhofer UMSICHT, supported by VH. The runaway case has been selected as it is known that the discharge through a Kaplan turbine increases with its speed, and may reach up to twice the value of nominal discharge. The simulation model consists of: - penstock - Kaplan turbine (modelled with a valve characteristic) - draft tube All hydraulic pipe element...

  4. Computer Aided Design of Kaplan Turbine Piston with SolidWorks

    OpenAIRE

    Camelia Jianu

    2010-01-01

    The paper presents the steps for 3D computer aided design (CAD) of Kaplan turbine piston made in SolidWorks.The present paper is a tutorial for a Kaplan turbine piston 3D geometry, which is dedicaded to the Parts Sketch and Parts Features design and Drawing Geometry and Drawing Annotation.

  5. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    Science.gov (United States)

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.

  6. A framework to estimate probability of diagnosis error in NPP advanced MCR

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Kim, Jong Hyun; Jang, Inseok; Seong, Poong Hyun

    2018-01-01

    Highlights: •As new type of MCR has been installed in NPPs, the work environment is considerably changed. •A new framework to estimate operators’ diagnosis error probabilities should be proposed. •Diagnosis error data were extracted from the full-scope simulator of the advanced MCR. •Using Bayesian inference, a TRC model was updated for use in advanced MCR. -- Abstract: Recently, a new type of main control room (MCR) has been adopted in nuclear power plants (NPPs). The new MCR, known as the advanced MCR, consists of digitalized human-system interfaces (HSIs), computer-based procedures (CPS), and soft controls while the conventional MCR includes many alarm tiles, analog indicators, hard-wired control devices, and paper-based procedures. These changes significantly affect the generic activities of the MCR operators, in relation to diagnostic activities. The aim of this paper is to suggest a framework to estimate the probabilities of diagnosis errors in the advanced MCR by updating a time reliability correlation (TRC) model. Using Bayesian inference, the TRC model was updated with the probabilities of diagnosis errors. Here, the diagnosis error data were collected from a full-scope simulator of the advanced MCR. To do this, diagnosis errors were determined based on an information processing model and their probabilities were calculated. However, these calculated probabilities of diagnosis errors were largely affected by context factors such as procedures, HSI, training, and others, known as PSFs (Performance Shaping Factors). In order to obtain the nominal diagnosis error probabilities, the weightings of PSFs were also evaluated. Then, with the nominal diagnosis error probabilities, the TRC model was updated. This led to the proposal of a framework to estimate the nominal probabilities of diagnosis errors in the advanced MCR.

  7. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  8. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold.

  9. Aplicación y técnicas del análisis de supervivencia en las investigaciones clínicas Application and techniques of survival analysis in clinical research

    Directory of Open Access Journals (Sweden)

    Anissa Gramatges Ortiz

    2002-08-01

    Full Text Available Se realizó una actualización sobre el análisis de supervivencia en las investigaciones clínicas. Se expusieron algunos de los conceptos más generales sobre este tipo de análisis y las características de los tiempos de supervivencia.Se abordan temas relacionados con los diferentes métodos que facilitan la estimación de las probabilidades de supervivencia para uno o más grupos de individuos, con la ejemplificación del cálculo de las probabilidades para el método de Kaplan-Meier. Se destaca la comparación de la supervivencia de varios grupos atendiendo a distintos factores que los diferencian, así como también se enuncian algunas de las pruebas estadísticas que nos posibilitan la comparación, como son la prueba log rank y la Breslow, como alternativa de esta cuando se evidencia una divergencia del azar proporcional, es decir, cuando las curvas de supe DE SUPERVI rvivencia se cruzanconcepts of this type of analyses and the characteristics of survival times were presented. Aspects related with the different methods facilitating the estimation of survival probabilities for one or more groups of subjects, including the example of calculation of Kaplan Meier method´s probabilities were dealt with . The survival rates of several groups were compared, taking into consideration various factors that differentiate them. Some of the statistical tests making the comparison possible such as log rank test, and the Breslow test as an alternative of the former when there is a proportional random divergence, that is, when survival curves cross were stated

  10. Analyzing survival curves at a fixed point in time for paired and clustered right-censored data

    Science.gov (United States)

    Su, Pei-Fang; Chi, Yunchan; Lee, Chun-Yi; Shyr, Yu; Liao, Yi-De

    2018-01-01

    In clinical trials, information about certain time points may be of interest in making decisions about treatment effectiveness. Rather than comparing entire survival curves, researchers can focus on the comparison at fixed time points that may have a clinical utility for patients. For two independent samples of right-censored data, Klein et al. (2007) compared survival probabilities at a fixed time point by studying a number of tests based on some transformations of the Kaplan-Meier estimators of the survival function. However, to compare the survival probabilities at a fixed time point for paired right-censored data or clustered right-censored data, their approach would need to be modified. In this paper, we extend the statistics to accommodate the possible within-paired correlation and within-clustered correlation, respectively. We use simulation studies to present comparative results. Finally, we illustrate the implementation of these methods using two real data sets. PMID:29456280

  11. Estimation of the probability of success in petroleum exploration

    Science.gov (United States)

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum

  12. ASURV: Astronomical SURVival Statistics

    Science.gov (United States)

    Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.

    2014-06-01

    ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.

  13. First line chemotherapy plus trastuzumab in metastatic breast cancer ...

    African Journals Online (AJOL)

    First line chemotherapy plus trastuzumab in metastatic breast cancer HER2 positive - Observational institutional study. ... The progression free survival was estimated by the Kaplan-Meier method, from the date of first cycle to the date of progression or at the last consultation, and the median was 12.8 months. Trastuzumab ...

  14. Influence of antiviral therapy on survival of patients with hepatitis B ...

    African Journals Online (AJOL)

    The mortality rates in two groups were evaluated with Kaplan-Meier estimate. ... 274 (76.9 %) died, with 89 patients belonging to the antiviral group while the ... TACE is different from systemic ... and identification of study participants was not ..... Table 3: Cox regression analysis to deteermine variables associated with overall ...

  15. Interdisciplinary interventional therapy for tracheobronchial stenosis with modern metal net stents; Interdisziplinaere interventionelle Therapie tracheobronchialer Stenosen mit modernen Metallmaschenstents

    Energy Technology Data Exchange (ETDEWEB)

    Rieger, J.; Linsenmaier, U.; Fedorowski, A.; Pfeifer, K.J. [Ludwig-Maximilians-Universitaet Muenchen (Germany). Inst. fuer Klinische Radiologie; Hautmann, H.; Huber, R.M. [Klinikum Innenstadt der Ludwig-Maximilians-Universitaet Muenchen (Germany). Abteilung Pulmonologie, Medizinische Klinik

    2002-08-01

    Study objectives: Assessment of the therapeutic potential of tracheobronchial stenting for obstructive tracheobronchial disease, in-vivo comparison of different stent types and development of helpful criteria for choosing the suitable stent type. Material and Methods: Prospective case analysis. Between 1993 and 1999 53 stents were implanted into the tracheobronchial system of 39 consecutive patients with benign or malignant airway obstruction. Every single stent (26 Strecker Stents, 18 Wallstents, 6 Accuflex Nitinolstents, 1 Dumon-, 1 Ruesch- and 1 Palmazstent) was recorded in an unified database. Analysis comprised clinical effectiveness, lung function if possible, relevant complications and radiologic follow-up parameters. The probability of their remaining within the tracheobronchial system, of their remaining undislocated and uncompressed was calculated using Kaplan-Meier analysis for three stent types. Results: Stent placement proved itself to be an effective treatment in 86% of the patients. Resistance could be normalized in 9/9 patients. Kaplan-Meier analysis clearly revealed a higher probability for the Wall- and Nitinolstent to remain within the tracheobronchial system and to remain uncompressed. Dislocation also occurred more rarely. Explanation of the Wallstent, however, if desired, was much more difficult compared to the Strecker stent. The Wallstent also occasionally led to the formation of granulation tissue especially at the proximal stent end and, as such, required reintervention. (orig.) [German] Ziel: Erfassung des therapeutischen Potenzials der Stentbehandlung bei tracheobronchialen Stenosen, in-vivo Vergleich verschiedener Stenttypen und Erarbeitung von geeigneten Kriterien zur Stentwahl. Material und Methodik: Prospektive Fallanalyse. Zwischen 1993 und 1999 wurden insgesamt 53 Stents in das Tracheobronichalsystem von 39 konsekutiven Patienten implantiert. Jeder einzelne Stent (26 Streckerstents, 18 Wallstents, 6 Nitinolstents, ein Dumonstent

  16. Estimated probability of the number of buildings damaged by the ...

    African Journals Online (AJOL)

    The analysis shows that the probability estimator of the building damage ... and homeowners) should reserve the cost of repair at least worth the risk of loss, to face ... Keywords: Citarum River; logistic regression; genetic algorithm; losses risk; ...

  17. Estimating migratory connectivity of birds when re-encounter probabilities are heterogeneous

    Science.gov (United States)

    Cohen, Emily B.; Hostelter, Jeffrey A.; Royle, J. Andrew; Marra, Peter P.

    2014-01-01

    Understanding the biology and conducting effective conservation of migratory species requires an understanding of migratory connectivity – the geographic linkages of populations between stages of the annual cycle. Unfortunately, for most species, we are lacking such information. The North American Bird Banding Laboratory (BBL) houses an extensive database of marking, recaptures and recoveries, and such data could provide migratory connectivity information for many species. To date, however, few species have been analyzed for migratory connectivity largely because heterogeneous re-encounter probabilities make interpretation problematic. We accounted for regional variation in re-encounter probabilities by borrowing information across species and by using effort covariates on recapture and recovery probabilities in a multistate capture–recapture and recovery model. The effort covariates were derived from recaptures and recoveries of species within the same regions. We estimated the migratory connectivity for three tern species breeding in North America and over-wintering in the tropics, common (Sterna hirundo), roseate (Sterna dougallii), and Caspian terns (Hydroprogne caspia). For western breeding terns, model-derived estimates of migratory connectivity differed considerably from those derived directly from the proportions of re-encounters. Conversely, for eastern breeding terns, estimates were merely refined by the inclusion of re-encounter probabilities. In general, eastern breeding terns were strongly connected to eastern South America, and western breeding terns were strongly linked to the more western parts of the nonbreeding range under both models. Through simulation, we found this approach is likely useful for many species in the BBL database, although precision improved with higher re-encounter probabilities and stronger migratory connectivity. We describe an approach to deal with the inherent biases in BBL banding and re-encounter data to demonstrate

  18. COMPARATIVE ANALYSIS OF ESTIMATION METHODS OF PHARMACY ORGANIZATION BANKRUPTCY PROBABILITY

    Directory of Open Access Journals (Sweden)

    V. L. Adzhienko

    2014-01-01

    Full Text Available A purpose of this study was to determine the probability of bankruptcy by various methods in order to predict the financial crisis of pharmacy organization. Estimating the probability of pharmacy organization bankruptcy was conducted using W. Beaver’s method adopted in the Russian Federation, with integrated assessment of financial stability use on the basis of scoring analysis. The results obtained by different methods are comparable and show that the risk of bankruptcy of the pharmacy organization is small.

  19. Dynamic Model of Kaplan Turbine Regulating System Suitable for Power System Analysis

    OpenAIRE

    Zhao, Jie; Wang, Li; Liu, Dichen; Wang, Jun; Zhao, Yu; Liu, Tian; Wang, Haoyu

    2015-01-01

    Accurate modeling of Kaplan turbine regulating system is of great significance for grid security and stability analysis. In this paper, Kaplan turbine regulating system model is divided into the governor system model, the blade control system model, and the turbine and water diversion system model. The Kaplan turbine has its particularity, and the on-cam relationship between the wicket gate opening and the runner blade angle under a certain water head on the whole range was obtained by high-o...

  20. Targeting Peripheral-Derived Regulatory T Cells as a Means of Enhancing Immune Responses Directed against Prostate Cancer

    Science.gov (United States)

    2017-08-01

    28 weeks) to complete this study. Furthermore, using Kaplan - Meier survival curves, we have discovered that TR AMP; L ck-cre; Klf2fl/fl mice do...3 2 Thymus Spleen Thymus Spleen FoxP3 FoxP3 Figure 2. Kaplan -Meier survival curve. TRAMP (black) versus TRAMP; Lck-cre; Klf2fl/fl (red) survival

  1. RBPJ and EphrinB2 as Molecular Targets to Treat Brain Arteriovenous Malformation in Notch4 Induced Mouse Model

    Science.gov (United States)

    2017-10-01

    of time for mutant mice to moribundity. We recorded numbers of subjects at risk at 0, 25, 50, 75, 100 days old and use the number to generate Kaplan ...Meier curve. We obtained Kaplan -Meier analysis data showed that time to moribundity doubled in Notch4iGOF- EC;RbpjiΔEC mice, as compared to

  2. Modelos estimados de análisis de supervivencia para el tiempo de permanencia de los estudiantes de la Universidad Francisco de Paula Santander

    OpenAIRE

    Mawency Vergel Ortega; José Joaquín Martínez Lozano; Eduardo Ibargüen Mondragón

    2016-01-01

    The article shows factors associated with college desertion. The survival analysis technique allowed to perform a study with students from different programs at the Francisco de Paula Santander University, considering the events: semester, abandonment, punishment, punishment - abandonment. Using the Kaplan-Meier estimator [1], the survival function for each event of interest was estimated and desertion models whose variables were significant at 10% using the semi-parametric met...

  3. Two-step estimation in ratio-of-mediator-probability weighted causal mediation analysis.

    Science.gov (United States)

    Bein, Edward; Deutsch, Jonah; Hong, Guanglei; Porter, Kristin E; Qin, Xu; Yang, Cheng

    2018-04-15

    This study investigates appropriate estimation of estimator variability in the context of causal mediation analysis that employs propensity score-based weighting. Such an analysis decomposes the total effect of a treatment on the outcome into an indirect effect transmitted through a focal mediator and a direct effect bypassing the mediator. Ratio-of-mediator-probability weighting estimates these causal effects by adjusting for the confounding impact of a large number of pretreatment covariates through propensity score-based weighting. In step 1, a propensity score model is estimated. In step 2, the causal effects of interest are estimated using weights derived from the prior step's regression coefficient estimates. Statistical inferences obtained from this 2-step estimation procedure are potentially problematic if the estimated standard errors of the causal effect estimates do not reflect the sampling uncertainty in the estimation of the weights. This study extends to ratio-of-mediator-probability weighting analysis a solution to the 2-step estimation problem by stacking the score functions from both steps. We derive the asymptotic variance-covariance matrix for the indirect effect and direct effect 2-step estimators, provide simulation results, and illustrate with an application study. Our simulation results indicate that the sampling uncertainty in the estimated weights should not be ignored. The standard error estimation using the stacking procedure offers a viable alternative to bootstrap standard error estimation. We discuss broad implications of this approach for causal analysis involving propensity score-based weighting. Copyright © 2018 John Wiley & Sons, Ltd.

  4. MNS16A minisatellite genotypes in relation to risk of glioma and meningioma and to glioblastoma outcome

    DEFF Research Database (Denmark)

    Andersson, U.; Osterman, P.; Sjostrom, S.

    2009-01-01

    was analysed using Kaplan-Meier estimates and equality of survival distributions using the log-rank test and Cox proportional hazard ratios. The MNS16A genotype was not associated with risk of occurrence of glioma, glioblastoma (GBM) or meningioma. For GBM there were median survivals of 15.3, 11.0 and 10...

  5. Interdisciplinary interventional therapy for tracheobronchial stenosis with modern metal net stents

    International Nuclear Information System (INIS)

    Rieger, J.; Linsenmaier, U.; Fedorowski, A.; Pfeifer, K.J.; Hautmann, H.; Huber, R.M.

    2002-01-01

    Study objectives: Assessment of the therapeutic potential of tracheobronchial stenting for obstructive tracheobronchial disease, in-vivo comparison of different stent types and development of helpful criteria for choosing the suitable stent type. Material and Methods: Prospective case analysis. Between 1993 and 1999 53 stents were implanted into the tracheobronchial system of 39 consecutive patients with benign or malignant airway obstruction. Every single stent (26 Strecker Stents, 18 Wallstents, 6 Accuflex Nitinolstents, 1 Dumon-, 1 Ruesch- and 1 Palmazstent) was recorded in an unified database. Analysis comprised clinical effectiveness, lung function if possible, relevant complications and radiologic follow-up parameters. The probability of their remaining within the tracheobronchial system, of their remaining undislocated and uncompressed was calculated using Kaplan-Meier analysis for three stent types. Results: Stent placement proved itself to be an effective treatment in 86% of the patients. Resistance could be normalized in 9/9 patients. Kaplan-Meier analysis clearly revealed a higher probability for the Wall- and Nitinolstent to remain within the tracheobronchial system and to remain uncompressed. Dislocation also occurred more rarely. Explanation of the Wallstent, however, if desired, was much more difficult compared to the Strecker stent. The Wallstent also occasionally led to the formation of granulation tissue especially at the proximal stent end and, as such, required reintervention. (orig.) [de

  6. Computer Aided Design of Kaplan Turbine Piston with\tSolidWorks

    Directory of Open Access Journals (Sweden)

    Camelia Jianu

    2010-10-01

    Full Text Available The paper presents the steps for 3D computer aided design (CAD of Kaplan turbine piston made in SolidWorks.The present paper is a tutorial for a Kaplan turbine piston 3D geometry, which is dedicaded to the Parts Sketch and Parts Features design and Drawing Geometry and Drawing Annotation.

  7. Analysis of the Kaplan turbine draft tube effect

    International Nuclear Information System (INIS)

    Motycak, L; Skotak, A; Obrovsky, J

    2010-01-01

    The aim of this paper is to present information about possible problems and errors which can appear during numerical analyses of low head Kaplan turbines with a view to the runner - draft tube interaction. The setting of numerical model, grid size, used boundary conditions are the interface definition between runner and draft tube are discussed. There are available data from physical model tests which gives a great opportunity to compare CFD and experiment results and on the basis of this comparison to determine the approach to the CFD flow modeling. The main purpose for the Kaplan turbine model measurement was to gather the information about real flow field. The model tests were carried out in new hydraulic laboratory of CKD Blansko Engineering. The model tests were focused on the detailed velocity measurements downstream of the runner by differential pressure probe and on the velocity measurement downstream of the draft tube elbow by Particle Image Velocimetry method (PIV). The data from CFD simulation were compared to the velocity measurement results. In the paper also the design of the original draft tube modification due to flow improvement is discussed in the case of the Kaplan turbine uprating project. The results of the draft tube modification were confirmed by model tests in the hydraulic laboratory as well.

  8. Analysis of the Kaplan turbine draft tube effect

    Energy Technology Data Exchange (ETDEWEB)

    Motycak, L; Skotak, A; Obrovsky, J, E-mail: motycak.vhs@cbeng.c [CKD Blansko Engineering, a.s., Capkova 2357/5, Blansko 67801 (Czech Republic)

    2010-08-15

    The aim of this paper is to present information about possible problems and errors which can appear during numerical analyses of low head Kaplan turbines with a view to the runner - draft tube interaction. The setting of numerical model, grid size, used boundary conditions are the interface definition between runner and draft tube are discussed. There are available data from physical model tests which gives a great opportunity to compare CFD and experiment results and on the basis of this comparison to determine the approach to the CFD flow modeling. The main purpose for the Kaplan turbine model measurement was to gather the information about real flow field. The model tests were carried out in new hydraulic laboratory of CKD Blansko Engineering. The model tests were focused on the detailed velocity measurements downstream of the runner by differential pressure probe and on the velocity measurement downstream of the draft tube elbow by Particle Image Velocimetry method (PIV). The data from CFD simulation were compared to the velocity measurement results. In the paper also the design of the original draft tube modification due to flow improvement is discussed in the case of the Kaplan turbine uprating project. The results of the draft tube modification were confirmed by model tests in the hydraulic laboratory as well.

  9. Analysis of the Kaplan turbine draft tube effect

    Science.gov (United States)

    Motycak, L.; Skotak, A.; Obrovsky, J.

    2010-08-01

    The aim of this paper is to present information about possible problems and errors which can appear during numerical analyses of low head Kaplan turbines with a view to the runner - draft tube interaction. The setting of numerical model, grid size, used boundary conditions are the interface definition between runner and draft tube are discussed. There are available data from physical model tests which gives a great opportunity to compare CFD and experiment results and on the basis of this comparison to determine the approach to the CFD flow modeling. The main purpose for the Kaplan turbine model measurement was to gather the information about real flow field. The model tests were carried out in new hydraulic laboratory of CKD Blansko Engineering. The model tests were focused on the detailed velocity measurements downstream of the runner by differential pressure probe and on the velocity measurement downstream of the draft tube elbow by Particle Image Velocimetry method (PIV). The data from CFD simulation were compared to the velocity measurement results. In the paper also the design of the original draft tube modification due to flow improvement is discussed in the case of the Kaplan turbine uprating project. The results of the draft tube modification were confirmed by model tests in the hydraulic laboratory as well.

  10. Probability Density Estimation Using Neural Networks in Monte Carlo Calculations

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Cho, Jin Young; Song, Jae Seung; Kim, Chang Hyo

    2008-01-01

    The Monte Carlo neutronics analysis requires the capability for a tally distribution estimation like an axial power distribution or a flux gradient in a fuel rod, etc. This problem can be regarded as a probability density function estimation from an observation set. We apply the neural network based density estimation method to an observation and sampling weight set produced by the Monte Carlo calculations. The neural network method is compared with the histogram and the functional expansion tally method for estimating a non-smooth density, a fission source distribution, and an absorption rate's gradient in a burnable absorber rod. The application results shows that the neural network method can approximate a tally distribution quite well. (authors)

  11. Demonstration Integrated Knowledge-Based System for Estimating Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, Jack L.

    1999-04-21

    Human Reliability Analysis (HRA) is currently comprised of at least 40 different methods that are used to analyze, predict, and evaluate human performance in probabilistic terms. Systematic HRAs allow analysts to examine human-machine relationships, identify error-likely situations, and provide estimates of relative frequencies for human errors on critical tasks, highlighting the most beneficial areas for system improvements. Unfortunately, each of HRA's methods has a different philosophical approach, thereby producing estimates of human error probabilities (HEPs) that area better or worse match to the error likely situation of interest. Poor selection of methodology, or the improper application of techniques can produce invalid HEP estimates, where that erroneous estimation of potential human failure could have potentially severe consequences in terms of the estimated occurrence of injury, death, and/or property damage.

  12. Improved Governing of Kaplan Turbine Hydropower Plants Operating Island Grids

    OpenAIRE

    Gustafsson, Martin

    2013-01-01

    To reduce the consequences of a major fault in the electric power grid, functioning parts of the grid can be divided into smaller grid islands. The grid islands are operated isolated from the power network, which places new demands on a faster frequency regulation. This thesis investigates a Kaplan turbine hydropower plant operating an island grid. The Kaplan turbine has two control signals, the wicket gate and the turbine blade positions, controlling the mechanical power. The inputs are comb...

  13. The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation

    Science.gov (United States)

    Felder, Guido; Zischg, Andreas; Weingartner, Rolf

    2017-07-01

    Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.

  14. [Application of Competing Risks Model in Predicting Smoking Relapse Following Ischemic Stroke].

    Science.gov (United States)

    Hou, Li-Sha; Li, Ji-Jie; Du, Xu-Dong; Yan, Pei-Jing; Zhu, Cai-Rong

    2017-07-01

    To determine factors associated with smoking relapse in men who survived from their first stroke. Data were collected through face to face interviews with stroke patients in the hospital, and then repeated every three months via telephone over the period from 2010 to 2014. Kaplan-Meier method and competing risk model were adopted to estimate and predict smoking relapse rates. The Kaplan-Meier method estimated a higher relapse rate than the competing risk model. The four-year relapse rate was 43.1% after adjustment of competing risk. Exposure to environmental tobacco smoking outside of home and workplace (such as bars and restaurants) ( P =0.01), single ( P <0.01), and prior history of smoking at least 20 cigarettes per day ( P =0.02) were significant predictors of smoking relapse. When competing risks exist, competing risks model should be used in data analyses. Smoking interventions should give priorities to those without a spouse and those with a heavy smoking history. Smoking ban in public settings can reduce smoking relapse in stroke patients.

  15. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  16. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  17. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  18. Kaplan og Norton bør læses af hele ledelsen

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj

    2009-01-01

    Anmeldelse af "Eksekveringsgevinsten - Øget konkurrencekraft med fokuseret strategi og drift", Robert S. Kaplan & David P. Norton, 2009, Gyldendal Business. Udgivelsesdato: 8. april......Anmeldelse af "Eksekveringsgevinsten - Øget konkurrencekraft med fokuseret strategi og drift", Robert S. Kaplan & David P. Norton, 2009, Gyldendal Business. Udgivelsesdato: 8. april...

  19. Probability of Neutralization Estimation for APR1400 Physical Protection System Design Effectiveness Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Lim, Heoksoon; Na, Janghwan; Chi, Moongoo [Korea Hydro and Nuclear Power Co. Ltd. Central Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    It is focusing on development of a new designing process which can be compatible to international standards such as IAEA1 and NRC2 suggest. Evaluation for the design effectiveness was found as one of the areas to improve. If a design doesn't meet a certain level of effectiveness, it should be re-designed accordingly. The effectiveness can be calculated with combination of probability of Interruption and probability of neutralization. System Analysis of Vulnerability to Intrusion (SAVI) has been developed by Sandia National Laboratories for that purpose. With SNL's timely detection methodology, SAVI has been used by U.S. nuclear utilities to meet the NRC requirements for PPS design effectiveness evaluation. For the SAVI calculation, probability of neutralization is a vital input element that must be supplied. This paper describes the elements to consider for neutralization, probability estimation methodology, and the estimation for APR1400 PPS design effectiveness evaluation process. Markov chain and Monte Carlo simulation are often used for simple numerical calculation to estimate P{sub N}. The results from both methods are not always identical even for the same situation. P{sub N} values for APR1400 evaluation were calculated based on Markov chain method and modified to be applicable for guards/adversaries ratio based analysis.

  20. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  1. Comparative pathogenicity of Vibrio spp., Photobacterium damselae ssp. damselae and five isolates of Aeromonas salmonicida ssp. achromogenes in juvenile Atlantic halibut (Hippoglossus hippoglossus).

    Science.gov (United States)

    Bowden, T J; Bricknell, I R; Preziosi, B M

    2018-01-01

    Juvenile Atlantic halibut (~100 mg, Hippoglossus hippoglossus) were exposed to Vibrio proteolyticus, a Vibrio spp. isolate, Photobacterium damselae ssp. damselae and five different isolates of Aeromonas salmonicida ssp. achromogenes via an hour-long bath immersion to ascertain their variation in pathogenicity to this fish species. Results were analysed using Kaplan-Meier survival analysis. Analysis of the data from challenges using A. salmonicida ssp. achromogenes revealed three survival values of zero and a spread of values from 0 to 28.43. Challenges using a Vibrio spp isolate, V. proteolyticus and P. damselae resulted in Kaplan-Meier survival estimates of 31.21, 50.41 and 57.21, respectively. As all bacterial species tested could induce juvenile halibut mortalities, they must all be considered as potential pathogens. However, the degree of pathogenicity of A. salmonicida is isolate dependent. © 2017 John Wiley & Sons Ltd.

  2. Kaplan-Narayanan-Neuberger lattice fermions pass a perturbative test

    International Nuclear Information System (INIS)

    Aoki, S.; Levien, R.B.

    1995-01-01

    We test perturbatively a recent scheme for implementing chiral fermions on the lattice, proposed by Kaplan and modified by Narayanan and Neuberger, using as our testing ground the chiral Schwinger model. The scheme is found to reproduce the desired form of the effective action, whose real part is gauge invariant and whose imaginary part gives the correct anomaly in the continuum limit, once technical problems relating to the necesary infinite extent of the extra dimension are properly addressed. The indications from this study are that the Kaplan-Narayanan-Neuberger scheme has a good chance at being a correct lattice regularization of chiral gauge theories

  3. On estimating probability of presence from use-availability or presence-background data.

    Science.gov (United States)

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against

  4. Verification of “Channel-Probability Model” of Grain Yield Estimation

    Directory of Open Access Journals (Sweden)

    ZHENG Hong-yan

    2016-07-01

    Full Text Available The "channel-probability model" of grain yield estimation was verified and discussed systematically by using the grain production data from 1949 to 2014 in 16 typical counties, and 6 typical districts, and 31 provinces of China. The results showed as follows:(1Due to the geographical spatial scale was large enough, different climate zones and different meteorological conditions could compensated, and grain yield estimation error was small in the scale of nation. Therefore, it was not necessary to modify the grain yield estimation error by mirco-trend and the climate year types in the scale of nation. However, the grain yield estimation in the scale of province was located at the same of a climate zone,the scale was small, so the impact of the meteorological conditions on grain yield was less complementary than the scale of nation. While the spatial scale of districts and counties was smaller, accordingly the compensation of the impact of the meteorological conditions on grain yield was least. Therefore, it was necessary to use mrico-trend amendment and the climate year types amendment to modify the grain yield estimation in districts and counties.(2Mirco-trend modification had two formulas, generally, when the error of grain yield estimation was less than 10%, it could be modified by Y×(1-K; while the error of grain yield estimation was more than 10%, it could be modified by Y/(1+K.(3Generally, the grain estimation had 5 grades, and some had 7 grades because of large error fluctuation. The parameters modified of super-high yield year and super-low yield year must be depended on the real-time crop growth and the meteorological condition. (4By plenty of demonstration analysis, it was proved that the theory and method of "channel-probability model" was scientific and practical. In order to improve the accuracy of grain yield estimation, the parameters could be modified with micro-trend amendment and the climate year types amendment. If the

  5. Case Report: Meier-Gorlin syndrome: Report of an additional patient ...

    African Journals Online (AJOL)

    We report a 7 year old female child with the classical triad of Meier-Gorlin syndrome (MGS), (microtia, absent patella and short stature). She had the characteristic facial features, with normal mentality and defective speech, skeletal abnormalities, conductive hearing loss, cystitis and normal growth hormone level.

  6. Improvements of a Kaplan type small turbine: Forbedre og vidreutvikle en Kaplan småturbin

    OpenAIRE

    Fjærvold, Lars

    2012-01-01

    The goal with this master thesis was to establish Hill diagrams and improve a Kaplan turbine intended for use in Afghanistan. The turbine efficiency has been tested in setting 1 and 2. Turbine efficiency in setting 3 and 4 could not be tested because the runner blades interfere with the housing making it impossible to rotate the turbine. The efficiency was tested with an effective pressure head ranging from 2 to 8 meters. Best efficiency point was not reached because of limitations in the te...

  7. A new model to estimate prognosis in patients with hepatocellular carcinoma after Yttrium-90 radioembolization.

    Directory of Open Access Journals (Sweden)

    Zhihong Weng

    Full Text Available AIMS: The current prognostic model to estimate the survival in hepatocellular carcinoma (HCC patients treated with transarterial hepatic selective internal radiotherapy (SIRT is not fully characterized. The aim of this study was to establish a new scoring model including assessment of both tumor responses and therapy-induced systemic changes in HCC patients to predict survival at an early time point post-SIRT. METHODS AND MATERIALS: Between 2008 and 2012, 149 HCC patients treated with SIRT were included into this study. CT images and biomarkers in blood tested at one month post-SIRT were analyzed and correlated with clinical outcome. Tumor responses were assessed by RECIST 1.1, mRECIST, and Choi criteria. Kaplan-Meier methods were used to estimate survival curves. Cox regression was used in uni- and multivariable survival analyses and in the establishment of a prognostic model. RESULTS: A multivariate proportional hazards model was created based on the tumor response, the number of tumor nodules, the score of the model for end stage liver disease (MELD, and the serum C-reactive protein levels which were independent predictors of survival in HCC patients at one month post-SIRT. This prognostic model accurately differentiated the outcome of patients with different risk scores in this cohort (P<0.001. The model also had the ability to assign a predicted survival probability for individual patients. CONCLUSIONS: A new model to predict survival of HCC patients mainly based on tumor responses and therapy-induced systemic changes provides reliable prognosis and accurately discriminates the survival at an early time point after SIRT in these patients.

  8. Allelic drop-out probabilities estimated by logistic regression--Further considerations and practical implementation

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  9. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Directory of Open Access Journals (Sweden)

    Michael F Sloma

    2017-11-01

    Full Text Available Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  10. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Science.gov (United States)

    Sloma, Michael F; Mathews, David H

    2017-11-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  11. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    Science.gov (United States)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  12. Estimation of the common cause failure probabilities of the components under mixed testing schemes

    International Nuclear Information System (INIS)

    Kang, Dae Il; Hwang, Mee Jeong; Han, Sang Hoon

    2009-01-01

    For the case where trains or channels of standby safety systems consisting of more than two redundant components are tested in a staggered manner, the standby safety components within a train can be tested simultaneously or consecutively. In this case, mixed testing schemes, staggered and non-staggered testing schemes, are used for testing the components. Approximate formulas, based on the basic parameter method, were developed for the estimation of the common cause failure (CCF) probabilities of the components under mixed testing schemes. The developed formulas were applied to the four redundant check valves of the auxiliary feed water system as a demonstration study for their appropriateness. For a comparison, we estimated the CCF probabilities of the four redundant check valves for the mixed, staggered, and non-staggered testing schemes. The CCF probabilities of the four redundant check valves for the mixed testing schemes were estimated to be higher than those for the staggered testing scheme, and lower than those for the non-staggered testing scheme.

  13. ITGA3 and ITGB4 expression biomarkers estimate the risks of locoregional and hematogenous dissemination of oral squamous cell carcinoma

    International Nuclear Information System (INIS)

    Nagata, Masaki; Takahashi, Katsu; Kodama, Naoki; Kawase, Tomoyuki; Hoshina, Hideyuki; Ikeda, Nobuyuki; Shingaki, Susumu; Takagi, Ritsuo; Noman, Arhab A; Suzuki, Kenji; Kurita, Hiroshi; Ohnishi, Makoto; Ohyama, Tokio; Kitamura, Nobutaka; Kobayashi, Takanori; Uematsu, Kohya

    2013-01-01

    Molecular biomarkers are essential for monitoring treatment effects, predicting prognosis, and improving survival rate in oral squamous cell carcinoma. This study sought to verify the effectiveness of two integrin gene expression ratios as biomarkers. Gene expression analyses of integrin α3 (ITGA3), integrin β4 (ITGB4), CD9 antigen (CD9), and plakoglobin (JUP) by quantitative real-time PCR were conducted on total RNA from 270 OSCC cases. The logrank test, Cox proportional hazards model, and Kaplan-Meier estimates were performed on the gene expression ratios of ITGA3/CD9 and ITGB4/JUP and on the clinicopathological parameters for major clinical events. A high rate (around 80%) of lymph node metastasis was found in cases with a high ITGA3/CD9 ratio (high-ITGA3/CD9) and invasive histopathology (YK4). Primary site recurrence (PSR) was associated with high-ITGA3/CD9, T3-4 (TNM class), and positive margin, indicating that PSR is synergistically influenced by treatment failure and biological malignancy. A high ITGB4/JUP ratio (high-ITGB4/JUP) was revealed to be a primary contributor to distant metastasis without the involvement of clinicopathological factors, suggesting intervention of a critical step dependent on the function of the integrin β4 subunit. Kaplan-Meier curves revealed positive margin as a lethal treatment consequence in high-ITGA3/CD9 and YK4 double-positive cases. Two types of metastatic trait were found in OSCC: locoregional dissemination, which was reflected by high-ITGA3/CD9, and distant metastasis through hematogenous dissemination, uniquely distinguished by high-ITGB4/JUP. The clinical significance of the integrin biomarkers implies that biological mechanisms such as cancer cell motility and anchorage-independent survival are vital for OSCC recurrence and metastasis

  14. AKaplan-Meier estimators of distance distributions for spatial point processes

    NARCIS (Netherlands)

    Baddeley, A.J.; Gill, R.D.

    1997-01-01

    When a spatial point process is observed through a bounded window, edge effects hamper the estimation of characteristics such as the empty space function $F$, the nearest neighbour distance distribution $G$, and the reduced second order moment function $K$. Here we propose and study product-limit

  15. Reporting of loss to follow-up information in randomised controlled trials with time-to-event outcomes: a literature survey

    Directory of Open Access Journals (Sweden)

    Bender Ralf

    2011-09-01

    Full Text Available Abstract Background To assess the reporting of loss to follow-up (LTFU information in articles on randomised controlled trials (RCTs with time-to-event outcomes, and to assess whether discrepancies affect the validity of study results. Methods Literature survey of all issues of the BMJ, Lancet, JAMA, and New England Journal of Medicine published between 2003 and 2005. Eligible articles were reports of RCTs including at least one Kaplan-Meier plot. Articles were classified as "assessable" if sufficient information was available to assess LTFU. In these articles, LTFU information was derived from Kaplan-Meier plots, extracted from the text, and compared. Articles were then classified as "consistent" or "not consistent". Sensitivity analyses were performed to assess the validity of study results. Results 319 eligible articles were identified. 187 (59% were classified as "assessable", as they included sufficient information for evaluation; 140 of 319 (44% presented consistent LTFU information between the Kaplan-Meier plot and text. 47 of 319 (15% were classified as "not consistent". These 47 articles were included in sensitivity analyses. When various imputation methods were used, the results of a chi2-test applied to the corresponding 2 × 2 table changed and hence were not robust in about half of the studies. Conclusions Less than half of the articles on RCTs using Kaplan-Meier plots provide assessable and consistent LTFU information, thus questioning the validity of the results and conclusions of many studies presenting survival analyses. Authors should improve the presentation of both Kaplan-Meier plots and LTFU information, and reviewers of study publications and journal editors should critically appraise the validity of the information provided.

  16. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    Science.gov (United States)

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  17. Estimating the exceedance probability of rain rate by logistic regression

    Science.gov (United States)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  18. Long-Term Efficacy, Tolerability, and Renal Safety of Atazanavir/Ritonavir-based Antiretroviral Therapy in a Cohort of Treatment-Naïve Patients with HIV-1 Infection: the REMAIN Study.

    Science.gov (United States)

    Teófilo, Eugénio; Rocha-Pereira, Nuno; Kuhlmann, Birger; Antela, Antonio; Knechten, Heribert; Santos, Jesús; Jiménez-Expósito, Maria Jesús

    2016-02-01

    Boosted protease inhibitors (PIs), including ritonavir-boosted atazanavir (ATV/r), are a recommended option for the initial treatment of HIV-1 infection based upon clinical trial data; however, long-term real-life clinical data are limited. We evaluated the long-term use of ATV/r as a component of antiretroviral combination therapy in the real-life setting in the REMAIN study. This was an observational cohort study conducted at sites across Germany, Portugal, and Spain. Retrospective historical and prospective longitudinal follow-up data were extracted every six months from medical records of HIV-infected treatment-naïve patients aged ≥ 18 years initiating a first-line ATV/r-containing regimen. Eligible patients (n = 517) were followed up for a median of 3.4 years. The proportion remaining on ATV/r at 5 years was 51.5% with an estimated Kaplan-Meier median time to treatment discontinuation of 4.9 years. Principal reasons for discontinuation were adverse events (15.9%; 8.9% due to hyperbilirubinemia) and virologic failure (6.8%). The Kaplan-Meier probability of not having virologic failure (HIV-1 RNA treatment-emergent major PI resistance occurred. ATV/r was generally well tolerated during long-term treatment with no significant changes in estimated glomerular filtration rate over five years. In a real-life clinical setting over five years, treatment-naïve patients with HIV-1 infection initiating an ATV/r-based regimen showed sustained virologic suppression, an overall treatment persistence rate of 51.5%, an absence of treatment-emergent major PI resistance mutations at virologic failure, a long-term safety profile consistent with that observed in clinical trials, and no significant decline in renal function.

  19. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  20. Multi-objective shape optimization of runner blade for Kaplan turbine

    International Nuclear Information System (INIS)

    Power machines LMZ, Saint Petersburg (Russian Federation))" data-affiliation=" (OJSC Power machines LMZ, Saint Petersburg (Russian Federation))" >Semenova, A; Power machines LMZ, Saint Petersburg (Russian Federation))" data-affiliation=" (OJSC Power machines LMZ, Saint Petersburg (Russian Federation))" >Pylev, I; Chirkov, D; Lyutov, A; Chemy, S; Skorospelov, V

    2014-01-01

    Automatic runner shape optimization based on extensive CFD analysis proved to be a useful design tool in hydraulic turbomachinery. Previously the authors developed an efficient method for Francis runner optimization. It was successfully applied to the design of several runners with different specific speeds. In present work this method is extended to the task of a Kaplan runner optimization. Despite of relatively simpler blade shape, Kaplan turbines have several features, complicating the optimization problem. First, Kaplan turbines normally operate in a wide range of discharges, thus CFD analysis of each variant of the runner should be carried out for several operation points. Next, due to a high specific speed, draft tube losses have a great impact on the overall turbine efficiency, and thus should be accurately evaluated. Then, the flow in blade tip and hub clearances significantly affects the velocity profile behind the runner and draft tube behavior. All these features are accounted in the present optimization technique. Parameterization of runner blade surface using 24 geometrical parameters is described in details. For each variant of runner geometry steady state three-dimensional turbulent flow computations are carried out in the domain, including wicket gate, runner, draft tube, blade tip and hub clearances. The objectives are maximization of efficiency in best efficiency and high discharge operation points, with simultaneous minimization of cavitation area on the suction side of the blade. Multiobjective genetic algorithm is used for the solution of optimization problem, requiring the analysis of several thousands of runner variants. The method is applied to optimization of runner shape for several Kaplan turbines with different heads

  1. Multi-objective shape optimization of runner blade for Kaplan turbine

    Science.gov (United States)

    Semenova, A.; Chirkov, D.; Lyutov, A.; Chemy, S.; Skorospelov, V.; Pylev, I.

    2014-03-01

    Automatic runner shape optimization based on extensive CFD analysis proved to be a useful design tool in hydraulic turbomachinery. Previously the authors developed an efficient method for Francis runner optimization. It was successfully applied to the design of several runners with different specific speeds. In present work this method is extended to the task of a Kaplan runner optimization. Despite of relatively simpler blade shape, Kaplan turbines have several features, complicating the optimization problem. First, Kaplan turbines normally operate in a wide range of discharges, thus CFD analysis of each variant of the runner should be carried out for several operation points. Next, due to a high specific speed, draft tube losses have a great impact on the overall turbine efficiency, and thus should be accurately evaluated. Then, the flow in blade tip and hub clearances significantly affects the velocity profile behind the runner and draft tube behavior. All these features are accounted in the present optimization technique. Parameterization of runner blade surface using 24 geometrical parameters is described in details. For each variant of runner geometry steady state three-dimensional turbulent flow computations are carried out in the domain, including wicket gate, runner, draft tube, blade tip and hub clearances. The objectives are maximization of efficiency in best efficiency and high discharge operation points, with simultaneous minimization of cavitation area on the suction side of the blade. Multiobjective genetic algorithm is used for the solution of optimization problem, requiring the analysis of several thousands of runner variants. The method is applied to optimization of runner shape for several Kaplan turbines with different heads.

  2. Molecular genetics analysis of hereditary breast and ovarian cancer patients in India

    OpenAIRE

    Soumittra, Nagasamy; Meenakumari, Balaiah; Parija, Tithi; Sridevi, Veluswami; Nancy, Karunakaran N; Swaminathan, Rajaraman; Rajalekshmy, Kamalalayam R; Majhi, Urmila; Rajkumar, Thangarajan

    2009-01-01

    Abstract Background Hereditary cancers account for 5–10% of cancers. In this study BRCA1, BRCA2 and CHEK2*(1100delC) were analyzed for mutations in 91 HBOC/HBC/HOC families and early onset breast and early onset ovarian cancer cases. Methods PCR-DHPLC was used for mutation screening followed by DNA sequencing for identification and confirmation of mutations. Kaplan-Meier survival probabilities were computed for five-year survival data on Breast and Ovarian cancer cases separately, and differe...

  3. Mathematical, numerical and experimental analysis of the swirling flow at a Kaplan runner outlet

    International Nuclear Information System (INIS)

    Muntean, S; Ciocan, T; Susan-Resiga, R F; Cervantes, M; Nilsson, H

    2012-01-01

    The paper presents a novel mathematical model for a-priori computation of the swirling flow at Kaplan runners outlet. The model is an extension of the initial version developed by Susan-Resiga et al [1], to include the contributions of non-negligible radial velocity and of the variable rothalpy. Simple analytical expressions are derived for these additional data from three-dimensional numerical simulations of the Kaplan turbine. The final results, i.e. velocity components profiles, are validated against experimental data at two operating points, with the same Kaplan runner blades opening, but variable discharge.

  4. Mathematical, numerical and experimental analysis of the swirling flow at a Kaplan runner outlet

    Science.gov (United States)

    Muntean, S.; Ciocan, T.; Susan-Resiga, R. F.; Cervantes, M.; Nilsson, H.

    2012-11-01

    The paper presents a novel mathematical model for a-priori computation of the swirling flow at Kaplan runners outlet. The model is an extension of the initial version developed by Susan-Resiga et al [1], to include the contributions of non-negligible radial velocity and of the variable rothalpy. Simple analytical expressions are derived for these additional data from three-dimensional numerical simulations of the Kaplan turbine. The final results, i.e. velocity components profiles, are validated against experimental data at two operating points, with the same Kaplan runner blades opening, but variable discharge.

  5. Estimation of the common cause failure probabilities on the component group with mixed testing scheme

    International Nuclear Information System (INIS)

    Hwang, Meejeong; Kang, Dae Il

    2011-01-01

    Highlights: ► This paper presents a method to estimate the common cause failure probabilities on the common cause component group with mixed testing schemes. ► The CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing. ► There are many CCCGs with specific mixed testing schemes in real plant operation. ► Therefore, a general formula which is applicable to both alternate periodic testing scheme and train level mixed testing scheme was derived. - Abstract: This paper presents a method to estimate the common cause failure (CCF) probabilities on the common cause component group (CCCG) with mixed testing schemes such as the train level mixed testing scheme or the alternate periodic testing scheme. In the train level mixed testing scheme, the components are tested in a non-staggered way within the same train, but the components are tested in a staggered way between the trains. The alternate periodic testing scheme indicates that all components in the same CCCG are tested in a non-staggered way during the planned maintenance period, but they are tested in a staggered way during normal plant operation. Since the CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing, CCF estimators have two kinds of formulas in accordance with the testing schemes. Thus, there are general formulas to estimate the CCF probability on the staggered testing scheme and non-staggered testing scheme. However, in real plant operation, there are many CCCGs with specific mixed testing schemes. Recently, Barros () and Kang () proposed a CCF factor estimation method to reflect the alternate periodic testing scheme and the train level mixed testing scheme. In this paper, a general formula which is applicable to both the alternate periodic testing scheme and the train level mixed testing scheme was derived.

  6. Steam generator tubes rupture probability estimation - study of the axially cracked tube case

    International Nuclear Information System (INIS)

    Mavko, B.; Cizelj, L.; Roussel, G.

    1992-01-01

    The objective of the present study is to estimate the probability of a steam generator tube rupture due to the unstable propagation of axial through-wall cracks during a hypothetical accident. For this purpose the probabilistic fracture mechanics model was developed taking into account statistical distributions of influencing parameters. A numerical example considering a typical steam generator seriously affected by axial stress corrosion cracking in the roll transition area, is presented; it indicates the change of rupture probability with different assumptions focusing mostly on tubesheet reinforcing factor, crack propagation rate and crack detection probability. 8 refs., 4 figs., 4 tabs

  7. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  8. Estimation of the defect detection probability for ultrasonic tests on thick sections steel weldments. Technical report

    International Nuclear Information System (INIS)

    Johnson, D.P.; Toomay, T.L.; Davis, C.S.

    1979-02-01

    An inspection uncertainty analysis of published PVRC Specimen 201 data is reported to obtain an estimate of the probability of recording an indication as a function of imperfection height for ASME Section XI Code ultrasonic inspections of the nuclear reactor vessel plate seams and to demonstrate the advantages of inspection uncertainty analysis over conventional detection/nondetection counting analysis. This analysis found the probability of recording a significant defect with an ASME Section XI Code ultrasonic inspection to be very high, if such a defect should exist in the plate seams of a nuclear reactor vessel. For a one-inch high crack, for example, this analysis gives a best estimate recording probability of .985 and a 90% lower confidence bound recording probabilty of .937. It is also shown that inspection uncertainty analysis gives more accurate estimates and gives estimates over a much greater flaw size range than is possible with conventional analysis. There is reason to believe that the estimation procedure used is conservative, the estimation is based on data generated several years ago, on very small defects, in an environment that is different from the actual in-service inspection environment

  9. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  10. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  11. Meier-Gorlin syndrome.

    Science.gov (United States)

    de Munnik, Sonja A; Hoefsloot, Elisabeth H; Roukema, Jolt; Schoots, Jeroen; Knoers, Nine V A M; Brunner, Han G; Jackson, Andrew P; Bongers, Ernie M H F

    2015-09-17

    Meier-Gorlin syndrome (MGS) is a rare autosomal recessive primordial dwarfism disorder, characterized by microtia, patellar applasia/hypoplasia, and a proportionate short stature. Associated clinical features encompass feeding problems, congenital pulmonary emphysema, mammary hypoplasia in females and urogenital anomalies, such as cryptorchidism and hypoplastic labia minora and majora. Typical facial characteristics during childhood comprise a small mouth with full lips and micro-retrognathia. During ageing, a narrow, convex nose becomes more prominent. The diagnosis MGS should be considered in patients with at least two of the three features of the clinical triad of microtia, patellar anomalies, and pre- and postnatal growth retardation. In patients with short stature and/or microtia, the patellae should be assessed with care by ultrasonography before age 6 or radiography thereafter. Mutations in one of five genes (ORC1, ORC4, ORC6, CDT1, and CDC6) of the pre-replication complex, involved in DNA-replication, are detected in approximately 67-78% of patients with MGS. Patients with ORC1 and ORC4 mutations appear to have the most severe short stature and microcephaly. Management should be directed towards in-depth investigation, treatment and prevention of associated problems, such as growth retardation, feeding problems, hearing loss, luxating patellae, knee pain, gonarthrosis, and possible pulmonary complications due to congenital pulmonary emphysema with or without broncho- or laryngomalacia. Growth hormone treatment is ineffective in most patients with MGS, but may be effective in patients in whom growth continues to decrease after the first year of life (usually growth velocity normalizes after the first year) and with low levels of IGF1. At present, few data is available about reproduction of females with MGS, but the risk of premature labor might be increased. Here, we propose experience-based guidelines for the regular care and treatment of MGS patients.

  12. Estimating success probability of a rugby goal kick and developing a ...

    African Journals Online (AJOL)

    The objective of this study was firstly to derive a formula to estimate the success probability of a particular rugby goal kick and, secondly to derive a goal kicker rating measure that could be used to rank rugby union goal kickers. Various factors that could influence the success of a particular goal kick were considered.

  13. Estimating Model Probabilities using Thermodynamic Markov Chain Monte Carlo Methods

    Science.gov (United States)

    Ye, M.; Liu, P.; Beerli, P.; Lu, D.; Hill, M. C.

    2014-12-01

    Markov chain Monte Carlo (MCMC) methods are widely used to evaluate model probability for quantifying model uncertainty. In a general procedure, MCMC simulations are first conducted for each individual model, and MCMC parameter samples are then used to approximate marginal likelihood of the model by calculating the geometric mean of the joint likelihood of the model and its parameters. It has been found the method of evaluating geometric mean suffers from the numerical problem of low convergence rate. A simple test case shows that even millions of MCMC samples are insufficient to yield accurate estimation of the marginal likelihood. To resolve this problem, a thermodynamic method is used to have multiple MCMC runs with different values of a heating coefficient between zero and one. When the heating coefficient is zero, the MCMC run is equivalent to a random walk MC in the prior parameter space; when the heating coefficient is one, the MCMC run is the conventional one. For a simple case with analytical form of the marginal likelihood, the thermodynamic method yields more accurate estimate than the method of using geometric mean. This is also demonstrated for a case of groundwater modeling with consideration of four alternative models postulated based on different conceptualization of a confining layer. This groundwater example shows that model probabilities estimated using the thermodynamic method are more reasonable than those obtained using the geometric method. The thermodynamic method is general, and can be used for a wide range of environmental problem for model uncertainty quantification.

  14. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    Science.gov (United States)

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  15. Metastatic volume: an old oncologic concept and a new prognostic factor for stage IV melanoma patients.

    Science.gov (United States)

    Panasiti, V; Curzio, M; Roberti, V; Lieto, P; Devirgiliis, V; Gobbi, S; Naspi, A; Coppola, R; Lopez, T; di Meo, N; Gatti, A; Trevisan, G; Londei, P; Calvieri, S

    2013-01-01

    The last melanoma staging system of the 2009 American Joint Committee on Cancer takes into account, for stage IV disease, the serum levels of lactate dehydrogenase (LDH) and the site of distant metastases. Our aim was to compare the significance of metastatic volume, as evaluated at the time of stage IV melanoma diagnosis, with other clinical predictors of prognosis. We conducted a retrospective multicentric study. To establish which variables were statistically correlated both with death and survival time, contingency tables were evaluated. The overall survival curves were compared using the Kaplan-Meier method. Metastatic volume and number of affected organs were statistically related to death. In detail, patients with a metastatic volume >15 cm(3) had a worse prognosis than those with a volume lower than this value (survival probability at 60 months: 6.8 vs. 40.9%, respectively). The Kaplan-Meier method confirmed that survival time was significantly related to the site(s) of metastases, to elevated LDH serum levels and to melanoma stage according to the latest system. Our results suggest that metastatic volume may be considered as a useful prognostic factor for survival among melanoma patients.

  16. Application of Enlisted Force Retention Levels and Career Field Stability

    Science.gov (United States)

    2017-03-23

    APPLICATION OF ENLISTED FORCE RETENTION LEVELS AND CAREER FIELD STABILITY THESIS Presented to the Faculty Department of Operational Sciences ...Fulfillment of the Requirements for the Degree of Master of Science in Operations Research Jamie T. Zimmermann, MS, BS Captain, USAF March 2017...Appendix B. The function proc lifetest is a nonparametric estimate of the survivor function using either the Kaplan-Meier method or the actuarial

  17. Prita Meier, Swahili Port Cities: The Architecture of Elsewhere

    OpenAIRE

    Longair, Sarah

    2018-01-01

    Prita Meier’s Swahili Port Cities: the Architecture of Elsewhere is a highly original and important contribution to scholarship on East Africa, and more widely for scholars interested in complicating how we understand the formation of global cities and border zone societies. It is not a conventional architectural history, yet it places buildings, in particular the coral and lime stone constructions found on the Swahili coast, at its heart. Meier uses the “materiality of city life” to offer “a...

  18. Kaplan kõneles Iraagis rahust / Raivo Nikiforov ; interv. Eda Post

    Index Scriptorium Estoniae

    Nikiforov, Raivo

    2005-01-01

    Tapa väljaõppekeskuse kaplan leitnant Raivo Nikiforov käis Bagdadis Eesti rahuvalvajatele jõulujumalateenistust pidamas ning eestlaste elu jälgimas. Iraagi missioonist, rahuvalvajate elamistingimustest

  19. An optimized Line Sampling method for the estimation of the failure probability of nuclear passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2010-01-01

    The quantitative reliability assessment of a thermal-hydraulic (T-H) passive safety system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. In this work, Line Sampling (LS) is adopted for efficient MC sampling. In the LS method, an 'important direction' pointing towards the failure domain of interest is determined and a number of conditional one-dimensional problems are solved along such direction; this allows for a significant reduction of the variance of the failure probability estimator, with respect, for example, to standard random sampling. Two issues are still open with respect to LS: first, the method relies on the determination of the 'important direction', which requires additional runs of the T-H code; second, although the method has been shown to improve the computational efficiency by reducing the variance of the failure probability estimator, no evidence has been given yet that accurate and precise failure probability estimates can be obtained with a number of samples reduced to below a few hundreds, which may be required in case of long-running models. The work presented in this paper addresses the first issue by (i) quantitatively comparing the efficiency of the methods proposed in the literature to determine the LS important direction; (ii) employing artificial neural network (ANN) regression models as fast-running surrogates of the original, long-running T-H code to reduce the computational cost associated to the

  20. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  1. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  2. A method for estimating failure rates for low probability events arising in PSA

    International Nuclear Information System (INIS)

    Thorne, M.C.; Williams, M.M.R.

    1995-01-01

    The authors develop a method for predicting failure rates and failure probabilities per event when, over a given test period or number of demands, no failures have occurred. A Bayesian approach is adopted to calculate a posterior probability distribution for the failure rate or failure probability per event subsequent to the test period. This posterior is then used to estimate effective failure rates or probabilities over a subsequent period of time or number of demands. In special circumstances, the authors results reduce to the well-known rules of thumb, viz: 1/N and 1/T, where N is the number of demands during the test period for no failures and T is the test period for no failures. However, the authors are able to give strict conditions on the validity of these rules of thumb and to improve on them when necessary

  3. Comparison of Exposure in the Kaplan Versus the Kocher Approach in the Treatment of Radial Head Fractures.

    Science.gov (United States)

    Barnes, Leslie Fink; Lombardi, Joseph; Gardner, Thomas R; Strauch, Robert J; Rosenwasser, Melvin P

    2018-01-01

    The aim of this study was to compare the complete visible surface area of the radial head, neck, and coronoid in the Kaplan and Kocher approaches to the lateral elbow. The hypothesis was that the Kaplan approach would afford greater visibility due to the differential anatomy of the intermuscular planes. Ten cadavers were dissected with the Kaplan and Kocher approaches, and the visible surface area was measured in situ using a 3-dimensional digitizer. Six measurements were taken for each approach by 2 surgeons, and the mean of these measurements were analyzed. The mean surface area visible with the lateral collateral ligament (LCL) preserved in the Kaplan approach was 616.6 mm 2 in comparison with the surface area of 136.2 mm 2 visible in the Kocher approach when the LCL was preserved. Using a 2-way analysis of variance, the difference between these 2 approaches was statistically significant. When the LCL complex was incised in the Kocher approach, the average visible surface area of the Kocher approach was 456.1 mm 2 and was statistically less than the Kaplan approach. The average surface area of the coronoid visible using a proximally extended Kaplan approach was 197.8 mm 2 . The Kaplan approach affords significantly greater visible surface area of the proximal radius than the Kocher approach.

  4. A retrospective study on the use of post-operative colonoscopy following potentially curative surgery for colorectal cancer in a Canadian province

    Directory of Open Access Journals (Sweden)

    Bryant Heather E

    2004-04-01

    Full Text Available Abstract Background Surveillance colonoscopy is commonly recommended following potentially curative surgery for colorectal cancer. We determined factors associated with patients undergoing a least one colonoscopy within five years of surgery. Methods In this historical cohort study, data on 3918 patients age 30 years or older residing in Alberta, Canada, who had undergone a potentially curative surgical resection for local or regional stage colorectal cancer between 1983 and 1995 were obtained from the provincial cancer registry, ministry of health and cancer clinic charts. Kaplan-Meier estimates of the probability of undergoing a post-operative colonoscopy were calculated for patient, tumor and treatment-related variables of interest. Results A colonoscopy was performed within five years of surgery in 1979 patients. The probability of undergoing a colonoscopy for those diagnosed in the 1990s was greater than for those diagnosed earlier (0.65 vs 0.55, P Conclusions The majority of patients undergo colonoscopy following colorectal cancer surgery. However, there are important variations in surveillance practices across different patient and treatment characteristics.

  5. Estimation of submarine mass failure probability from a sequence of deposits with age dates

    Science.gov (United States)

    Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.

    2013-01-01

    The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.

  6. Verification of Kaplan turbine cam curves realization accuracy at power plant

    Directory of Open Access Journals (Sweden)

    Džepčeski Dane

    2016-01-01

    Full Text Available Sustainability of approximately constant value of Kaplan turbine efficiency, for relatively large net head changes, is a result of turbine runner variable geometry. Dependence of runner blades position change on guide vane opening represents the turbine cam curve. The cam curve realization accuracy is of great importance for the efficient and proper exploitation of turbines and consequently complete units. Due to the reasons mentioned above, special attention has been given to the tests designed for cam curves verification. The goal of this paper is to provide the description of the methodology and the results of the tests performed in the process of Kaplan turbine cam curves verification.

  7. Selection of anchor values for human error probability estimation

    International Nuclear Information System (INIS)

    Buffardi, L.C.; Fleishman, E.A.; Allen, J.A.

    1989-01-01

    There is a need for more dependable information to assist in the prediction of human errors in nuclear power environments. The major objective of the current project is to establish guidelines for using error probabilities from other task settings to estimate errors in the nuclear environment. This involves: (1) identifying critical nuclear tasks, (2) discovering similar tasks in non-nuclear environments, (3) finding error data for non-nuclear tasks, and (4) establishing error-rate values for the nuclear tasks based on the non-nuclear data. A key feature is the application of a classification system to nuclear and non-nuclear tasks to evaluate their similarities and differences in order to provide a basis for generalizing human error estimates across tasks. During the first eight months of the project, several classification systems have been applied to a sample of nuclear tasks. They are discussed in terms of their potential for establishing task equivalence and transferability of human error rates across situations

  8. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    Directory of Open Access Journals (Sweden)

    Tomoaki Chiba

    Full Text Available In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  9. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    Science.gov (United States)

    Chiba, Tomoaki; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  10. Nonparametric estimation of transition probabilities in the non-Markov illness-death model: A comparative study.

    Science.gov (United States)

    de Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2015-06-01

    Multi-state models are often used for modeling complex event history data. In these models the estimation of the transition probabilities is of particular interest, since they allow for long-term predictions of the process. These quantities have been traditionally estimated by the Aalen-Johansen estimator, which is consistent if the process is Markov. Several non-Markov estimators have been proposed in the recent literature, and their superiority with respect to the Aalen-Johansen estimator has been proved in situations in which the Markov condition is strongly violated. However, the existing estimators have the drawback of requiring that the support of the censoring distribution contains the support of the lifetime distribution, which is not often the case. In this article, we propose two new methods for estimating the transition probabilities in the progressive illness-death model. Some asymptotic results are derived. The proposed estimators are consistent regardless the Markov condition and the referred assumption about the censoring support. We explore the finite sample behavior of the estimators through simulations. The main conclusion of this piece of research is that the proposed estimators are much more efficient than the existing non-Markov estimators in most cases. An application to a clinical trial on colon cancer is included. Extensions to progressive processes beyond the three-state illness-death model are discussed. © 2015, The International Biometric Society.

  11. Individualism, Nationalism, and Universalism: The Educational Ideals of Mordecai M. Kaplan's Philosophy of Jewish Education

    Science.gov (United States)

    Ackerman, Ari

    2008-01-01

    This article will examine educational ideals by exploring the relation between the individual, the collective, and humanity in Kaplan's Jewish and educational philosophy. Generally the goals of individualism, nationalism, and universalism are seen as mutually exclusive. By contrast, Kaplan argues for the symbiotic relationship between…

  12. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    Science.gov (United States)

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned

  13. Shifting paradigms in the estimation of survival for castration-resistant prostate cancer: A tertiary academic center experience.

    Science.gov (United States)

    Afshar, Mehran; Evison, Felicity; James, Nicholas D; Patel, Prashant

    2015-08-01

    Castration-resistant prostate cancer (CRPC) has retained a guarded prognosis, with historical survival estimates of 18 to 24 months. However, the landscape of available therapy has changed, and the emphasis has altered from supportive to active treatment. Few large series from real-world populations exist in the contemporary era with fully mature survival data to confirm the indication based on clinical trials that patients with CRPC are surviving far longer than the historical estimates. We aim to review a large patient cohort with CRPC and provide mature survival data. Using the electronic histopathology database at Queen Elizabeth Hospital, Birmingham, UK, all prostate-specific antigentest results between April 2006 and September 2007 were extracted, and patients satisfying the American Society for Radiation Oncology (ASTRO) definition of hormone failure were identified. Electronic records were reviewed and variables were collected, including survival, treatment, biochemistry, histopathology, and demographics. Probability of survival, and of developing metastasis or CRPC, was determined using the Kaplan-Meier method. Patients were stratified into 3 groups, namely, D0--no metastasis at diagnosis but later appearance, D1--no metastasis at diagnosis or at last follow-up, and D2--metastasis at diagnosis. From 8,062 patient-prostate-specific antigen episodes, we identified 447 patients meeting the criteria. A notes review revealed 147 patients with CRPC. Median overall survival (OS) from diagnosis was 84.7 months (95% CI: 73-89), and 129 deaths had occurred (88%). Median OS from diagnosis for D0, D1, and D2 patients was 100.4, 180.1, and 58.9 months, respectively (Pdata benefit clinicians and patients in understanding prognosis and treatment choices. Importantly, our patients were diagnosed before the current wave of novel therapeutics for CRPC, so survival for men diagnosed today may be more than our findings. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Numerical investigation of hub clearance flow in a Kaplan turbine

    Science.gov (United States)

    Wu, H.; Feng, J. J.; Wu, G. K.; Luo, X. Q.

    2012-11-01

    In this paper, the flow field considering the hub clearance flow in a Kaplan turbine has been investigated through using the commercial CFD code ANSYS CFX based on high-quality structured grids generated by ANSYS ICEM CFD. The turbulence is simulated by k-ω based shear stress transport (SST) turbulence model together with automatic near wall treatments. Four kinds of simulations have been conducted for the runner geometry without hub clearance, with only the hub front clearance, with only the rear hub clearance, and with both front and rear clearance. The analysis of the obtained results is focused on the flow structure of the hub clearance flow, the effect on the turbine performance including hydraulic efficiency and cavitation performance, which can improve the understanding on the flow field in a Kaplan turbine.

  15. Numerical investigation of hub clearance flow in a Kaplan turbine

    International Nuclear Information System (INIS)

    Wu, H; Feng, J J; Wu, G K; Luo, X Q

    2012-01-01

    In this paper, the flow field considering the hub clearance flow in a Kaplan turbine has been investigated through using the commercial CFD code ANSYS CFX based on high-quality structured grids generated by ANSYS ICEM CFD. The turbulence is simulated by k-ω based shear stress transport (SST) turbulence model together with automatic near wall treatments. Four kinds of simulations have been conducted for the runner geometry without hub clearance, with only the hub front clearance, with only the rear hub clearance, and with both front and rear clearance. The analysis of the obtained results is focused on the flow structure of the hub clearance flow, the effect on the turbine performance including hydraulic efficiency and cavitation performance, which can improve the understanding on the flow field in a Kaplan turbine.

  16. Deciphering the Adaptive Immune Response to Ovarian Cancer

    Science.gov (United States)

    2013-10-01

    positive for CD8þ TIL. Conversely, of the cases that were positive for CD8þ TIL, approximately half also contained CD20þ TIL. By KaplanMeier analysis...and D). Scale bars: 100 mm (A and B) or 50 mm (C and D). Representative of 5 tumor samples. E, KaplanMeier curves showing that the presence of both...CH, Subramanian S, van de Rijn M, Turbin D, et al. Intraepithelial T cells and prognosis in ovarian carcinoma: novel associations with stage, tumor

  17. Evaluation and comparison of estimation methods for failure rates and probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, Jussi K. [Fortum Power and Heat Oy, P.O. Box 23, 07901 Loviisa (Finland)]. E-mail: jussi.vaurio@fortum.com; Jaenkaelae, Kalle E. [Fortum Nuclear Services, P.O. Box 10, 00048 Fortum (Finland)

    2006-02-01

    An updated parametric robust empirical Bayes (PREB) estimation methodology is presented as an alternative to several two-stage Bayesian methods used to assimilate failure data from multiple units or plants. PREB is based on prior-moment matching and avoids multi-dimensional numerical integrations. The PREB method is presented for failure-truncated and time-truncated data. Erlangian and Poisson likelihoods with gamma prior are used for failure rate estimation, and Binomial data with beta prior are used for failure probability per demand estimation. Combined models and assessment uncertainties are accounted for. One objective is to compare several methods with numerical examples and show that PREB works as well if not better than the alternative more complex methods, especially in demanding problems of small samples, identical data and zero failures. False claims and misconceptions are straightened out, and practical applications in risk studies are presented.

  18. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  19. De 'wraak van de geografie' volgens Robert D. Kaplan

    NARCIS (Netherlands)

    Mamadouh, V.

    2013-01-01

    De Amerikaanse publicist Robert D. Kaplan heeft een nieuwe bestseller, De wraak van de geografie, waarin hij het belang van geografie voor de internationale politiek uit de doeken doet. Zijn roep om meer aandacht voor geografie is echter erg eenzijdig en een miskenning van alles waar ons vak voor

  20. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    Science.gov (United States)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  1. A method for the estimation of the probability of damage due to earthquakes

    International Nuclear Information System (INIS)

    Alderson, M.A.H.G.

    1979-07-01

    The available information on seismicity within the United Kingdom has been combined with building damage data from the United States to produce a method of estimating the probability of damage to structures due to the occurrence of earthquakes. The analysis has been based on the use of site intensity as the major damage producing parameter. Data for structural, pipework and equipment items have been assumed and the overall probability of damage calculated as a function of the design level. Due account is taken of the uncertainties of the seismic data. (author)

  2. Estimating the Probability of a Rare Event Over a Finite Time Horizon

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; L'Ecuyer, Pierre; Rubino, Gerardo; Tuffin, Bruno

    2007-01-01

    We study an approximation for the zero-variance change of measure to estimate the probability of a rare event in a continuous-time Markov chain. The rare event occurs when the chain reaches a given set of states before some fixed time limit. The jump rates of the chain are expressed as functions of

  3. Censoring approach to the detection limits in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.

    2004-01-01

    We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called 'nondetects', can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples

  4. Annotated corpus and the empirical evaluation of probability estimates of grammatical forms

    Directory of Open Access Journals (Sweden)

    Ševa Nada

    2003-01-01

    Full Text Available The aim of the present study is to demonstrate the usage of an annotated corpus in the field of experimental psycholinguistics. Specifically, we demonstrate how the manually annotated Corpus of Serbian Language (Kostić, Đ. 2001 can be used for probability estimates of grammatical forms, which allow the control of independent variables in psycholinguistic experiments. We address the issue of processing Serbian inflected forms within two subparadigms of feminine nouns. In regression analysis, almost all processing variability of inflected forms has been accounted for by the amount of information (i.e. bits carried by the presented forms. In spite of the fact that probability distributions of inflected forms for the two paradigms differ, it was shown that the best prediction of processing variability is obtained by the probabilities derived from the predominant subparadigm which encompasses about 80% of feminine nouns. The relevance of annotated corpora in experimental psycholinguistics is discussed more in detail .

  5. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  6. Prognostic significance of transient myocardial ischaemia after first acute myocardial infarction: five year follow up study

    DEFF Research Database (Denmark)

    Mickley, H; Nielsen, J R; Berning, J

    1995-01-01

    infarction. MAIN OUTCOME MEASURES: Relation of ambulatory ST segment depression, exercise test variables, and left ventricular ejection fraction to subsequent objective (cardiac death or myocardial infarction) or subjective (need for coronary revascularisation) events. RESULTS: 23 of the 123 patients had...... an association between transient ST segment depression and an adverse long term outcome was found (Kaplan-Meier analysis; P = 0.004). The presence of exercise induced angina identified a similar proportion of patients with a poor prognosis (Kaplan-Meier analysis; P ... ST segment depression had high specificity but poor sensitivity. The presence of exercise induced ST segment depression was of no value in predicting combined cardiac events. Indeed, patients without exertional ST segment depression were at increased risk of future objective end points (Kaplan...

  7. Site Specific Probable Maximum Precipitation Estimates and Professional Judgement

    Science.gov (United States)

    Hayes, B. D.; Kao, S. C.; Kanney, J. F.; Quinlan, K. R.; DeNeale, S. T.

    2015-12-01

    State and federal regulatory authorities currently rely upon the US National Weather Service Hydrometeorological Reports (HMRs) to determine probable maximum precipitation (PMP) estimates (i.e., rainfall depths and durations) for estimating flooding hazards for relatively broad regions in the US. PMP estimates for the contributing watersheds upstream of vulnerable facilities are used to estimate riverine flooding hazards while site-specific estimates for small water sheds are appropriate for individual facilities such as nuclear power plants. The HMRs are often criticized due to their limitations on basin size, questionable applicability in regions affected by orographic effects, their lack of consist methods, and generally by their age. HMR-51 for generalized PMP estimates for the United States east of the 105th meridian, was published in 1978 and is sometimes perceived as overly conservative. The US Nuclear Regulatory Commission (NRC), is currently reviewing several flood hazard evaluation reports that rely on site specific PMP estimates that have been commercially developed. As such, NRC has recently investigated key areas of expert judgement via a generic audit and one in-depth site specific review as they relate to identifying and quantifying actual and potential storm moisture sources, determining storm transposition limits, and adjusting available moisture during storm transposition. Though much of the approach reviewed was considered a logical extension of HMRs, two key points of expert judgement stood out for further in-depth review. The first relates primarily to small storms and the use of a heuristic for storm representative dew point adjustment developed for the Electric Power Research Institute by North American Weather Consultants in 1993 in order to harmonize historic storms for which only 12 hour dew point data was available with more recent storms in a single database. The second issue relates to the use of climatological averages for spatially

  8. PDE-Foam - a probability-density estimation method using self-adapting phase-space binning

    CERN Document Server

    Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter

    2009-01-01

    Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...

  9. Estimation of the four-wave mixing noise probability-density function by the multicanonical Monte Carlo method.

    Science.gov (United States)

    Neokosmidis, Ioannis; Kamalakis, Thomas; Chipouras, Aristides; Sphicopoulos, Thomas

    2005-01-01

    The performance of high-powered wavelength-division multiplexed (WDM) optical networks can be severely degraded by four-wave-mixing- (FWM-) induced distortion. The multicanonical Monte Carlo method (MCMC) is used to calculate the probability-density function (PDF) of the decision variable of a receiver, limited by FWM noise. Compared with the conventional Monte Carlo method previously used to estimate this PDF, the MCMC method is much faster and can accurately estimate smaller error probabilities. The method takes into account the correlation between the components of the FWM noise, unlike the Gaussian model, which is shown not to provide accurate results.

  10. Survival analysis

    International Nuclear Information System (INIS)

    Badwe, R.A.

    1999-01-01

    The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model

  11. Volume-Based F-18 FDG PET/CT Imaging Markers Provide Supplemental Prognostic Information to Histologic Grading in Patients With High-Grade Bone or Soft Tissue Sarcoma

    DEFF Research Database (Denmark)

    Andersen, Kim Francis; Fuglo, Hanna Maria; Rasmussen, Sine Hvid

    2015-01-01

    analysis. Kaplan-Meier survival estimates and log-rank test were used to compare the degree of equality of survival distributions. Prognostic variables with related hazard ratios (HR) were assessed using Cox proportional hazards regression analysis.Forty-one of 92 patients died during follow-up (45%; 12 BS.......05, HR 3.37 [95% CI 1.02-11.11]). No significant results were demonstrated for MTV40%.Volume-based F-18 FDG PET/CT imaging markers in terms of pretreatment estimation of TLG provide supplemental prognostic information to histologic grading, with significant independent properties for prediction...

  12. Reexamining trends in premarital sex in the United States

    OpenAIRE

    Lawrence L. Wu; Steven Martin; Paula England

    2018-01-01

    Background: In a heavily cited paper, Finer (2007) asserted that by age 30, 82Š of US women born 1939-1948 engaged in premarital sex, increasing to 94Š for those born 1969-1978. Using the same data, our age 30 estimates are 55Š and 87Š for women born 1939-1948 and 1969-1978. Our analyses thus document strikingly different levels and trends. Methods: We replicate Finer's single-decrement Kaplan-Meier estimates of premarital sex using Cycles 3-6 of the National Survey of Family Growth, the s...

  13. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  14. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  15. Psychological scaling of expert estimates of human error probabilities: application to nuclear power plant operation

    International Nuclear Information System (INIS)

    Comer, K.; Gaddy, C.D.; Seaver, D.A.; Stillwell, W.G.

    1985-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories sponsored a project to evaluate psychological scaling techniques for use in generating estimates of human error probabilities. The project evaluated two techniques: direct numerical estimation and paired comparisons. Expert estimates were found to be consistent across and within judges. Convergent validity was good, in comparison to estimates in a handbook of human reliability. Predictive validity could not be established because of the lack of actual relative frequencies of error (which will be a difficulty inherent in validation of any procedure used to estimate HEPs). Application of expert estimates in probabilistic risk assessment and in human factors is discussed

  16. Numerical simulation of turbulence flow in a Kaplan turbine -Evaluation on turbine performance prediction accuracy-

    International Nuclear Information System (INIS)

    Ko, P; Kurosawa, S

    2014-01-01

    The understanding and accurate prediction of the flow behaviour related to cavitation and pressure fluctuation in a Kaplan turbine are important to the design work enhancing the turbine performance including the elongation of the operation life span and the improvement of turbine efficiency. In this paper, high accuracy turbine and cavitation performance prediction method based on entire flow passage for a Kaplan turbine is presented and evaluated. Two-phase flow field is predicted by solving Reynolds-Averaged Navier-Stokes equations expressed by volume of fluid method tracking the free surface and combined with Reynolds Stress model. The growth and collapse of cavitation bubbles are modelled by the modified Rayleigh-Plesset equation. The prediction accuracy is evaluated by comparing with the model test results of Ns 400 Kaplan model turbine. As a result that the experimentally measured data including turbine efficiency, cavitation performance, and pressure fluctuation are accurately predicted. Furthermore, the cavitation occurrence on the runner blade surface and the influence to the hydraulic loss of the flow passage are discussed. Evaluated prediction method for the turbine flow and performance is introduced to facilitate the future design and research works on Kaplan type turbine

  17. Numerical simulation of turbulence flow in a Kaplan turbine -Evaluation on turbine performance prediction accuracy-

    Science.gov (United States)

    Ko, P.; Kurosawa, S.

    2014-03-01

    The understanding and accurate prediction of the flow behaviour related to cavitation and pressure fluctuation in a Kaplan turbine are important to the design work enhancing the turbine performance including the elongation of the operation life span and the improvement of turbine efficiency. In this paper, high accuracy turbine and cavitation performance prediction method based on entire flow passage for a Kaplan turbine is presented and evaluated. Two-phase flow field is predicted by solving Reynolds-Averaged Navier-Stokes equations expressed by volume of fluid method tracking the free surface and combined with Reynolds Stress model. The growth and collapse of cavitation bubbles are modelled by the modified Rayleigh-Plesset equation. The prediction accuracy is evaluated by comparing with the model test results of Ns 400 Kaplan model turbine. As a result that the experimentally measured data including turbine efficiency, cavitation performance, and pressure fluctuation are accurately predicted. Furthermore, the cavitation occurrence on the runner blade surface and the influence to the hydraulic loss of the flow passage are discussed. Evaluated prediction method for the turbine flow and performance is introduced to facilitate the future design and research works on Kaplan type turbine.

  18. The Probability of Default Under IFRS 9: Multi-period Estimation and Macroeconomic Forecast

    Directory of Open Access Journals (Sweden)

    Tomáš Vaněk

    2017-01-01

    Full Text Available In this paper we propose a straightforward, flexible and intuitive computational framework for the multi-period probability of default estimation incorporating macroeconomic forecasts. The concept is based on Markov models, the estimated economic adjustment coefficient and the official economic forecasts of the Czech National Bank. The economic forecasts are taken into account in a separate step to better distinguish between idiosyncratic and systemic risk. This approach is also attractive from the interpretational point of view. The proposed framework can be used especially when calculating lifetime expected credit losses under IFRS 9.

  19. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  20. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    Energy Technology Data Exchange (ETDEWEB)

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  1. The probability estimate of the defects of the asynchronous motors based on the complex method of diagnostics

    Science.gov (United States)

    Zhukovskiy, Yu L.; Korolev, N. A.; Babanova, I. S.; Boikov, A. V.

    2017-10-01

    This article is devoted to the development of a method for probability estimate of failure of an asynchronous motor as a part of electric drive with a frequency converter. The proposed method is based on a comprehensive method of diagnostics of vibration and electrical characteristics that take into account the quality of the supply network and the operating conditions. The developed diagnostic system allows to increase the accuracy and quality of diagnoses by determining the probability of failure-free operation of the electromechanical equipment, when the parameters deviate from the norm. This system uses an artificial neural networks (ANNs). The results of the system for estimator the technical condition are probability diagrams of the technical state and quantitative evaluation of the defects of the asynchronous motor and its components.

  2. Treatment of Benign and Malignant Tracheobronchial Obstruction with Metal Wire Stents: Experience with a Balloon-Expandable and a Self-Expandable Stent Type

    International Nuclear Information System (INIS)

    Rieger, Johannes; Hautmann, Hubert; Linsenmaier, Ulrich; Weber, Cristoph; Treitl, Markus; Huber, R.M.; Pfeifer, Klaus-Juergen

    2004-01-01

    Over the last few years various types of metal wire stents have been increasingly employed in the treatment of both malignant and benign tracheobronchial obstruction. To date, however, few studies have investigated the in vivo properties of different stent types. We implanted 26 balloon-expandable tantalum Strecker stents (18 patients) and 18 self-expandable Wallstents (16 patients) into the tracheobronchial system of 30 patients with combined stenting in 4 patients. Mean age was 51 years (range: 0.5-79 years). Malignant disease was present in 23 patients, benign disease in seven patients. Both patients and individual stents were monitored clinically and radiographically. The probability of stents remaining within the tracheobronchial system, and of their remaining undislocated and uncompressed was calculated using Kaplan-Meier analysis for both stent types. Average stent follow-up time was 112 days until explantation and 115 days until patients' death or discharge. Kaplan-Meier analysis revealed a higher probability for the Wallstent to remain within the tracheobronchial system. Dislocation and compression occurred more rarely. Explantation, however, if desired, was more difficult compared to the Strecker stent. The Wallstent also led to the formation of granulation tissue, especially at the proximal stent end, frequently requiring reintervention. Both stent types proved to be effective therapeutic options in the management of obstructive tracheobronchial disease. The mechanical properties of the Strecker stent seem to be less favorable compared to the Wallstent but removal is easy. For benign disease, however, the Wallstent reveals limitations due to significant side effects

  3. The long-term outcomes of epilepsy surgery

    Science.gov (United States)

    Keller, Simon; Nicolson, Andrew; Biswas, Shubhabrata; Smith, David; Osman Farah, Jibril; Eldridge, Paul; Wieshmann, Udo

    2018-01-01

    Objective Despite modern anti-epileptic drug treatment, approximately 30% of epilepsies remain medically refractory and for these patients, epilepsy surgery may be a treatment option. There have been numerous studies demonstrating good outcome of epilepsy surgery in the short to median term however, there are a limited number of studies looking at the long-term outcomes. The aim of this study was to ascertain the long-term outcome of resective epilepsy surgery in a large neurosurgery hospital in the U.K. Methods This a retrospective analysis of prospectively collected data. We used the 2001 International League Against Epilepsy (ILAE) classification system to classify seizure freedom and Kaplan-Meier survival analysis to estimate the probability of seizure freedom. Results We included 284 patients who underwent epilepsy surgery (178 anterior temporal lobe resections, 37 selective amygdalohippocampectomies, 33 temporal lesionectomies, 36 extratemporal lesionectomies), and had a prospective median follow-up of 5 years (range 1–27). Kaplan-Meier estimates showed that 47% (95% CI 40–58) remained seizure free (apart from simple partial seizures) at 5 years and 38% (95% CI 31–45) at 10 years after surgery. 74% (95% CI 69–80) had a greater than 50% seizure reduction at 5 years and 70% (95% CI 64–77) at 10 years. Patients who had an amygdalohippocampectomy were more likely to have seizure recurrence than patients who had an anterior temporal lobe resection (p = 0.006) and temporal lesionectomy (p = 0.029). There was no significant difference between extra temporal and temporal lesionectomies. Hippocampal sclerosis was associated with a good outcome but declined in relative frequency over the years. Conclusion The vast majority of patients who were not seizure free experienced at least a substantial and long-lasting reduction in seizure frequency. A positive long-term outcome after epilepsy surgery is possible for many patients and especially those with

  4. Computer Aided Design of the Link-Fork Head-Piston Assembly of the Kaplan Turbine with Solidworks

    Directory of Open Access Journals (Sweden)

    Camelia Jianu

    2010-10-01

    Full Text Available The paper presents the steps for 3D computer aided design (CAD of the link-fork head-piston assembly of the Kaplan turbine made in SolidWorks.The present paper is a tutorial for a Kaplan turbine assembly 3D geometry, which is dedicated to the Assembly design and Drawing Geometry and Drawing Annotation.

  5. Prevalence and mortality of cancer among HIV-infected inpatients in Beijing, China.

    Science.gov (United States)

    Yang, Jun; Su, Shu; Zhao, Hongxin; Wang, Dennis; Wang, Jiali; Zhang, Fujie; Zhao, Yan

    2016-02-16

    Cancer is responsible for elevated HIV-related morbidity and mortality. Research on HIV-infected patients with concurrent cancer is rare in China. The purpose of our study was to investigate the prevalence and risk factors associated with cancer among HIV-infected inpatients in Beijing, and to investigate the mortality and risk factors among HIV-infected inpatients with cancer. Hospital records from a total of 1946 HIV-infected patients were collected from the Beijing Ditan Hospital. The data, from 2008 to 2013, were collected retrospectively. The cancer diagnoses included AIDS-defining cancers (ADC) and non-AIDS defining cancers (NADC). Logistic regression was used to identify risk factors predicting the concurrence of cancer with HIV. Mortality was examined using Kaplan-Meier estimates and Cox proportional hazards models. 7.7 % (149 cases) of all HIV-infected inpatients had concurrent cancer at their first hospital admission; of those, 33.6 % (50 cases) had ADCs, and 66.4 % (99 cases) had NADCs. The most prevalent NADCs were Hodgkin's lymphoma, gastrointestinal cancer, liver cancer, and lung cancer. Patients who did not accept antiretroviral therapy (ART) were more likely to suffer from cancer [AOR = 2.07 (1.42-3.01), p = 0.001]. Kaplan-Meier curves indicated that the survival probability of HIV-positive cancer patients was significantly lower than that of HIV-positive cancer-free patients (log-rank test, p cancer, the mortality was also higher among those who did not receive ART [AHR = 2.19 (1.84-2.61), p cancer concurrence among hospitalized HIV-infected patients was 7.7 %. Concurrent cancer also increased mortality among HIV-infected patients. ART was protective against concurrent cancer as well as mortality among HIV-infected cancer patients. These results highlight the importance of promoting cancer screening and early ART initiation among HIV-infected patients.

  6. Rechazo y retrasplante corneal Corneal rejection and re-transplantation

    Directory of Open Access Journals (Sweden)

    Miguel O Mokey Castellanos

    2007-06-01

    Full Text Available Se efectuó una investigación observacional análítica retrospectiva, sobre los transplantes corneales efectuados en el Servicio de Oftalmología del Hospital "Hermanos Ameijeiras. Rechazaron 76 pacientes, que se compararon con un control de 89 pacientes, que en un período similar no tuvieron rechazo. El queratocono fue la afección corneal que predominó. El primer lugar en los rechazos correspondió a queratoherpes (43,5 %. El menor índice de rechazo fue para el queratocono (8,8 %. Se analizó la multiplicidad de rechazos; y fue frecuente que se presentara un solo rechazo, aunque sí hubo congruencia entre el número de rechazos y la necesidad de retrasplantes. Se encontró que los resultados de la conducta médica o quirúrgica se relacionaban con la causa. Se calcula un índice de supervivencia (Kaplan-Meier, que concluye que en los primeros dos años existe menos posibilidad de aparición de rechazoAn retrospective observational analytical research was conducted on corneal transplants performed at Ophthalmological Service in “Hermanos Ameijeiras” hospital . Seventy six patients had graft rejection and were compared to a control group of 89 patients that did not present rejection in the same period of time. Keratoconus was the prevailing corneal problem. The highest rejection rate corresponded to keratoherpes (43,5% whereas the lowest rate was for keratoconus (8,8%. Multiplicity of rejections was analyzed and it was found that mostly one graft rejection occured, but number of rejections was associated with the need of re-transplantation. It was found that the results of medical or surgical performance were related to the cause of graft rejection. A survival index (Kaplan-Meier was estimated, which showed that occurence of graf rejection is less probable in the first two years

  7. Anterolateral Knee Extra-articular Stabilizers: A Robotic Sectioning Study of the Anterolateral Ligament and Distal Iliotibial Band Kaplan Fibers.

    Science.gov (United States)

    Geeslin, Andrew G; Chahla, Jorge; Moatshe, Gilbert; Muckenhirn, Kyle J; Kruckeberg, Bradley M; Brady, Alex W; Coggins, Ashley; Dornan, Grant J; Getgood, Alan M; Godin, Jonathan A; LaPrade, Robert F

    2018-05-01

    The individual kinematic roles of the anterolateral ligament (ALL) and the distal iliotibial band Kaplan fibers in the setting of anterior cruciate ligament (ACL) deficiency require further clarification. This will improve understanding of their potential contribution to residual anterolateral rotational laxity after ACL reconstruction and may influence selection of an anterolateral extra-articular reconstruction technique, which is currently a matter of debate. Hypothesis/Purpose: To compare the role of the ALL and the Kaplan fibers in stabilizing the knee against tibial internal rotation, anterior tibial translation, and the pivot shift in ACL-deficient knees. We hypothesized that the Kaplan fibers would provide greater tibial internal rotation restraint than the ALL in ACL-deficient knees and that both structures would provide restraint against internal rotation during a simulated pivot-shift test. Controlled laboratory study. Ten paired fresh-frozen cadaveric knees (n = 20) were used to investigate the effect of sectioning the ALL and the Kaplan fibers in ACL-deficient knees with a 6 degrees of freedom robotic testing system. After ACL sectioning, sectioning was randomly performed for the ALL and the Kaplan fibers. An established robotic testing protocol was utilized to assess knee kinematics when the specimens were subjected to a 5-N·m internal rotation torque (0°-90° at 15° increments), a simulated pivot shift with 10-N·m valgus and 5-N·m internal rotation torque (15° and 30°), and an 88-N anterior tibial load (30° and 90°). Sectioning of the ACL led to significantly increased tibial internal rotation (from 0° to 90°) and anterior tibial translation (30° and 90°) as compared with the intact state. Significantly increased internal rotation occurred with further sectioning of the ALL (15°-90°) and Kaplan fibers (15°, 60°-90°). At higher flexion angles (60°-90°), sectioning the Kaplan fibers led to significantly greater internal rotation

  8. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  9. A review and comparison of methods for recreating individual patient data from published Kaplan-Meier survival curves for economic evaluations: a simulation study.

    Science.gov (United States)

    Wan, Xiaomin; Peng, Liubao; Li, Yuanjian

    2015-01-01

    In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.

  10. Can a polynomial interpolation improve on the Kaplan-Yorke dimension?

    International Nuclear Information System (INIS)

    Richter, Hendrik

    2008-01-01

    The Kaplan-Yorke dimension can be derived using a linear interpolation between an h-dimensional Lyapunov exponent λ (h) >0 and an h+1-dimensional Lyapunov exponent λ (h+1) <0. In this Letter, we use a polynomial interpolation to obtain generalized Lyapunov dimensions and study the relationships among them for higher-dimensional systems

  11. Adding gauge fields to Kaplan's fermions

    International Nuclear Information System (INIS)

    Blum, T.; Kaerkkaeinen, L.

    1994-01-01

    We experiment with adding dynamical gauge field to Kaplan (defect) fermions. In the case of U(1) gauge theory we use an inhomogeneous Higgs mechanism to restrict the 3d gauge dynamics to a planar 2d defect. In our simulations the 3d theory produce the correct 2d gauge dynamics. We measure fermion propagators with dynamical gauge fields. They posses the correct chiral structure. The fermions at the boundary of the support of the gauge field (waveguide) are non-chiral, and have a mass two times heavier than the chiral modes. Moreover, these modes cannot be excited by a source at the defect; implying that they are dynamically decoupled. We have also checked that the anomaly relation is fullfilled for the case of a smooth external gauge field. (orig.)

  12. Numerical and in-situ investigations of water hammer effects in Drava river Kaplan turbine hydropower plants

    International Nuclear Information System (INIS)

    Bergant, A; Gregorc, B; Gale, J

    2012-01-01

    This paper deals with critical flow regimes that may induce unacceptable water hammer in Kaplan turbine hydropower plants. Water hammer analysis should be performed for normal, emergency and catastrophic operating conditions. Hydropower plants with Kaplan turbines are usually comprised of relatively short inlet and outlet conduits. The rigid water hammer theory can be used for this case. For hydropower plants with long penstocks the elastic water hammer should be used. Some Kaplan turbine units are installed in systems with long open channels. In this case, water level oscillations in the channels should be carefully investigated. Computational results are compared with results of measurements in recently rehabilitated seven Drava river hydroelectric power plants in Slovenia. Water hammer in the six power plants is controlled by appropriate adjustment of the wicket gates and runner blades closing/opening manoeuvres. Due to very long inflow and outflow open channels in Zlatolicje HPP a special vaned pressure regulating device attenuates extreme pressures in Kaplan turbine flow-passage system and controls unsteady flow in both open channels. Comparisons of results include normal operating regimes. The agreement between computed and measured results is reasonable.

  13. Numerical and in-situ investigations of water hammer effects in Drava river Kaplan turbine hydropower plants

    Science.gov (United States)

    Bergant, A.; Gregorc, B.; Gale, J.

    2012-11-01

    This paper deals with critical flow regimes that may induce unacceptable water hammer in Kaplan turbine hydropower plants. Water hammer analysis should be performed for normal, emergency and catastrophic operating conditions. Hydropower plants with Kaplan turbines are usually comprised of relatively short inlet and outlet conduits. The rigid water hammer theory can be used for this case. For hydropower plants with long penstocks the elastic water hammer should be used. Some Kaplan turbine units are installed in systems with long open channels. In this case, water level oscillations in the channels should be carefully investigated. Computational results are compared with results of measurements in recently rehabilitated seven Drava river hydroelectric power plants in Slovenia. Water hammer in the six power plants is controlled by appropriate adjustment of the wicket gates and runner blades closing/opening manoeuvres. Due to very long inflow and outflow open channels in Zlatoličje HPP a special vaned pressure regulating device attenuates extreme pressures in Kaplan turbine flow-passage system and controls unsteady flow in both open channels. Comparisons of results include normal operating regimes. The agreement between computed and measured results is reasonable.

  14. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  15. Estimating the Probabilities of Default for Callable Bonds: A Duffie-Singleton Approach

    OpenAIRE

    David Wang

    2005-01-01

    This paper presents a model for estimating the default risks implicit in the prices of callable corporate bonds. The model considers three essential ingredients in the pricing of callable corporate bonds: stochastic interest rate, default risk, and call provision. The stochastic interest rate is modeled as a square-root diffusion process. The default risk is modeled as a constant spread, with the magnitude of this spread impacting the probability of a Poisson process governing the arrival of ...

  16. Dictionary-Based Stochastic Expectation–Maximization for SAR Amplitude Probability Density Function Estimation

    OpenAIRE

    Moser , Gabriele; Zerubia , Josiane; Serpico , Sebastiano B.

    2006-01-01

    International audience; In remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of the pixel intensities. This paper deals with the problem of probability density function (pdf) estimation in the context of synthetic aperture radar (SAR) amplitude data analysis. Several theoretical and heuristic models for the pdfs of SAR data have been proposed in the literature, which have been proved to be effective for different land-cov...

  17. Persistent fatigue in young athletes: measuring the clinical course and identifying variables affecting clinical recovery.

    Science.gov (United States)

    Locke, S; Osborne, M; O'Rourke, P

    2011-02-01

    The objective of this paper is to measure the clinical course (months) in young athletes with persistent fatigue and to identify any covariates affecting the duration of recovery. This was a prospective longitudinal study of 68 athletes; 87% were elite (42 males, 26 females), aged 20.5±3.74 years (SD), who presented with the symptom of persistent fatigue. The collective duration to full clinical recovery was estimated using Kaplan-Meier product-limit curves, and covariates associated with prolonging recovery were identified from Cox proportional hazard models. The median recovery was 5 months (range 1-60 months). The range of presenting symptom duration was 0.5-36 months. The covariates identified were an increased duration of presenting symptoms [hazard ratio (HR), 1.06; 95% confidence interval (CI), 1.02-1.12; P=0.005] and the response of serum cortisol concentration to a standard exercise challenge (HR, 1.92; 95% CI, 1.09-3.38; P=0.03). Delay in recovery was not associated with categories of fatigue that included medical, training-related diagnoses, or other causes. In conclusion, the fatigued athlete represents a significant clinical problem with a median recovery of 5 months, whose collective clinical course to recovery can be estimated by Kaplan-Meier curves and appears to be a continuum. © 2009 John Wiley & Sons A/S.

  18. Evaluation of test-strategies for estimating probability of low prevalence of paratuberculosis in Danish dairy herds

    DEFF Research Database (Denmark)

    Sergeant, E.S.G.; Nielsen, Søren S.; Toft, Nils

    2008-01-01

    of this study was to develop a method to estimate the probability of low within-herd prevalence of paratuberculosis for Danish dairy herds. A stochastic simulation model was developed using the R(R) programming environment. Features of this model included: use of age-specific estimates of test......-sensitivity and specificity; use of a distribution of observed values (rather than a fixed, low value) for design prevalence; and estimates of the probability of low prevalence (Pr-Low) based on a specific number of test-positive animals, rather than for a result less than or equal to a specified cut-point number of reactors....... Using this model, five herd-testing strategies were evaluated: (1) milk-ELISA on all lactating cows; (2) milk-ELISA on lactating cows 4 years old; (4) faecal culture on all lactating cows; and (5) milk-ELISA plus faecal culture in series on all lactating cows. The five testing strategies were evaluated...

  19. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  20. Meier-Gorlin syndrome: Growth and secondary sexual development of a microcephalic primordial dwarfism disorder

    NARCIS (Netherlands)

    de Munnik, Sonja A.; Otten, Barto J.; Schoots, Jeroen; Bicknell, Louise S.; Aftimos, Salim; Al-Aama, Jumana Y.; van Bever, Yolande; Bober, Michael B.; Borm, George F.; Clayton-Smith, Jill; Deal, Cheri L.; Edrees, Alaa Y.; Feingold, Murray; Fryer, Alan; van Hagen, Johanna M.; Hennekam, Raoul C.; Jansweijer, Maaike C. E.; Johnson, Diana; Kant, Sarina G.; Opitz, John M.; Ramadevi, A. Radha; Reardon, Willie; Ross, Alison; Sarda, Pierre; Schrander-Stumpel, Constance T. R. M.; Sluiter, A. Erik; Temple, I. Karen; Terhal, Paulien A.; Toutain, Annick; Wise, Carol A.; Wright, Michael; Skidmore, David L.; Samuels, Mark E.; Hoefsloot, Lies H.; Knoers, Nine V. A. M.; Brunner, Han G.; Jackson, Andrew P.; Bongers, Ernie M. H. F.

    2012-01-01

    MeierGorlin syndrome (MGS) is a rare autosomal recessive disorder characterized by primordial dwarfism, microtia, and patellar aplasia/hypoplasia. Recently, mutations in the ORC1, ORC4, ORC6, CDT1, and CDC6 genes, encoding components of the pre-replication complex, have been identified. This complex

  1. Meier-Gorlin syndrome: Growth and secondary sexual development of a microcephalic primordial dwarfism disorder

    NARCIS (Netherlands)

    de Munnik, S.A.; Otten, B.J.; Schoots, J.; Bicknell, L.S.; Aftimos, S.; Al-Aama, J.Y.; van Bever, Y.; Bober, M.B.; Borm, G.F.; Clayton-Smith, J.; Deal, C.L.; Edrees, A.Y.; Feingold, M.; Fryer, A.; van Hagen, J.M.; Hennekam, R.C.M.; Jansweijer, M.C.E.; Johnson, D.; Kant, S.G.; Opitz, J.M.; Ramadevi, A.R.; Reardon, W.; Ross, A.; Sarda, P.; Schrander-Stumpel, C.T.R.M.; Sluiter, A.E.; Temple, I.K.; Terhal, P.A.; Toutain, A.; Wise, C.A.; Wright, M.; Skidmore, D.L.; Samuels, M.E.; Hoefsloot, L.H.; Knoers, N.V.A.M.; Brunner, H.G.; Jackson, A.P.; Bongers, M.H.F.

    2012-01-01

    Meier-Gorlin syndrome (MGS) is a rare autosomal recessive disorder characterized by primordial dwarfism, microtia, and patellar aplasia/hypoplasia. Recently, mutations in the ORC1, ORC4, ORC6, CDT1, and CDC6 genes, encoding components of the pre-replication complex, have been identified. This

  2. XRF and XANES Data for Kaplan U Paper

    Science.gov (United States)

    The dataset contains two XRF images of iron and uranium distribution on plant roots and a database of XANES data used to produce XANES spectra figure for Figure 7 in the published paper.This dataset is associated with the following publication:Kaplan, D., R. Kukkadapu, J. Seaman, B. Arey, A. Dohnalkova, S. Buettner, D. Li, T. Varga, K. Scheckel, and P. Jaffe. Iron Mineralogy and Uranium-Binding Environment in the Rhizosphere of a Wetland Soil. D. Barcelo SCIENCE OF THE TOTAL ENVIRONMENT. Elsevier BV, AMSTERDAM, NETHERLANDS, 569: 53-64, (2016).

  3. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  4. Análisis de la aparición de discapacidades en personas mayores de Cataluña = Analysis of disability onset of the elderly in Catalonia

    Directory of Open Access Journals (Sweden)

    Bermúdez Morata, Lluís

    2008-01-01

    Full Text Available El presente trabajo se centra en el estudio del tiempo hasta la discapacitación para las actividades de la vida diaria en la población activa de Cataluña mayor de 60 años. El riesgo de sufrir discapacidades es mayor para este grupo de edad que en los más jóvenes y, además, se incrementa con la edad. La aparición de discapacidades resta habilidades a la hora de realizar las distintas actividades de la vida diaria, por ello hemos centrado nuestro interés en la variable tiempo hasta que una persona de 60 años que no muestra discapacidad alguna, se discapacita, identificando factores ligados a los incrementos del riesgo de discapacitación. A partir de los datos que proporciona la Encuesta sobre Discapacidades, Deficiencias y Estado de Salud (EDDES, INE 1999 y, utilizando el estimador de Kaplan-Meier, se estiman las funciones de supervivencia que permiten calcular probabilidades relacionadas con la edad de discapacitación. Asimismo, se ha realizado un análisis del modelo de regresión de Weibull que permite interpretar cómo y en qué medida afectan las características individuales. = In Spain individuals aged 60 years and above are major consumers of the health care system. The risk of becoming unable to perform daily life activities is higher for the elderly than for the younger population, and in addition, it increases with age. As a consequence we focus on the study of the period of life after an abled person who is 60 years old becomes disabled and we also study the factors that are related to the risk of disability. Using data from the Survey of Disabilities, Handicaps and Health Status (EDDES, INE 1999, and using the Kaplan-Meier estimator, we estimate the survival functions to calculate the probability of becoming disabled at different age points. Besides, a Weibull regression model is estimated in order to interpret the effects of individual characteristics on the disability risk.

  5. Structural health monitoring and probability of detection estimation

    Science.gov (United States)

    Forsyth, David S.

    2016-02-01

    Structural health monitoring (SHM) methods are often based on nondestructive testing (NDT) sensors and are often proposed as replacements for NDT to lower cost and/or improve reliability. In order to take advantage of SHM for life cycle management, it is necessary to determine the Probability of Detection (POD) of the SHM system just as for traditional NDT to ensure that the required level of safety is maintained. Many different possibilities exist for SHM systems, but one of the attractive features of SHM versus NDT is the ability to take measurements very simply after the SHM system is installed. Using a simple statistical model of POD, some authors have proposed that very high rates of SHM system data sampling can result in high effective POD even in situations where an individual test has low POD. In this paper, we discuss the theoretical basis for determining the effect of repeated inspections, and examine data from SHM experiments against this framework to show how the effective POD from multiple tests can be estimated.

  6. Fatigue Analysis of an Outer Bearing Bush of a Kaplan Turbine

    Directory of Open Access Journals (Sweden)

    Doina Frunzaverde

    2011-01-01

    Full Text Available The paper presents the fatigue analysis of an outer bearing bush of aKaplan turbine. This outer bush, together with an inner one, bear thepin lever - trunion - blade subassembly of the runner blade operatingmechanism. For modeling and simulation, SolidWorks software is used.

  7. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    Science.gov (United States)

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  9. Calculating the dielectric anisotropy of nematic liquid crystals: a reinvestigation of the Maier–Meier theory

    International Nuclear Information System (INIS)

    Ran, Zhang; Jun, He; Zeng-Hui, Peng; Li, Xuan

    2009-01-01

    This paper investigates the average dielectric permittivity (ε-bar ) in the Maier–Meier theory for calculating the dielectric anisotropy (Δε) of nematic liquid crystals. For the reason that ε-bar of nematics has the same expression as the dielectric permittivity of the isotropic state, the Onsager equation for isotropic dielectric was used to calculate it. The computed ε-bar shows reasonable agreement with the results of the numerical methods used in the literature. Molecular parameters, such as the polarizability and its anisotropy, the dipole moment and its angle with the molecular long axis, were taken from semi-empirical quantum chemistry (MOCPAC/AM1) modeling. The calculated values of Δε according to the Maier–Meier equation are in good agreement with the experimental results for the investigated compounds having different core structures and polar substituents. (condensed matter: structure, thermal and mechanical properties)

  10. Serum level of soluble urokinase-type plasminogen activator receptor is a strong and independent predictor of survival in human immunodeficiency virus infection

    DEFF Research Database (Denmark)

    Sidenius, N; Sier, C.F.M.; Ullum, H

    2000-01-01

    levels of soluble uPAR (suPAR) in patients with advanced HIV-1 disease and whether the serum level of suPAR is predictive of clinical outcome. Using an enzyme-linked immunosorbent assay, the level of suPAR was measured retrospectively in serum samples from 314 patients with HIV-1 infection. By Kaplan......-Meier and Cox regression analyses, the serum suPAR levels were correlated to survival with AIDS-related death as the end point. High levels of serum suPAR (greater than median) were associated with poor overall survival, and Kaplan-Meier analysis on patients stratified by suPAR level demonstrated a continuous...

  11. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  12. Perturbative analysis for Kaplan's lattice chiral fermions

    International Nuclear Information System (INIS)

    Aoki, S.; Hirose, H.

    1994-01-01

    Perturbation theory for lattice fermions with domain wall mass terms is developed and is applied to investigate the chiral Schwinger model formulated on the lattice by Kaplan's method. We calculate the effective action for gauge fields to one loop, and find that it contains a longitudinal component even for anomaly-free cases. From the effective action we obtain gauge anomalies and Chern-Simons currents without ambiguity. We also show that the current corresponding to the fermion number has a nonzero divergence and it flows off the wall into the extra dimension. Similar results are obtained for a proposal by Shamir, who used a constant mass term with free boundaries instead of domain walls

  13. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    Science.gov (United States)

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  14. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations

    International Nuclear Information System (INIS)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use

  15. Estimating occurrence and detection probabilities for stream-breeding salamanders in the Gulf Coastal Plain

    Science.gov (United States)

    Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.

    2017-01-01

    Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.

  16. Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation

    Directory of Open Access Journals (Sweden)

    Campos-Aranda Daniel Francisco

    2015-09-01

    Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.

  17. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  18. Outcome with lenalidomide plus dexamethasone followed by early autologous stem cell transplantation in patients with newly diagnosed multiple myeloma on the ECOG-ACRIN E4A03 randomized clinical trial: long-term follow-up.

    Science.gov (United States)

    Biran, N; Jacobus, S; Vesole, D H; Callander, N S; Fonseca, R; Williams, M E; Abonour, R; Katz, M S; Rajkumar, S V; Greipp, P R; Siegel, D S

    2016-09-02

    In Eastern Cooperative Oncology Group-ACRIN E4A03, on completion of four cycles of therapy, newly diagnosed multiple myeloma patients had the option of proceeding to autologous peripheral blood stem cell transplant (ASCT) or continuing on their assigned therapy lenalidomide plus low-dose dexamethasone (Ld) or lenalidomide plus high-dose dexamethasone (LD). This landmark analysis compared the outcome of 431 patients surviving their first four cycles of therapy pursuing early ASCT to those continuing on their assigned therapy. Survival distributions were estimated using the Kaplan-Meier method and compared with log-rank test. Ninety patients (21%) opted for early ASCT. The 1-, 2-, 3-, 4- and 5-year survival probability estimates were higher for early ASCT versus no early ASCT at 99, 93, 91, 85 and 80% versus 94, 84, 75, 65 and 57%, respectively. The median overall survival (OS) in the early versus no early ASCT group was not reached (NR) versus 5.78 years. In patients 50, 0.25). In patients ⩾65 years of age, median OS in the early versus no early ASCT was NR versus 5.11 years. ASCT dropped out of statistical significance (P=0.080). Patients opting for ASCT after induction Ld/LD had a higher survival probability and improvement in OS regardless of dexamethasone dose density.

  19. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  20. Skull base chordomas: analysis of dose-response characteristics

    International Nuclear Information System (INIS)

    Niemierko, Andrzej; Terahara, Atsuro; Goitein, Michael

    1997-01-01

    Objective: To extract dose-response characteristics from dose-volume histograms and corresponding actuarial survival statistics for 115 patients with skull base chordomas. Materials and Methods: We analyzed data for 115 patients with skull base chordoma treated with combined photon and proton conformal radiotherapy to doses in the range 66.6Gy - 79.2Gy. Data set for each patient included gender, histology, age, tumor volume, prescribed dose, overall treatment time, time to recurrence or time to last observation, target dose-volume histogram, and several dosimetric parameters (minimum/mean/median/maximum target dose, percent of the target volume receiving the prescribed dose, dose to 90% of the target volume, and the Equivalent Uniform Dose (EUD). Data were analyzed using the Kaplan-Meier survivor function estimate, the proportional hazards (Cox) model, and parametric modeling of the actuarial probability of recurrence. Parameters of dose-response characteristics were obtained using the maximum likelihood method. Results: Local failure developed in 42 (36%) of patients, with actuarial local control rates at 5 years of 59.2%. The proportional hazards model revealed significant dependence of gender on the probability of recurrence, with female patients having significantly poorer prognosis (hazard ratio of 2.3 with the p value of 0.008). The Wilcoxon and the log-rank tests of the corresponding Kaplan-Meier recurrence-free survival curves confirmed statistical significance of this effect. The Cox model with stratification by gender showed significance of tumor volume (p=0.01), the minimum target dose (p=0.02), and the EUD (p=0.02). Other parameters were not significant at the α level of significance of 0.05, including the prescribed dose (p=0.21). Parametric analysis using a combined model of tumor control probability (to account for non-uniformity of target dose distribution) and the Weibull failure time model (to account for censoring) allowed us to estimate

  1. Survival and mortality among users and non-users of hydroxyurea with sickle cell disease.

    Science.gov (United States)

    de Araujo, Olinda Maria Rodrigues; Ivo, Maria Lúcia; Ferreira Júnior, Marcos Antonio; Pontes, Elenir Rose Jardim Cury; Bispo, Ieda Maria Gonçalves Pacce; de Oliveira, Eveny Cristine Luna

    2015-01-01

    to estimate survival, mortality and cause of death among users or not of hydroxyurea with sickle cell disease. cohort study with retrospective data collection, from 1980 to 2010 of patients receiving inpatient treatment in two Brazilian public hospitals. The survival probability was determined using the Kaplan-Meier estimator, survival calculations (SPSS version 10.0), comparison between survival curves, using the log rank method. The level of significance was p=0.05. of 63 patients, 87% had sickle cell anemia, with 39 using hydroxyurea, with a mean time of use of the drug of 20.0±10.0 years and a mean dose of 17.37±5.4 to 20.94±7.2 mg/kg/day, raising the fetal hemoglobin. In the comparison between those using hydroxyurea and those not, the survival curve was greater among the users (p=0.014). A total of 10 deaths occurred, with a mean age of 28.1 years old, and with Acute Respiratory Failure as the main cause. the survival curve is greater among the users of hydroxyurea. The results indicate the importance of the nurse incorporating therapeutic advances of hydroxyurea in her care actions.

  2. A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines

    Science.gov (United States)

    Pennacchi, P.; Borghesani, P.; Chatterton, S.

    2015-08-01

    Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.

  3. Assessing the Adequacy of Probability Distributions for Estimating the Extreme Events of Air Temperature in Dabaa Region

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2015-01-01

    Assessing the adequacy of probability distributions for estimating the extreme events of air temperature in Dabaa region is one of the pre-requisite s for any design purpose at Dabaa site which can be achieved by probability approach. In the present study, three extreme value distributions are considered and compared to estimate the extreme events of monthly and annual maximum and minimum temperature. These distributions include the Gumbel/Frechet distributions for estimating the extreme maximum values and Gumbel /Weibull distributions for estimating the extreme minimum values. Lieblein technique and Method of Moments are applied for estimating the distribution para meters. Subsequently, the required design values with a given return period of exceedance are obtained. Goodness-of-Fit tests involving Kolmogorov-Smirnov and Anderson-Darling are used for checking the adequacy of fitting the method/distribution for the estimation of maximum/minimum temperature. Mean Absolute Relative Deviation, Root Mean Square Error and Relative Mean Square Deviation are calculated, as the performance indicators, to judge which distribution and method of parameters estimation are the most appropriate one to estimate the extreme temperatures. The present study indicated that the Weibull distribution combined with Method of Moment estimators gives the highest fit, most reliable, accurate predictions for estimating the extreme monthly and annual minimum temperature. The Gumbel distribution combined with Method of Moment estimators showed the highest fit, accurate predictions for the estimation of the extreme monthly and annual maximum temperature except for July, August, October and November. The study shows that the combination of Frechet distribution with Method of Moment is the most accurate for estimating the extreme maximum temperature in July, August and November months while t he Gumbel distribution and Lieblein technique is the best for October

  4. Large mass limit of the continuum theories in Kaplan's formulation

    International Nuclear Information System (INIS)

    Kawano, T.; Kikukawa, Y.

    1994-01-01

    Being inspired by Kaplan's proposal for simulating chiral fermions on a lattice, we examine the continuum analogue of his domain-wall construction for two-dimensional chiral Schwinger models. Adopting a slightly unusual dimensional regularization, we explicitly evaluate the one-loop effective action in the limit that the domain-wall mass goes to infinity. For anomaly-free cases, the effective action turns out to be gauge invariant in the two-dimensional sense

  5. Empirical investigation on using wind speed volatility to estimate the operation probability and power output of wind turbines

    International Nuclear Information System (INIS)

    Liu, Heping; Shi, Jing; Qu, Xiuli

    2013-01-01

    Highlights: ► Ten-minute wind speed and power generation data of an offshore wind turbine are used. ► An ARMA–GARCH-M model is built to simultaneously forecast wind speed mean and volatility. ► The operation probability and expected power output of the wind turbine are predicted. ► The integrated approach produces more accurate wind power forecasting than other conventional methods. - Abstract: In this paper, we introduce a quantitative methodology that performs the interval estimation of wind speed, calculates the operation probability of wind turbine, and forecasts the wind power output. The technological advantage of this methodology stems from the empowered capability of mean and volatility forecasting of wind speed. Based on the real wind speed and corresponding wind power output data from an offshore wind turbine, this methodology is applied to build an ARMA–GARCH-M model for wind speed forecasting, and then to compute the operation probability and the expected power output of the wind turbine. The results show that the developed methodology is effective, the obtained interval estimation of wind speed is reliable, and the forecasted operation probability and expected wind power output of the wind turbine are accurate

  6. Safety, tolerability, and initial efficacy of AZD6140, the first reversible oral adenosine diphosphate receptor antagonist, compared with clopidogrel, in patients with non-ST-segment elevation acute coronary syndrome: primary results of the DISPERSE-2 trial

    DEFF Research Database (Denmark)

    Cannon, Christopher P; Husted, Steen; Harrington, Robert A

    2007-01-01

    , or clopidogrel 300-mg loading dose plus 75 mg once daily for up to 12 weeks. RESULTS: The primary end point, the Kaplan-Meier rate of major or minor bleeding through 4 weeks, was 8.1% in the clopidogrel group, 9.8% in the AZD6140 90-mg group, and 8.0% in the AZD6140 180-mg group (p = 0.43 and p = 0.......96, respectively, vs. clopidogrel); the major bleeding rates were 6.9%, 7.1%, and 5.1%, respectively (p = 0.91 and p = 0.35, respectively, vs. clopidogrel). Although not statistically significant, favorable trends were seen in the Kaplan-Meier rates of myocardial infarction (MI) over the entire study period (MI: 5...

  7. Meier-Gorlin syndrome: Report of an additional patient with congenital heart disease

    Directory of Open Access Journals (Sweden)

    Rabah M. Shawky

    2014-10-01

    Full Text Available We report a 7 year old female child with the classical triad of Meier-Gorlin syndrome (MGS, (microtia, absent patella and short stature. She had the characteristic facial features, with normal mentality and defective speech, skeletal abnormalities, conductive hearing loss, cystitis and normal growth hormone level. She suffered from recurrent chest infection during the first year of life which improved gradually with age. Although congenital heart is rarely observed in MGS, our patient had in addition fenestrated interatrial septal defect.

  8. Air injection test on a Kaplan turbine: prototype - model comparison

    Science.gov (United States)

    Angulo, M.; Rivetti, A.; Díaz, L.; Liscia, S.

    2016-11-01

    Air injection is a very well-known resource to reduce pressure pulsation magnitude in turbines, especially on Francis type. In the case of large Kaplan designs, even when not so usual, it could be a solution to mitigate vibrations arising when tip vortex cavitation phenomenon becomes erosive and induces structural vibrations. In order to study this alternative, aeration tests were performed on a Kaplan turbine at model and prototype scales. The research was focused on efficiency of different air flow rates injected in reducing vibrations, especially at the draft tube and the discharge ring and also in the efficiency drop magnitude. It was found that results on both scales presents the same trend in particular for vibration levels at the discharge ring. The efficiency drop was overestimated on model tests while on prototype were less than 0.2 % for all power output. On prototype, air has a beneficial effect in reducing pressure fluctuations up to 0.2 ‰ of air flow rate. On model high speed image computing helped to quantify the volume of tip vortex cavitation that is strongly correlated with the vibration level. The hydrophone measurements did not capture the cavitation intensity when air is injected, however on prototype, it was detected by a sonometer installed at the draft tube access gallery.

  9. The estimation of probable maximum precipitation: the case of Catalonia.

    Science.gov (United States)

    Casas, M Carmen; Rodríguez, Raül; Nieto, Raquel; Redaño, Angel

    2008-12-01

    A brief overview of the different techniques used to estimate the probable maximum precipitation (PMP) is presented. As a particular case, the 1-day PMP over Catalonia has been calculated and mapped with a high spatial resolution. For this purpose, the annual maximum daily rainfall series from 145 pluviometric stations of the Instituto Nacional de Meteorología (Spanish Weather Service) in Catalonia have been analyzed. In order to obtain values of PMP, an enveloping frequency factor curve based on the actual rainfall data of stations in the region has been developed. This enveloping curve has been used to estimate 1-day PMP values of all the 145 stations. Applying the Cressman method, the spatial analysis of these values has been achieved. Monthly precipitation climatological data, obtained from the application of Geographic Information Systems techniques, have been used as the initial field for the analysis. The 1-day PMP at 1 km(2) spatial resolution over Catalonia has been objectively determined, varying from 200 to 550 mm. Structures with wavelength longer than approximately 35 km can be identified and, despite their general concordance, the obtained 1-day PMP spatial distribution shows remarkable differences compared to the annual mean precipitation arrangement over Catalonia.

  10. Repair of a Kaplan blade sealing surface without dismantling the turbine; Instandsetzung einer Kaplanschaufel-Dichtflaeche ohne Turbinendemontage

    Energy Technology Data Exchange (ETDEWEB)

    Drygas, A.; Bauer, K. [E.ON Wasserkraft GmbH, Landshut (Germany)

    2008-07-01

    In spite of aiming at minimum maintenance costs, runners of Kaplan turbines need to be kept in good repair. Besides preserving their main function as an energy converter, ecological reasons have to be considered as well. The latter aspect accounts for fully functional, safe seals of the pivot-mounted Kaplan runner blades. Advanced wear of the sealing surfaces may require mechanical processing, which formerly called for a costly dismantling of the runner. A newly developed and patented processing device now allows for machining the worn out sealing surfaces without dismantling the runner, thus reducing costs considerably. The device was first successfully applied to a Kaplan turbine runner with a diameter of 5.35 m. The device, so far designed for grinding, will be enhanced for lathing, in order to obtain a process even more efficient when combining lathing and grinding. (orig.)

  11. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  12. Estimated probability of postwildfire debris flows in the 2012 Whitewater-Baldy Fire burn area, southwestern New Mexico

    Science.gov (United States)

    Tillery, Anne C.; Matherne, Anne Marie; Verdin, Kristine L.

    2012-01-01

    In May and June 2012, the Whitewater-Baldy Fire burned approximately 1,200 square kilometers (300,000 acres) of the Gila National Forest, in southwestern New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 128 basins burned by the Whitewater-Baldy Fire. A pair of empirical hazard-assessment models developed by using data from recently burned basins throughout the intermountain Western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows along the burned area drainage network and for selected drainage basins within the burned area. The models incorporate measures of areal burned extent and severity, topography, soils, and storm rainfall intensity to estimate the probability and volume of debris flows following the fire. In response to the 2-year-recurrence, 30-minute-duration rainfall, modeling indicated that four basins have high probabilities of debris-flow occurrence (greater than or equal to 80 percent). For the 10-year-recurrence, 30-minute-duration rainfall, an additional 14 basins are included, and for the 25-year-recurrence, 30-minute-duration rainfall, an additional eight basins, 20 percent of the total, have high probabilities of debris-flow occurrence. In addition, probability analysis along the stream segments can identify specific reaches of greatest concern for debris flows within a basin. Basins with a high probability of debris-flow occurrence were concentrated in the west and central parts of the burned area, including tributaries to Whitewater Creek, Mineral Creek, and Willow Creek. Estimated debris-flow volumes ranged from about 3,000-4,000 cubic meters (m3) to greater than 500,000 m3 for all design storms modeled. Drainage basins with estimated volumes greater than 500,000 m3 included tributaries to Whitewater Creek, Willow

  13. Probability estimation of potential harm to human health and life caused by a hypothetical nuclear accident at the nuclear power plant

    International Nuclear Information System (INIS)

    Soloviov, Vladyslav; Pysmenniy, Yevgen

    2015-01-01

    This paper describes some general methodological aspects of the assessment of the damage to human life and health caused by a hypothetical nuclear accident at the nuclear power plant (NPP). Probability estimation of death (due to cancer and non-cancer effects of radiation injury), disability and incapacity of individuals were made by taking into account the regulations of Ukraine. According to the assessment, the probability of death due to cancer and non-cancer effects of radiation damage to individuals who received radiation dose of 1 Sv is equal to 0.09. Probability of disability of 1, 2 or 3 group regardless of the radiation dose is 0.009, 0.0054, 0.027, respectively. Probability of temporary disability of the individual who received dose equal to 33 mSv (the level of potential exposure in a hypothetical nuclear accident at the NPP) is equal 0.16. This probability estimation of potential harm to human health and life caused by a hypothetical nuclear accident can be used for NPP in different countries using requirements of regulations in these countries. And also to estimate the amount of insurance payments due to the nuclear damage in the event of a nuclear accident at the NPP or other nuclear industry enterprise. (author)

  14. Improved long-term survival after intra-operative single high-dose ATG-Fresenius induction in renal transplantation: a single centre experience.

    Science.gov (United States)

    Kaden, Jürgen; May, Gottfried; Völp, Andreas; Wesslau, Claus

    2009-01-01

    In organ grafts donor-specific sensitization is initiated immediately after revascularization. Therefore, in 1990 we introduced the intra-operative single high-dose ATG-Fresenius (ATG-F) induction in addition to standard triple drug therapy (TDT) consisting of steroids, azathioprine and cyclosporin. A total of 778 first renal transplantations from deceased donors, performed between 1987 and 1998, were included in this evaluation. This retrospective analysis of clinic records and electronic databases presents data of all recipients of first kidney grafts who received two different ATG-F inductions (1(st) group: 9 mg/kg body weight as single high-dose intra-operatively, n=484; 2(nd) group: 3 mg/kg body weight on 7 or 8 consecutive days as multiple-dose starting also intra-operatively, n=78) and standard TDT alone (3(rd) group: TDT alone, n=216). The 10-year patient survival rates were 72.6+/-2.6% (TDT + ATG-F single high-dose), 79.5+/-5.1% (TDT + ATG-F multiple-dose) and 67.2+/-3.7%% (TDT alone; Kaplan-Meier estimates with standard errors; ATG-F vs TDT alone, p=0.001). The 10-year graft survival rates with censoring of patients that died with a functioning graft were 73.8+/-2.4%, 57.7+/-5.8% and 58.4+/-3.6% (Kaplan-Meier estimates with standard errors; 1(st) vs 2(nd )and 3(rd) group, respectively, p<0.001) and the 10-year graft survival rates with patient death counted as graft failure were 58.3+/-2.7%, 55.7+/-5.8% and 48.2+/-3.5% (Kaplan-Meier estimates with standard errors; ATG-F single high-dose vs TDT, p=0.023). In pre-sensitized recipients there were also significant differences in favour of ATG-F, more notably in the single high-dose ATG-F induction. A total of 69% of the patients in the two cohorts receiving ATG-F did not experience any transplant rejections compared to 56% in patients undergoing TDT alone (p=0.018). The incidence of infectious complications was comparable across all groups. According to evidence obtained from the routine documentation of 778

  15. An electronic health record-enabled obesity database

    Directory of Open Access Journals (Sweden)

    Wood G

    2012-05-01

    Full Text Available Abstract Background The effectiveness of weight loss therapies is commonly measured using body mass index and other obesity-related variables. Although these data are often stored in electronic health records (EHRs and potentially very accessible, few studies on obesity and weight loss have used data derived from EHRs. We developed processes for obtaining data from the EHR in order to construct a database on patients undergoing Roux-en-Y gastric bypass (RYGB surgery. Methods Clinical data obtained as part of standard of care in a bariatric surgery program at an integrated health delivery system were extracted from the EHR and deposited into a data warehouse. Data files were extracted, cleaned, and stored in research datasets. To illustrate the utility of the data, Kaplan-Meier analysis was used to estimate length of post-operative follow-up. Results Demographic, laboratory, medication, co-morbidity, and survey data were obtained from 2028 patients who had undergone RYGB at the same institution since 2004. Pre-and post-operative diagnostic and prescribing information were available on all patients, while survey laboratory data were available on a majority of patients. The number of patients with post-operative laboratory test results varied by test. Based on Kaplan-Meier estimates, over 74% of patients had post-operative weight data available at 4 years. Conclusion A variety of EHR-derived data related to obesity can be efficiently obtained and used to study important outcomes following RYGB.

  16. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  17. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  18. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Directory of Open Access Journals (Sweden)

    William H. Farmer

    2017-10-01

    New hydrological insights for the region: Several methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index and geospatial tools (kriging and topological kriging. These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  19. Exclusive breastfeeding duration and determinants among Brazilian children under two years of age

    Directory of Open Access Journals (Sweden)

    Sarah Warkentin

    2013-06-01

    Full Text Available OBJECTIVE: The present study described the duration and identified the determinants of exclusive breastfeeding. METHODS: The study used data from the Pesquisa Nacional de Demografia e Saúde da Criança e da Mulher 2006 (National Demographic and Health Survey on Women and Children 2006. Data were collected using questionnaires administered by trained professionals and refer to a subsample of 1,704 children aged less than 24 months. The estimated durations of exclusive breastfeeding are presented according to socioeconomic, demographic and epidemiological variables. Kaplan Meier estimator curves were used to produce valid estimates of breastfeeding duration and the Cox's proportional hazards model was fitted to identify risks. RESULTS: The median estimated duration of exclusive breastfeeding was 60 days. The final Cox model consisted of mother's age <20 years (hazard ratio=1.53, 95% confidence interval=1.11-1.48, use of pacifier (hazard ratio=1.53, 95% confidence interval=1.37-1.71, not residing in the country's southeast region (hazard ratio=1.22, 95% confidence interval=1.07-1.40 and socioeconomic status (hazard ratio=1.28, 95% confidence interval=1.06-1.55. CONCLUSION: The Kaplan Meier estimator corrected the underestimated duration of breastfeeding in the country when calculated by the current status methodology. Despite the national efforts done in the last decades to promote breastfeeding, the results indicate that the duration of exclusive breastfeeding is still half of that recommended for this dietary practice to promote health. Ways to revert this situation would be ongoing educational activities involving the educational and health systems, associated with advertising campaigns on television and radio mainly targeting young mothers with low education level and low income, identified as those at high risk of weaning their children early.

  20. Improved method for estimating particle scattering probabilities to finite detectors for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Mickael, M.; Gardner, R.P.; Verghese, K.

    1988-01-01

    An improved method for calculating the total probability of particle scattering within the solid angle subtended by finite detectors is developed, presented, and tested. The limiting polar and azimuthal angles subtended by the detector are measured from the direction that most simplifies their calculation rather than from the incident particle direction. A transformation of the particle scattering probability distribution function (pdf) is made to match the transformation of the direction from which the limiting angles are measured. The particle scattering probability to the detector is estimated by evaluating the integral of the transformed pdf over the range of the limiting angles measured from the preferred direction. A general formula for transforming the particle scattering pdf is derived from basic principles and applied to four important scattering pdf's; namely, isotropic scattering in the Lab system, isotropic neutron scattering in the center-of-mass system, thermal neutron scattering by the free gas model, and gamma-ray Klein-Nishina scattering. Some approximations have been made to these pdf's to enable analytical evaluations of the final integrals. These approximations are shown to be valid over a wide range of energies and for most elements. The particle scattering probability to spherical, planar circular, and right circular cylindrical detectors has been calculated using the new and previously reported direct approach. Results indicate that the new approach is valid and is computationally faster by orders of magnitude

  1. Influence of Body Mass Index on Tumor Pathology and Survival in Uterine Cancer

    DEFF Research Database (Denmark)

    Bjerrum Kristensen, Anne; Hare-Bruun, Helle; Høgdall, Claus Kim

    2017-01-01

    OBJECTIVE: To evaluate the influence of body mass index (BMI) on endometrial tumor pathology, stage and complication rate and to identify individual prognostic factors, such as BMI, in types I and II endometrial cancer. DESIGN: Register study included all Danish women who underwent surgery...... I and II endometrial cancer were retrieved. Kaplan-Meier plot was used to illustrate differences in survival in relation to BMI. Log-rank test was used to demonstrate difference between the curves. Cox regression hazard model was used to estimate hazard ratios (HR) of the effect of BMI on overall...

  2. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    Science.gov (United States)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  3. Factors influencing reporting and harvest probabilities in North American geese

    Science.gov (United States)

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  4. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    Science.gov (United States)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the

  5. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri.

    Science.gov (United States)

    2014-01-01

    Regression analysis techniques were used to develop a : set of equations for rural ungaged stream sites for estimating : discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent : annual exceedance probabilities, which are equivalent to : ann...

  6. Estimating the probability of allelic drop-out of STR alleles in forensic genetics

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2009-01-01

    In crime cases with available DNA evidence, the amount of DNA is often sparse due to the setting of the crime. In such cases, allelic drop-out of one or more true alleles in STR typing is possible. We present a statistical model for estimating the per locus and overall probability of allelic drop......-out using the results of all STR loci in the case sample as reference. The methodology of logistic regression is appropriate for this analysis, and we demonstrate how to incorporate this in a forensic genetic framework....

  7. Tempo até o transplante e sobrevida em pacientes com insuficiência renal crônica no Estado do Rio de Janeiro, Brasil, 1998-2002 Time to kidney transplantation in chronic renal failure patients in the State of Rio de Janeiro, Brazil, 1998-2002

    Directory of Open Access Journals (Sweden)

    Cynthia Braga da Cunha

    2007-04-01

    Full Text Available Neste estudo, descreveram-se as características dos 14.419 pacientes com insuficiência renal crônica tratados por hemodiálise no Estado do Rio de Janeiro, Brasil, e analisou-se o tempo até a primeira realização do transplante no período de 1998 a 2002. Técnicas de análise de sobrevida como a análise não paramétrica de Kaplan-Meier e a modelagem semiparamétrica com o modelo de riscos proporcionais de Cox foram utilizadas. Além do modelo de sobrevida para transplante, o tempo até o óbito foi estimado para a comparação das estimativas dos dois modelos. Os resultados mostraram que, no período estudado, apenas 6,3% dos pacientes foram transplantados, 32,4% foram indicados e 6,3% inscritos na lista de espera. Observa-se que a probabilidade de transplante dos pacientes indicados, inscritos para o transplante e os que estão em uma faixa etária reduzida é maior. A diabetes mellitus possui um efeito redutor de 35% no risco de realização de transplante. Todas as estimativas apresentaram direções contrárias às obtidas pelo modelo de sobrevida para óbito.This study analyzes the characteristics of 14,419 chronic renal failure patients treated with hemodialysis and time to first kidney transplantation in the State of Rio de Janeiro, Brazil, from 1998 to 2002. Survival analysis methods were used, such as the Kaplan-Meier non-parametric method and the semi-parametric method with the Cox proportional hazards model. Besides the survival model for transplantation, time to death was analyzed to compare the two models' estimates. During the period studied, only 6.3% of patients received transplants, 32.4% were referred for transplantation, and 6.3% were included on the waiting list. Odds of transplantation were greater for those who had been referred, those on the waiting list, and younger patients. Diabetes mellitus reduced the probability of conducting transplantation by 35%. All the estimates showed directions opposite to those

  8. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

  9. First-passage Probability Estimation of an Earthquake Response of Seismically Isolated Containment Buildings

    International Nuclear Information System (INIS)

    Hahm, Dae-Gi; Park, Kwan-Soon; Koh, Hyun-Moo

    2008-01-01

    The awareness of a seismic hazard and risk is being increased rapidly according to the frequent occurrences of the huge earthquakes such as the 2008 Sichuan earthquake which caused about 70,000 confirmed casualties and a 20 billion U.S. dollars economic loss. Since an earthquake load contains various uncertainties naturally, the safety of a structural system under an earthquake excitation has been assessed by probabilistic approaches. In many structural applications for a probabilistic safety assessment, it is often regarded that the failure of a system will occur when the response of the structure firstly crosses the limit barrier within a specified interval of time. The determination of such a failure probability is usually called the 'first-passage problem' and has been extensively studied during the last few decades. However, especially for the structures which show a significant nonlinear dynamic behavior, an effective and accurate method for the estimation of such a failure probability is not fully established yet. In this study, we presented a new approach to evaluate the first-passage probability of an earthquake response of seismically isolated structures. The proposed method is applied to the seismic isolation system for the containment buildings of a nuclear power plant. From the numerical example, we verified that the proposed method shows accurate results with more efficient computational efforts compared to the conventional approaches

  10. TÜ, TPÜ ja EHI uusi magistreid / Eda Tursk, Hille Roots, Heidi Meier

    Index Scriptorium Estoniae

    Tursk, Eda

    2004-01-01

    Tartu ülikooli eesti ja soome-ugri keeleteaduse osakonnas ning kirjanduse ja rahvaluule osakonnas kaitsesid 2003.a. magistritööd Niina Aasmäe, Piret Voll, Larissa Degel, Reet Hendrikson, Tiina Pai, Petar Kehayov, Anna Baidullina, Katrin Ennus, Kristi Jõesaar, Ell Vahtramäe, Lauri Sommer, Andreas Kalkun, Mirjam Hinrikus, Kristel Nõlvak. Tallinna Pedagoogikaülikoolis kaitsesid 2003.a. magistritööd Sirje Nootre, Merike Mägedi, Tiiu Koovit, Heidi Meier, Jaanika Stackhouse, Lilian Ossi, Annika Vamper, Marika Mikkor, Piret Õunapuu, Helin Puksand, Taimi Rosenberg. Eesti Humanitaarinstituudis kaitses 2003.a. magistritööd Merilin Miljan

  11. A novel multi-model probability battery state of charge estimation approach for electric vehicles using H-infinity algorithm

    International Nuclear Information System (INIS)

    Lin, Cheng; Mu, Hao; Xiong, Rui; Shen, Weixiang

    2016-01-01

    Highlights: • A novel multi-model probability battery SOC fusion estimation approach was proposed. • The linear matrix inequality-based H∞ technique is employed to estimate the SOC. • The Bayes theorem has been employed to realize the optimal weight for the fusion. • The robustness of the proposed approach is verified by different batteries. • The results show that the proposed method can promote global estimation accuracy. - Abstract: Due to the strong nonlinearity and complex time-variant property of batteries, the existing state of charge (SOC) estimation approaches based on a single equivalent circuit model (ECM) cannot provide the accurate SOC for the entire discharging period. This paper aims to present a novel SOC estimation approach based on a multiple ECMs fusion method for improving the practical application performance. In the proposed approach, three battery ECMs, namely the Thevenin model, the double polarization model and the 3rd order RC model, are selected to describe the dynamic voltage of lithium-ion batteries and the genetic algorithm is then used to determine the model parameters. The linear matrix inequality-based H-infinity technique is employed to estimate the SOC from the three models and the Bayes theorem-based probability method is employed to determine the optimal weights for synthesizing the SOCs estimated from the three models. Two types of lithium-ion batteries are used to verify the feasibility and robustness of the proposed approach. The results indicate that the proposed approach can improve the accuracy and reliability of the SOC estimation against uncertain battery materials and inaccurate initial states.

  12. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  13. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  14. Pesticide mixtures in the Swedish streams: Environmental risks, contributions of individual compounds and consequences of single-substance oriented risk mitigation.

    Science.gov (United States)

    Gustavsson, Mikael; Kreuger, Jenny; Bundschuh, Mirco; Backhaus, Thomas

    2017-11-15

    This paper presents the ecotoxicological assessment and environmental risk evaluation of complex pesticide mixtures occurring in freshwater ecosystems in southern Sweden. The evaluation is based on exposure data collected between 2002 and 2013 by the Swedish pesticide monitoring program and includes 1308 individual samples, detecting mixtures of up to 53 pesticides (modal=8). Pesticide mixture risks were evaluated using three different scenarios for non-detects (best-case, worst-case and using the Kaplan-Meier method). The risk of each scenario was analyzed using Swedish Water Quality Objectives (WQO) and trophic-level specific environmental thresholds. Using the Kaplan-Meier method the environmental risk of 73% of the samples exceeded acceptable levels, based on an assessment using Concentration-Addition and WQOs for the individual pesticides. Algae were the most sensitive organism group. However, analytical detection limits, especially for insecticides, were insufficient to analyze concentrations at or near their WQO's. Thus, the risk of the analyzed pesticide mixtures to crustaceans and fish is systematically underestimated. Treating non-detects as being present at their individual limit of detection increased the estimated risk by a factor 100 or more, compared to the best-case or the Kaplan-Meier scenario. Pesticide mixture risks are often driven by only 1-3 compounds. However, the risk-drivers (i.e., individual pesticides explaining the largest share of potential effects) differ substantially between sites and samples, and 83 of the 141 monitored pesticides need to be included in the assessment to account for 95% of the risk at all sites and years. Single-substance oriented risk mitigation measures that would ensure that each individual pesticide is present at a maximum of 95% of its individual WQO, would also reduce the mixture risk, but only from a median risk quotient of 2.1 to a median risk quotient of 1.8. Also, acceptable total risk levels would still

  15. First hitting probabilities for semi markov chains and estimation

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos

    2017-01-01

    We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain...

  16. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...

  17. Probable mode prediction for H.264 advanced video coding P slices using removable SKIP mode distortion estimation

    Science.gov (United States)

    You, Jongmin; Jeong, Jechang

    2010-02-01

    The H.264/AVC (advanced video coding) is used in a wide variety of applications including digital broadcasting and mobile applications, because of its high compression efficiency. The variable block mode scheme in H.264/AVC contributes much to its high compression efficiency but causes a selection problem. In general, rate-distortion optimization (RDO) is the optimal mode selection strategy, but it is computationally intensive. For this reason, the H.264/AVC encoder requires a fast mode selection algorithm for use in applications that require low-power and real-time processing. A probable mode prediction algorithm for the H.264/AVC encoder is proposed. To reduce the computational complexity of RDO, the proposed method selects probable modes among all allowed block modes using removable SKIP mode distortion estimation. Removable SKIP mode distortion is used to estimate whether or not a further divided block mode is appropriate for a macroblock. It is calculated using a no-motion reference block with a few computations. Then the proposed method reduces complexity by performing the RDO process only for probable modes. Experimental results show that the proposed algorithm can reduce encoding time by an average of 55.22% without significant visual quality degradation and increased bit rate.

  18. Going Large or Going Small in Plant Design: Comparison between a P.P. with three small Kaplan turbines and a P.P. with just one Large Kaplan turbine

    Science.gov (United States)

    Castro-Otero, C.

    2017-04-01

    Very often small turbine manufacturers are requested to produce sizeable turbines, too large in terms of physical dimensions, power or designing capacity. In these cases clever alternative solutions should be found to meet customers’ needs. For instance: in the old times twin runner Francis turbines were an option instead of one large machine, or if a too large Pelton turbine cannot be manufactured or designed, a good option is to install a medium size Francis and a small Pelton. Likewise, a similar approach needs to be taken should the manufacturer be asked for a too large Kaplan. Facing this situation a good option is to install three or more small Kaplan turbines. This particular case was studied in depth and after all the considerations had been made, the following question arouse: Is this a way out for the manufacturer or is it really the best option for the customer? The choice made as a way out for the manufacturer became the best option for the customer and a success for both parties. This paper aims to encourage developers and engineering firms to search for more options than the traditional one to find the best option in plant design.

  19. The Role of Vaginal Brachytherapy in the Treatment of Surgical Stage I Papillary Serous or Clear Cell Endometrial Cancer

    International Nuclear Information System (INIS)

    Barney, Brandon M.; Petersen, Ivy A.; Mariani, Andrea; Dowdy, Sean C.; Bakkum-Gamez, Jamie N.; Haddock, Michael G.

    2013-01-01

    Objectives: The optimal adjuvant therapy for International Federation of Gynecology and Obstetrics (FIGO) stage I papillary serous (UPSC) or clear cell (CC) endometrial cancer is unknown. We report on the largest single-institution experience using adjuvant high-dose-rate vaginal brachytherapy (VBT) for surgically staged women with FIGO stage I UPSC or CC endometrial cancer. Methods and Materials: From 1998-2011, 103 women with FIGO 2009 stage I UPSC (n=74), CC (n=21), or mixed UPSC/CC (n=8) endometrial cancer underwent total abdominal hysterectomy with bilateral salpingo-oophorectomy followed by adjuvant high-dose-rate VBT. Nearly all patients (n=98, 95%) also underwent extended lymph node dissection of pelvic and paraortic lymph nodes. All VBT was performed with a vaginal cylinder, treating to a dose of 2100 cGy in 3 fractions. Thirty-five patients (34%) also received adjuvant chemotherapy. Results: At a median follow-up time of 36 months (range, 1-146 months), 2 patients had experienced vaginal recurrence, and the 5-year Kaplan Meier estimate of vaginal recurrence was 3%. The rates of isolated pelvic recurrence, locoregional recurrence (vaginal + pelvic), and extrapelvic recurrence (including intraabdominal) were similarly low, with 5-year Kaplan-Meier estimates of 4%, 7%, and 10%, respectively. The estimated 5-year overall survival was 84%. On univariate analysis, delivery of chemotherapy did not affect recurrence or survival. Conclusions: VBT is effective at preventing vaginal relapse in women with surgical stage I UPSC or CC endometrial cancer. In this cohort of patients who underwent comprehensive surgical staging, the risk of isolated pelvic or extrapelvic relapse was low, implying that more extensive adjuvant radiation therapy is likely unnecessary.

  20. Actions and Beliefs : Estimating Distribution-Based Preferences Using a Large Scale Experiment with Probability Questions on Expectations

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2005-01-01

    We combine the choice data of proposers and responders in the ultimatum game, their expectations elicited in the form of subjective probability questions, and the choice data of proposers ("dictator") in a dictator game to estimate a structural model of decision making under uncertainty.We use a

  1. Estimating Probable Maximum Precipitation by Considering Combined Effect of Typhoon and Southwesterly Air Flow

    Directory of Open Access Journals (Sweden)

    Cheng-Chin Liu

    2016-01-01

    Full Text Available Typhoon Morakot hit southern Taiwan in 2009, bringing 48-hr of heavy rainfall [close to the Probable Maximum Precipitation (PMP] to the Tsengwen Reservoir catchment. This extreme rainfall event resulted from the combined (co-movement effect of two climate systems (i.e., typhoon and southwesterly air flow. Based on the traditional PMP estimation method (i.e., the storm transposition method, STM, two PMP estimation approaches, i.e., Amplification Index (AI and Independent System (IS approaches, which consider the combined effect are proposed in this work. The AI approach assumes that the southwesterly air flow precipitation in a typhoon event could reach its maximum value. The IS approach assumes that the typhoon and southwesterly air flow are independent weather systems. Based on these assumptions, calculation procedures for the two approaches were constructed for a case study on the Tsengwen Reservoir catchment. The results show that the PMP estimates for 6- to 60-hr durations using the two approaches are approximately 30% larger than the PMP estimates using the traditional STM without considering the combined effect. This work is a pioneer PMP estimation method that considers the combined effect of a typhoon and southwesterly air flow. Further studies on this issue are essential and encouraged.

  2. Comparison of long-term results between osteo-odonto-keratoprosthesis and tibial bone keratoprosthesis.

    Science.gov (United States)

    Charoenrook, Victor; Michael, Ralph; de la Paz, Maria Fideliz; Temprano, José; Barraquer, Rafael I

    2018-04-01

    To compare the anatomical and the functional results between osteo-odonto-keratoprosthesis (OOKP) and keratoprosthesis using tibial bone autograft (Tibial bone KPro). We reviewed the charts of 258 patients; 145 had OOKP whereas 113 had Tibial bone KPro implanted. Functional success was defined as best corrected visual acuity ≥0.05 on decimal scale and anatomical success as retention of the keratoprosthesis lamina. Kaplan-Meier survival curves were calculated for anatomical and functional survival as well as to estimate the probability of post-op complications. The anatomical survival for both KPro groups was not significantly different and was estimated as 67% for OOKP and 54% for Tibial bone KPro at 10 years after surgery. There was also no difference found after subdividing for primary diagnosis groups such as chemical injury, thermal burn, trachoma and all autoimmune cases combined. Estimated functional survival at 10 years post-surgery was 49% for OOKP and 25% for Tibial bone KPro, which was significantly different. The probability of patients with Tibial bone KPro developing one or more post-operative complications at 10 years after surgery (65%) was significantly higher than those with OOKP (40%). Mucous membrane necrosis and retroprosthetic membrane formation were more common in Tibial bone KPro than OOKP. Both types of autologous biological KPro, OOKP and Tibial bone KPro, had statistically similar rate of keratoprosthesis extrusion. Although functional success rate was significantly higher in OOKP, it may have been influenced by a better visual potential in the patients in this group. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Smooth conditional distribution function and quantiles under random censorship.

    Science.gov (United States)

    Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine

    2002-09-01

    We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).

  4. Nonrhabdomyosarcoma soft tissue sarcoma (NRSTS) in pediatric and young adult patients: Results from a prospective study using limited-margin radiotherapy.

    Science.gov (United States)

    Tinkle, Christopher L; Fernandez-Pineda, Israel; Sykes, April; Lu, Zhaohua; Hua, Chia-Ho; Neel, Michael D; Bahrami, Armita; Shulkin, Barry L; Kaste, Sue C; Pappo, Alberto; Spunt, Sheri L; Krasin, Matthew J

    2017-11-15

    Indications for and delivery of adjuvant therapies for pediatric nonrhabdomyosarcoma soft tissue sarcoma (NRSTS) have been derived largely from adult studies; therefore, significant concern remains regarding radiation exposure to normal tissue. The authors report long-term treatment outcomes and toxicities for pediatric and young adult patients with high-grade NRSTS who were treated on a prospective trial using limited-margin radiotherapy. Sixty-two patients (ages 3-22 years) with predominantly high-grade NRSTS requiring radiation were treated on a phase 2 institutional study of conformal external-beam radiotherapy and/or brachytherapy using a 1.5-cm to 2-cm anatomically constrained margin. The estimated cumulative incidence of local failure, Gray's method estimated cumulative incidence of local failure, Kaplan-Meier method estimated survival, competing-risk regression model determined predictors of disease outcome, and toxicity was reported according to CTCAE v2.0. At a median follow-up of 5.1 years (range, 0.2-10.9 years), 9 patients had experienced local failure. The 5-year overall cumulative incidence of local failure was 14.8% (95% confidence interval [CI], 7.2%-25%), and all but 1 local failure occurred outside the highest-dose irradiation volume. The 5-year Kaplan-Meier estimates for event-free and overall survival were 49.3% (95% CI, 36.3%-61.1%) and 67.9% (95% CI, 54.2%-78.3%), respectively. Multivariable analysis indicated that younger age was the only independent predictor of local recurrence (P = .004). The 5-year cumulative incidence of grade 3 or 4 late toxicity was 15% (95% CI, 7.2%-25.3%). The delivery of limited-margin radiotherapy using conformal external-beam radiotherapy or brachytherapy provides a high rate of local tumor control without an increase in marginal failures and with acceptable treatment-related morbidity. Cancer 2017;123:4419-29. © 2017 American Cancer Society. © 2017 American Cancer Society.

  5. Validating a Local Failure Risk Stratification for Use in Prospective Studies of Adjuvant Radiation Therapy for Bladder Cancer

    International Nuclear Information System (INIS)

    Baumann, Brian C.; He, Jiwei; Hwang, Wei-Ting; Tucker, Kai N.; Bekelman, Justin E.; Herr, Harry W.; Lerner, Seth P.; Guzzo, Thomas J.; Malkowicz, S. Bruce; Christodouleas, John P.

    2016-01-01

    Purpose: To inform prospective trials of adjuvant radiation therapy (adj-RT) for bladder cancer after radical cystectomy, a locoregional failure (LF) risk stratification was proposed. This stratification was developed and validated using surgical databases that may not reflect the outcomes expected in prospective trials. Our purpose was to assess sources of bias that may affect the stratification model's validity or alter the LF risk estimates for each subgroup: time bias due to evolving surgical techniques; trial accrual bias due to inclusion of patients who would be ineligible for adj-RT trials because of early disease progression, death, or loss to follow-up shortly after cystectomy; bias due to different statistical methods to estimate LF; and subgrouping bias due to different definitions of the LF subgroups. Methods and Materials: The LF risk stratification was developed using a single-institution cohort (n=442, 1990-2008) and the multi-institutional SWOG 8710 cohort (n=264, 1987-1998) treated with radical cystectomy with or without chemotherapy. We evaluated the sensitivity of the stratification to sources of bias using Fine-Gray regression and Kaplan-Meier analyses. Results: Year of radical cystectomy was not associated with LF risk on univariate or multivariate analysis after controlling for risk group. By use of more stringent inclusion criteria, 26 SWOG patients (10%) and 60 patients from the single-institution cohort (14%) were excluded. Analysis of the remaining patients confirmed 3 subgroups with significantly different LF risks with 3-year rates of 7%, 17%, and 36%, respectively (P<.01), nearly identical to the rates without correcting for trial accrual bias. Kaplan-Meier techniques estimated higher subgroup LF rates than competing risk analysis. The subgroup definitions used in the NRG-GU001 adj-RT trial were validated. Conclusions: These sources of bias did not invalidate the LF risk stratification or substantially change the model's LF estimates.

  6. Current recommendations on the estimation of transition probabilities in Markov cohort models for use in health care decision-making: a targeted literature review

    Directory of Open Access Journals (Sweden)

    Olariu E

    2017-09-01

    Full Text Available Elena Olariu,1 Kevin K Cadwell,1 Elizabeth Hancock,1 David Trueman,1 Helene Chevrou-Severac2 1PHMR Ltd, London, UK; 2Takeda Pharmaceuticals International AG, Zurich, Switzerland Objective: Although Markov cohort models represent one of the most common forms of decision-analytic models used in health care decision-making, correct implementation of such models requires reliable estimation of transition probabilities. This study sought to identify consensus statements or guidelines that detail how such transition probability matrices should be estimated. Methods: A literature review was performed to identify relevant publications in the following databases: Medline, Embase, the Cochrane Library, and PubMed. Electronic searches were supplemented by manual-searches of health technology assessment (HTA websites in Australia, Belgium, Canada, France, Germany, Ireland, Norway, Portugal, Sweden, and the UK. One reviewer assessed studies for eligibility. Results: Of the 1,931 citations identified in the electronic searches, no studies met the inclusion criteria for full-text review, and no guidelines on transition probabilities in Markov models were identified. Manual-searching of the websites of HTA agencies identified ten guidelines on economic evaluations (Australia, Belgium, Canada, France, Germany, Ireland, Norway, Portugal, Sweden, and UK. All identified guidelines provided general guidance on how to develop economic models, but none provided guidance on the calculation of transition probabilities. One relevant publication was identified following review of the reference lists of HTA agency guidelines: the International Society for Pharmacoeconomics and Outcomes Research taskforce guidance. This provided limited guidance on the use of rates and probabilities. Conclusions: There is limited formal guidance available on the estimation of transition probabilities for use in decision-analytic models. Given the increasing importance of cost

  7. Estimates of mean consequences and confidence bounds on the mean associated with low-probability seismic events in total system performance assessments

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James

    2007-01-01

    An approach is described to estimate mean consequences and confidence bounds on the mean of seismic events with low probability of breaching components of the engineered barrier system. The approach is aimed at complementing total system performance assessment models used to understand consequences of scenarios leading to radionuclide releases in geologic nuclear waste repository systems. The objective is to develop an efficient approach to estimate mean consequences associated with seismic events of low probability, employing data from a performance assessment model with a modest number of Monte Carlo realizations. The derived equations and formulas were tested with results from a specific performance assessment model. The derived equations appear to be one method to estimate mean consequences without having to use a large number of realizations. (authors)

  8. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions.

    Science.gov (United States)

    Wenger, Seth J; Freeman, Mary C

    2008-10-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  9. Estimation of the nuclear fuel assembly eigenfrequencies in the probability sense

    Directory of Open Access Journals (Sweden)

    Zeman V.

    2014-12-01

    Full Text Available The paper deals with upper and lower limits estimation of the nuclear fuel assembly eigenfrequencies, whose design and operation parameters are random variables. Each parameter is defined by its mean value and standard deviation or by a range of values. The gradient and three sigma criterion approach is applied to the calculation of the upper and lower limits of fuel assembly eigenfrequencies in the probability sense. Presented analytical approach used for the calculation of eigenfrequencies sensitivity is based on the modal synthesis method and the fuel assembly decomposition into six identical revolved fuel rod segments, centre tube and load-bearing skeleton linked by spacer grids. The method is applied for the Russian TVSA-T fuel assembly in the WWER1000/320 type reactor core in the Czech nuclear power plant Temelín.

  10. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  11. Estimation of probability of coastal flooding: A case study in the Norton Sound, Alaska

    Science.gov (United States)

    Kim, S.; Chapman, R. S.; Jensen, R. E.; Azleton, M. T.; Eisses, K. J.

    2010-12-01

    Along the Norton Sound, Alaska, coastal communities have been exposed to flooding induced by the extra-tropical storms. Lack of observation data especially with long-term variability makes it difficult to assess the probability of coastal flooding critical in planning for development and evacuation of the coastal communities. We estimated the probability of coastal flooding with the help of an existing storm surge model using ADCIRC and a wave model using WAM for the Western Alaska which includes the Norton Sound as well as the adjacent Bering Sea and Chukchi Sea. The surface pressure and winds as well as ice coverage was analyzed and put in a gridded format with 3 hour interval over the entire Alaskan Shelf by Ocean Weather Inc. (OWI) for the period between 1985 and 2009. The OWI also analyzed the surface conditions for the storm events over the 31 year time period between 1954 and 1984. The correlation between water levels recorded by NOAA tide gage and local meteorological conditions at Nome between 1992 and 2005 suggested strong local winds with prevailing Southerly components period are good proxies for high water events. We also selected heuristically the local winds with prevailing Westerly components at Shaktoolik which locates at the eastern end of the Norton Sound provided extra selection of flood events during the continuous meteorological data record between 1985 and 2009. The frequency analyses were performed using the simulated water levels and wave heights for the 56 year time period between 1954 and 2009. Different methods of estimating return periods were compared including the method according to FEMA guideline, the extreme value statistics, and fitting to the statistical distributions such as Weibull and Gumbel. The estimates are similar as expected but with a variation.

  12. Tempo de internação e a ocorrência de eventos adversos a medicamentos: uma questão da enfermagem Tiempo de internación y la ocurrencia de eventos adversos por medicamentos: una cuestión de la enfermería The length of stay and the occurrence of adverse drug: a question of nursing

    Directory of Open Access Journals (Sweden)

    Keroulay Estebanez Roque

    2011-09-01

    Full Text Available O objetivo do estudo foi estimar o efeito do tempo e das características individuais na ocorrência de evento adverso a medicamentos em pacientes com afecções cardiológicas. Trata-se de um estudo avaliativo sobre a ocorrência de evento adverso a medicamentos realizado em um hospital público e cardiológico, localizado no Município do Rio de Janeiro. Para análise da sobrevida utilizou-se o método de Kaplan-Meier. A probabilidade de sobreviver livre de evento adverso a medicamentos até 30, 60 e 100 dias foi respectivamente de 96%, 93% e 73%. Os eventos adversos a medicamentos representam condições marcadoras que descrevem o desempenho dos serviços de saúde. A detecção de eventos adversos nas instituições hospitalares possibilita conhecer as falhas que ocorrem no sistema de medicação e ainda implementar estratégias para reduzi-las.El objetivo del estudio fue estimar el efecto del tiempo y de las diferencias individuales en la ocurrencia de eventos adversos por medicamentos en pacientes con enfermedades cardíacas. Este es un estudio de evaluación sobre la presencia de evento adverso a cabo en un hospital público y cardiología, ubicado en Río de Janeiro. Para el análisis de supervivencia se utilizó el método de Kaplan-Meier. La probabilidad de supervivencia libre de eventos adversos por medicamentos por 30, 60 y 100 días fueron respectivamente del 96%, 93% y 73%. Los eventos adversos por medicamentos representan condiciones marcadoras que describen el desempeño de los servicios de salud. La detección de eventos adversos en los hospitales hace que sea posible conocer los errores que ocurren en el sistema de medicación, así como aplicar estrategias para reducirlos.The objective of the study was to estimate the time effect and the individual characteristics in the occurrence of an adverse event to drugs in patients with cardiac diseases. It is an evaluative study about the occurrence of an adverse event to drugs that

  13. Extraneural metastases of medulloblastoma: desmoplastic variants may have prolonged survival.

    Science.gov (United States)

    Young, Robert J; Khakoo, Yasmin; Yhu, Stephen; Wolden, Suzanne; De Braganca, Kevin C; Gilheeney, Stephen W; Dunkel, Ira J

    2015-04-01

    Extraneural metastases from CNS medulloblastoma are rare and poorly described. The purpose of this study is to describe the clinical and radiological characteristics of a large single institution series of patients with medulloblastoma who developed extraneural metastases. We retrospectively reviewed a departmental database over a 20 year period for all patients with medulloblastoma who developed extraneural metastases. Chart and imaging reviews were performed, and overall survival (OS) estimated by the Kaplan-Meier method. We found 14 patients with medulloblastoma and extraneural metastases. The median age at initial diagnosis was 16.3 years (range, 3.2-44.2), and the most common subtype was desmoplastic (n = 6, 42.9%). After initial gross total resection, most patients received radiation therapy alone (n = 10, 71.4%). Metastases to bone were most common (n = 11, 78.6%) followed by metastases to bone marrow (n = 6, 42.9%), usually to the spine. The median time from initial diagnosis to first extraneural metastasis was 1.5 years (range, 0.2-17.4), and the median OS from extraneural metastasis to death was 3.3 years (range, 0-18). The Kaplan-Meier estimate of 5 year OS from extraneural metastasis diagnosis was 40.0% (95% CI, 20.2-79.2). Extraneural metastases from medulloblastoma may rarely develop after initial diagnosis to involve bone and bone marrow. We found that desmoplastic variant extraneural tumors had longer survival than nondesmoplastic variants, suggesting that histopathological and more recent molecular subtyping have important roles in determining the prognosis of medulloblastoma patients. © 2014 Wiley Periodicals, Inc.

  14. Cardiovascular Risk Factors and 5-year Mortality in the Copenhagen Stroke Study

    DEFF Research Database (Denmark)

    Kammersgaard, Lars Peter; Olsen, Tom Skyhøj

    2005-01-01

    population. METHODS: We studied 905 ischemic stroke patients from the community-based Copenhagen Stroke Study. Patients had a CT scan and stroke severity was measured by the Scandinavian Stroke Scale on admission. A comprehensive evaluation was performed by a standardized medical examination...... and questionnaire for cardiovascular risk factors, age, and sex. Follow-up was performed 5 years after stroke, and data on mortality were obtained for all, except 6, who had left the country. Five-year mortality was calculated by the Kaplan-Meier procedure and the influence of multiple predictors was analyzed...... by Cox proportional hazards analyses adjusted for age, gender, stroke severity, and risk factor profile. RESULTS: In Kaplan-Meier analyses atrial fibrillation (AF), ischemic heart disease, diabetes, and previous stroke were associated with increased mortality, while smoking and alcohol intake were...

  15. Cardiovascular risk factors and 5-year mortality in the Copenhagen Stroke Study

    DEFF Research Database (Denmark)

    Kammersgaard, Lars Peter; Olsen, Tom Skyhøj

    2005-01-01

    population. METHODS: We studied 905 ischemic stroke patients from the community-based Copenhagen Stroke Study. Patients had a CT scan and stroke severity was measured by the Scandinavian Stroke Scale on admission. A comprehensive evaluation was performed by a standardized medical examination...... and questionnaire for cardiovascular risk factors, age, and sex. Follow-up was performed 5 years after stroke, and data on mortality were obtained for all, except 6, who had left the country. Five-year mortality was calculated by the Kaplan-Meier procedure and the influence of multiple predictors was analyzed...... by Cox proportional hazards analyses adjusted for age, gender, stroke severity, and risk factor profile. RESULTS: In Kaplan-Meier analyses atrial fibrillation (AF), ischemic heart disease, diabetes, and previous stroke were associated with increased mortality, while smoking and alcohol intake were...

  16. Numerical investigation of tip clearance cavitation in Kaplan runners

    Science.gov (United States)

    Nikiforova, K.; Semenov, G.; Kuznetsov, I.; Spiridonov, E.

    2016-11-01

    There is a gap between the Kaplan runner blade and the shroud that makes for a special kind of cavitation: cavitation in the tip leakage flow. Two types of cavitation caused by the presence of clearance gap are known: tip vortex cavitation that appears at the core of the rolled up vortex on the blade suction side and tip clearance cavitation that appears precisely in the gap between the blade tip edge and the shroud. In the context of this work numerical investigation of the model Kaplan runner has been performed taking into account variable tip clearance for several cavitation regimes. The focus is put on investigation of structure and origination of mechanism of cavitation in the tip leakage flow. Calculations have been performed with the help of 3-D unsteady numerical model for two-phase medium. Modeling of turbulent flow in this work has been carried out using full equations of Navier-Stokes averaged by Reynolds with correction for streamline curvature and system rotation. For description of this medium (liquid-vapor) simplification of Euler approach is used; it is based on the model of interpenetrating continuums, within the bounds of this two- phase medium considered as a quasi-homogeneous mixture with the common velocity field and continuous distribution of density for both phases. As a result, engineering techniques for calculation of cavitation conditioned by existence of tip clearance in model turbine runner have been developed. The detailed visualization of the flow was carried out and vortex structure on the suction side of the blade was reproduced. The range of frequency with maximum value of pulsation was assigned and maximum energy frequency was defined; it is based on spectral analysis of the obtained data. Comparison between numerical computation results and experimental data has been also performed. The location of cavitation zone has a good agreement with experiment for all analyzed regimes.

  17. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  18. New methods for estimating follow-up rates in cohort studies

    Directory of Open Access Journals (Sweden)

    Xiaonan Xue

    2017-12-01

    Full Text Available Abstract Background The follow-up rate, a standard index of the completeness of follow-up, is important for assessing the validity of a cohort study. A common method for estimating the follow-up rate, the “Percentage Method”, defined as the fraction of all enrollees who developed the event of interest or had complete follow-up, can severely underestimate the degree of follow-up. Alternatively, the median follow-up time does not indicate the completeness of follow-up, and the reverse Kaplan-Meier based method and Clark’s Completeness Index (CCI also have limitations. Methods We propose a new definition for the follow-up rate, the Person-Time Follow-up Rate (PTFR, which is the observed person-time divided by total person-time assuming no dropouts. The PTFR cannot be calculated directly since the event times for dropouts are not observed. Therefore, two estimation methods are proposed: a formal person-time method (FPT in which the expected total follow-up time is calculated using the event rate estimated from the observed data, and a simplified person-time method (SPT that avoids estimation of the event rate by assigning full follow-up time to all events. Simulations were conducted to measure the accuracy of each method, and each method was applied to a prostate cancer recurrence study dataset. Results Simulation results showed that the FPT has the highest accuracy overall. In most situations, the computationally simpler SPT and CCI methods are only slightly biased. When applied to a retrospective cohort study of cancer recurrence, the FPT, CCI and SPT showed substantially greater 5-year follow-up than the Percentage Method (92%, 92% and 93% vs 68%. Conclusions The Person-time methods correct a systematic error in the standard Percentage Method for calculating follow-up rates. The easy to use SPT and CCI methods can be used in tandem to obtain an accurate and tight interval for PTFR. However, the FPT is recommended when event rates and

  19. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  20. Decreased graft survival in liver transplant recipients of donors with positive blood cultures: a review of the United Network for Organ Sharing dataset.

    Science.gov (United States)

    Huaman, Moises A; Vilchez, Valery; Mei, Xiaonan; Shah, Malay B; Daily, Michael F; Berger, Jonathan; Gedaly, Roberto

    2017-06-01

    Liver transplantation using blood culture positive donors (BCPD) has allowed a significant expansion of the donor pool. We aimed to characterize BCPD and assess the outcomes of BCPD liver transplant recipients. We retrieved data from the United Network for Organ Sharing (UNOS) registry on all adults who underwent primary, single-organ deceased-donor liver transplantation in the USA between 2008 and 2013. Patients were classified into two cohorts: the BCPD cohort and the non-BCPD cohort. One-year graft and patient survival were compared between cohorts using Kaplan-Meier estimates and Cox models. A total of 28 961 patients were included. There were 2316 (8.0%) recipients of BCPD. BCPD were more likely to be older, female, black, diabetic, hypertensive, and obese compared to non-BCPD. Graft survival was significantly lower in BCPD recipients compared to non-BCPD recipients (Kaplan-Meier, 0.85 vs. 0.87; P = 0.009). Results remained significant in propensity-matched analysis (P = 0.038). BCPD was independently associated with decreased graft survival (adjusted HR; 1.10, 95% CI 1.01-1.20; P = 0.04). There were no significant differences in patient survival between study groups. BCPD was associated with decreased graft survival in liver transplant recipients. Studies are needed to identify subgroups of BCPD with the highest risk of graft failure and characterize the underlying pathogenic mechanisms. © 2016 Steunstichting ESOT.

  1. Reverse hybrid total hip arthroplasty

    DEFF Research Database (Denmark)

    Wangen, Helge; Havelin, Leif I.; Fenstad, Anne M

    2017-01-01

    Background and purpose - The use of a cemented cup together with an uncemented stem in total hip arthroplasty (THA) has become popular in Norway and Sweden during the last decade. The results of this prosthetic concept, reverse hybrid THA, have been sparsely described. The Nordic Arthroplasty....... Patients and methods - From the NARA, we extracted data on reverse hybrid THAs from January 1, 2000 until December 31, 2013. 38,415 such hips were studied and compared with cemented THAs. The Kaplan-Meier method and Cox regression analyses were used to estimate the prosthesis survival and the relative risk...

  2. Breast conserving treatment in Denmark, 1989-1998. A nationwide population-based study of the Danish Breast Cancer Co-operative Group

    DEFF Research Database (Denmark)

    Ewertz, M.; Kempel, M.M.; During, M.

    2008-01-01

    patients in Denmark. PATIENTS AND METHODS: To evaluate the results of this treatment, we performed a nationwide population-based follow-up study of patients aged less than 75 years treated in Denmark from 1989 to 1998 based on the database of Danish Breast Cancer Cooperative Group. RESULTS: At 15 years...... of follow-up, the Kaplan-Meier estimate of overall survival was 69% among 3 758 patients who received the recommended treatment. Within the first 10 years of follow-up, the cumulative incidences of loco-regional recurrences, distant metastases or other malignant disease, or death as a first event were 9...

  3. Mortality impact of positive blood cultures in patients with suspected community-acquired bacteraemia. A Danish population-based cohort study

    DEFF Research Database (Denmark)

    Søgaard, Mette; Nørgaard, Mette; Sørensen, Henrik Toft

    2009-01-01

    for age, gender, coexisting chronic diseases, marital status, use of immunosuppressives, and calendar period. Further, we conducted analyses restricted to patients a discharge diagnose of infectious diseases (ICD-10 codes A00-B99). Results: In total, 1,665 (8.2%) patients had positive blood culture...... System. We computed Kaplan-Meier curves and product limit estimates for the main study variables. Next, time-dependent Cox regression analyses was used to compare the risk of death in patients with positive blood cultures and patients with negative cultures at days 0-7, 8-30, and 31-180, controlling...

  4. Rib fractures after percutaneous radiofrequency and microwave ablation of lung tumors: incidence and relevance.

    Science.gov (United States)

    Alexander, Erica S; Hankins, Carol A; Machan, Jason T; Healey, Terrance T; Dupuy, Damian E

    2013-03-01

    To retrospectively identify the incidence and probable risk factors for rib fractures after percutaneous radiofrequency ablation (RFA) and microwave ablation (MWA) of neoplasms in the lung and to identify complications related to these fractures. Institutional review board approval was obtained for this HIPAA-compliant retrospective study. Study population was 163 patients treated with MWA and/or RFA for 195 lung neoplasms between February 2004 and April 2010. Follow-up computed tomographic images of at least 3 months were retrospectively reviewed by board-certified radiologists to determine the presence of rib fractures. Generalized estimating equations were performed to assess the effect that patient demographics, tumor characteristics, treatment parameters, and ablation zone characteristics had on development of rib fractures. Kaplan-Meier curve was used to estimate patients' probability of rib fracture after ablation as a function of time. Clinical parameters (ie, pain in ribs or chest, organ damage caused by fractured rib) were evaluated for patients with confirmed fracture. Rib fractures in proximity to the ablation zone were found in 13.5% (22 of 163) of patients. Estimated probability of fracture was 9% at 1 year and 22% at 3 years. Women were more likely than were men to develop fracture after ablation (P = .041). Patients with tumors closer to the chest wall were more likely to develop fracture (P = .0009), as were patients with ablation zones that involved visceral pleura (P = .039). No patients with rib fractures that were apparently induced by RFA and MWA had organ injury or damage related to fracture, and 9.1% (2 of 22) of patients reported mild pain. Rib fractures were present in 13.5% of patients after percutaneous RFA and MWA of lung neoplasms. Patients who had ablations performed close to the chest wall should be monitored for rib fractures.

  5. A Satellite Mortality Study to Support Space Systems Lifetime Prediction

    Science.gov (United States)

    Fox, George; Salazar, Ronald; Habib-Agahi, Hamid; Dubos, Gregory

    2013-01-01

    Estimating the operational lifetime of satellites and spacecraft is a complex process. Operational lifetime can differ from mission design lifetime for a variety of reasons. Unexpected mortality can occur due to human errors in design and fabrication, to human errors in launch and operations, to random anomalies of hardware and software or even satellite function degradation or technology change, leading to unrealized economic or mission return. This study focuses on data collection of public information using, for the first time, a large, publically available dataset, and preliminary analysis of satellite lifetimes, both operational lifetime and design lifetime. The objective of this study is the illustration of the relationship of design life to actual lifetime for some representative classes of satellites and spacecraft. First, a Weibull and Exponential lifetime analysis comparison is performed on the ratio of mission operating lifetime to design life, accounting for terminated and ongoing missions. Next a Kaplan-Meier survivor function, standard practice for clinical trials analysis, is estimated from operating lifetime. Bootstrap resampling is used to provide uncertainty estimates of selected survival probabilities. This study highlights the need for more detailed databases and engineering reliability models of satellite lifetime that include satellite systems and subsystems, operations procedures and environmental characteristics to support the design of complex, multi-generation, long-lived space systems in Earth orbit.

  6. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    Science.gov (United States)

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  7. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  8. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    Science.gov (United States)

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (nresearchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  9. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  10. O11.4. EDUCATION, EMPLOYMENT AND DISABILITY AMONG YOUNG PERSONS WITH EARLY PSYCHOSIS PARTICIPATING IN A COORDINATED SPECIALTY CARE PROGRAM

    Science.gov (United States)

    Smith, Thomas; Humensky, Jennifer; Scodes, Jennifer; Wall, Melanie; Nossel, Ilana; Dixon, Lisa

    2018-01-01

    Abstract Background Comprehensive early treatment programs for individuals with early psychosis have demonstrated success internationally, spurring rapid expansion of the model in the United States. Between 2014–2016, U.S. federal funding to states to support Coordinated Specialty Care (CSC) for individuals with early psychosis increased to $50 million annually (Dixon, 2017). New York State (NYS) was an early adopter and has rapidly expanded CSC across the state. This study prospectively evaluated education and employment outcomes over time within NYS’s CSC program, OnTrackNY. Methods Employment and education trajectories were assessed for individuals with early psychosis who had at least one three-month follow-up assessment, from the program’s inception in October 2013, through September 2016 (N=325). Rates of Social Security Administration (SSA) disability enrollment were assessed for individuals enrolled from October 2013 to June 2017 (n=679). Education and employment status was estimated using longitudinal logistic models utilizing generalized estimating equations with an autoregressive covariance structure to account for within-subject correlations over time. To test how education/employment changed over time, pre-specified contrasts were tested from the longitudinal model for the mean change in sequential follow-up visits. A Kaplan-Meier estimator with discrete time to event and censoring at last observed follow-up month with no event was used to estimate the probability of any education/employment by one year after admission and to estimate the risk of disability by two years after admission. Results Approximately 40% of individuals with early psychosis were engaged in school or work upon enrollment in a CSC program; engagement increased to 80% after 6 months of care. The estimated probability of being employed or in school at some time during the year after admission was 87.9% (95% Confidence Interval (CI)= [82.9, 92.0]). Relative to women, men had

  11. The Educational Philosophies of Mordecai Kaplan and Michael Rosenak: Surprising Similarities and Illuminating Differences

    Science.gov (United States)

    Schein, Jeffrey; Caplan, Eric

    2014-01-01

    The thoughts of Mordecai Kaplan and Michael Rosenak present surprising commonalities as well as illuminating differences. Similarities include the perception that Judaism and Jewish education are in crisis, the belief that Jewish peoplehood must include commitment to meaningful content, the need for teachers to teach from a position of…

  12. An Estimation of a Passive Infra-Red Sensor Probability of Detection

    International Nuclear Information System (INIS)

    Osman, E.A.; El-Gazar, M.I.; Shaat, M.K.; El-Kafas, A.A.; Zidan, W.I.; Wadoud, A.A.

    2009-01-01

    Passive Infera-Red (PIR) sensors are one of many detection sensors are used to detect any intrusion process of the nuclear sites. In this work, an estimation of a PIR Sensor's Probability of Detection of a hypothetical facility is presented. sensor performance testing performed to determine whether a particular sensor will be acceptable in a proposed design. We have access to a sensor test field in which the sensor of interest is already properly installed and the parameters have been set to optimal levels by preliminary testing. The PIR sensor construction, operation and design for the investigated nuclear site are explained. Walking and running intrusion tests were carried out inside the field areas of the PIR sensor to evaluate the sensor performance during the intrusion process. 10 trials experimentally performed for achieving the intrusion process via a passive infra-red sensor's network system. The performance and intrusion senses of PIR sensors inside the internal zones was recorded and evaluated.

  13. Metagenes Associated with Survival in Non-Small Cell Lung Cancer

    Science.gov (United States)

    Urgard, Egon; Vooder, Tõnu; Võsa, Urmo; Välk, Kristjan; Liu, Mingming; Luo, Cheng; Hoti, Fabian; Roosipuu, Retlav; Annilo, Tarmo; Laine, Jukka; Frenz, Christopher M.; Zhang, Liqing; Metspalu, Andres

    2011-01-01

    NSCLC (non-small cell lung cancer) comprises about 80% of all lung cancer cases worldwide. Surgery is most effective treatment for patients with early-stage disease. However, 30%–55% of these patients develop recurrence within 5 years. Therefore, markers that can be used to accurately classify early-stage NSCLC patients into different prognostic groups may be helpful in selecting patients who should receive specific therapies. A previously published dataset was used to evaluate gene expression profiles of different NSCLC subtypes. A moderated two-sample t-test was used to identify differentially expressed genes between all tumor samples and cancer-free control tissue, between SCC samples and AC/BC samples and between stage I tumor samples and all other tumor samples. Gene expression microarray measurements were validated using qRT-PCR. Bayesian regression analysis and Kaplan-Meier survival analysis were performed to determine metagenes associated with survival. We identified 599 genes which were down-regulated and 402 genes which were up-regulated in NSCLC compared to the normal lung tissue and 112 genes which were up-regulated and 101 genes which were down-regulated in AC/BC compared to the SCC. Further, for stage Ib patients the metagenes potentially associated with survival were identified. Genes that expressed differently between normal lung tissue and cancer showed enrichment in gene ontology terms which were associated with mitosis and proliferation. Bayesian regression and Kaplan-Meier analysis showed that gene-expression patterns and metagene profiles can be applied to predict the probability of different survival outcomes in NSCLC patients. PMID:21695068

  14. Clinical Significance of Preoperative Albumin and Globulin Ratio in Patients with Gastric Cancer Undergoing Treatment

    Directory of Open Access Journals (Sweden)

    Min-jie Mao

    2017-01-01

    Full Text Available Background. The pretreatment albumin and globulin ratio (AGR was an inflammation-associated factor which was related to the overall survival in various malignancies. The aim of this study was to evaluate the prognostic value of AGR in patients with gastric cancer. Method. This retrospective study included 862 cases pathologically diagnosed with gastric cancer. All patients were randomly divided into the testing group (431 cases and validation group (431 cases. The relationships of AGR with clinicopathologic characteristics and prognosis were analyzed by Kaplan-Meier and Cox regression methods. Results. In the testing group, the median overall survival was 26.90 months and the cutoff value of AGR was 1.50 based on R language. Kaplan-Meier analysis showed that lower AGR was correlated with poorer overall survival. Multivariate analysis demonstrated that AGR was an independent prognostic factor for overall survival (HR: 0.584, 95% CI = 0.351–0.973, and p = 0.039. In the validation group, the median overall survival was 24.10 months. Lower AGR (≤1.50 also had a significantly poorer overall survival by Kaplan-Meier analysis. According to multivariate analysis, the AGR was also confirmed to be an independent prognostic factor for overall survival (HR: 0.578, 95% CI = 0.373–0.897, and p = 0.015. Conclusions. Our study suggested that the pretreatment AGR could be a prognostic biomarker for overall survival in patients with gastric cancer.

  15. Mortality, Causes of Death and Associated Factors Relate to a Large HIV Population-Based Cohort.

    Science.gov (United States)

    Garriga, César; García de Olalla, Patricia; Miró, Josep M; Ocaña, Inma; Knobel, Hernando; Barberá, Maria Jesús; Humet, Victoria; Domingo, Pere; Gatell, Josep M; Ribera, Esteve; Gurguí, Mercè; Marco, Andrés; Caylà, Joan A

    2015-01-01

    Antiretroviral therapy has led to a decrease in HIV-related mortality and to the emergence of non-AIDS defining diseases as competing causes of death. This study estimates the HIV mortality rate and their risk factors with regard to different causes in a large city from January 2001 to June 2013. We followed-up 3137 newly diagnosed HIV non-AIDS cases. Causes of death were classified as HIV-related, non-HIV-related and external. We examined the effect of risk factors on survival using mortality rates, Kaplan-Meier plots and Cox models. Finally, we estimated survival for each main cause of death groups through Fine and Gray models. 182 deaths were found [14.0/1000 person-years of follow-up (py); 95% confidence interval (CI):12.0-16.1/1000 py], 81.3% of them had a known cause of death. Mortality rate by HIV-related causes and non-HIV-related causes was the same (4.9/1000 py; CI:3.7-6.1/1000 py), external was lower [1.7/1000 py; (1.0-2.4/1000 py)]. Kaplan-Meier estimate showed worse survival in intravenous drug user (IDU) and heterosexuals than in men having sex with men (MSM). Factors associated with HIV-related causes of death include: IDU male (subHazard Ratio (sHR):3.2; CI:1.5-7.0) and causes of death include: ageing (sHR:1.5; CI:1.4-1.7) and heterosexual female (sHR:2.8; CI:1.1-7.3) versus MSM. Factors associated with external causes of death were IDU male (sHR:28.7; CI:6.7-123.2) and heterosexual male (sHR:11.8; CI:2.5-56.4) versus MSM. There are important differences in survival among transmission groups. Improved treatment is especially necessary in IDUs and heterosexual males.

  16. Bias Correction Methods Explain Much of the Variation Seen in Breast Cancer Risks of BRCA1/2 Mutation Carriers.

    Science.gov (United States)

    Vos, Janet R; Hsu, Li; Brohet, Richard M; Mourits, Marian J E; de Vries, Jakob; Malone, Kathleen E; Oosterwijk, Jan C; de Bock, Geertruida H

    2015-08-10

    Recommendations for treating patients who carry a BRCA1/2 gene are mainly based on cumulative lifetime risks (CLTRs) of breast cancer determined from retrospective cohorts. These risks vary widely (27% to 88%), and it is important to understand why. We analyzed the effects of methods of risk estimation and bias correction and of population factors on CLTRs in this retrospective clinical cohort of BRCA1/2 carriers. The following methods to estimate the breast cancer risk of BRCA1/2 carriers were identified from the literature: Kaplan-Meier, frailty, and modified segregation analyses with bias correction consisting of including or excluding index patients combined with including or excluding first-degree relatives (FDRs) or different conditional likelihoods. These were applied to clinical data of BRCA1/2 families derived from our family cancer clinic for whom a simulation was also performed to evaluate the methods. CLTRs and 95% CIs were estimated and compared with the reference CLTRs. CLTRs ranged from 35% to 83% for BRCA1 and 41% to 86% for BRCA2 carriers at age 70 years width of 95% CIs: 10% to 35% and 13% to 46%, respectively). Relative bias varied from -38% to +16%. Bias correction with inclusion of index patients and untested FDRs gave the smallest bias: +2% (SD, 2%) in BRCA1 and +0.9% (SD, 3.6%) in BRCA2. Much of the variation in breast cancer CLTRs in retrospective clinical BRCA1/2 cohorts is due to the bias-correction method, whereas a smaller part is due to population differences. Kaplan-Meier analyses with bias correction that includes index patients and a proportion of untested FDRs provide suitable CLTRs for carriers counseled in the clinic. © 2015 by American Society of Clinical Oncology.

  17. Short-term outcomes after incontinent conduit for gynecologic cancer: comparison of ileal, sigmoid, and transverse colon.

    Science.gov (United States)

    Tabbaa, Zaid M; Janco, Jo Marie T; Mariani, Andrea; Dowdy, Sean C; McGree, Michaela E; Weaver, Amy L; Cliby, William A

    2014-06-01

    The aim of this study is to estimate the overall rates of significant incontinent conduit-related complications and compare rates between conduit types. This was a retrospective review of 166 patients who underwent incontinent urinary diversion from April 1993 through April 2013. Patients were categorized by conduit type-ileal, sigmoid colon, and transverse colon. Significant conduit-related complications were assessed at 30 and 90days after surgery. Significant conduit-related complication was defined as any of the following: ureteral stricture, conduit leak, conduit obstruction, conduit ischemia, ureteral anastomotic leak, stent obstruction requiring intervention via interventional radiology procedure or reoperation, and renal failure. A total of 166 patients underwent formation of an incontinent urinary conduit, most commonly during exenteration for gynecologic malignancy. There were 129 ileal, 11 transverse colon, and 26 sigmoid conduits. The overall significant conduit-related complication rate within 30days was 15.1%. Complication rates for ileal, transverse and sigmoid conduits were 14.7%, 0%, and 23.1%, respectively (Fisher's exact test, p=0.24). By 90days, the Kaplan-Meier estimated rates of significant complications were 21.8% overall, and 22.3%, 0%, and 28.9%, respectively, by conduit type (log-rank test, p=0.19). The most common significant conduit-related complications were conduit or ureteral anastomotic leaks and conduit obstructions. By 1 and 2years following surgery, the Kaplan-Meier estimated overall rate of significant conduit-related complication increased to 26.5% and 30.1%, respectively. Our study suggests that there are multiple appropriate tissue sites for use in incontinent conduit formation, and surgical approach should be individualized. Most significant conduit-related complications occur within 90days after surgery. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  19. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers-Part I

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry

    2008-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).

  20. Estimation of Extreme Responses and Failure Probability of Wind Turbines under Normal Operation by Controlled Monte Carlo Simulation

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri

    of the evolution of the PDF of a stochastic process; hence an alternative to the FPK. The considerable advantage of the introduced method over FPK is that its solution does not require high computational cost which extends its range of applicability to high order structural dynamic problems. The problem...... an alternative approach for estimation of the first excursion probability of any system is based on calculating the evolution of the Probability Density Function (PDF) of the process and integrating it on the specified domain. Clearly this provides the most accurate results among the three classes of the methods....... The solution of the Fokker-Planck-Kolmogorov (FPK) equation for systems governed by a stochastic differential equation driven by Gaussian white noise will give the sought time variation of the probability density function. However the analytical solution of the FPK is available for only a few dynamic systems...

  1. An approach for estimating the breach probabilities of moraine-dammed lakes in the Chinese Himalayas using remote-sensing data

    Directory of Open Access Journals (Sweden)

    X. Wang

    2012-10-01

    Full Text Available To make first-order estimates of the probability of moraine-dammed lake outburst flood (MDLOF and prioritize the probabilities of breaching posed by potentially dangerous moraine-dammed lakes (PDMDLs in the Chinese Himalayas, an objective approach is presented. We first select five indicators to identify PDMDLs according to four predesigned criteria. The climatic background was regarded as the climatic precondition of the moraine-dam failure, and under different climatic preconditions, we distinguish the trigger mechanisms of MDLOFs and subdivide them into 17 possible breach modes, with each mode having three or four components; we combined the precondition, modes and components to construct a decision-making tree of moraine-dam failure. Conversion guidelines were established so as to quantify the probabilities of components of a breach mode employing the historic performance method combined with expert knowledge and experience. The region of the Chinese Himalayas was chosen as a study area where there have been frequent MDLOFs in recent decades. The results show that the breaching probabilities (P of 142 PDMDLs range from 0.037 to 0.345, and they can be further categorized as 43 lakes with very high breach probabilities (P ≥ 0.24, 47 lakes with high breach probabilities (0.18 ≤ P < 0.24, 24 lakes with mid-level breach probabilities (0.12 ≤ P < 0.18, 24 lakes with low breach probabilities (0.06 ≤ P < 0.12, and four lakes with very low breach probabilities (p < 0.06.

  2. Estimation of delayed neutron emission probability by using the gross theory of nuclear β-decay

    International Nuclear Information System (INIS)

    Tachibana, Takahiro

    1999-01-01

    The delayed neutron emission probabilities (P n -values) of fission products are necessary in the study of reactor physics; e.g. in the calculation of total delayed neutron yields and in the summation calculation of decay heat. In this report, the P n -values estimated by the gross theory for some fission products are compared with experiment, and it is found that, on the average, the semi-gross theory somewhat underestimates the experimental P n -values. A modification of the β-decay strength function is briefly discussed to get more reasonable P n -values. (author)

  3. Application of the standard lymphadenectomy in gastric adenocarcinoma

    International Nuclear Information System (INIS)

    Aguirre Fernandez, Roberto Eduardo; LLera Dominguez, Gerardo de la; Pennon Guerra, Maria; Alvarez Perez, Alvaro

    2011-01-01

    The 5-years postoperative survival of 132 patients operated on due to gastric adenocarcinoma that underwent lymphadenectomy of the two first ganglionic levels. The results of the epidemiological parameters were determined and the survival by Kaplan-Meier method was defined

  4. The impact of socioeconomic status on the association between biomedical and psychosocial well-being and all-cause mortality in older Spanish adults.

    Science.gov (United States)

    Doménech-Abella, Joan; Mundó, Jordi; Moneta, Maria Victoria; Perales, Jaime; Ayuso-Mateos, José Luis; Miret, Marta; Haro, Josep Maria; Olaya, Beatriz

    2018-03-01

    The aim of this paper was to analyze the effect of biomedical and psychosocial well-being, based on distinct successful aging models (SA), on time to mortality, and determine whether this effect was modified by socioeconomic status (SES) in a nationally representative sample of older Spanish adults. Data were taken from a 3-year follow-up study with 2783 participants aged 50 or over. Vital status was ascertained using national registers or asking participants' relatives. Kaplan-Meier curves were used to estimate the time to death by SES, and levels of biomedical and psychosocial SA. Cox proportional hazard regression models were conducted to explore interactions between SES and SA models while adjusting for gender, age, and marital status. Lower levels of SES and biomedical and psychosocial SA were associated with low probability of survival. Only the interaction between SES and biomedical SA was significant. Biomedical SA impacted on mortality rates among individuals with low SES but not on those with medium or high SES, whereas psychosocial SA affected mortality regardless of SES. Promoting equal access to health care system and improved psychosocial well-being could be a protective factor against premature mortality in older Spanish adults with low SES.

  5. Development and Validation of a Calculator for Estimating the Probability of Urinary Tract Infection in Young Febrile Children.

    Science.gov (United States)

    Shaikh, Nader; Hoberman, Alejandro; Hum, Stephanie W; Alberty, Anastasia; Muniz, Gysella; Kurs-Lasky, Marcia; Landsittel, Douglas; Shope, Timothy

    2018-06-01

    Accurately estimating the probability of urinary tract infection (UTI) in febrile preverbal children is necessary to appropriately target testing and treatment. To develop and test a calculator (UTICalc) that can first estimate the probability of UTI based on clinical variables and then update that probability based on laboratory results. Review of electronic medical records of febrile children aged 2 to 23 months who were brought to the emergency department of Children's Hospital of Pittsburgh, Pittsburgh, Pennsylvania. An independent training database comprising 1686 patients brought to the emergency department between January 1, 2007, and April 30, 2013, and a validation database of 384 patients were created. Five multivariable logistic regression models for predicting risk of UTI were trained and tested. The clinical model included only clinical variables; the remaining models incorporated laboratory results. Data analysis was performed between June 18, 2013, and January 12, 2018. Documented temperature of 38°C or higher in children aged 2 months to less than 2 years. With the use of culture-confirmed UTI as the main outcome, cutoffs for high and low UTI risk were identified for each model. The resultant models were incorporated into a calculation tool, UTICalc, which was used to evaluate medical records. A total of 2070 children were included in the study. The training database comprised 1686 children, of whom 1216 (72.1%) were female and 1167 (69.2%) white. The validation database comprised 384 children, of whom 291 (75.8%) were female and 200 (52.1%) white. Compared with the American Academy of Pediatrics algorithm, the clinical model in UTICalc reduced testing by 8.1% (95% CI, 4.2%-12.0%) and decreased the number of UTIs that were missed from 3 cases to none. Compared with empirically treating all children with a leukocyte esterase test result of 1+ or higher, the dipstick model in UTICalc would have reduced the number of treatment delays by 10.6% (95% CI

  6. Are recommendations from carcinoma of the cervix Patterns of Care studies (PCS) in the United States of America (USA) applicable to centers in developing countries?

    International Nuclear Information System (INIS)

    Craighead, Peter S.; Smulian, Hubert G.; Groot, Henk J. de; Toit, Pierre F.M. du

    1997-01-01

    Purpose: To compare patient demographics, treatment resources, practice patterns, and outcome results for squamous cell carcinoma of the uterine cervix (SCC) between the 1978 and 1983 Patterns of Care studies (PCS) in the United States of America (USA) and a nonacademic center within a developing country. Methods and Materials: Patient details (race, age, stage, and number per year), treatment used, and treatment outcome were retrieved from the charts of the 1160 cases registered at this center with SCC of the cervix between 1976 and 1985. Demographic variables and Kaplan-Meier survival estimates were calculated and compared with results from published PCS reviews. Results: There is a significant difference in the racial group presentation of cervix cancer at this center compared with the PCS reviews (p < 0.005), and median ages are significantly lower at this center (t = p < 0.001). The proportion of patients with Stage III or more was significantly higher at this center than the PCS centers (24 vs. 47%, p < 0.001). There were also vast differences in facility resources. Fewer cases at this center underwent intracavitary insertions than at PCS centers. Mean Point A doses were significantly reduced for this center compared with the PCS reviews. Kaplan-Meier estimates were similar for Stage I and II in PCS centers and this center, but were inferior for this center in Stage III patients (p < 0.05 for OS and p < 0.01 for LC). Late morbidity rates were similar for both PCS centers and this center. Conclusion: PCS recommendations may be applicable to nonacademic centers within developing countries, if the latter use staging techniques that are consistent with the PCS staging guidelines

  7. Lifetime anxiety disorder and current anxiety symptoms associated with hastened depressive recurrence in bipolar disorder.

    Science.gov (United States)

    Shah, Saloni; Kim, Jane P; Park, Dong Yeon; Kim, Hyun; Yuen, Laura D; Do, Dennis; Dell'Osso, Bernardo; Hooshmand, Farnaz; Miller, Shefali; Wang, Po W; Ketter, Terence A

    2017-09-01

    To assess differential relationships between lifetime anxiety disorder/current anxiety symptoms and longitudinal depressive severity in bipolar disorder (BD). Stanford BD Clinic outpatients enrolled during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD (STEP-BD) Affective Disorders Evaluation and followed with the STEP-BD Clinical Monitoring Form while receiving naturalistic treatment for up to two years. Baseline unfavorable illness characteristics/current mood symptoms and times to depressive recurrence/recovery were compared in patients with versus without lifetime anxiety disorder/current anxiety symptoms. Among 105 currently recovered patients, lifetime anxiety disorder was significantly associated with 10/27 (37.0%) demographic/other unfavorable illness characteristics/current mood symptoms/current psychotropics, hastened depressive recurrence (driven by earlier onset age), and a significantly (> two-fold) higher Kaplan-Meier estimated depressive recurrence rate, whereas current anxiety symptoms were significantly associated with 10/27 (37.0%) demographic/other unfavorable illness characteristics/current mood symptoms/current psychotropics and hastened depressive recurrence (driven by lifetime anxiety disorder), but only a numerically higher Kaplan-Meier estimated depressive recurrence rate. In contrast, among 153 currently depressed patients, lifetime anxiety disorder/current anxiety symptoms were not significantly associated with time to depressive recovery or depressive recovery rate. American tertiary BD clinic referral sample, open naturalistic treatment. Research is needed regarding differential relationships between lifetime anxiety disorder and current anxiety symptoms and hastened/delayed depressive recurrence/recovery - specifically whether lifetime anxiety disorder versus current anxiety symptoms has marginally more robust association with hastened depressive recurrence, and whether both have marginally more robust

  8. Miners’ return to work following injuries in coal mines

    Directory of Open Access Journals (Sweden)

    Ashis Bhattacherjee

    2016-12-01

    Full Text Available Background: The occupational injuries in mines are common and result in severe socio-economical consequences. Earlier studies have revealed the role of multiple factors such as demographic factors, behavioral factors, health-related factors, working environment, and working conditions for mine injuries. However, there is a dearth of information about the role of some of these factors in delayed return to work (RTW following a miner’s injury. These factors may likely include personal characteristics of injured persons and his or her family, the injured person’s social and economic status, and job characteristics. This study was conducted to assess the role of some of these factors for the return to work following coal miners’ injuries. Material and Methods: A study was conducted for 109 injured workers from an underground coal mine in the years 2000–2009. A questionnaire, which was completed by the personnel interviews, included among others age, height, weight, seniority, alcohol consumption, sleeping duration, presence of diseases, job stress, job satisfaction, and injury type. The data was analyzed using the Kaplan-Meier estimates and the Cox proportional hazard model. Results: According to Kaplan-Meier estimate it was revealed that a lower number of dependents, longer sleep duration, no job stress, no disease, no alcohol addiction, and higher monthly income have a great impact on early return to work after injury. The Cox regression analysis revealed that the significant risk factors which influenced miners’ return to work included presence of disease, job satisfaction and injury type. Conclusions: The mine management should pay attention to significant risk factors for injuries in order to develop effective preventive measures. Med Pr 2016;67(6:729–742

  9. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    examined, which in turn leads to any of the known stereological estimates, including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...... geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator...

  10. Análisis de la recurrencia de cáncer de lengua considerando la presencia de eventos competitivos.

    Directory of Open Access Journals (Sweden)

    Raúl Vicente Mantilla Quispe

    2008-10-01

    Full Text Available Objetivo: Estimar la probabilidad de recurrencia en pacientes con cáncer de lengua según la edad, el estado ganglionar patológico y el tipo de tratamiento de los pacientes considerando la muerte antes de la recurrencia como evento competitivo. Material y métodos: Serie de casos retrospectiva de 290 pacientes con cáncer de lengua, con tratamiento en el INEN, entre los años 1977 y 2000. Se excluyeron del estudio 29 pacientes tratados solo con radioterapia. De los 261 pacientes, 31 solo tuvieron tratamiento del tumor primario. Resultados: La recurrencia fue 36,8%, la recurrencia local fue la más frecuente. La incidencia acumulada de la recurrencia a los 5 años, con el método de Kaplan Meier, se estimó en 44,7%, y según el análisis de riesgos competitivos en 42,4%. En el análisis univariado considerando riesgos competitivos, las incidencias acumuladas fueron: 56,8 y 38,9% en pacientes menores o iguales que 45 y mayores que 45 años respectivamente (p = 0,1556, 29,4% y 50,5% en pacientes con ganglios patológicos negativos y positivos, respectivamente (p = 0,0002, y 37,5% en pacientes con cirugía, y 47,4% con radioterapia (p = 0,03. En el análisis multivariado, con regresión de riesgos competitivos, no se encontró diferencia entre los tipos de tratamiento sobre la recurrencia (RR = 1,146, p = 0,620. Conclusiones: La pequeña diferencia entre los resultados del método Kaplan Meier y el que toma en cuenta eventos competitivos se debe a la baja tasa del evento competitivo, más aún por que se trata de muerte no relacionada con la enfermedad. La tasa de recurrencia fue similar al reportado en la literatura. Sólo se encontró diferencia significativa en la tasa de recurrencia en el grupo con compromiso ganglionar positivo. Aunque la comparación con los métodos estándar, Kaplan Meier y Regresión de Cox, muestra resultados similares se deben tener en cuenta los eventos competitivos. (Rev Med Hered 2008;19: 145-151.

  11. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  12. Third-line Targeted Therapy in Metastatic Renal Cell Carcinoma: Results from the International Metastatic Renal Cell Carcinoma Database Consortium

    DEFF Research Database (Denmark)

    Wells, J Connor; Stukalin, Igor; Norton, Craig

    2017-01-01

    and were included in the analysis. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Patients were analyzed for overall survival (OS) and progression-free survival using Kaplan-Meier curves, and were evaluated for overall response. Cox regression analyses were used to determine the statistical association...

  13. Incidence and Outcome of BRCA Mutations in Unselected Patients with Triple Receptor-Negative Breast Cancer.

    LENUS (Irish Health Repository)

    Gonzalez-Angulo, Ana M

    2011-03-01

    To investigate the incidence of germline and somatic BRCA1\\/2 mutations in unselected patients with triple-negative breast cancer (TNBC) and determine the prognostic significance of carrying a mutation. Methods: DNA was obtained from 77 TNBC and normal tissues. BRCA1\\/2 exons\\/flanking regions were sequenced from tumor and patients classified as mutant or wild type (WT). Sequencing was repeated from normal tissue to identify germline and somatic mutations. Patient characteristics were compared with chi-square. Survival was estimated by Kaplan-Meier method and compared with log-rank. Cox proportional hazards models were fit to determine the independent association of mutation status with outcome.

  14. Magnitude of bacteraemia predicts one-year mortality

    DEFF Research Database (Denmark)

    Gradel, Kim Oren; Schønheyder, Henrik Carl; Søgaard, Mette

    Objectives: All hospitals in our region use the BacT/Alert® blood culture (BC) system with a 3-bottle BC set for adults. We hypothesized that the magnitude of bacteremia (i.e., number of positive bottles in the initial BC set) predicted one-year mortality. Methods In a population-based study we...... with a BC index of 1 (i.e., one positive bottle) were chosen as the reference group. We computed Kaplan-Meier curves and performed Cox regression analyses to estimate mortality rate ratios (MRRs) with 95 % confidence intervals [CIs] 30 and 365 days after the initial BC sampling date, first in crude analyses...... mortality....

  15. Parental consanguineous marriages and clinical response to chemotherapy in locally advanced breast cancer patients.

    Science.gov (United States)

    Saadat, Mostafa; Khalili, Maryam; Omidvari, Shahpour; Ansari-Lari, Maryam

    2011-03-28

    The main aim of the present study was investigating the association between parental consanguinity and clinical response to chemotherapy in females affected with locally advanced breast cancer. A consecutive series of 92 patients were prospectively included in this study. Clinical assessment of treatment was accomplished by comparing initial tumor size with preoperative tumor size using revised RECIST guideline (version 1.1). Clinical response defined as complete response, partial response and no response. The Kaplan-Meier survival analysis were used to evaluate the association of parental marriages (first cousin vs unrelated marriages) and clinical response to chemotherapy (complete and partial response vs no response). Number of courses of chemotherapy was considered as time, in the analysis. Kaplan-Meier analysis revealed that offspring of unrelated marriages had poorer response to chemotherapy (log rank statistic=5.10, df=1, P=0.023). Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  17. Estimation method for first excursion probability of secondary system with impact and friction using maximum response

    International Nuclear Information System (INIS)

    Shigeru Aoki

    2005-01-01

    The secondary system such as pipings, tanks and other mechanical equipment is installed in the primary system such as building. The important secondary systems should be designed to maintain their function even if they are subjected to destructive earthquake excitations. The secondary system has many nonlinear characteristics. Impact and friction characteristic, which are observed in mechanical supports and joints, are common nonlinear characteristics. As impact damper and friction damper, impact and friction characteristic are used for reduction of seismic response. In this paper, analytical methods of the first excursion probability of the secondary system with impact and friction, subjected to earthquake excitation are proposed. By using the methods, the effects of impact force, gap size and friction force on the first excursion probability are examined. When the tolerance level is normalized by the maximum response of the secondary system without impact or friction characteristics, variation of the first excursion probability is very small for various values of the natural period. In order to examine the effectiveness of the proposed method, the obtained results are compared with those obtained by the simulation method. Some estimation methods for the maximum response of the secondary system with nonlinear characteristics have been developed. (author)

  18. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  19. The preoperative plasma fibrinogen level is an independent prognostic factor for overall survival of breast cancer patients who underwent surgical treatment.

    Science.gov (United States)

    Wen, Jiahuai; Yang, Yanning; Ye, Feng; Huang, Xiaojia; Li, Shuaijie; Wang, Qiong; Xie, Xiaoming

    2015-12-01

    Previous studies have suggested that plasma fibrinogen contributes to tumor cell proliferation, progression and metastasis. The current study was performed to evaluate the prognostic relevance of preoperative plasma fibrinogen in breast cancer patients. Data of 2073 consecutive breast cancer patients, who underwent surgery between January 2002 and December 2008 at the Sun Yat-sen University Cancer Center, were retrospectively evaluated. Plasma fibrinogen levels were routinely measured before surgeries. Participants were grouped by the cutoff value estimated by the receiver operating characteristic (ROC) curve analysis. Overall survival (OS) was assessed using Kaplan-Meier analysis, and multivariate Cox proportional hazards regression model was performed to evaluate the independent prognostic value of plasma fibrinogen level. The optimal cutoff value of preoperative plasma fibrinogen was determined to be 2.83 g/L. The Kaplan-Meier analysis showed that patients with high fibrinogen levels had shorter OS than patients with low fibrinogen levels (p factor for OS in breast cancer patients (HR = 1.475, 95% confidence interval (CI): 1.177-1.848, p = 0.001). Subgroup analyses revealed that plasma fibrinogen level was an unfavorable prognostic parameter in stage II-III, Luminal subtypes and triple-negative breast cancer patients. Elevated preoperative plasma fibrinogen was independently associated with poor prognosis in breast cancer patients and may serve as a valuable parameter for risk assessment in breast cancer patients. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Factors affecting commencement and cessation of smoking behaviour in Malaysian adults

    Directory of Open Access Journals (Sweden)

    Ghani Wan

    2012-03-01

    Full Text Available Abstract Background Tobacco consumption peak in developed countries has passed, however, it is on the increase in many developing countries. Apart from cigarettes, consumption of local hand-rolled cigarettes such as bidi and rokok daun are prevalent in specific communities. Although factors associated with smoking initiation and cessation has been investigated elsewhere, the only available data for Malaysia is on prevalence. This study aims to investigate factors associated with smoking initiation and cessation which is imperative in designing intervention programs. Methods Data were collected from 11,697 adults by trained recording clerks on sociodemographic characteristics, practice of other risk habit and details of smoking such as type, duration and frequency. Smoking commencement and cessation were analyzed using the Kaplan-Meier estimates and log-rank tests. Univariate and multivariate Cox proportional hazard regression models were used to calculate the hazard rate ratios. Results Males had a much higher prevalence of the habit (61.7% as compared to females (5.8%. Cessation was found to be most common among the Chinese and those regularly consuming alcoholic beverages. Kaplan-Meier plot shows that although males are more likely to start smoking, females are found to be less likely to stop. History of betel quid chewing and alcohol consumption significantly increase the likelihood of commencement (p Conclusions Gender, ethnicity, history of quid chewing and alcohol consumption have been found to be important factors in smoking commencement; while ethnicity, betel quid chewing and type of tobacco smoked influences cessation.

  1. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    International Nuclear Information System (INIS)

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  2. Comparative Study of Barotrauma Risk during Fish Passage through Kaplan Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Richmond, Marshall C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Hydrology Group; Romero-Gomez, Pedro [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Hydrology Group; Serkowski, John A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Hydrology Group; Rakowski, Cynthia L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Hydrology Group; Graf, Michael J. [Voith Hydro, York, PA (United States)

    2015-10-01

    Rapid pressure changes in hydroelectric turbine flows can cause barotrauma that can be hazardous to the passage of fish, in particular migratory juvenile salmonids. Although numerous laboratory tests have evaluated the effect of rapid decompression in fish species of relevance, numerical modeling studies offer the advantage of predicting, for new turbine designs, the potential risks of mortality and injury from rapid pressure change during turbine passage. However, rapid pressure change is only one of several hydraulic risks encountered by fish during turbine passage in addition to blade strike, shear, and turbulence. To better understand the role of rapid pressure changes, the present work focuses on the application of a computational fluid dynamics based method for evaluating the risk of pressure-related mortality to fish passing through an early 1960s era original hydroelectric Kaplan turbine at Wanapum Dam (Columbia River, Washington), and a modern advanced Kaplan turbine installed in 2005. The results show that the modeling approach acceptably reproduced the nadir pressure distributions compared to field data previously collected at the site using an autonomous sensor. Our findings show that the new advanced-design unit performs better, in terms of reduced barotrauma risk to fish from exposure to low pressures, than the original turbine unit. The outcomes allow for comparative analyses of turbine designs and operations prior to installation, an advantage that can potentially be integrated in the process of designing new turbine units to achieve superior environmental performance. Overall, the results show that modern turbine designs can achieve the multiple objectives of increasing power generation, lowering cavitation potential, and reducing barotrauma risks to passing fish.

  3. Modelling the survivorship of Nigeria children in their first 10 years of ...

    African Journals Online (AJOL)

    Fagbamigbe

    Conflict of Interest: Authors declared no conflict of interest ... Keywords: Survivorship, Nigeria, children mortality, Kaplan Meier, Brass Indirect method, Prediction ... variables or sex of older siblings, post- neonatal mortality is 12% higher and 2nd ... Relationship between maternal education and child survival in developing ...

  4. Quantitative estimation of the human error probability during soft control operations

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jung, Wondea

    2013-01-01

    Highlights: ► An HRA method to evaluate execution HEP for soft control operations was proposed. ► The soft control tasks were analyzed and design-related influencing factors were identified. ► An application to evaluate the effects of soft controls was performed. - Abstract: In this work, a method was proposed for quantifying human errors that can occur during operation executions using soft controls. Soft controls of advanced main control rooms have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to identify the human error modes and quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests an evaluation framework for quantifying the execution error probability using soft controls. In the application result, it was observed that the human error probabilities of soft controls showed both positive and negative results compared to the conventional controls according to the design quality of advanced main control rooms

  5. Uniform Estimate of the Finite-Time Ruin Probability for All Times in a Generalized Compound Renewal Risk Model

    Directory of Open Access Journals (Sweden)

    Qingwu Gao

    2012-01-01

    Full Text Available We discuss the uniformly asymptotic estimate of the finite-time ruin probability for all times in a generalized compound renewal risk model, where the interarrival times of successive accidents and all the claim sizes caused by an accident are two sequences of random variables following a wide dependence structure. This wide dependence structure allows random variables to be either negatively dependent or positively dependent.

  6. Beyond reliability, multi-state failure analysis of satellite subsystems: A statistical approach

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Reliability is widely recognized as a critical design attribute for space systems. In recent articles, we conducted nonparametric analyses and Weibull fits of satellite and satellite subsystems reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we extend our investigation of failures of satellites and satellite subsystems beyond the binary concept of reliability to the analysis of their anomalies and multi-state failures. In reliability analysis, the system or subsystem under study is considered to be either in an operational or failed state; multi-state failure analysis introduces 'degraded states' or partial failures, and thus provides more insights through finer resolution into the degradation behavior of an item and its progression towards complete failure. The database used for the statistical analysis in the present work identifies five states for each satellite subsystem: three degraded states, one fully operational state, and one failed state (complete failure). Because our dataset is right-censored, we calculate the nonparametric probability of transitioning between states for each satellite subsystem with the Kaplan-Meier estimator, and we derive confidence intervals for each probability of transitioning between states. We then conduct parametric Weibull fits of these probabilities using the Maximum Likelihood Estimation (MLE) approach. After validating the results, we compare the reliability versus multi-state failure analyses of three satellite subsystems: the thruster/fuel; the telemetry, tracking, and control (TTC); and the gyro/sensor/reaction wheel subsystems. The results are particularly revealing of the insights that can be gleaned from multi-state failure analysis and the deficiencies, or blind spots, of the traditional reliability analysis. In addition to the specific results provided here, which should prove particularly useful to the space industry, this work highlights the importance

  7. Incident pregnancy and pregnancy outcomes among HIV-infected women in Uganda and Zimbabwe.

    Science.gov (United States)

    Lancaster, Kathryn E; Kwok, Cynthia; Rinaldi, Anne; Byamugisha, Josaphat; Magwali, Tulani; Nyamapfeni, Prisca; Salata, Robert A; Morrison, Charles S

    2015-12-01

    To describe pregnancy outcomes among HIV-infected women and examine factors associated with live birth among those receiving and not receiving combination antiretroviral therapy (cART). The present analysis included women with HIV from Uganda and Zimbabwe who participated in a prospective cohort study during 2001-2009. Incident pregnancies and pregnancy outcomes were recorded quarterly. The Kaplan-Meier method was used to estimate incident pregnancy probabilities; factors associated with live birth were evaluated by Poisson regression with generalized estimating equations. Among 306 HIV-infected women, there were 160 incident pregnancies (10.1 per 100 women-years). The pregnancy rate was higher among cART-naïve women than among those receiving cART (10.7 vs 5.5 per 100 women-years; P=0.047), and it was higher in Uganda than in Zimbabwe (14.4 vs 7.7 per 100 women-years; Ppregnancy (relative risk 0.8; 95% confidence interval 0.7-1.0). Women not receiving cART have higher pregnancy rates than do those receiving cART, but cART use might not affect the risk of adverse pregnancy outcomes. Timely prenatal care and monitoring of illnesses during pregnancy should be incorporated into treatment services for HIV-infected women. Copyright © 2015 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  8. Evaluation of success after second Ahmed glaucoma valve implantation.

    Science.gov (United States)

    Nilforushan, Naveed; Yadgari, Maryam; Jazayeri, Anis Alsadat; Karimi, Nasser

    2016-03-01

    To evaluate the outcome of the second Ahmed glaucoma valve (AGV) surgery in eyes with failed previous AGV surgery. Retrospective case series. Following chart review, 36 eyes of 34 patients with second AGV implantation were enrolled in this study. The primary outcome measure was surgical success defined in terms of intraocular pressure (IOP) control using two criteria: Success was defined as IOP ≤21 mmHg (criterion 1) and IOP ≤16 mmHg (criterion 2), with at least 20% reduction in IOP, either with no medication (complete success) or with no more than two medications (qualified success). Kaplan-Meier survival analysis was used to determine the probability of surgical success. The average age of the patients was 32.7 years (range 4-65), and the mean duration of follow-up was 21.4 months (range 6-96). Preoperatively, the mean IOP was 26.94 mmHg (standard deviation [SD] 7.03), and the patients were using 2.8 glaucoma medications on average (SD 0.9). The mean IOP decreased significantly to 13.28 mmHg (SD 3.59) at the last postoperative visit (P = 0.00) while the patients needed even fewer glaucoma medications on average (1.4 ± 1.1, P = 0.00). Surgical success of second glaucoma drainage devices (Kaplan-Meier analysis), according to criterion 1, at 6, 12, 18, and 42 months was 94%, 85%, 80%, and 53% respectively, and according to criterion 2, was 94%, 85%, 75%, and 45%, respectively. Repeated AGV implantation seems to be a safe modality of treatment with acceptable success rate in cases with failed previous AGV surgery.

  9. The Influence of Phacoemulsification on Surgical Outcomes of Trabeculectomy with Mitomycin-C for Uveitic Glaucoma.

    Directory of Open Access Journals (Sweden)

    Asaho Nishizawa

    Full Text Available To evaluate the influence of phacoemulsification after trabeculectomy on the postoperative intraocular pressure (IOP in eyes with uveitic glaucoma (UG.Kumamoto University Hospital, Kumamoto, Japan.A retrospective cohort study.The medical records of patients with UG who had trabeculectomy with mitomycin-C (MMC were reviewed. Complete and qualified surgical failures were defined by an IOP of ≥21 mmHg (condition A, ≥18 mmHg (condition B, or ≥15 mmHg (condition C without and with glaucoma eye drops, respectively. Kaplan-Meier survival analysis, generalized by the Wilcoxon test, and the Cox proportional hazards model analysis were conducted. Post-trabeculectomy phacoemulsification was treated as a time-dependent variable. In 24 (30% of the included 80 eyes, phacoemulsification was included, and they were divided into two groups: groups I (8 eyes with phacoemulsification within 1 year after trabeculectomy and group II (16 eyes after 1 year following trabeculectomy.Multivariable Cox proportional hazards model analysis showed post-trabeculectomy phacoemulsification was a significant factor in both complete success and qualified success based upon condition C (P = 0.0432 and P = 0.0488, respectively, but not for the other conditions. Kaplan-Meier survival analyses indicated significant differences in success probabilities between groups I and group II for complete success and qualified success based upon condition C (P = 0.020 and P = 0.013, respectively. There was also a significant difference for qualified success based upon condition B (P = 0.034, while there was no significant difference for the other conditions.Post-trabeculectomy phacoemulsification, especially within 1 year, can cause poor prognosis of IOP control of UG eyes after trabeculectomy with MMC.

  10. A Point System to Forecast Hepatocellular Carcinoma Risk Before and After Treatment Among Persons with Chronic Hepatitis C.

    Science.gov (United States)

    Xing, Jian; Spradling, Philip R; Moorman, Anne C; Holmberg, Scott D; Teshale, Eyasu H; Rupp, Loralee B; Gordon, Stuart C; Lu, Mei; Boscarino, Joseph A; Schmidt, Mark A; Trinacty, Connie M; Xu, Fujie

    2017-11-01

    Risk of hepatocellular carcinoma (HCC) may be difficult to determine in the clinical setting. Develop a scoring system to forecast HCC risk among patients with chronic hepatitis C. Using data from the Chronic Hepatitis Cohort Study collected during 2005-2014, we derived HCC risk scores for males and females using an extended Cox model with aspartate aminotransferase-to-platelet ratio index (APRI) as a time-dependent variables and mean Kaplan-Meier survival functions from patient data at two study sites, and used data collected at two separate sites for external validation. For model calibration, we used the Greenwood-Nam-D'Agostino goodness-of-fit statistic to examine differences between predicted and observed risk. Of 12,469 patients (1628 with a history of sustained viral response [SVR]), 504 developed HCC; median follow-up was 6 years. Final predictors in the model included age, alcohol abuse, interferon-based treatment response, and APRI. Point values, ranging from -3 to 14 (males) and -3 to 12 (females), were established using hazard ratios of the predictors aligned with 1-, 3-, and 5-year Kaplan-Meier survival probabilities of HCC. Discriminatory capacity was high (c-index 0.82 males and 0.84 females) and external calibration demonstrated no differences between predicted and observed HCC risk for 1-, 3-, and 5-year forecasts among males (all p values >0.97) and for 3- and 5-year risk among females (all p values >0.87). This scoring system, based on age, alcohol abuse history, treatment response, and APRI, can be used to forecast up to a 5-year risk of HCC among hepatitis C patients before and after SVR.

  11. Effects of Parental Union Dissolution on Child Mortality and Child Schooling in Burkina Faso

    Directory of Open Access Journals (Sweden)

    Jean-François Kobiané

    2013-10-01

    Full Text Available Background: Family structure and union dissolution has been one of the most thoroughly studied determinants of children's wellbeing worldwide. To date, however, few of these studies have examined sub-Saharan Africa, especially countries in West Africa where marital breakdowns are not uncommon. Objective: We attempt to examine the effects of a mother's divorce and widowhood on children's risk of mortality under age 5 and on their probability of entering primary school. Methods: Survival data analysis methods, specifically Kaplan-Meier and piecewise exponential models, are used for analysis, based on data come from the 2000 Migration and Urban Integration Survey of Burkina Faso. Results: Compared to those of intact families, children of divorced parents experience higher estimated mortality risks under age 5 and a lower probability of entering school, even after controlling for various other factors. This effect is large and significant during the first two years after the divorce. The death of the father is also found to greatly reduce a child's likelihood of entering school, but its effect on mortality is not significant. Conclusions: The results indicate that the family context plays an important role in determining two important aspects of children's welfare: their probabilities of dying before age 5 and of entering school. Comments: Children of divorced parents or a deceased father are living in precarious situations and their specific needs should be taken into account in policies in order to improve the wellbeing of all children. Attention must be directed to the first two years following the union dissolution.

  12. Unsteady numerical simulation of the flow in the U9 Kaplan turbine model

    International Nuclear Information System (INIS)

    Javadi, Ardalan; Nilsson, Håkan

    2014-01-01

    The Reynolds-averaged Navier-Stokes equations with the RNG k-ε turbulence model closure are utilized to simulate the unsteady turbulent flow throughout the whole flow passage of the U9 Kaplan turbine model. The U9 Kaplan turbine model comprises 20 stationary guide vanes and 6 rotating blades (696.3 RPM), working at best efficiency load (0.71 m 3 /s). The computations are conducted using a general finite volume method, using the OpenFOAM CFD code. A dynamic mesh is used together with a sliding GGI interface to include the effect of the rotating runner. The clearance is included in the guide vane. The hub and tip clearances are also included in the runner. An analysis is conducted of the unsteady behavior of the flow field, the pressure fluctuation in the draft tube, and the coherent structures of the flow. The tangential and axial velocity distributions at three sections in the draft tube are compared against LDV measurements. The numerical result is in reasonable agreement with the experimental data, and the important flow physics close to the hub in the draft tube is captured. The hub and tip vortices and an on-axis forced vortex are captured. The numerical results show that the frequency of the forced vortex in 1/5 of the runner rotation

  13. Unsteady numerical simulation of the flow in the U9 Kaplan turbine model

    Science.gov (United States)

    Javadi, Ardalan; Nilsson, Håkan

    2014-03-01

    The Reynolds-averaged Navier-Stokes equations with the RNG k-ε turbulence model closure are utilized to simulate the unsteady turbulent flow throughout the whole flow passage of the U9 Kaplan turbine model. The U9 Kaplan turbine model comprises 20 stationary guide vanes and 6 rotating blades (696.3 RPM), working at best efficiency load (0.71 m3/s). The computations are conducted using a general finite volume method, using the OpenFOAM CFD code. A dynamic mesh is used together with a sliding GGI interface to include the effect of the rotating runner. The clearance is included in the guide vane. The hub and tip clearances are also included in the runner. An analysis is conducted of the unsteady behavior of the flow field, the pressure fluctuation in the draft tube, and the coherent structures of the flow. The tangential and axial velocity distributions at three sections in the draft tube are compared against LDV measurements. The numerical result is in reasonable agreement with the experimental data, and the important flow physics close to the hub in the draft tube is captured. The hub and tip vortices and an on-axis forced vortex are captured. The numerical results show that the frequency of the forced vortex in 1/5 of the runner rotation.

  14. [Estimation of the risk of upper digestive tract bleeding in patients with portal cavernomatosis].

    Science.gov (United States)

    Couselo, M; Ibáñez, V; Mangas, L; Gómez-Chacón, J; Vila Carbó, J J

    2011-01-01

    The aim of this study is to find out the risk of upper gastrointestinal bleeding (UGB) after the diagnosis of portal cavernoma in children, and to investigate several potential risk factors. We analyzed retrospectively 13 cases of portal cavernoma and estimated the risk of UGB with the Kaplan-Meier survival analysis. We calculated the incidence rate of the sample and the number of haemorrhages per year for each patient individually. From the moment of the diagnosis various parameters were recorded: age, platelets, leukocytes, hemoblobin, hematocrit, prothrombin time and number of bleedings. The relation between these parameters and the risk of bleeding was assessed with the Cox analysis. The patients were followed for a median period of 7.1 years. 10 patients (77%) presented at least 1 episode of UGB after the diagnosis. The median survival time until the first haemorrhage was 314 days. After the diagnosis the incidence rate of the sample was 0.43 episodes of upper gastrointestinal bleeding per person-year. The number of individual bleedings per person had a range of 0-2.2 episodes per year. There is very few data about the risk of bleeding in children with portal cavernoma. In our sample, we found out an incidence rate of 0.43 and a median survival time of 314 days until the first episode of bleeding after the diagnosis, but we were not able to find a statistically significant association between the studied variables and the risk of bleeding.

  15. Estimation of probability for the presence of claw and digital skin diseases by combining cow- and herd-level information using a Bayesian network

    DEFF Research Database (Denmark)

    Ettema, Jehan Frans; Østergaard, Søren; Kristensen, Anders Ringgaard

    2009-01-01

    , the data has been used to estimate the random effect of herd on disease prevalence and to find conditional probabilities of cows being lame, given the presence of the three diseases. By considering the 50 herds representative for the Danish population, the estimates for risk factors, conditional...

  16. Comparison of transmission dynamics between Streptococcus uberis and Streptococcus agalactiae intramammary infections.

    Science.gov (United States)

    Leelahapongsathon, Kansuda; Schukken, Ynte Hein; Pinyopummintr, Tanu; Suriyasathaporn, Witaya

    2016-02-01

    The objectives of study were to determine the transmission parameters (β), durations of infection, and basic reproductive numbers (R0) of both Streptococcus agalactiae and Streptococcus uberis as pathogens causing mastitis outbreaks in dairy herds. A 10-mo longitudinal study was performed using 2 smallholder dairy herds with mastitis outbreaks caused by Strep. agalactiae and Strep. uberis, respectively. Both herds had poor mastitis control management and did not change their milking management during the entire study period. Quarter milk samples were collected at monthly intervals from all lactating animals in each herd for bacteriological identification. The durations of infection for Strep. uberis intramammary infection (IMI) and Strep. agalactiae IMI were examined using Kaplan-Meier survival curves, and the Kaplan-Meier survival functions for Strep. uberis IMI and Strep. agalactiae IMI were compared using log rank survival-test. The spread of Strep. uberis and Strep. agalactiae through the population was determined by transmission parameter, β, the probability per unit of time that one infectious quarter will infect another quarter, assuming that all other quarters are susceptible. For the Strep. uberis outbreak herd (31 cows), 56 new infections and 28 quarters with spontaneous cure were observed. For the Strep. agalactiae outbreak herd (19 cows), 26 new infections and 9 quarters with spontaneous cure were observed. The duration of infection for Strep. agalactiae (mean=270.84 d) was significantly longer than the duration of infection for Strep. uberis (mean=187.88 d). The transmission parameters (β) estimated (including 95% confidence interval) for Strep. uberis IMI and Strep. agalactiae IMI were 0.0155 (0.0035-0.0693) and 0.0068 (0.0008-0.0606), respectively. The R0 (including 95% confidence interval) during the study were 2.91 (0.63-13.47) and 1.86 (0.21-16.61) for Strep. uberis IMI and Strep. agalactiae IMI, respectively. In conclusion, the transmission

  17. A nationwide study of serous “borderline” ovarian tumors in Denmark 1978–2002

    DEFF Research Database (Denmark)

    Hannibal, Charlotte Gerd; Vang, Russell; Junge, Jette

    2014-01-01

    OBJECTIVE: To describe the study population and estimate overall survival of women with a serous "borderline" ovarian tumor (SBT) in Denmark over 25 years relative to the general population. METHODS: The Danish Pathology Data Bank and the Danish Cancer Registry were used to identify 1487 women...... as noninvasive or invasive. Medical records were collected from hospital departments and reviewed. Data were analyzed using Kaplan-Meier and relative survival was estimated with follow-up through September 2, 2013. RESULTS: A cohort of 1042 women with a confirmed SBT diagnosis was identified. Women with stage I...... had an overall survival similar to the overall survival expected from the general population (p=0.3), whereas women with advanced stage disease had a poorer one (pwomen with noninvasive (pwomen with advanced stage...

  18. STAT3 inhibitor enhances chemotherapy drug efficacy by ...

    African Journals Online (AJOL)

    Immunohistochemistry and Kaplan-Meier method of survival analysis were used to determine chemoresistance trends in patients. STAT3 inhibitor treatment, RNAi or ectopic overexpression of STAT3 or MUC1 in NSCLC cells were used to determine their inter-molecular relation and for modulating stemness-related genes.

  19. The influence of obesity on response to tumour necrosis factor-α inhibitors in psoriatic arthritis

    DEFF Research Database (Denmark)

    Højgaard, Pil; Glintborg, Bente; Kristensen, Lars Erik

    2016-01-01

    OBJECTIVES: To investigate the impact of obesity on response to the first TNF-α inhibitor (TNFI) treatment course in patients with PsA followed in routine care. METHODS: We performed an observational cohort study based on the Danish and Icelandic biologics registries. Kaplan-Meier plots, Cox and ...

  20. Naive Probability: Model-based Estimates of Unique Events

    Science.gov (United States)

    2014-05-04

    of inference. Argument and Computation, 1–17, iFirst. Khemlani, S., & Johnson-Laird, P.N. (2012b). Theories of the syllogism: A meta -analysis...is the probability that… 1 space tourism will achieve widespread popularity in the next 50 years? advances in material science will lead to the... governments dedicate more resources to contacting extra-terrestrials? 8 the United States adopts an open border policy of universal acceptance? English is

  1. A classification scheme of erroneous behaviors for human error probability estimations based on simulator data

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2017-01-01

    Because it has been indicated that empirical data supporting the estimates used in human reliability analysis (HRA) is insufficient, several databases have been constructed recently. To generate quantitative estimates from human reliability data, it is important to appropriately sort the erroneous behaviors found in the reliability data. Therefore, this paper proposes a scheme to classify the erroneous behaviors identified by the HuREX (Human Reliability data Extraction) framework through a review of the relevant literature. A case study of the human error probability (HEP) calculations is conducted to verify that the proposed scheme can be successfully implemented for the categorization of the erroneous behaviors and to assess whether the scheme is useful for the HEP quantification purposes. Although continuously accumulating and analyzing simulator data is desirable to secure more reliable HEPs, the resulting HEPs were insightful in several important ways with regard to human reliability in off-normal conditions. From the findings of the literature review and the case study, the potential and limitations of the proposed method are discussed. - Highlights: • A taxonomy of erroneous behaviors is proposed to estimate HEPs from a database. • The cognitive models, procedures, HRA methods, and HRA databases were reviewed. • HEPs for several types of erroneous behaviors are calculated as a case study.

  2. A developmental study of risky decisions on the cake gambling task: age and gender analyses of probability estimation and reward evaluation.

    Science.gov (United States)

    Van Leijenhorst, Linda; Westenberg, P Michiel; Crone, Eveline A

    2008-01-01

    Decision making, or the process of choosing between competing courses of actions, is highly sensitive to age-related change, showing development throughout adolescence. In this study, we tested whether the development of decision making under risk is related to changes in risk-estimation abilities. Participants (N = 93) between ages 8-30 performed a child friendly gambling task, the Cake Gambling task, which was inspired by the Cambridge Gambling Task (Rogers et al., 1999), which has previously been shown to be sensitive to orbitofrontal cortex (OFC) damage. The task allowed comparisons of the contributions to risk perception of (1) the ability to estimate probabilities and (2) evaluate rewards. Adult performance patterns were highly similar to those found in previous reports, showing increased risk taking with increases in the probability of winning and the magnitude of potential reward. Behavioral patterns in children and adolescents did not differ from adult patterns, showing a similar ability for probability estimation and reward evaluation. These data suggest that participants 8 years and older perform like adults in a gambling task, previously shown to depend on the OFC in which all the information needed to make an advantageous decision is given on each trial and no information needs to be inferred from previous behavior. Interestingly, at all ages, females were more risk-averse than males. These results suggest that the increase in real-life risky behavior that is seen in adolescence is not a consequence of changes in risk perception abilities. The findings are discussed in relation to theories about the protracted development of the prefrontal cortex.

  3. A Method to Estimate the Probability that Any Individual Cloud-to-Ground Lightning Stroke was Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.

    2011-01-01

    A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.

  4. average probability of failure on demand estimation for burner

    African Journals Online (AJOL)

    HOD

    Pij – Probability from state i to j. 1. INTRODUCTION. In the process .... the numerical value of the PFD as result of components, sub-system ... ignored in probabilistic risk assessment it may lead to ...... Markov chains for a holistic modeling of SIS.

  5. FuzzyStatProb: An R Package for the Estimation of Fuzzy Stationary Probabilities from a Sequence of Observations of an Unknown Markov Chain

    Directory of Open Access Journals (Sweden)

    Pablo J. Villacorta

    2016-07-01

    Full Text Available Markov chains are well-established probabilistic models of a wide variety of real systems that evolve along time. Countless examples of applications of Markov chains that successfully capture the probabilistic nature of real problems include areas as diverse as biology, medicine, social science, and engineering. One interesting feature which characterizes certain kinds of Markov chains is their stationary distribution, which stands for the global fraction of time the system spends in each state. The computation of the stationary distribution requires precise knowledge of the transition probabilities. When the only information available is a sequence of observations drawn from the system, such probabilities have to be estimated. Here we review an existing method to estimate fuzzy transition probabilities from observations and, with them, obtain the fuzzy stationary distribution of the resulting fuzzy Markov chain. The method also works when the user directly provides fuzzy transition probabilities. We provide an implementation in the R environment that is the first available to the community and serves as a proof of concept. We demonstrate the usefulness of our proposal with computational experiments on a toy problem, namely a time-homogeneous Markov chain that guides the randomized movement of an autonomous robot that patrols a small area.

  6. Local Recurrence of Hepatocellular Carcinoma after Segmental Transarterial Chemoembolization: Risk Estimates Based on Multiple Prognostic Factors

    International Nuclear Information System (INIS)

    Park, Seung Hyun; Cho, Yun Ku; Ahn, Yong Sik; Park, Yoon Ok; Kim, Jae Kyun; Chung, Jin Wook

    2007-01-01

    To determine the prognostic factors for local recurrence of nodular hepatocellular carcinoma after segmental transarterial chemoembolization. Seventy-four nodular hepatocellular carcinoma tumors ≤5 cm were retrospectively analyzed for local recurrence after segmental transarterial chemoembolization using follow-up CT images (median follow-up of 17 months, 4 77 months in range). The tumors were divided into four groups (IA, IB, IIA, and IIB) according to whether the one-month follow-up CT imaging, after segmental transarterial chemoembolization, showed homogeneous (Group I) or inhomogeneous (Group II) iodized oil accumulation, or whether the tumors were located within the liver segment (Group A) or in a segmental border zone (Group B). Comparison of tumor characteristics between Group IA and the other three groups was performed using the chi-square test. Local recurrence rates were compared among the groups using the Kaplan-Meier estimation and log rank test. Local tumor recurrence occurred in 19 hepatocellular carcinoma tumors (25.7%). There were: 28, 18, 17, and 11 tumors in Group IA, IB, IIA, and IIB, respectively. One of 28 (3.6%) tumors in Group IA, and 18 of 46 (39.1%) tumors in the other three groups showed local recurrence. Comparisons between Group IA and the other three groups showed that the tumor characteristics were similar. One-, two-, and three-year estimated local recurrence rates in Group IA were 0%, 11.1%, and 11.1%, respectively. The difference between Group IA and the other three groups was statistically significant (p 0.000). An acceptably low rate of local recurrence was observed for small or intermediate nodular tumors located within the liver segment with homogeneous iodized oil accumulation

  7. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013 : [summary].

    Science.gov (United States)

    2015-01-01

    Traditionally, the Iowa DOT has used the Iowa Runoff Chart and single-variable regional regression equations (RREs) from a USGS report : (published in 1987) as the primary methods to estimate annual exceedance-probability discharge : (AEPD) for small...

  8. Estimating landholders' probability of participating in a stewardship program, and the implications for spatial conservation priorities.

    Directory of Open Access Journals (Sweden)

    Vanessa M Adams

    Full Text Available The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements--conservation covenants and management agreements--based on payment level and proportion of properties required to be managed. We then spatially predicted landholders' probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation.

  9. Influence of the level of fit of a density probability function to wind-speed data on the WECS mean power output estimation

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Velazquez, Sergio

    2008-01-01

    Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error ε made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R 2 statistic (R a 2 ). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R a 2 statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R a 2 increases

  10. SAMSN1 is highly expressed and associated with a poor survival in glioblastoma multiforme.

    Directory of Open Access Journals (Sweden)

    Yong Yan

    Full Text Available OBJECTIVES: To study the expression pattern and prognostic significance of SAMSN1 in glioma. METHODS: Affymetrix and Arrystar gene microarray data in the setting of glioma was analyzed to preliminarily study the expression pattern of SAMSN1 in glioma tissues, and Hieratical clustering of gene microarray data was performed to filter out genes that have prognostic value in malignant glioma. Survival analysis by Kaplan-Meier estimates stratified by SAMSN1 expression was then made based on the data of more than 500 GBM cases provided by The Cancer Genome Atlas (TCGA project. At last, we detected the expression of SAMSN1 in large numbers of glioma and normal brain tissue samples using Tissue Microarray (TMA. Survival analysis by Kaplan-Meier estimates in each grade of glioma was stratified by SAMSN1 expression. Multivariate survival analysis was made by Cox proportional hazards regression models in corresponding groups of glioma. RESULTS: With the expression data of SAMSN1 and 68 other genes, high-grade glioma could be classified into two groups with clearly different prognoses. Gene and large sample tissue microarrays showed high expression of SAMSN1 in glioma particularly in GBM. Survival analysis based on the TCGA GBM data matrix and TMA multi-grade glioma dataset found that SAMSN1 expression was closely related to the prognosis of GBM, either PFS or OS (P<0.05. Multivariate survival analysis with Cox proportional hazards regression models confirmed that high expression of SAMSN1 was a strong risk factor for PFS and OS of GBM patients. CONCLUSION: SAMSN1 is over-expressed in glioma as compared with that found in normal brains, especially in GBM. High expression of SAMSN1 is a significant risk factor for the progression free and overall survival of GBM.

  11. Long-Term Outcomes With Intraoperative Radiotherapy as a Component of Treatment for Locally Advanced or Recurrent Uterine Sarcoma

    Energy Technology Data Exchange (ETDEWEB)

    Barney, Brandon M., E-mail: barney.brandon@mayo.edu [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Petersen, Ivy A. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Dowdy, Sean C.; Bakkum-Gamez, Jamie N. [Division of Gynecologic Surgery, Mayo Clinic, Rochester, Minnesota (United States); Haddock, Michael G. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States)

    2012-05-01

    Purpose: To report our institutional experience with intraoperative radiotherapy (IORT) as a component of treatment for women with locally advanced or recurrent uterine sarcoma. Methods and Materials: From 1990 to 2010, 16 women with primary (n = 3) or locoregionally recurrent (n = 13) uterine sarcoma received IORT as a component of combined modality treatment. Tumor histology studies found leiomyosarcoma (n = 9), endometrial stromal sarcoma (n = 4), and carcinosarcoma (n = 3). Surgery consisted of gross total resection in 2 patients, subtotal resection in 6 patients, and resection with close surgical margins in 8 patients. The median IORT dose was 12.5 Gy (range, 10-20 Gy). All patients received perioperative external beam radiotherapy (EBRT; median dose, 50.4 Gy; range, 20-62.5 Gy), and 6 patients also received perioperative systemic therapy. Results: Seven of the 16 patients are alive at a median follow-up of 44 months (range, 11-203 months). The 3-year Kaplan-Meier estimate of local relapse (within the EBRT field) was 7%, and central control (within the IORT field) was 100%. No local failures occurred in any of the 6 patients who underwent subtotal resection. The 3-year freedom from distant relapse was 48%, with failures occurring most frequently in the lungs or mediastinum. Median survival was 18 months, and 3-year Kaplan-Meier estimates of cause-specific and overall survival were 58% and 53%, respectively. Three patients (19%) experienced late Grade 3 toxicity. Conclusions: A combined modality approach with perioperative EBRT, surgery, and IORT for locally advanced or recurrent uterine sarcoma resulted in excellent local disease control with acceptable toxicity, even in patients with positive resection margins. With this approach, some patients were able to experience long-term freedom from recurrence.

  12. Long-Term Outcomes With Intraoperative Radiotherapy as a Component of Treatment for Locally Advanced or Recurrent Uterine Sarcoma

    International Nuclear Information System (INIS)

    Barney, Brandon M.; Petersen, Ivy A.; Dowdy, Sean C.; Bakkum-Gamez, Jamie N.; Haddock, Michael G.

    2012-01-01

    Purpose: To report our institutional experience with intraoperative radiotherapy (IORT) as a component of treatment for women with locally advanced or recurrent uterine sarcoma. Methods and Materials: From 1990 to 2010, 16 women with primary (n = 3) or locoregionally recurrent (n = 13) uterine sarcoma received IORT as a component of combined modality treatment. Tumor histology studies found leiomyosarcoma (n = 9), endometrial stromal sarcoma (n = 4), and carcinosarcoma (n = 3). Surgery consisted of gross total resection in 2 patients, subtotal resection in 6 patients, and resection with close surgical margins in 8 patients. The median IORT dose was 12.5 Gy (range, 10–20 Gy). All patients received perioperative external beam radiotherapy (EBRT; median dose, 50.4 Gy; range, 20–62.5 Gy), and 6 patients also received perioperative systemic therapy. Results: Seven of the 16 patients are alive at a median follow-up of 44 months (range, 11–203 months). The 3-year Kaplan-Meier estimate of local relapse (within the EBRT field) was 7%, and central control (within the IORT field) was 100%. No local failures occurred in any of the 6 patients who underwent subtotal resection. The 3-year freedom from distant relapse was 48%, with failures occurring most frequently in the lungs or mediastinum. Median survival was 18 months, and 3-year Kaplan-Meier estimates of cause-specific and overall survival were 58% and 53%, respectively. Three patients (19%) experienced late Grade 3 toxicity. Conclusions: A combined modality approach with perioperative EBRT, surgery, and IORT for locally advanced or recurrent uterine sarcoma resulted in excellent local disease control with acceptable toxicity, even in patients with positive resection margins. With this approach, some patients were able to experience long-term freedom from recurrence.

  13. Miners' return to work following injuries in coal mines.

    Science.gov (United States)

    Bhattacherjee, Ashis; Kunar, Bijay Mihir

    2016-12-22

    The occupational injuries in mines are common and result in severe socio-economical consequences. Earlier studies have revealed the role of multiple factors such as demographic factors, behavioral factors, health-related factors, working environment, and working conditions for mine injuries. However, there is a dearth of information about the role of some of these factors in delayed return to work (RTW) following a miner's injury. These factors may likely include personal characteristics of injured persons and his or her family, the injured person's social and economic status, and job characteristics. This study was conducted to assess the role of some of these factors for the return to work following coal miners' injuries. A study was conducted for 109 injured workers from an underground coal mine in the years 2000-2009. A questionnaire, which was completed by the personnel interviews, included among others age, height, weight, seniority, alcohol consumption, sleeping duration, presence of diseases, job stress, job satisfaction, and injury type. The data was analyzed using the Kaplan-Meier estimates and the Cox proportional hazard model. According to Kaplan-Meier estimate it was revealed that a lower number of dependents, longer sleep duration, no job stress, no disease, no alcohol addiction, and higher monthly income have a great impact on early return to work after injury. The Cox regression analysis revealed that the significant risk factors which influenced miners' return to work included presence of disease, job satisfaction and injury type. The mine management should pay attention to significant risk factors for injuries in order to develop effective preventive measures. Med Pr 2016;67(6):729-742. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  14. Statins Reduces the Risk of Dementia in Patients with Late-Onset Depression: A Retrospective Cohort Study.

    Science.gov (United States)

    Yang, Ya-Hsu; Teng, Hao-Wei; Lai, Yen-Ting; Li, Szu-Yuan; Lin, Chih-Ching; Yang, Albert C; Chan, Hsiang-Lin; Hsieh, Yi-Hsuan; Lin, Chiao-Fan; Hsu, Fu-Ying; Liu, Chih-Kuang; Liu, Wen-Sheng

    2015-01-01

    Patients with late-onset depression (LOD) have been reported to run a higher risk of subsequent dementia. The present study was conducted to assess whether statins can reduce the risk of dementia in these patients. We used the data from National Health Insurance of Taiwan during 1996-2009. Standardized Incidence Ratios (SIRs) were calculated for LOD and subsequent dementia. The criteria for LOD diagnoses included age ≥65 years, diagnosis of depression after 65 years of age, at least three service claims, and treatment with antidepressants. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients for validation studies. Kaplan-Meier curve estimate was used to measure the group of patients with dementia living after diagnosis of LOD. Totally 45,973 patients aged ≥65 years were enrolled. The prevalence of LOD was 12.9% (5,952/45,973). Patients with LOD showed to have a higher incidence of subsequent dementia compared with those without LOD (Odds Ratio: 2.785; 95% CI 2.619-2.958). Among patients with LOD, lipid lowering agent (LLA) users (for at least 3 months) had lower incidence of subsequent dementia than non-users (Hazard Ratio = 0.781, 95% CI 0.685-0.891). Nevertheless, only statins users showed to have reduced risk of dementia (Hazard Ratio = 0.674, 95% CI 0.547-0.832) while other LLAs did not, which was further validated by Kaplan-Meier estimates after we used the propensity scores with the one-to-one nearest-neighbor matching model to control the confounding factors. Statins may reduce the risk of subsequent dementia in patients with LOD.

  15. Intravitreal chemotherapy in retinoblastoma: expanded use beyond intravitreal seeds.

    Science.gov (United States)

    Abramson, David H; Ji, Xunda; Francis, Jasmine H; Catalanotti, Federica; Brodie, Scott E; Habib, Larissa

    2018-06-06

    Ophthalmic artery chemosurgery (OAC) has changed the face of retinoblastoma treatment and led to a higher rate of globe salvage. The introduction of intravitreal chemotherapy (IVitC) has further enhanced globe salvage with increased success in treatment of intravitreal seeds. Our group has seen success at treating non-vitreous disease that is refractory to OAC using IVitC. This study was undertaken to quantify and report on this success. A retrospective review was used to identify patients treated with IVitC for indications other than vitreous seeds from two centres. The indication, prior and concurrent treatment, response time and duration of treatment were documented. Kaplan-Meier estimates were used to evaluate ocular and recurrence-free survival. Ocular toxicity was evaluated using the 30 Hz flicker electroretinogram (ERG). Continuous and categorical variables were compared with Student's t-test and χ 2 test, respectively. Fifty-six eyes from 52 retinoblastoma patients were identified. There were no disease-related or treatment-related deaths. One patient developed a second primary malignancy (pinealoblastoma) and subsequent leptomeningeal spread. Ninety-eight per cent of the eyes showed clinical regression. Recurrence was seen in 14.3%. Of the recurrences, five occurred in retinal tumours and three in subretinal seeds. The Kaplan-Meier estimated risk of recurrence in all patients treated was 83.5% (95% CI 7.9 to 14.1) at 10 months. The mean change in ERG over treatment course was -17.7 μV. Intravitreal chemotherapy is successful for the treatment of subretinal seeds and recurrent retinal tumours and could be considered as adjunctive therapy in globe-sparing treatment of retinoblastoma. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Estimation of peak discharge quantiles for selected annual exceedance probabilities in northeastern Illinois

    Science.gov (United States)

    Over, Thomas M.; Saito, Riki J.; Veilleux, Andrea G.; Sharpe, Jennifer B.; Soong, David T.; Ishii, Audrey L.

    2016-06-28

    This report provides two sets of equations for estimating peak discharge quantiles at annual exceedance probabilities (AEPs) of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002 (recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively) for watersheds in Illinois based on annual maximum peak discharge data from 117 watersheds in and near northeastern Illinois. One set of equations was developed through a temporal analysis with a two-step least squares-quantile regression technique that measures the average effect of changes in the urbanization of the watersheds used in the study. The resulting equations can be used to adjust rural peak discharge quantiles for the effect of urbanization, and in this study the equations also were used to adjust the annual maximum peak discharges from the study watersheds to 2010 urbanization conditions.The other set of equations was developed by a spatial analysis. This analysis used generalized least-squares regression to fit the peak discharge quantiles computed from the urbanization-adjusted annual maximum peak discharges from the study watersheds to drainage-basin characteristics. The peak discharge quantiles were computed by using the Expected Moments Algorithm following the removal of potentially influential low floods defined by a multiple Grubbs-Beck test. To improve the quantile estimates, regional skew coefficients were obtained from a newly developed regional skew model in which the skew increases with the urbanized land use fraction. The drainage-basin characteristics used as explanatory variables in the spatial analysis include drainage area, the fraction of developed land, the fraction of land with poorly drained soils or likely water, and the basin slope estimated as the ratio of the basin relief to basin perimeter.This report also provides the following: (1) examples to illustrate the use of the spatial and urbanization-adjustment equations for estimating peak discharge quantiles at ungaged

  17. Building vulnerability to hydro-geomorphic hazards: Estimating damage probability from qualitative vulnerability assessment using logistic regression

    Science.gov (United States)

    Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida

    2016-10-01

    The focus of this study is an analysis of building vulnerability through investigating impacts from the 8 February 2013 flash flood event along the Avenida Venezuela channel in the city of Arequipa, Peru. On this day, 124.5 mm of rain fell within 3 h (monthly mean: 29.3 mm) triggering a flash flood that inundated at least 0.4 km2 of urban settlements along the channel, affecting more than 280 buildings, 23 of a total of 53 bridges (pedestrian, vehicle and railway), and leading to the partial collapse of sections of the main road, paralyzing central parts of the city for more than one week. This study assesses the aspects of building design and site specific environmental characteristics that render a building vulnerable by considering the example of a flash flood event in February 2013. A statistical methodology is developed that enables estimation of damage probability for buildings. The applied method uses observed inundation height as a hazard proxy in areas where more detailed hydrodynamic modeling data is not available. Building design and site-specific environmental conditions determine the physical vulnerability. The mathematical approach considers both physical vulnerability and hazard related parameters and helps to reduce uncertainty in the determination of descriptive parameters, parameter interdependency and respective contributions to damage. This study aims to (1) enable the estimation of damage probability for a certain hazard intensity, and (2) obtain data to visualize variations in damage susceptibility for buildings in flood prone areas. Data collection is based on a post-flood event field survey and the analysis of high (sub-metric) spatial resolution images (Pléiades 2012, 2013). An inventory of 30 city blocks was collated in a GIS database in order to estimate the physical vulnerability of buildings. As many as 1103 buildings were surveyed along the affected drainage and 898 buildings were included in the statistical analysis. Univariate and

  18. Age at sexual initiation and factors associated with it among youths ...

    African Journals Online (AJOL)

    Bernt Lindtjorn

    Objective: The objective of the study was to determine the median age at first sexual intercourse and the associated .... Wollo Zone, Amhara National Regional State, North East ..... censored. Urba n. Rura l. Residence rural/urban. Fig 2: Kaplan Meier curve showing log survival for ..... renting and selling pornography films.

  19. Epilepsy in Rett syndrome--lessons from the Rett networked database

    DEFF Research Database (Denmark)

    Nissenkorn, Andreea; Levy-Drummer, Rachel S; Bondi, Ori

    2015-01-01

    collected. Statistical analysis was done using the IBM SPSS Version 21 software, logistic regression, and Kaplan-Meier survival curves. RESULTS: Epilepsy was present in 68.1% of the patients, with uncontrolled seizures in 32.6% of the patients with epilepsy. Mean age of onset of epilepsy was 4...

  20. Survival of Root-filled Teeth in the Swedish Adult Population

    DEFF Research Database (Denmark)

    Fransson, Helena; Dawson, Victoria S; Frisk, Fredrik

    2016-01-01

    INTRODUCTION: The aim was to assess survival in the Swedish population of teeth treated by nonsurgical root canal treatment during 2009. METHODS: Data from the Swedish Social Insurance Agency were analyzed by Kaplan-Meier analysis to assess cumulative tooth survival during a period of 5-6 years o...

  1. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  2. ipw: An R Package for Inverse Probability Weighting

    Directory of Open Access Journals (Sweden)

    Ronald B. Geskus

    2011-10-01

    Full Text Available We describe the R package ipw for estimating inverse probability weights. We show how to use the package to fit marginal structural models through inverse probability weighting, to estimate causal effects. Our package can be used with data from a point treatment situation as well as with a time-varying exposure and time-varying confounders. It can be used with binomial, categorical, ordinal and continuous exposure variables.

  3. Efficient simulation of tail probabilities of sums of correlated lognormals

    DEFF Research Database (Denmark)

    Asmussen, Søren; Blanchet, José; Juneja, Sandeep

    We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown to be eff......We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown...... optimize the scaling parameter of the covariance. The second estimator decomposes the probability of interest in two contributions and takes advantage of the fact that large deviations for a sum of correlated lognormals are (asymptotically) caused by the largest increment. Importance sampling...

  4. Community Music during the New Deal: The Contributions of Willem Van de Wall and Max Kaplan

    Science.gov (United States)

    Krikun, Andrew

    2010-01-01

    Willem Van de Wall (1887-1953) and Max Kaplan (1911-98) built careers spanning music performance, music education, adult education, sociology, social work, music therapy and community music. Willem Van de Wall was a seminal influence on the development of the fields of music therapy and adult education--researching the role of music in…

  5. Progression to Legal Blindness in Patients With Normal Tension Glaucoma: Hospital-Based Study.

    Science.gov (United States)

    Sawada, Akira; Rivera, Jonathan A; Takagi, Daisuke; Nishida, Takashi; Yamamoto, Tetsuya

    2015-06-01

    To determine the probability of an eye with normal tension glaucoma (NTG) progressing to legal blindness under standard ophthalmic care. Patients diagnosed with NTG (n = 382) between 1985 and 2007 at Gifu University Hospital were followed for at least 5 years under standard ophthalmic care. The collected data included the best-corrected visual acuity (BCVA), intraocular pressure (IOP), and visual field status. Blindness was defined as a BCVA of blindness in one or both eyes. The mean follow-up period after diagnosis was 13.3 ± 5.4 years with a range of 5.0 to 29.1 years. At diagnosis, 18 patients (4.7%) had unilateral blindness due to glaucoma. At final examination, 34 patients had progressed to unilateral blindness and 5 to bilateral blindness. The Kaplan-Meier life table analysis estimate for unilateral blindness was 5.8 ± 1.3% at 10 years and 9.9 ± 1.9% at 20 years. Similarly, that for bilateral blindness was 0.3 ± 0.3% at 10 years and 1.4 ± 0.8% at 20 years. A Cox proportional hazard model analysis showed that a lower initial BCVA (P blindness in at least one eye. The probability of blindness in eyes with NTG is much lower than previously reported in patients with high-tension glaucoma. Nevertheless, special care should be taken to follow NTG patients, and especially those with worse BCVA and more advanced visual field loss at diagnosis.

  6. Tumor Control Probability Modeling for Stereotactic Body Radiation Therapy of Early-Stage Lung Cancer Using Multiple Bio-physical Models

    Science.gov (United States)

    Liu, Feng; Tai, An; Lee, Percy; Biswas, Tithi; Ding, George X.; El Naqa, Isaam; Grimm, Jimm; Jackson, Andrew; Kong, Feng-Ming (Spring); LaCouture, Tamara; Loo, Billy; Miften, Moyed; Solberg, Timothy; Li, X Allen

    2017-01-01

    Purpose To analyze pooled clinical data using different radiobiological models and to understand the relationship between biologically effective dose (BED) and tumor control probability (TCP) for stereotactic body radiotherapy (SBRT) of early-stage non-small cell lung cancer (NSCLC). Method and Materials The clinical data of 1-, 2-, 3-, and 5-year actuarial or Kaplan-Meier TCP from 46 selected studies were collected for SBRT of NSCLC in the literature. The TCP data were separated for Stage T1 and T2 tumors if possible, otherwise collected for combined stages. BED was calculated at isocenters using six radiobiological models. For each model, the independent model parameters were determined from a fit to the TCP data using the least chi-square (χ2) method with either one set of parameters regardless of tumor stages or two sets for T1 and T2 tumors separately. Results The fits to the clinic data yield consistent results of large α/β ratios of about 20 Gy for all models investigated. The regrowth model that accounts for the tumor repopulation and heterogeneity leads to a better fit to the data, compared to other 5 models where the fits were indistinguishable between the models. The models based on the fitting parameters predict that the T2 tumors require about additional 1 Gy physical dose at isocenters per fraction (≤5 fractions) to achieve the optimal TCP when compared to the T1 tumors. Conclusion This systematic analysis of a large set of published clinical data using different radiobiological models shows that local TCP for SBRT of early-stage NSCLC has strong dependence on BED with large α/β ratios of about 20 Gy. The six models predict that a BED (calculated with α/β of 20) of 90 Gy is sufficient to achieve TCP ≥ 95%. Among the models considered, the regrowth model leads to a better fit to the clinical data. PMID:27871671

  7. Primary mediastinal large B-cell lymphoma: Clinical features, prognostic factors and survival with RCHOP in Arab patients in the PET scan era.

    Science.gov (United States)

    Al Shemmari, Salem; Sankaranarayanan, Sreedharan P; Krishnan, Yamini

    2014-07-01

    PMBCL is a distinct type of nonhodgkins lymphoma with specific clinicopathological features. To clarify clinical features, treatment alternatives and outcomes, we evaluated 28 Arab patients treated with chemotherapy or radiotherapy between 2006 and 2011. PMBCL lymphoma patients identified according to WHO classification and treated at KCCC between 2006 and 2011 were included in this study. Demographic and clinical data are presented as means or medians. Overall survival was estimated using the Kaplan-Meier method. Survival rates were compared using the log-rank test. A P lymphoma entity seen in the young with good survival. The role of PET scan for response evaluation and the type of consolidation therapy needs to be further clarified.

  8. Descriptive epidemiology and natural history of idiopathic venous thromboembolism in U.S. active duty enlisted personnel, 1998-2007.

    Science.gov (United States)

    Freeman, Randall J; Li, Yuanzhang; Niebuhr, David W

    2011-05-01

    The estimated incidence of idiopathic venous thromboembolism (IVTE) cases in the United States ranges from 24,000 to 282,000/year. This analysis explores the incidence and prevalence of IVTE in the military and if cases experience increased attrition. The Defense Medical Surveillance System was searched for incident IVTE cases from 1998 through 2007. Enlisted cases were each matched to 3 controls. Kaplan-Meier survival analysis and Cox proportional hazard modeling were performed. We matched 463 cases to 1,389 controls. Outpatient IVTE rates have increased markedly from 1998 through 2007. Cases of all-cause attrition risk (0.56 [95% CI = 0.44, 0.72]) and rates were significantly less than controls (p readiness, and medical costs.

  9. Estimating inverse probability weights using super learner when weight-model specification is unknown in a marginal structural Cox model context.

    Science.gov (United States)

    Karim, Mohammad Ehsanul; Platt, Robert W

    2017-06-15

    Correct specification of the inverse probability weighting (IPW) model is necessary for consistent inference from a marginal structural Cox model (MSCM). In practical applications, researchers are typically unaware of the true specification of the weight model. Nonetheless, IPWs are commonly estimated using parametric models, such as the main-effects logistic regression model. In practice, assumptions underlying such models may not hold and data-adaptive statistical learning methods may provide an alternative. Many candidate statistical learning approaches are available in the literature. However, the optimal approach for a given dataset is impossible to predict. Super learner (SL) has been proposed as a tool for selecting an optimal learner from a set of candidates using cross-validation. In this study, we evaluate the usefulness of a SL in estimating IPW in four different MSCM simulation scenarios, in which we varied the specification of the true weight model specification (linear and/or additive). Our simulations show that, in the presence of weight model misspecification, with a rich and diverse set of candidate algorithms, SL can generally offer a better alternative to the commonly used statistical learning approaches in terms of MSE as well as the coverage probabilities of the estimated effect in an MSCM. The findings from the simulation studies guided the application of the MSCM in a multiple sclerosis cohort from British Columbia, Canada (1995-2008), to estimate the impact of beta-interferon treatment in delaying disability progression. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  11. Estimated probabilities, volumes, and inundation areas depths of potential postwildfire debris flows from Carbonate, Slate, Raspberry, and Milton Creeks, near Marble, Gunnison County, Colorado

    Science.gov (United States)

    Stevens, Michael R.; Flynn, Jennifer L.; Stephens, Verlin C.; Verdin, Kristine L.

    2011-01-01

    During 2009, the U.S. Geological Survey, in cooperation with Gunnison County, initiated a study to estimate the potential for postwildfire debris flows to occur in the drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble, Colorado. Currently (2010), these drainage basins are unburned but could be burned by a future wildfire. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of postwildfire debris-flow occurrence and debris-flow volumes for drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble. Data for the postwildfire debris-flow models included drainage basin area; area burned and burn severity; percentage of burned area; soil properties; rainfall total and intensity for the 5- and 25-year-recurrence, 1-hour-duration-rainfall; and topographic and soil property characteristics of the drainage basins occupied by the four creeks. A quasi-two-dimensional floodplain computer model (FLO-2D) was used to estimate the spatial distribution and the maximum instantaneous depth of the postwildfire debris-flow material during debris flow on the existing debris-flow fans that issue from the outlets of the four major drainage basins. The postwildfire debris-flow probabilities at the outlet of each drainage basin range from 1 to 19 percent for the 5-year-recurrence, 1-hour-duration rainfall, and from 3 to 35 percent for 25-year-recurrence, 1-hour-duration rainfall. The largest probabilities for postwildfire debris flow are estimated for Raspberry Creek (19 and 35 percent), whereas estimated debris-flow probabilities for the three other creeks range from 1 to 6 percent. The estimated postwildfire debris-flow volumes at the outlet of each creek range from 7,500 to 101,000 cubic meters for the 5-year-recurrence, 1-hour-duration rainfall, and from 9,400 to 126,000 cubic meters for

  12. Estimated probability of stroke among medical outpatients in Enugu ...

    African Journals Online (AJOL)

    Risk factors for stroke were evaluated using a series of laboratory tests, medical history and physical examinations. The 10‑year probability of stroke was determined by applying the Framingham stroke risk equation. Statistical analysis was performed with the use of the SPSS 17.0 software package (SPSS Inc., Chicago, IL, ...

  13. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  14. Clinical outcome among HIV-infected patients starting saquinavir hard gel compared to ritonavir or indinavir

    DEFF Research Database (Denmark)

    Kirk, O; Mocroft, A; Pradier, C

    2001-01-01

    -up within the EuroSIDA study. METHODS: Changes in plasma viral load (pVL) and CD4 cell count from baseline were compared between treatment groups. Time to new AIDS-defining events and death were compared in Kaplan--Meier models, and Cox models were established to further assess differences in clinical...

  15. C-reactive protein-to-albumin ratio is a predictor of hepatitis B virus related decompensated cirrhosis: time-dependent receiver operating characteristics and decision curve analysis.

    Science.gov (United States)

    Huang, Si-Si; Xie, Dong-Mei; Cai, Yi-Jing; Wu, Jian-Min; Chen, Rui-Chong; Wang, Xiao-Dong; Song, Mei; Zheng, Ming-Hua; Wang, Yu-Qun; Lin, Zhuo; Shi, Ke-Qing

    2017-04-01

    Hepatitis B virus (HBV) infection remains a major health problem and HBV-related-decompensated cirrhosis (HBV-DC) usually leads to a poor prognosis. Our aim was to determine the utility of inflammatory biomarkers in predicting mortality of HBV-DC. A total of 329 HBV-DC patients were enrolled. Survival estimates for the entire study population were generated using the Kaplan-Meier method. The prognostic values for model for end-stage liver disease (MELD) score, Child-Pugh score, and inflammatory biomarkers neutrophil/lymphocyte ratio, C-reactive protein-to-albumin ratio (CAR), and lymphocyte-to-monocyte ratio (LMR) for HBV-DC were compared using time-dependent receiver operating characteristic curves and time-dependent decision curves. The survival time was 23.1±15.8 months. Multivariate analysis identified age, CAR, LMR, and platelet count as prognostic independent risk factors. Kaplan-Meier analysis indicated that CAR of at least 1.0 (hazard ratio, 7.19; 95% confidence interval, 4.69-11.03), and LMR less than 1.9 (hazard ratio, 2.40; 95% confidence interval, 1.69-3.41) were independently associated with mortality of HBV-DC. The time-dependent receiver operating characteristic indicated that CAR showed the best performance in predicting mortality of HBV-DC compared with LMR, MELD score, and Child-Pugh score. The results were also confirmed by time-dependent decision curves. CAR and LMR were associated with the prognosis of HBV-DC. CAR was superior to LMR, MELD score, and Child-Pugh score in HBV-DC mortality prediction.

  16. Predicting long-term graft survival in adult kidney transplant recipients

    Directory of Open Access Journals (Sweden)

    Brett W Pinsky

    2012-01-01

    Full Text Available The ability to accurately predict a population′s long-term survival has important implications for quantifying the benefits of transplantation. To identify a model that can accurately predict a kidney transplant population′s long-term graft survival, we retrospectively studied the United Network of Organ Sharing data from 13,111 kidney-only transplants completed in 1988- 1989. Nineteen-year death-censored graft survival (DCGS projections were calculated and com-pared with the population′s actual graft survival. The projection curves were created using a two-part estimation model that (1 fits a Kaplan-Meier survival curve immediately after transplant (Part A and (2 uses truncated observational data to model a survival function for long-term projection (Part B. Projection curves were examined using varying amounts of time to fit both parts of the model. The accuracy of the projection curve was determined by examining whether predicted sur-vival fell within the 95% confidence interval for the 19-year Kaplan-Meier survival, and the sample size needed to detect the difference in projected versus observed survival in a clinical trial. The 19-year DCGS was 40.7% (39.8-41.6%. Excellent predictability (41.3% can be achieved when Part A is fit for three years and Part B is projected using two additional years of data. Using less than five total years of data tended to overestimate the population′s long-term survival, accurate prediction of long-term DCGS is possible, but requires attention to the quantity data used in the projection method.

  17. Sharp probability estimates for Shor's order-finding algorithm

    OpenAIRE

    Bourdon, P. S.; Williams, H. T.

    2006-01-01

    Let N be a (large positive integer, let b > 1 be an integer relatively prime to N, and let r be the order of b modulo N. Finally, let QC be a quantum computer whose input register has the size specified in Shor's original description of his order-finding algorithm. We prove that when Shor's algorithm is implemented on QC, then the probability P of obtaining a (nontrivial) divisor of r exceeds 0.7 whenever N exceeds 2^{11}-1 and r exceeds 39, and we establish that 0.7736 is an asymptotic lower...

  18. Determining the sample size required to establish whether a medical device is non-inferior to an external benchmark.

    Science.gov (United States)

    Sayers, Adrian; Crowther, Michael J; Judge, Andrew; Whitehouse, Michael R; Blom, Ashley W

    2017-08-28

    The use of benchmarks to assess the performance of implants such as those used in arthroplasty surgery is a widespread practice. It provides surgeons, patients and regulatory authorities with the reassurance that implants used are safe and effective. However, it is not currently clear how or how many implants should be statistically compared with a benchmark to assess whether or not that implant is superior, equivalent, non-inferior or inferior to the performance benchmark of interest.We aim to describe the methods and sample size required to conduct a one-sample non-inferiority study of a medical device for the purposes of benchmarking. Simulation study. Simulation study of a national register of medical devices. We simulated data, with and without a non-informative competing risk, to represent an arthroplasty population and describe three methods of analysis (z-test, 1-Kaplan-Meier and competing risks) commonly used in surgical research. We evaluate the performance of each method using power, bias, root-mean-square error, coverage and CI width. 1-Kaplan-Meier provides an unbiased estimate of implant net failure, which can be used to assess if a surgical device is non-inferior to an external benchmark. Small non-inferiority margins require significantly more individuals to be at risk compared with current benchmarking standards. A non-inferiority testing paradigm provides a useful framework for determining if an implant meets the required performance defined by an external benchmark. Current contemporary benchmarking standards have limited power to detect non-inferiority, and substantially larger samples sizes, in excess of 3200 procedures, are required to achieve a power greater than 60%. It is clear when benchmarking implant performance, net failure estimated using 1-KM is preferential to crude failure estimated by competing risk models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No

  19. Mortality, Causes of Death and Associated Factors Relate to a Large HIV Population-Based Cohort.

    Directory of Open Access Journals (Sweden)

    César Garriga

    Full Text Available Antiretroviral therapy has led to a decrease in HIV-related mortality and to the emergence of non-AIDS defining diseases as competing causes of death. This study estimates the HIV mortality rate and their risk factors with regard to different causes in a large city from January 2001 to June 2013.We followed-up 3137 newly diagnosed HIV non-AIDS cases. Causes of death were classified as HIV-related, non-HIV-related and external. We examined the effect of risk factors on survival using mortality rates, Kaplan-Meier plots and Cox models. Finally, we estimated survival for each main cause of death groups through Fine and Gray models.182 deaths were found [14.0/1000 person-years of follow-up (py; 95% confidence interval (CI:12.0-16.1/1000 py], 81.3% of them had a known cause of death. Mortality rate by HIV-related causes and non-HIV-related causes was the same (4.9/1000 py; CI:3.7-6.1/1000 py, external was lower [1.7/1000 py; (1.0-2.4/1000 py].Kaplan-Meier estimate showed worse survival in intravenous drug user (IDU and heterosexuals than in men having sex with men (MSM. Factors associated with HIV-related causes of death include: IDU male (subHazard Ratio (sHR:3.2; CI:1.5-7.0 and <200 CD4 at diagnosis (sHR:2.7; CI:1.3-5.7 versus ≥500 CD4. Factors associated with non-HIV-related causes of death include: ageing (sHR:1.5; CI:1.4-1.7 and heterosexual female (sHR:2.8; CI:1.1-7.3 versus MSM. Factors associated with external causes of death were IDU male (sHR:28.7; CI:6.7-123.2 and heterosexual male (sHR:11.8; CI:2.5-56.4 versus MSM.There are important differences in survival among transmission groups. Improved treatment is especially necessary in IDUs and heterosexual males.

  20. Whole abdominal irradiation in ovarian carcinoma

    International Nuclear Information System (INIS)

    Romestaing, P.; Gallo, C.; Gerard, J.F.; Ardiet, J.M.; Carrie, C.

    1989-01-01

    The prognosis of ovarian cancers, which are frequently diagnosed at a late stage, can probably be improved by whole abdominal radiotherapy. 45 patients in Lyon and 8 patients in Montelimar (7 stage I or C, 10 stage II and 36 stage III) were treated by whole abdominal radiotherapy, generally after 6 courses of chemotherapy (46 cases). The overall 5-year survival of this group of patients was 48% (Kaplan-Meier method). When the patients treated by complete resection at 1st look surgery (19 cases) are compared with those in whom 1st look surgery was incomplete (34 cases), the actuarial survival was 83% versus 27%. This study demonstrates that whole abdominal radiotherapy is feasible without any serious long-term complications after two operations and 6 courses of chemotherapy. These encouraging results need to be confirmed by randomized prospective studies [fr

  1. Failure rate of cemented and uncemented total hip replacements

    DEFF Research Database (Denmark)

    Makela, K. T.; Matilainen, M.; Pulkkinen, P.

    2014-01-01

    ). Participants 347 899 total hip replacements performed during 1995-2011. Main outcome measures Probability of implant survival (Kaplan-Meier analysis) along with implant survival with revision for any reason as endpoint (Cox multiple regression) adjusted for age, sex, and diagnosis in age groups 55-64, 65......Objective To assess the failure rate of cemented, uncemented, hybrid, and reverse hybrid total hip replacements in patients aged 55 years or older. Design Register study. Setting Nordic Arthroplasty Register Association database (combined data from Sweden, Norway, Denmark, and Finland......-74, and 75 years or older. Results The proportion of total hip replacements using uncemented implants increased rapidly towards the end of the study period. The 10 year survival of cemented implants in patients aged 65 to 74 and 75 or older (93.8%, 95% confidence interval 93.6% to 94.0% and 95.9%, 95...

  2. A method to combine hydrodynamics and constructive design in the optimization of the runner blades of Kaplan turbines

    International Nuclear Information System (INIS)

    Miclosina, C O; Balint, D I; Campian, C V; Frunzaverde, D; Ion, I

    2012-01-01

    This paper deals with the optimization of the axial hydraulic turbines of Kaplan type. The optimization of the runner blade is presented systematically from two points of view: hydrodynamic and constructive. Combining these aspects in order to gain a safer operation when unsteady effects occur in the runner of the turbine is attempted. The design and optimization of the runner blade is performed with QTurbo3D software developed at the Center for Research in Hydraulics, Automation and Thermal Processes (CCHAPT) from 'Eftimie Murgu' University of Resita, Romania. QTurbo3D software offers possibilities to design the meridian channel of hydraulic turbines design the blades and optimize the runner blade. 3D modeling and motion analysis of the runner blade operating mechanism are accomplished using SolidWorks software. The purpose of motion study is to obtain forces, torques or stresses in the runner blade operating mechanism, necessary to estimate its lifetime. This paper clearly states the importance of combining the hydrodynamics with the structural design in the optimization procedure of the runner of hydraulic turbines.

  3. A method to combine hydrodynamics and constructive design in the optimization of the runner blades of Kaplan turbines

    Science.gov (United States)

    Miclosina, C. O.; Balint, D. I.; Campian, C. V.; Frunzaverde, D.; Ion, I.

    2012-11-01

    This paper deals with the optimization of the axial hydraulic turbines of Kaplan type. The optimization of the runner blade is presented systematically from two points of view: hydrodynamic and constructive. Combining these aspects in order to gain a safer operation when unsteady effects occur in the runner of the turbine is attempted. The design and optimization of the runner blade is performed with QTurbo3D software developed at the Center for Research in Hydraulics, Automation and Thermal Processes (CCHAPT) from "Eftimie Murgu" University of Resita, Romania. QTurbo3D software offers possibilities to design the meridian channel of hydraulic turbines design the blades and optimize the runner blade. 3D modeling and motion analysis of the runner blade operating mechanism are accomplished using SolidWorks software. The purpose of motion study is to obtain forces, torques or stresses in the runner blade operating mechanism, necessary to estimate its lifetime. This paper clearly states the importance of combining the hydrodynamics with the structural design in the optimization procedure of the runner of hydraulic turbines.

  4. Infant mortality in a very low birth weight cohort from a public hospital in Rio de Janeiro, RJ, Brazil

    Directory of Open Access Journals (Sweden)

    Regina Coeli Azeredo Cardoso

    2013-09-01

    Full Text Available OBJECTIVES: to evaluate infant mortality in very low birth weight newborns from a public hospital in Rio de Janeiro, Brazil (2002-2006. METHODS: a retrospective cohort study was performed using the probabilistic linkage method to identify infant mortality. Mortality proportions were calculated according to birth weight intervals and period of death. The Kaplan-Meier method was used to estimate overall cumulative survival probability. The association between maternal schooling and survival of very low birth weight infants was evaluated by means of Cox proportional hazard models adjusted for: prenatal care, birth weight, and gestational age. RESULTS: the study included 782 very low birth weight newborns. Of these, (28.6% died before one year of age. Neonatal mortality was 19.5%, and earlyneonatal mortality was 14.9%. Mortality was highest in the lowest weight group (71.6%. Newborns whose mothers had less than four years of schooling had 2.5 times higher risk of death than those whose mothers had eight years of schooling or more, even after adjusting for intermediate factors. CONCLUSIONS: the results showed higher mortality among very low birth weight infants. Low schooling was an independent predictor of infant death in this low-income population sample.

  5. Longevity of anterior resin-bonded bridges: survival rates of two tooth preparation designs.

    Science.gov (United States)

    Abuzar, M; Locke, J; Burt, G; Clausen, G; Escobar, K

    2018-04-16

    Significant developments have occurred in the design of resin-bonded bridges (RBB) over the past two decades. They are commonly used as an alternative treatment option for a single missing tooth. The longevity of these bridges needs to be further investigated to evaluate long-term outcomes for this option to remain relevant. A cohort of patients who received anterior resin-bonded bridges (ARBB) over two decades was studied retrospectively. Longevity of 206 ARBB was assessed using Kaplan-Meier probability estimates. The two modified tooth preparation designs investigated were: (A) mesial and distal vertical grooves only; and (B) one proximal groove adjacent to the pontic and two palatal grooves. Age and gender of the patient cohort were also recorded. Overall survival rate of ARBB was 98% at 5 years, 97.2% at 10 years, and 95.1% from 12 years till 21 years. Survival curves showed minor differences when compared for the two designs, age groups and gender of ARBB recipients. Differences in the proportion of surviving bridges for design A (95.96%) and design B (98.13%) were not statistically significant (Fisher's exact test). Anterior RBB with described tooth preparation designs demonstrate a high survival rate. © 2018 Australian Dental Association.

  6. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    Science.gov (United States)

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. Pressure relieving support surfaces (PRESSURE) trial: cost effectiveness analysis.

    Science.gov (United States)

    Iglesias, Cynthia; Nixon, Jane; Cranny, Gillian; Nelson, E Andrea; Hawkins, Kim; Phillips, Angela; Torgerson, David; Mason, Su; Cullum, Nicky

    2006-06-17

    To assess the cost effectiveness of alternating pressure mattresses compared with alternating pressure overlays for the prevention of pressure ulcers in patients admitted to hospital. Cost effectiveness analysis carried out alongside the pressure relieving support surfaces (PRESSURE) trial; a multicentre UK based pragmatic randomised controlled trial. 11 hospitals in six UK NHS trusts. Intention to treat population comprising 1971 participants. Kaplan Meier estimates of restricted mean time to development of pressure ulcers and total costs for treatment in hospital. Alternating pressure mattresses were associated with lower overall costs (283.6 pounds sterling per patient on average, 95% confidence interval--377.59 pounds sterling to 976.79 pounds sterling) mainly due to reduced length of stay in hospital, and greater benefits (a delay in time to ulceration of 10.64 days on average,--24.40 to 3.09). The differences in health benefits and total costs for hospital stay between alternating pressure mattresses and alternating pressure overlays were not statistically significant; however, a cost effectiveness acceptability curve indicated that on average alternating pressure mattresses compared with alternating pressure overlays were associated with an 80% probability of being cost saving. Alternating pressure mattresses for the prevention of pressure ulcers are more likely to be cost effective and are more acceptable to patients than alternating pressure overlays.

  8. The significant impact of acute kidney injury on CKD in patients who survived over 10 years after myeloablative allogeneic SCT.

    Science.gov (United States)

    Shimoi, T; Ando, M; Munakata, W; Kobayashi, T; Kakihana, K; Ohashi, K; Akiyama, H; Sakamaki, H

    2013-01-01

    There are no well-defined studies of chronic kidney disease (CKD) among long-term survivors after hematopoietic SCT. A retrospective longitudinal study was conducted to characterize CKD in 77 subjects that had undergone myeloablative allogeneic SCT, all of whom had their serum creatinine (Cr) levels followed-up during the 10-year period after SCT. Their mean (range) survival time was 14.4 (10.5-20.2) years. CKD was defined as a persistent decrease in the Cr-based estimated glomerular filtration rate to below 60 mL/min/1.73 m². Acute kidney injury (AKI) was defined as an increase in Cr within the first 100 days after SCT, and its severity was classified into three stages according to the AKIN criteria. Kaplan-Meier and Cox proportional hazards regression analyses evaluated the association between AKI and the incidence of CKD. The cumulative incidence of CKD increased over time and reached 34% at 10 years. After adjusting for known risks for post-SCT CKD, each AKIN stage was strongly associated with the incidence of CKD. The incidence of CKD probably increases over time among subjects who are alive at >10 years after SCT. This study places a new emphasis on AKI as an important risk factor for CKD in post-SCT subjects.

  9. Axial U(1) current in Grabowska and Kaplan's formulation

    Science.gov (United States)

    Hamada, Yu; Kawai, Hikaru

    2017-06-01

    Recently, Grabowska and Kaplan [Phys. Rev. Lett. 116, 211602 (2016); Phys. Rev. D 94, 114504 (2016)] suggested a nonperturbative formulation of a chiral gauge theory, which consists of the conventional domain-wall fermion and a gauge field that evolves by gradient flow from one domain wall to the other. We introduce two sets of domain-wall fermions belonging to complex conjugate representations so that the effective theory is a 4D vector-like gauge theory. Then, as a natural definition of the axial-vector current, we consider a current that generates simultaneous phase transformations for the massless modes in 4 dimensions. However, this current is exactly conserved and does not reproduce the correct anomaly. In order to investigate this point precisely, we consider the mechanism of the conservation. We find that this current includes not only the axial current on the domain wall but also a contribution from the bulk, which is nonlocal in the sense of 4D fields. Therefore, the local current is obtained by subtracting the bulk contribution from it.

  10. Coupled skinny baker's maps and the Kaplan-Yorke conjecture

    Science.gov (United States)

    Gröger, Maik; Hunt, Brian R.

    2013-09-01

    The Kaplan-Yorke conjecture states that for ‘typical’ dynamical systems with a physical measure, the information dimension and the Lyapunov dimension coincide. We explore this conjecture in a neighborhood of a system for which the two dimensions do not coincide because the system consists of two uncoupled subsystems. We are interested in whether coupling ‘typically’ restores the equality of the dimensions. The particular subsystems we consider are skinny baker's maps, and we consider uni-directional coupling. For coupling in one of the possible directions, we prove that the dimensions coincide for a prevalent set of coupling functions, but for coupling in the other direction we show that the dimensions remain unequal for all coupling functions. We conjecture that the dimensions prevalently coincide for bi-directional coupling. On the other hand, we conjecture that the phenomenon we observe for a particular class of systems with uni-directional coupling, where the information and Lyapunov dimensions differ robustly, occurs more generally for many classes of uni-directionally coupled systems (also called skew-product systems) in higher dimensions.

  11. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  12. Estimation of long-term probabilities for inadvertent intrusion into radioactive waste management areas

    International Nuclear Information System (INIS)

    Eedy, W.; Hart, D.

    1988-05-01

    The risk to human health from radioactive waste management sites can be calculated as the product of the probability of accidental exposure (intrusion) times the probability of a health effect from such exposure. This report reviews the literature and evaluates methods used to predict the probabilities for unintentional intrusion into radioactive waste management areas in Canada over a 10,000-year period. Methods to predict such probabilities are available. They generally assume a long-term stability in terms of existing resource uses and society in the management area. The major potential for errors results from the unlikeliness of these assumptions holding true over such lengthy periods of prediction

  13. Comparative studies of parameters based on the most probable versus an approximate linear extrapolation distance estimates for circular cylindrical absorbing rod

    International Nuclear Information System (INIS)

    Wassef, W.A.

    1982-01-01

    Estimates and techniques that are valid to calculate the linear extrapolation distance for an infinitely long circular cylindrical absorbing region are reviewed. Two estimates, in particular, are put into consideration, that is the most probable and the value resulting from an approximate technique based on matching the integral transport equation inside the absorber with the diffusion approximation in the surrounding infinite scattering medium. Consequently, the effective diffusion parameters and the blackness of the cylinder are derived and subjected to comparative studies. A computer code is set up to calculate and compare the different parameters, which is useful in reactor analysis and serves to establish a beneficial estimates that are amenable to direct application to reactor design codes

  14. LINEAR REGRESSION MODEL ESTİMATİON FOR RIGHT CENSORED DATA

    Directory of Open Access Journals (Sweden)

    Ersin Yılmaz

    2016-05-01

    Full Text Available In this study, firstly we will define a right censored data. If we say shortly right-censored data is censoring values that above the exact line. This may be related with scaling device. And then  we will use response variable acquainted from right-censored explanatory variables. Then the linear regression model will be estimated. For censored data’s existence, Kaplan-Meier weights will be used for  the estimation of the model. With the weights regression model  will be consistent and unbiased with that.   And also there is a method for the censored data that is a semi parametric regression and this method also give  useful results  for censored data too. This study also might be useful for the health studies because of the censored data used in medical issues generally.

  15. Infants with prenatally diagnosed kidney anomalies have an increased risk of urinary tract infections

    DEFF Research Database (Denmark)

    Rasmussen, Maria; Sunde, Lone; Andersen, René F

    2017-01-01

    AIM: This study estimated the urinary tract infection (UTI) risk in a nationwide cohort of infants prenatally diagnosed with parenchymal kidney anomalies compared with a comparison cohort. METHODS: A Danish population-based nationwide cohort of foetuses diagnosed with parenchymal kidney anomalies...... between 2007 and 2012 had previously been identified. These were compared with foetuses without kidney anomalies who were prenatally scanned the same year. Live born infants were followed from birth until the diagnosis of UTI, emigration, death or two years of age. Cumulative incidences of UTIs were...... computed. Mortality was estimated using the Kaplan-Meier method. RESULTS: We identified 412 foetuses with parenchymal kidney anomalies out of 362 069 who underwent ultrasound scans and 277 were born alive. The overall risk of a UTI before the age of two years was 19%, and it was 14% among infants without...

  16. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  17. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  18. The Factor Structure and Age-Related Factorial Invariance of the Delis-Kaplan Executive Function System (D-KEFS)

    Science.gov (United States)

    Latzman, Robert D.; Markon, Kristian E.

    2010-01-01

    There has been an increased interest in the structure of and relations among executive functions.The present study examined the factor structure as well as age-related factorial invariance of the Delis-Kaplan Executive Function System (D-KEFS), a widely used inventory aimed at assessing executive functions. Analyses were first conducted using data…

  19. Influence of the level of fit of a density probability function to wind-speed data on the WECS mean power output estimation

    Energy Technology Data Exchange (ETDEWEB)

    Carta, Jose A. [Department of Mechanical Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Las Palmas de Gran Canaria, Canary Islands (Spain); Ramirez, Penelope; Velazquez, Sergio [Department of Renewable Energies, Technological Institute of the Canary Islands, Pozo Izquierdo Beach s/n, 35119 Santa Lucia, Gran Canaria, Canary Islands (Spain)

    2008-10-15

    Static methods which are based on statistical techniques to estimate the mean power output of a WECS (wind energy conversion system) have been widely employed in the scientific literature related to wind energy. In the static method which we use in this paper, for a given wind regime probability distribution function and a known WECS power curve, the mean power output of a WECS is obtained by resolving the integral, usually using numerical evaluation techniques, of the product of these two functions. In this paper an analysis is made of the influence of the level of fit between an empirical probability density function of a sample of wind speeds and the probability density function of the adjusted theoretical model on the relative error {epsilon} made in the estimation of the mean annual power output of a WECS. The mean power output calculated through the use of a quasi-dynamic or chronological method, that is to say using time-series of wind speed data and the power versus wind speed characteristic of the wind turbine, serves as the reference. The suitability of the distributions is judged from the adjusted R{sup 2} statistic (R{sub a}{sup 2}). Hourly mean wind speeds recorded at 16 weather stations located in the Canarian Archipelago, an extensive catalogue of wind-speed probability models and two wind turbines of 330 and 800 kW rated power are used in this paper. Among the general conclusions obtained, the following can be pointed out: (a) that the R{sub a}{sup 2} statistic might be useful as an initial gross indicator of the relative error made in the mean annual power output estimation of a WECS when a probabilistic method is employed; (b) the relative errors tend to decrease, in accordance with a trend line defined by a second-order polynomial, as R{sub a}{sup 2} increases. (author)

  20. Intraoperative Radiation Therapy for Locally Advanced and Recurrent Soft-Tissue Sarcomas in Adults

    International Nuclear Information System (INIS)

    Tran, Phuoc T.; Hara, Wendy; Su Zheng; Lin, H. Jill; Bendapudi, Pavan K.; Norton, Jeffrey; Teng, Nelson; King, Christopher R.; Kapp, Daniel S.

    2008-01-01

    Purpose: To analyze the outcomes of and identify prognostic factors for patients treated with surgery and intraoperative radiotherapy (IORT) for locally advanced and recurrent soft-tissue sarcoma in adults from a single institution. Methods and Materials: We retrospectively reviewed 50 consecutive patients treated with IORT to 62 sites of disease. Primary sites included retroperitoneum-pelvis (78%), extremity (8%), and other (14%). Seventy percent of patients had recurrent disease failing prior surgery (70%) and/or radiation (32%). Mean disease-free interval (DFI) before IORT was 1.9 years (range, 2 weeks-5.4 years). The IORT was delivered with orthovoltage X-rays using individually sized beveled cone applicators. Clinical characteristics were as follows: mean tumor size, 10 cm (range, 1-25 cm); high-grade histologic subtype (72%); and mean dose, 1,159 cGy (range, 600-1,600 cGy). Postoperative radiation or chemotherapy was administered to 37% of IORT Sites and 32% of patients, respectively. Outcomes measured were infield control (IFC), locoregional control (LRC), distant metastasis-free survival (DMFS), disease-specific survival (DSS), and treatment-related complications. Mean and median follow-up of alive patients were 59 and 35 months, respectively. Results: Kaplan-Meier 5-year IFC, LRC, DMFS, and DSS probabilities for the entire group were 55%, 26%, 51%, and 25%, respectively. Prognostic factors found to be significant (p < 0.05) on multivariate analysis were prior DFI and tumor size for LRC, extremity location and leiomyosarcoma histologic subtype for DMFS, and prior DFI for DSS. Our cohort had five Grade 3/4 complications associated with treatment or a 5-year Kaplan-Meier Grade 3/4 complication-free survival rate of 85%. Conclusions: IORT after tumor reductive surgery is well tolerated and seems to confer IFC in carefully selected patients

  1. Birth interval and its predictors among married women in Dabat ...

    African Journals Online (AJOL)

    2008-12-30

    Birth intervals (time between two successive live births) if short are associated with diverse complications. We assessed birth interval and its predictors among 613 married women who gave birth from January 1 to December 30, 2008. Data were collected in April 2012. Life table and Kaplan-Meier curve were used to ...

  2. Pressure pulsation in Kaplan turbines: Prototype-CFD comparison

    International Nuclear Information System (INIS)

    Rivetti, A; Lucino, C; Liscia, S; Muguerza, D; Avellan, F

    2012-01-01

    Pressure pulsation phenomena in a large Kaplan turbine are investigated by means of numerical simulations (CFD) and prototype measurements in order to study the dynamic behavior of flow due to the blade passage and its interaction with other components of the turbine. Numerical simulations are performed with the commercial software Ansys CFX code, solving the incompressible Unsteady Reynolds-Averaged-Navier Stokes equations under a finite volume scheme. The computational domain involves the entire machine at prototype scale. Special care is taken in the discretization of the wicket gate overhang and runner blade gap. Prototype measurements are performed using pressure transducers at different locations among the wicket gate outlet and the draft tube inlet. Then, CFD results are compared with temporary signals of prototype measurements at identical locations to validate the numerical model. A detailed analysis was focused on the tip gap flow and the pressure field at the discharge ring. From a rotating reference frame perspective, it is found that the mean pressure fluctuates accordingly the wicket gate passage. Moreover, in prototype measurements the pressure frequency that reveals the presence of modulated cavitation at the discharge ring is distinguished, as also verified from the shape of erosion patches in concordance with the number of wicket gates.

  3. Pressure pulsation in Kaplan turbines: Prototype-CFD comparison

    Science.gov (United States)

    Rivetti, A.; Lucino1, C.; Liscia, S.; Muguerza, D.; Avellan, F.

    2012-11-01

    Pressure pulsation phenomena in a large Kaplan turbine are investigated by means of numerical simulations (CFD) and prototype measurements in order to study the dynamic behavior of flow due to the blade passage and its interaction with other components of the turbine. Numerical simulations are performed with the commercial software Ansys CFX code, solving the incompressible Unsteady Reynolds-Averaged-Navier Stokes equations under a finite volume scheme. The computational domain involves the entire machine at prototype scale. Special care is taken in the discretization of the wicket gate overhang and runner blade gap. Prototype measurements are performed using pressure transducers at different locations among the wicket gate outlet and the draft tube inlet. Then, CFD results are compared with temporary signals of prototype measurements at identical locations to validate the numerical model. A detailed analysis was focused on the tip gap flow and the pressure field at the discharge ring. From a rotating reference frame perspective, it is found that the mean pressure fluctuates accordingly the wicket gate passage. Moreover, in prototype measurements the pressure frequency that reveals the presence of modulated cavitation at the discharge ring is distinguished, as also verified from the shape of erosion patches in concordance with the number of wicket gates.

  4. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  5. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  6. Socialization into a Civilization: The Dewey-Kaplan Synthesis in American Jewish Schooling in the Early 20th Century

    Science.gov (United States)

    Jacobs, Benjamin M.

    2009-01-01

    This historical study focuses on how John Dewey's theory of education as socialization and Mordecai Kaplan's theory of Judaism as a civilization together served as an ideological base and pedagogical framework for the creation of "progressive," "reconstructed" American Jewish school programs in the early 20th century…

  7. Improved Variable Window Kernel Estimates of Probability Densities

    OpenAIRE

    Hall, Peter; Hu, Tien Chung; Marron, J. S.

    1995-01-01

    Variable window width kernel density estimators, with the width varying proportionally to the square root of the density, have been thought to have superior asymptotic properties. The rate of convergence has been claimed to be as good as those typical for higher-order kernels, which makes the variable width estimators more attractive because no adjustment is needed to handle the negativity usually entailed by the latter. However, in a recent paper, Terrell and Scott show that these results ca...

  8. Findings from a comprehensive diarrhoea prevention and treatment programme in Lusaka, Zambia

    Directory of Open Access Journals (Sweden)

    Samuel Bosomprah

    2016-06-01

    Full Text Available Abstract Background The Programme for the Awareness and Elimination of Diarrhoea (PAED was a pilot comprehensive diarrhoea prevention and control programme aimed to reduce post-neonatal, all-cause under-five mortality by 15 % in Lusaka Province. Interventions included introduction of the rotavirus vaccine, improved clinical case management of diarrhoea, and a comprehensive community prevention and advocacy campaign on hand washing with soap, exclusive breastfeeding up to 6 months of age, and the use of ORS and Zinc. This study aimed to assess the impact of PAED on under-5 mortality. Methods The study was a pre-post evaluation design. The Demographic and Health Survey style population-based two-stage approach was used to collect data at the beginning of the intervention and 3 years following the start of intervention implementation in Lusaka province. The primary outcome of interest was an all-cause, post-neonatal under-five mortality rate defined as the probability of dying after the 28th day and before the fifth birthday among children aged 1–59 months. The Kaplan-Meier time to event analysis was used to estimate the probability of death; multiplying this probability by 1000 to yield the post-neonatal mortality rate. Survival-time inverse probability weighting model was used to estimate Average Treatment Effect (ATE. Results The percentage of children under age 5 who had diarrhoea in the last 2 weeks preceding the survey declined from 15.8 % (95 % CI: 15.2 %, 16.4 % in 2012 to 12.7 % (95 % CI: 12.3 %, 13.2 % in 2015. Over the same period, mortality in post-neonatal children under 5 years of age declined by 34 %, from an estimated rate of 29 deaths per 1000 live births (95 % CI: (26, 32 death per 1000 live births to 19 deaths per 1000 live births (95 % CI: (16, 21 death per 1000 live births. When every child in the population of children aged 1–59 months is exposed to the intervention, the average time-to-death was estimated to

  9. All-cause mortality and multimorbidity in older adults: The role of social support and loneliness.

    Science.gov (United States)

    Olaya, Beatriz; Domènech-Abella, Joan; Moneta, Maria Victoria; Lara, Elvira; Caballero, Francisco Félix; Rico-Uribe, Laura Alejandra; Haro, Josep Maria

    2017-12-01

    To determine whether the effect of multimorbidity on time to mortality is modified by level of social support and loneliness in a representative sample of 2113 participants aged 60+. Vital status was ascertained through national registers or by asking participants' relatives. Baseline variables included number of illnesses, self-perceived social support (Oslo social support scale) and loneliness (UCLA loneliness scale). Kaplan-Meier survival curves were used to estimate the time to death by multimorbidity, social support and loneliness. Adjusted cox proportional hazards regression models were conducted to explore interactions between multimorbidity and social support and loneliness. Multimorbidity was associated with low probability of survival, whereas high loneliness and low social support were not related with time to death. Only the interaction multimorbidity∗social support was significant. Participants with low social support and 2 chronic diseases, compared with none, presented lower probability of survival (HR=2.43, 95%CI=1.14-5.18, psocial support. For participants with low social support, there were no differences between having one, two or more than two diseases. When there is high social support, the probability of death is significantly lower if one or two chronic diseases are present, compared with more than two. These findings indicate that having a supportive social environment increases the survival of people with physical illnesses, especially those with one or two. For those with more than two illnesses, survival remains unchanged regardless of the level of social support and other protective factors should be explored in future research. Geriatric health professionals are encouraged to evaluate social relationships and stimulate support given by relatives, friends or neighbors. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Adaptive estimation of binomial probabilities under misclassification

    NARCIS (Netherlands)

    Albers, Willem/Wim; Veldman, H.J.

    1984-01-01

    If misclassification occurs the standard binomial estimator is usually seriously biased. It is known that an improvement can be achieved by using more than one observer in classifying the sample elements. Here it will be investigated which number of observers is optimal given the total number of

  11. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  12. Efficacy, outcomes, and cost-effectiveness of desensitization using IVIG and rituximab.

    Science.gov (United States)

    Vo, Ashley A; Petrozzino, Jeffrey; Yeung, Kai; Sinha, Aditi; Kahwaji, Joseph; Peng, Alice; Villicana, Rafael; Mackowiak, John; Jordan, Stanley C

    2013-03-27

    Transplantation rates are very low for the broadly sensitized patient (panel reactive antibody [PRA]>80%; HS). Here, we examine the efficacy, outcomes, and cost-effectiveness of desensitization using high-dose intravenous immunoglobulin (IVIG) and rituximab to improve transplantation rates in HS patients. From July 2006 to December 2011, 207 HS (56 living donors/151 deceased donors) patients (donor-specific antibody positive, PRA>80%) were desensitized using IVIG and rituximab. After desensitization, responsive patients proceeded to transplantation with an acceptable crossmatch. Cost and outcomes of desensitization were compared with dialysis. Of the 207 treated patients, 146 (71%) were transplanted. At 48 months, patient and graft survival by Kaplan-Meier were 95% and 87.5%, respectively. The total 3-year cost for patients treated in the desensitization arm was $219,914 per patient compared with $238,667 per patient treated in the dialysis arm. Thus, each patient treated with desensitization is estimated to save the U.S. healthcare system $18,753 in 2011 USD. Overall, estimated patient survival at the end of 3 years was 96.6% for patients in the desensitization arm of the model (based on Cedars-Sinai survival rate) compared with 79.0% for an age, end-stage renal disease etiology, and PRA matched group of patients remaining on dialysis during the study period. We conclude that desensitization with IVIG+rituximab is clinically and cost-effective, with both financial savings and an estimated 17.6% greater probability of 3-year survival associated with desensitization versus dialysis alone. However, the benefits of desensitization and transplantation are limited by organ availability and allocation policies.

  13. Role of IKK-alpha in the EGFR Signaling Regulation

    Science.gov (United States)

    2014-09-01

    expression. (E) Kaplan -Meier overall survival curves of IKKin breast cancer patient data set. 16 Reference 1. Lo, H. W., Xia, W., Wei, Y...150- 100- 50- 150- 100- Twist1 AKT1-p AKT-T C 250- 150- 100- N-cad P1 P2 P3 TGFβ - - - + + - + + - - + - - + + + - - + - - + + + 50- Vimentin 150- 100

  14. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  15. Protective Effect of Hydroxychloroquine on Renal Damage in Patients with Lupus Nephritis: Data from LUMINA, a Multiethnic U.S. Cohort

    Science.gov (United States)

    Pons-Estel, Guillermo J.; Alarcón, Graciela S.; McGwin, Gerald; Danila, Maria I.; Zhang, Jie; Bastian, Holly M.; Reveille, John D.; Vilá, Luis M.

    2010-01-01

    Objective To assess if hydroxychloroquine can delay renal damage development in lupus nephritis patients. Methods Lupus nephritis patients (n=256) from LUMINA (n=635), a multiethnic cohort of African Americans, Hispanics and Caucasians, age ≥16 years, disease duration ≤5 years at baseline (T0) were studied. Renal damage was defined per the SLICC Damage Index (≥1 of the following lasting at least six months: estimated/measured glomerular filtration rate hydroxychloroquine use and renal damage (as defined, or omitting proteinuria) was estimated using Cox proportional regression analyses adjusting for potentially confounders. Kaplan-Meier survival curves based on hydroxychloroquine intake or World Health Organization (WHO) Class glomerulonephritis were also derived. Results Sixty-three (31.0%) of 203 patients developed renal damage over a mean (standard deviation) disease duration of 5.2 (3.5) years. The most frequent renal damage domain item was proteinuria. Hydroxychloroquine-takers (79.3%) exhibited a lower frequency of WHO Class IV glomerulonephritis, lower disease activity and received lower glucocorticoid doses than non-takers. After adjusting for confounders, hydroxychloroquine was protective of renal damage occurrence in full (HR=0.12; 95% CI 0.02-0.97; p=0.0464) and reduced (HR=0.29; 95%CI 0.13-0.68; p=0.0043) models. Omitting proteinuria provided comparable results. The cumulative probability of renal damage occurrence was higher in hydroxychloroquine non-takers and in WHO Class IV glomerulonephritis (phydroxychloroquine in retarding renal damage occurrence in SLE is still evident. PMID:19479701

  16. Duration of Psoriatic Skin Disease as Risk Factor for Subsequent Onset of Psoriatic Arthritis

    DEFF Research Database (Denmark)

    Egeberg, Alexander; Skov, Lone; Zachariae, Claus

    2018-01-01

    It is unclear whether psoriasis is a progressive disease that requires early aggressive intervention. This population-based study identified patients with psoriasis and psoriatic arthritis (PsA). Survival analysis and Kaplan-Meier life table techniques were used. The study comprised 10,011 psoria......It is unclear whether psoriasis is a progressive disease that requires early aggressive intervention. This population-based study identified patients with psoriasis and psoriatic arthritis (PsA). Survival analysis and Kaplan-Meier life table techniques were used. The study comprised 10......,011 psoriasis patients (severe n = 4,618), and 1,269 patients also had PsA. Incidence of PsA increased with duration of cutaneous symptoms (p = 0.0001). Psoriasis diagnosed before age 20 or 30 years, respectively, suggested a lower risk of PsA than psoriasis diagnosed after age 50 years, yet age at first...... cutaneous symptoms did not predict development of PsA. No clear association with disease severity was found. PsA incidence appeared stable with longer duration of psoriasis, but further data are needed to firmly establish the relationship with age of psoriasis onset....

  17. Maternal smoking during pregnancy and the risk of pediatric cardiovascular diseases of the offspring: A population-based cohort study with up to 18-years of follow up.

    Science.gov (United States)

    Leybovitz-Haleluya, Noa; Wainstock, Tamar; Landau, Daniella; Sheiner, Eyal

    2018-06-01

    Cigarette smoke is a well-known reproductive toxicant. We aimed to study the long-term effect of cigarette smoking during pregnancy on the risk for childhood cardiovascular morbidity of the offspring. A population-based cohort analysis was performed comparing total and subtypes of cardiovascular related pediatric hospitalizations among offspring of smoking mothers versus offspring of non-smoking mothers. The analysis included all singletons born between the years 1999-2014.A Kaplan-Meier survival curve was used to compare the cumulative cardiovascular morbidity, and a Cox proportional hazards model was constructed to adjust for confounders. The study population included 242,342 newborns which met inclusion criteria; among them 2861 were born to smoking mothers. Offspring of smoking mothers had higher rates of cardiovascular-related hospitalizations (1.3% vs. 0.6%, OR 2.1, 95% CI 1.5-2.9; p < 0.001; Kaplan-Meier log-rank test p < 0.001). Smoking exposure during pregnancy is associated with an increased risk for long-term pediatric cardiovascular morbidity of the offspring. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Estimation of Lifetime Duration for a Lever Pin of Runner Blade Operating Mechanism using a Graphic – analytic Method

    Directory of Open Access Journals (Sweden)

    Ana-Maria Budai

    2015-09-01

    Full Text Available In this paper are presented a graphic - analytic method that can be used to estimate the fatigue lifetime duration for an operating mechanism lever pin to a Kaplan turbine. The presented calculus algorithm is adapted from the one used by Fuji Electric to made strength calculus in order to refurbish a Romanian hydropower plant, equipped with a Kaplan turbine. The graphic part includes a 3D fatigue diagram for rotating bending stress designed by Fuji Electric specialists.

  19. Exposure-response relationship of regorafenib efficacy in patients with hepatocellular carcinoma.

    Science.gov (United States)

    Solms, Alexander; Reinecke, Isabel; Fiala-Buskies, Sabine; Keunecke, Anne; Drenth, Henk-Jan; Bruix, Jordi; Meinhardt, Gerold; Cleton, Adriaan; Ploeger, Bart

    2017-11-15

    To explore the relationship between regorafenib exposure and efficacy in patients with hepatocellular carcinoma (HCC) who had disease progression during sorafenib treatment (RESORCE). Exposure-response (ER) analyses for regorafenib were performed using data from a phase 3, randomized, placebo-controlled trial (RESORCE). Patients received 160mg regorafenib or placebo once daily (3weeks on/1week off in a 4-week cycle) with best supportive care until disease progression, death, or unacceptable toxicity. Kaplan-Meier analyses for overall survival (OS) and time-to-progression (TTP) were performed in which regorafenib-treated patients were grouped into four categories according to their estimated average exposure over 4weeks in cycle 1. While this analysis primarily focused on efficacy, a potential correlation between exposure and treatment-emergent adverse events (TEAEs) was also evaluated. If any differences were observed between Kaplan-Meier plots, the ER analysis continued with a multivariate Cox regression analysis to evaluate the correlation between exposure quartile categories and the efficacy and safety parameters while taking into consideration the effect of the predefined clinically relevant demographic and baseline covariates. The functional form of the ER relationship within the regorafenib treatment group was subsequently evaluated. Based on visual assessment of the Kaplan-Meier plots, no meaningful relationship between the exposure categories and TEAEs were observed, although median OS and TTP tended to be longer in the higher exposure categories. Further ER analyses, which considered the effects of predefined covariates and the different shapes of the ER relationship, focused on efficacy. The baseline risk factors Eastern Cooperative Oncology Group (ECOG) performance status ≥1, alpha-fetoprotein levels ≥400ng/ml, and aspartate transaminase or alanine transaminase levels >3×upper limit of normal were significantly associated with OS (Pregorafenib

  20. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.