WorldWideScience

Sample records for survival analysis techniques

  1. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  2. Survival analysis

    International Nuclear Information System (INIS)

    Badwe, R.A.

    1999-01-01

    The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model

  3. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  4. Use of a Survival Analysis Technique in Understanding Game Performance in Instructional Games. CRESST Report 812

    Science.gov (United States)

    Kim, Jinok; Chung, Gregory K. W. K.

    2012-01-01

    In this study we compared the effects of two math game designs on math and game performance, using discrete-time survival analysis (DTSA) to model players' risk of not advancing to the next level in the game. 137 students were randomly assigned to two game conditions. The game covered the concept of a unit and the addition of like-sized fractional…

  5. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  6. Development and validation of technique for in-vivo 3D analysis of cranial bone graft survival

    Science.gov (United States)

    Bernstein, Mark P.; Caldwell, Curtis B.; Antonyshyn, Oleh M.; Ma, Karen; Cooper, Perry W.; Ehrlich, Lisa E.

    1997-05-01

    Bone autografts are routinely employed in the reconstruction of facial deformities resulting from trauma, tumor ablation or congenital malformations. The combined use of post- operative 3D CT and SPECT imaging provides a means for quantitative in vivo evaluation of bone graft volume and osteoblastic activity. The specific objectives of this study were: (1) Determine the reliability and accuracy of interactive computer-assisted analysis of bone graft volumes based on 3D CT scans; (2) Determine the error in CT/SPECT multimodality image registration; (3) Determine the error in SPECT/SPECT image registration; and (4) Determine the reliability and accuracy of CT-guided SPECT uptake measurements in cranial bone grafts. Five human cadaver heads served as anthropomorphic models for all experiments. Four cranial defects were created in each specimen with inlay and onlay split skull bone grafts and reconstructed to skull and malar recipient sites. To acquire all images, each specimen was CT scanned and coated with Technetium doped paint. For purposes of validation, skulls were landmarked with 1/16-inch ball-bearings and Indium. This study provides a new technique relating anatomy and physiology for the analysis of cranial bone graft survival.

  7. Biostatistics series module 9: Survival analysis

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2017-01-01

    Full Text Available Survival analysis is concerned with “time to event“ data. Conventionally, it dealt with cancer death as the event in question, but it can handle any event occurring over a time frame, and this need not be always adverse in nature. When the outcome of a study is the time to an event, it is often not possible to wait until the event in question has happened to all the subjects, for example, until all are dead. In addition, subjects may leave the study prematurely. Such situations lead to what is called censored observations as complete information is not available for these subjects. The data set is thus an assemblage of times to the event in question and times after which no more information on the individual is available. Survival analysis methods are the only techniques capable of handling censored observations without treating them as missing data. They also make no assumption regarding normal distribution of time to event data. Descriptive methods for exploring survival times in a sample include life table and Kaplan–Meier techniques as well as various kinds of distribution fitting as advanced modeling techniques. The Kaplan–Meier cumulative survival probability over time plot has become the signature plot for biomedical survival analysis. Several techniques are available for comparing the survival experience in two or more groups – the log-rank test is popularly used. This test can also be used to produce an odds ratio as an estimate of risk of the event in the test group; this is called hazard ratio (HR. Limitations of the traditional log-rank test have led to various modifications and enhancements. Finally, survival analysis offers different regression models for estimating the impact of multiple predictors on survival. Cox's proportional hazard model is the most general of the regression methods that allows the hazard function to be modeled on a set of explanatory variables without making restrictive assumptions concerning the

  8. Home visit program improves technique survival in peritoneal dialysis.

    Science.gov (United States)

    Martino, Francesca; Adıbelli, Z; Mason, G; Nayak, A; Ariyanon, W; Rettore, E; Crepaldi, Carlo; Rodighiero, Mariapia; Ronco, Claudio

    2014-01-01

    Peritoneal dialysis (PD) is a home therapy, and technique survival is related to the adherence to PD prescription at home. The presence of a home visit program could improve PD outcomes. We evaluated its effects on clinical outcome during 1 year of follow-up. This was a case-control study. The case group included all 96 patients who performed PD in our center on January 1, 2013, and who attended a home visit program; the control group included all 92 patients who performed PD on January 1, 2008. The home visit program consisted of several additional visits to reinforce patients' confidence in PD management in their own environment. Outcomes were defined as technique failure, peritonitis episode, and hospitalization. Clinical and dialysis features were evaluated for each patient. The case group was significantly older (p = 0.048), with a lower grade of autonomy (p = 0.033), but a better hemoglobin level (p = 0.02) than the control group. During the observational period, we had 11 episodes of technique failure. We found a significant reduction in the rate of technique failure in the case group (p = 0.004). Furthermore, survival analysis showed a significant extension of PD treatment in the patients supported by the home visit program (52 vs. 48.8 weeks, p = 0.018). We did not find any difference between the two groups in terms of peritonitis and hospitalization rate; however, trends toward a reduction of Gram-positive peritonitis rates as well as prevalence and duration of hospitalization related to PD problems were identified in the case group. The retrospective nature of the analysis was a limitation of this study. The home visit program improves the survival of PD patients and could reduce the rate of Gram-positive peritonitis and hospitalization. Video Journal Club "Cappuccino with Claudio Ronco" at http://www.karger.com/?doi=365168.

  9. Mid-term survival analysis of closed wedge high tibial osteotomy: A comparative study of computer-assisted and conventional techniques.

    Science.gov (United States)

    Bae, Dae Kyung; Song, Sang Jun; Kim, Kang Il; Hur, Dong; Jeong, Ho Yeon

    2016-03-01

    The purpose of the present study was to compare the clinical and radiographic results and survival rates between computer-assisted and conventional closing wedge high tibial osteotomies (HTOs). Data from a consecutive cohort comprised of 75 computer-assisted HTOs and 75 conventional HTOs were retrospectively reviewed. The Knee Society knee and function scores, Hospital for Special Surgery (HSS) score and femorotibial angle (FTA) were compared between the two groups. Survival rates were also compared with procedure failure. The knee and function scores at one year postoperatively were slightly better in the computer-assisted group than those in conventional group (90.1 vs. 86.1) (82.0 vs. 76.0). The HSS scores at one year postoperatively were slightly better for the computer-assisted HTOs than those of conventional HTOs (89.5 vs. 81.8). The inlier of the postoperative FTA was wider in the computer-assisted group than that in the conventional HTO group (88.0% vs. 58.7%), and mean postoperative FTA was greater in the computer-assisted group that in the conventional HTO group (valgus 9.0° vs. valgus 7.6°, pclinical and radiographic results were better in the computer-assisted group that those in the conventional HTO group. Mid-term survival rates did not differ between computer-assisted and conventional HTOs. A comparative analysis of longer-term survival rate is required to demonstrate the long-term benefit of computer-assisted HTO. III. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Demisability and survivability sensitivity to design-for-demise techniques

    Science.gov (United States)

    Trisolini, Mirko; Lewis, Hugh G.; Colombo, Camilla

    2018-04-01

    The paper is concerned with examining the effects that design-for-demise solutions can have not only on the demisability of components, but also on their survivability that is their capability to withstand impacts from space debris. First two models are introduced. A demisability model to predict the behaviour of spacecraft components during the atmospheric re-entry and a survivability model to assess the vulnerability of spacecraft structures against space debris impacts. Two indices that evaluate the level of demisability and survivability are also proposed. The two models are then used to study the sensitivity of the demisability and of the survivability indices as a function of typical design-for-demise options. The demisability and the survivability can in fact be influenced by the same design parameters in a competing fashion that is while the demisability is improved, the survivability is worsened and vice versa. The analysis shows how the design-for-demise solutions influence the demisability and the survivability independently. In addition, the effect that a solution has simultaneously on the two criteria is assessed. Results shows which, among the design-for-demise parameters mostly influence the demisability and the survivability. For such design parameters maps are presented, describing their influence on the demisability and survivability indices. These maps represent a useful tool to quickly assess the level of demisability and survivability that can be expected from a component, when specific design parameters are changed.

  11. Survival analysis II: Cox regression

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.

    2011-01-01

    In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the

  12. Multivariate survival analysis and competing risks

    CERN Document Server

    Crowder, Martin J

    2012-01-01

    Multivariate Survival Analysis and Competing Risks introduces univariate survival analysis and extends it to the multivariate case. It covers competing risks and counting processes and provides many real-world examples, exercises, and R code. The text discusses survival data, survival distributions, frailty models, parametric methods, multivariate data and distributions, copulas, continuous failure, parametric likelihood inference, and non- and semi-parametric methods. There are many books covering survival analysis, but very few that cover the multivariate case in any depth. Written for a graduate-level audience in statistics/biostatistics, this book includes practical exercises and R code for the examples. The author is renowned for his clear writing style, and this book continues that trend. It is an excellent reference for graduate students and researchers looking for grounding in this burgeoning field of research.

  13. Deriving stable multi-parametric MRI radiomic signatures in the presence of inter-scanner variations: survival prediction of glioblastoma via imaging pattern analysis and machine learning techniques

    Science.gov (United States)

    Rathore, Saima; Bakas, Spyridon; Akbari, Hamed; Shukla, Gaurav; Rozycki, Martin; Davatzikos, Christos

    2018-02-01

    There is mounting evidence that assessment of multi-parametric magnetic resonance imaging (mpMRI) profiles can noninvasively predict survival in many cancers, including glioblastoma. The clinical adoption of mpMRI as a prognostic biomarker, however, depends on its applicability in a multicenter setting, which is hampered by inter-scanner variations. This concept has not been addressed in existing studies. We developed a comprehensive set of within-patient normalized tumor features such as intensity profile, shape, volume, and tumor location, extracted from multicenter mpMRI of two large (npatients=353) cohorts, comprising the Hospital of the University of Pennsylvania (HUP, npatients=252, nscanners=3) and The Cancer Imaging Archive (TCIA, npatients=101, nscanners=8). Inter-scanner harmonization was conducted by normalizing the tumor intensity profile, with that of the contralateral healthy tissue. The extracted features were integrated by support vector machines to derive survival predictors. The predictors' generalizability was evaluated within each cohort, by two cross-validation configurations: i) pooled/scanner-agnostic, and ii) across scanners (training in multiple scanners and testing in one). The median survival in each configuration was used as a cut-off to divide patients in long- and short-survivors. Accuracy (ACC) for predicting long- versus short-survivors, for these configurations was ACCpooled=79.06% and ACCpooled=84.7%, ACCacross=73.55% and ACCacross=74.76%, in HUP and TCIA datasets, respectively. The hazard ratio at 95% confidence interval was 3.87 (2.87-5.20, P<0.001) and 6.65 (3.57-12.36, P<0.001) for HUP and TCIA datasets, respectively. Our findings suggest that adequate data normalization coupled with machine learning classification allows robust prediction of survival estimates on mpMRI acquired by multiple scanners.

  14. Additive interaction in survival analysis

    DEFF Research Database (Denmark)

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise

    2012-01-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects...... an empirical example of interaction between education and smoking on risk of lung cancer. We argue that deviations from additivity of effects are important for public health interventions and clinical decision-making, and such estimations should be encouraged in prospective studies on health. A detailed...... is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures...

  15. Radiation therapy for early stage seminoma of testis. Analysis of survival and gastrointestinal toxicity in patients treated with modern megavoltage techniques over 10 years

    International Nuclear Information System (INIS)

    Yeoh, E.; O'Brein, P.C.; Razali, M.

    1993-01-01

    Seventy-seven patients treated with megavoltage irradiation to the paraaortic and/or pelvic nodal areas, for stage I and non-bulky ( 34 years), stage (I vs II) and dose of radiation (≤ 30 Gy vs > 30 Gy), showed none of these variables to have a significant influence on overall survival or on the incidence of late complications. The results of these findings are discussed in the light of recent studies of a surveillance policy following orchidectomy for stage I seminoma of the testis. Given that gastrointestinal toxicity is the major toxicity associated with the treatment of stage I patients, the data from this study should assist clinicians and their patients to arrive at an informed decision regarding adjuvant radiotherapy. 15 refs., 1 tab., 2 figs

  16. A Framework for RFID Survivability Requirement Analysis and Specification

    Science.gov (United States)

    Zuo, Yanjun; Pimple, Malvika; Lande, Suhas

    Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.

  17. Understanding survival analysis: Kaplan-Meier estimate.

    Science.gov (United States)

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-10-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.

  18. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  19. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  20. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  1. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  2. SURVIVAL ANALYSIS AND LENGTH-BIASED SAMPLING

    Directory of Open Access Journals (Sweden)

    Masoud Asgharian

    2010-12-01

    Full Text Available When survival data are colleted as part of a prevalent cohort study, the recruited cases have already experienced their initiating event. These prevalent cases are then followed for a fixed period of time at the end of which the subjects will either have failed or have been censored. When interests lies in estimating the survival distribution, from onset, of subjects with the disease, one must take into account that the survival times of the cases in a prevalent cohort study are left truncated. When it is possible to assume that there has not been any epidemic of the disease over the past period of time that covers the onset times of the subjects, one may assume that the underlying incidence process that generates the initiating event times is a stationary Poisson process. Under such assumption, the survival times of the recruited subjects are called “lengthbiased”. I discuss the challenges one is faced with in analyzing these type of data. To address the theoretical aspects of the work, I present asymptotic results for the NPMLE of the length-biased as well as the unbiased survival distribution. I also discuss estimating the unbiased survival function using only the follow-up time. This addresses the case that the onset times are either unknown or known with uncertainty. Some of our most recent work and open questions will be presented. These include some aspects of analysis of covariates, strong approximation, functional LIL and density estimation under length-biased sampling with right censoring. The results will be illustrated with survival data from patients with dementia, collected as part of the Canadian Study of Health and Aging (CSHA.

  3. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.

    1984-09-01

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  4. Radiotelemetry; techniques and analysis

    Science.gov (United States)

    Sybill K. Amelon; David C. Dalton; Joshua J. Millspaugh; Sandy A. Wolf

    2009-01-01

    Radiotelemetry has become and important tool in studies of animal behavior, ecology, management, and conservation. From the first decades following the introduction of radio transmitters, radiotelemetry emerged as a prominent and critically important tool in wildlife science for the study of physiology, animal movements (migration, dispersal, and home range), survival...

  5. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  6. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  7. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  8. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  9. Using Survival Analysis to Evaluate Medical Equipment Battery Life.

    Science.gov (United States)

    Kuhajda, David

    2016-01-01

    As hospital medical device managers obtain more data, opportunities exist for using the data to improve medical device management, enhance patient safety, and evaluate costs of decisions. As a demonstration of the ability to use data analytics, this article applies survival analysis statistical techniques to assist in making decisions on medical equipment maintenance. The analysis was performed on a large amount of data related to failures of an infusion pump manufacturer's lithium battery and two aftermarket replacement lithium batteries from one hospital facility. The survival analysis resulted in statistical evidence showing that one of the third-party batteries had a lower survival curve than the infusion pump manufacturer's battery. This lower survival curve translates to a shorter expected life before replacement is needed. The data suggested that to limit unexpected failures, replacing batteries at a two-year interval, rather than the current industry recommendation of three years, may be warranted. For less than $5,400 in additional annual cost, the risk of unexpected battery failures can be reduced from an estimated 28% to an estimated 7%.

  10. Techniques for measuring red cell, platelet, and WBC survival

    International Nuclear Information System (INIS)

    Mayer, K.; Freeman, J.E.

    1986-01-01

    Blood cell survival studies yield valuable information concerning production and destruction of cells circulating in the bloodstream. Methodologies for the measurement of red cell survival include nonisotopic methods such as differential agglutination and hemolysis. The isotopic label may be radioactive or, if not, will require availability of a mass spectrograph. These methods fall into two categories, one where red cells of all ages are labeled ( 51 Cr, DFP32, etc.) and those employing a cohort label of newly formed cells ( 14 C glycine, 75 Se methionine, etc.). Interpretation of results for methodology employed and mechanism of destruction, random or by senescence, are discussed. A similar approach is presented for platelet and leukocyte survival studies. The inherent difficulties and complications of sequestration, storage, and margination of these cells are emphasized and discussed. 38 references

  11. Causal inference in survival analysis using pseudo-observations.

    Science.gov (United States)

    Andersen, Per K; Syriopoulou, Elisavet; Parner, Erik T

    2017-07-30

    Causal inference for non-censored response variables, such as binary or quantitative outcomes, is often based on either (1) direct standardization ('G-formula') or (2) inverse probability of treatment assignment weights ('propensity score'). To do causal inference in survival analysis, one needs to address right-censoring, and often, special techniques are required for that purpose. We will show how censoring can be dealt with 'once and for all' by means of so-called pseudo-observations when doing causal inference in survival analysis. The pseudo-observations can be used as a replacement of the outcomes without censoring when applying 'standard' causal inference methods, such as (1) or (2) earlier. We study this idea for estimating the average causal effect of a binary treatment on the survival probability, the restricted mean lifetime, and the cumulative incidence in a competing risks situation. The methods will be illustrated in a small simulation study and via a study of patients with acute myeloid leukemia who received either myeloablative or non-myeloablative conditioning before allogeneic hematopoetic cell transplantation. We will estimate the average causal effect of the conditioning regime on outcomes such as the 3-year overall survival probability and the 3-year risk of chronic graft-versus-host disease. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Prediction of lung cancer patient survival via supervised machine learning classification techniques.

    Science.gov (United States)

    Lynch, Chip M; Abdollahi, Behnaz; Fuqua, Joshua D; de Carlo, Alexandra R; Bartholomai, James A; Balgemann, Rayeanne N; van Berkel, Victor H; Frieboes, Hermann B

    2017-12-01

    Outcomes for cancer patients have been previously estimated by applying various machine learning techniques to large datasets such as the Surveillance, Epidemiology, and End Results (SEER) program database. In particular for lung cancer, it is not well understood which types of techniques would yield more predictive information, and which data attributes should be used in order to determine this information. In this study, a number of supervised learning techniques is applied to the SEER database to classify lung cancer patients in terms of survival, including linear regression, Decision Trees, Gradient Boosting Machines (GBM), Support Vector Machines (SVM), and a custom ensemble. Key data attributes in applying these methods include tumor grade, tumor size, gender, age, stage, and number of primaries, with the goal to enable comparison of predictive power between the various methods The prediction is treated like a continuous target, rather than a classification into categories, as a first step towards improving survival prediction. The results show that the predicted values agree with actual values for low to moderate survival times, which constitute the majority of the data. The best performing technique was the custom ensemble with a Root Mean Square Error (RMSE) value of 15.05. The most influential model within the custom ensemble was GBM, while Decision Trees may be inapplicable as it had too few discrete outputs. The results further show that among the five individual models generated, the most accurate was GBM with an RMSE value of 15.32. Although SVM underperformed with an RMSE value of 15.82, statistical analysis singles the SVM as the only model that generated a distinctive output. The results of the models are consistent with a classical Cox proportional hazards model used as a reference technique. We conclude that application of these supervised learning techniques to lung cancer data in the SEER database may be of use to estimate patient survival time

  13. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  14. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  15. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  16. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  17. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  18. CASAS: Cancer Survival Analysis Suite, a web based application.

    Science.gov (United States)

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  19. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    Science.gov (United States)

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

  20. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  1. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  2. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  3. FACTORS AND COMPLICATIONS AFFECTING CATHETER AND TECHNIQUE SURVIVAL WITH PERMANENT SINGLE-LUMEN DIALYSIS CATHETERS

    NARCIS (Netherlands)

    DEMEESTER, J; VANHOLDER, R; DEROOSE, J; RINGOIR, S

    1994-01-01

    This long-term study on the outcome of permanent silicone single-lumen dialysis catheters consisted of 43 surgically inserted catheters in 33 patients. All catheters were attached to a pressure-pressure single-cannula dialysis system. Technique and catheter survival were 80 and 59% at 1 year, and 63

  4. Patient and technique survival in continuous ambulatory peritoneal dialysis in a single center of the west of Mexico.

    Science.gov (United States)

    Rojas-Campos, Enrique; Alcántar-Medina, Mario; Cortés-Sanabria, Laura; Martínez-Ramírez, Héctor R; Camarena, José L; Chávez, Salvador; Flores, Antonio; Nieves, Juan J; Monteón, Francisco; Gómez-Navarro, Benjamin; Cueto-Manzano, Alfonso M

    2007-01-01

    In Mexico, CAPD survival has been analyzed in few studies from the center of the country. However, there are concerns that such results may not represent what occurs in other province centers of our country, particularly in our geographical area. To evaluate the patient and technique survival on CAPD of a single center of the west of Mexico, and compare them with other reported series. Retrospective cohort study. Tertiary care, teaching hospital located in Guadalajara, Jalisco. Patients from our CAPD program (1999-2002) were retrospectively studied. Interventions. Clinical and biochemical variables at the start of dialysis and at the end of the follow-up were recorded and considered in the analysis of risk factors. Endpoints were patient (alive, dead or lost to follow-up) and technique status at the end of the study (June 2002). 49 patients were included. Mean patient survival (+/- SE) was 3.32 +/- 0.22 years (CI 95%: 2.9-3.8 years). Patients in the present study were younger (39 +/- 17yrs), had larger body surface area (1.72 +/- 0.22 m2), lower hematocrit (25.4 +/- 5.2%), albumin (2.6 +/- 0.6g/dL), and cholesterol (173 +/- 44 mg/dL), and higher urea (300 +/- 93 mg/dL) and creatinine (14.9 +/- 5.6 mg/ dL) than those in other Mexican series. In univariate analysis, the following variables were associated (p Mexico were younger, had higher body surface area and initiated peritoneal dialysis with a more deteriorated general status than patients reported in other Mexican series; in spite of the latter, patient and technique survival were not different. In our setting, pre-dialysis older age and lower CrCl significantly predicted mortality, while older predialysis age and higher peritonitis rate predicted technique failure.

  5. Prognostic and survival analysis of 837 Chinese colorectal cancer patients.

    Science.gov (United States)

    Yuan, Ying; Li, Mo-Dan; Hu, Han-Guang; Dong, Cai-Xia; Chen, Jia-Qi; Li, Xiao-Fen; Li, Jing-Jing; Shen, Hong

    2013-05-07

    To develop a prognostic model to predict survival of patients with colorectal cancer (CRC). Survival data of 837 CRC patients undergoing surgery between 1996 and 2006 were collected and analyzed by univariate analysis and Cox proportional hazard regression model to reveal the prognostic factors for CRC. All data were recorded using a standard data form and analyzed using SPSS version 18.0 (SPSS, Chicago, IL, United States). Survival curves were calculated by the Kaplan-Meier method. The log rank test was used to assess differences in survival. Univariate hazard ratios and significant and independent predictors of disease-specific survival and were identified by Cox proportional hazard analysis. The stepwise procedure was set to a threshold of 0.05. Statistical significance was defined as P analysis suggested age, preoperative obstruction, serum carcinoembryonic antigen level at diagnosis, status of resection, tumor size, histological grade, pathological type, lymphovascular invasion, invasion of adjacent organs, and tumor node metastasis (TNM) staging were positive prognostic factors (P analysis showed a significant statistical difference in 3-year survival among these groups: LNR1, 73%; LNR2, 55%; and LNR3, 42% (P analysis results showed that histological grade, depth of bowel wall invasion, and number of metastatic lymph nodes were the most important prognostic factors for CRC if we did not consider the interaction of the TNM staging system (P < 0.05). When the TNM staging was taken into account, histological grade lost its statistical significance, while the specific TNM staging system showed a statistically significant difference (P < 0.0001). The overall survival of CRC patients has improved between 1996 and 2006. LNR is a powerful factor for estimating the survival of stage III CRC patients.

  6. A survival analysis on critical components of nuclear power plants

    International Nuclear Information System (INIS)

    Durbec, V.; Pitner, P.; Riffard, T.

    1995-06-01

    Some tubes of heat exchangers of nuclear power plants may be affected by Primary Water Stress Corrosion Cracking (PWSCC) in highly stressed areas. These defects can shorten the lifetime of the component and lead to its replacement. In order to reduce the risk of cracking, a preventive remedial operation called shot peening was applied on the French reactors between 1985 and 1988. To assess and investigate the effects of shot peening, a statistical analysis was carried on the tube degradation results obtained from in service inspection that are regularly conducted using non destructive tests. The statistical method used is based on the Cox proportional hazards model, a powerful tool in the analysis of survival data, implemented in PROC PHRED recently available in SAS/STAT. This technique has a number of major advantages including the ability to deal with censored failure times data and with the complication of time-dependant co-variables. The paper focus on the modelling and a presentation of the results given by SAS. They provide estimate of how the relative risk of degradation changes after peening and indicate for which values of the prognostic factors analyzed the treatment is likely to be most beneficial. (authors). 2 refs., 3 figs., 6 tabs

  7. Radiofrequency Ablation of Colorectal Liver Metastases: Small Size Favorably Predicts Technique Effectiveness and Survival

    International Nuclear Information System (INIS)

    Veltri, Andrea; Sacchetto, Paola; Tosetti, Irene; Pagano, Eva; Fava, Cesare; Gandini, Giovanni

    2008-01-01

    The objective of this study was to analyze long-term results of radiofrequency thermal ablation (RFA) for colorectal metastases (MTS), in order to evaluate predictors for adverse events, technique effectiveness, and survival. One hundred ninety-nine nonresectable MTS (0.5-8 cm; mean, 2.9 cm) in 122 patients underwent a total of 166 RFA sessions, percutaneously or during surgery. The technique was 'simple' or 'combined' with vascular occlusion. The mean follow-up time was 24.2 months. Complications, technique effectiveness, and survival rates were statistically analyzed. Adverse events occurred in 8.1% of lesions (major complication rate: 1.1%), 7.1% with simple and 16.7% with combined technique (p = 0.15). Early complete response was obtained in 151 lesions (81.2%), but 49 lesions (26.3%) recurred locally after a mean of 10.4 months. Sustained complete ablation was achieved in 66.7% of lesions ≤3 cm versus 33.3% of lesions >3 cm (p 3 cm (p = 0.006). We conclude that 'simple' RFA is safe and successful for MTS ≤3 cm, contributing to prolong survival when patients can be completely treated.

  8. Breast cancer data analysis for survivability studies and prediction.

    Science.gov (United States)

    Shukla, Nagesh; Hagenbuchner, Markus; Win, Khin Than; Yang, Jack

    2018-03-01

    Breast cancer is the most common cancer affecting females worldwide. Breast cancer survivability prediction is challenging and a complex research task. Existing approaches engage statistical methods or supervised machine learning to assess/predict the survival prospects of patients. The main objectives of this paper is to develop a robust data analytical model which can assist in (i) a better understanding of breast cancer survivability in presence of missing data, (ii) providing better insights into factors associated with patient survivability, and (iii) establishing cohorts of patients that share similar properties. Unsupervised data mining methods viz. the self-organising map (SOM) and density-based spatial clustering of applications with noise (DBSCAN) is used to create patient cohort clusters. These clusters, with associated patterns, were used to train multilayer perceptron (MLP) model for improved patient survivability analysis. A large dataset available from SEER program is used in this study to identify patterns associated with the survivability of breast cancer patients. Information gain was computed for the purpose of variable selection. All of these methods are data-driven and require little (if any) input from users or experts. SOM consolidated patients into cohorts of patients with similar properties. From this, DBSCAN identified and extracted nine cohorts (clusters). It is found that patients in each of the nine clusters have different survivability time. The separation of patients into clusters improved the overall survival prediction accuracy based on MLP and revealed intricate conditions that affect the accuracy of a prediction. A new, entirely data driven approach based on unsupervised learning methods improves understanding and helps identify patterns associated with the survivability of patient. The results of the analysis can be used to segment the historical patient data into clusters or subsets, which share common variable values and

  9. Aplicación y técnicas del análisis de supervivencia en las investigaciones clínicas Application and techniques of survival analysis in clinical research

    Directory of Open Access Journals (Sweden)

    Anissa Gramatges Ortiz

    2002-08-01

    Full Text Available Se realizó una actualización sobre el análisis de supervivencia en las investigaciones clínicas. Se expusieron algunos de los conceptos más generales sobre este tipo de análisis y las características de los tiempos de supervivencia.Se abordan temas relacionados con los diferentes métodos que facilitan la estimación de las probabilidades de supervivencia para uno o más grupos de individuos, con la ejemplificación del cálculo de las probabilidades para el método de Kaplan-Meier. Se destaca la comparación de la supervivencia de varios grupos atendiendo a distintos factores que los diferencian, así como también se enuncian algunas de las pruebas estadísticas que nos posibilitan la comparación, como son la prueba log rank y la Breslow, como alternativa de esta cuando se evidencia una divergencia del azar proporcional, es decir, cuando las curvas de supe DE SUPERVI rvivencia se cruzanconcepts of this type of analyses and the characteristics of survival times were presented. Aspects related with the different methods facilitating the estimation of survival probabilities for one or more groups of subjects, including the example of calculation of Kaplan Meier method´s probabilities were dealt with . The survival rates of several groups were compared, taking into consideration various factors that differentiate them. Some of the statistical tests making the comparison possible such as log rank test, and the Breslow test as an alternative of the former when there is a proportional random divergence, that is, when survival curves cross were stated

  10. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves

    Directory of Open Access Journals (Sweden)

    Guyot Patricia

    2012-02-01

    Full Text Available Abstract Background The results of Randomized Controlled Trials (RCTs on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. Methods We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios with statistics based on repeated reconstructions by multiple observers. Results The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. Conclusion The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  11. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    Science.gov (United States)

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  12. Survival Analysis of Patients with End Stage Renal Disease

    Science.gov (United States)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  13. A new technique for assessing fish passage survival at hydro power stations

    International Nuclear Information System (INIS)

    Heisey, P.G.; Mathur, D.; D'Allesandro, L.

    1993-01-01

    The HI-Z Turb'N Tag recovery method is presented as a new technique that has been successfully used at ten hydropower stations to determine turbine or spillway passage survival of fish. According to this technique, fish are tagged with the Turb'N Tag, which is pear-shaped, made of inflatable latex, and ca 35 mm long and 13 mm wide. The tag is designed to inflate after passage through the turbine, where it then floats the fish to the surface where it can be easily spotted and netted. One tag is sufficient to retrieve fish less than 18 cm long, while three tags may be needed for fish longer than 30 cm. In tests, fish were recovered in under 10 minutes from the tailrace after being tagged and released into a turbine. The tag allowed over 90% recovery of fish in most tests. The technique had minimal effect on the well-being of both hardy and sensitive species and provided an opportunity to examine recovered fish for injuries and retain them up to 72 h to assess possible delayed effects. The technique overcomes most of the logistical problems associated with conventional methods (netting, radio telemetry, mass mark-recapture) to determine turbine passage survival. The technique can also be used to assess effects of spill and fish bypass structures. 9 refs., 2 figs., 1 tab

  14. Mediation analysis of the relationship between institutional research activity and patient survival

    DEFF Research Database (Denmark)

    Rochon, Justine; du Bois, Andreas; Lange, Theis

    2014-01-01

    BACKGROUND: Recent studies have suggested that patients treated in research-active institutions have better outcomes than patients treated in research-inactive institutions. However, little attention has been paid to explaining such effects, probably because techniques for mediation analysis...... existing so far have not been applicable to survival data. METHODS: We investigated the underlying mechanisms using a recently developed method for mediation analysis of survival data. Our analysis of the effect of research activity on patient survival was based on 352 patients who had been diagnosed...... mediated through either optimal surgery or chemotherapy. Taken together, about 26% of the beneficial effect of research activity was mediated through the proposed pathways. CONCLUSIONS: Mediation analysis allows proceeding from the question "Does it work?" to the question "How does it work?" In particular...

  15. Survival analysis for customer satisfaction: A case study

    Science.gov (United States)

    Hadiyat, M. A.; Wahyudi, R. D.; Sari, Y.

    2017-11-01

    Most customer satisfaction surveys are conducted periodically to track their dynamics. One of the goals of this survey was to evaluate the service design by recognizing the trend of satisfaction score. Many researchers recommended in redesigning the service when the satisfaction scores were decreasing, so that the service life cycle could be predicted qualitatively. However, these scores were usually set in Likert scale and had quantitative properties. Thus, they should also be analyzed in quantitative model so that the predicted service life cycle would be done by applying the survival analysis. This paper discussed a starting point for customer satisfaction survival analysis with a case study in healthcare service.

  16. Evaluation of capture techniques on lesser prairie-chicken trap injury and survival

    Science.gov (United States)

    Grisham, Blake A.; Boal, Clint W.; Mitchell, Natasia R.; Gicklhorn, Trevor S.; Borsdorf, Philip K.; Haukos, David A.; Dixon, Charles

    2015-01-01

    Ethical treatment of research animals is required under the Animal Welfare Act. This includes trapping methodologies that reduce unnecessary pain and duress. Traps used in research should optimize animal welfare conditions within the context of the proposed research study. Several trapping techniques are used in the study of lesser prairie-chickens, despite lack of knowledge of trap injury caused by the various methods. From 2006 to 2012, we captured 217, 40, and 144 lesser prairie-chickens Tympanuchus pallidicinctus using walk-in funnel traps, rocket nets, and drop nets, respectively, in New Mexico and Texas, to assess the effects of capture technique on injury and survival of the species. We monitored radiotagged, injured lesser prairie-chickens 7–65 d postcapture to assess survival rates of injured individuals. Injuries occurred disproportionately among trap type, injury type, and sex. The predominant injuries were superficial cuts to the extremities of males captured in walk-in funnel traps. However, we observed no mortalities due to trapping, postcapture survival rates of injured birds did not vary across trap types, and the daily survival probability of an injured and uninjured bird was ≥99%. Frequency and intensity of injuries in walk-in funnel traps are due to the passive nature of these traps (researcher cannot select specific individuals for capture) and incidental capture of individuals not needed for research. Comparatively, rocket nets and drop nets allow observers to target birds for capture and require immediate removal of captured individuals from the trap. Based on our results, trap injuries would be reduced if researchers monitor and immediately remove birds from walk-in funnels before they injure themselves; move traps to target specific birds and reduce recaptures; limit the number of consecutive trapping days on a lek; and use proper netting techniques that incorporate quick, efficient, trained handling procedures.

  17. [Survival analysis with competing risks: estimating failure probability].

    Science.gov (United States)

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  18. Causal inference in survival analysis using pseudo-observations

    DEFF Research Database (Denmark)

    Andersen, Per K; Syriopoulou, Elisavet; Parner, Erik T

    2017-01-01

    Causal inference for non-censored response variables, such as binary or quantitative outcomes, is often based on either (1) direct standardization ('G-formula') or (2) inverse probability of treatment assignment weights ('propensity score'). To do causal inference in survival analysis, one needs ...

  19. Bernstein - Von Mises theorem and its application in survival analysis

    Czech Academy of Sciences Publication Activity Database

    Timková, Jana

    2010-01-01

    Roč. 22, č. 3 (2010), s. 115-122 ISSN 1210-8022. [16. letní škola JČMF Robust 2010. Králíky, 30.01.2010-05.02.2010] R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * bayesian asymptotics * survival function Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/SI/timkova-bernstein - von mises theorem and its application in survival analysis.pdf

  20. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  1. Prognostic and survival analysis of presbyopia: The healthy twin study

    Science.gov (United States)

    Lira, Adiyani; Sung, Joohon

    2015-12-01

    Presbyopia, a vision condition in which the eye loses its flexibility to focus on near objects, is part of ageing process which mostly perceptible in the early or mid 40s. It is well known that age is its major risk factor, while sex, alcohol, poor nutrition, ocular and systemic diseases are known as common risk factors. However, many other variables might influence the prognosis. Therefore in this paper we developed a prognostic model to estimate survival from presbyopia. 1645 participants which part of the Healthy Twin Study, a prospective cohort study that has recruited Korean adult twins and their family members based on a nation-wide registry at public health agencies since 2005, were collected and analyzed by univariate analysis as well as Cox proportional hazard model to reveal the prognostic factors for presbyopia while survival curves were calculated by Kaplan-Meier method. Besides age, sex, diabetes, and myopia; the proposed model shows that education level (especially engineering program) also contribute to the occurrence of presbyopia as well. Generally, at 47 years old, the chance of getting presbyopia becomes higher with the survival probability is less than 50%. Furthermore, our study shows that by stratifying the survival curve, MZ has shorter survival with average onset time about 45.8 compare to DZ and siblings with 47.5 years old. By providing factors that have more effects and mainly associate with presbyopia, we expect that we could help to design an intervention to control or delay its onset time.

  2. Direct Survival Analysis: a new stock assessment method

    Directory of Open Access Journals (Sweden)

    Eduardo Ferrandis

    2007-03-01

    Full Text Available In this work, a new stock assessment method, Direct Survival Analysis, is proposed and described. The parameter estimation of the Weibull survival model proposed by Ferrandis (2007 is obtained using trawl survey data. This estimation is used to establish a baseline survival function, which is in turn used to estimate the specific survival functions in the different cohorts considered through an adaptation of the separable model of the fishing mortality rates introduced by Pope and Shepherd (1982. It is thus possible to test hypotheses on the evolution of survival during the period studied and to identify trends in recruitment. A link is established between the preceding analysis of trawl survey data and the commercial catch-at-age data that are generally obtained to evaluate the population using analytical models. The estimated baseline survival, with the proposed versions of the stock and catch equations and the adaptation of the Separable Model, may be applied to commercial catch-at-age data. This makes it possible to estimate the survival corresponding to the landing data, the initial size of the cohort and finally, an effective age of first capture, in order to complete the parameter model estimation and consequently the estimation of the whole survival and mortality, along with the reference parameters that are useful for management purposes. Alternatively, this estimation of an effective age of first capture may be obtained by adapting the demographic structure of trawl survey data to that of the commercial fleet through suitable selectivity models of the commercial gears. The complete model provides the evaluation of the stock at any age. The coherence (and hence the mutual “calibration” between the two kinds of information may be analysed and compared with results obtained by other methods, such as virtual population analysis (VPA, in order to improve the diagnosis of the state of exploitation of the population. The model may be

  3. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  4. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  5. Pregnancy associated nasopharyngeal carcinoma: A retrospective case-control analysis of maternal survival outcomes

    International Nuclear Information System (INIS)

    Cheng, Yi-Kan; Zhang, Fan; Tang, Ling-Long; Chen, Lei; Zhou, Guan-Qun; Zeng, Mu-Sheng; Kang, Tie-Bang; Jia, Wei-Hua; Shao, Jian-Yong; Mai, Hai-Qiang; Guo, Ying; Ma, Jun

    2015-01-01

    Background: Pregnancy-associated nasopharyngeal carcinoma (PANPC) has been associated with poor survival. Recent advances in radiation technology and imaging techniques, and the introduction of chemotherapy have improved survival in nasopharyngeal carcinoma (NPC); however, it is not clear whether these changes have improved survival in PANPC. Therefore, the purpose of this study was to compare five-year maternal survival in patients with PANPC and non-pregnant patients with NPC. Methods: After adjusting for age, stage and chemotherapy mode, we conducted a retrospective case-control study among 36 non-metastatic PANPC patients and 36 non-pregnant NPC patients (control group) who were treated at our institution between 2000 and 2010. Results: The median age of both groups was 30 years (range, 23–35 years); median follow-up for all patients was 70 months. Locoregionally-advanced disease accounted for 83.3% of all patients with PANPC and 92.9% of patients who developed NPC during pregnancy. In both the PANPC and control groups, 31 patients (86.1%) received chemotherapy and all patients received definitive radiotherapy. The five-year rates for overall survival (70% vs. 78%, p = 0.72), distant metastasis-free survival (79% vs. 76%, p = 0.77), loco-regional relapse-free survival (97% vs. 91%, p = 0.69) and disease-free survival (69% vs. 74%, p = 0.98) were not significantly different between the PANPC and control groups. Multivariate analysis using a Cox proportional hazards model revealed that only N-classification was significantly associated with five-year OS. Conclusion: This study demonstrates that, in the modern treatment era, pregnancy itself may not negatively influence survival outcomes in patients with NPC; however, pregnancy may delay the diagnosis of NPC

  6. Survival analysis in hematologic malignancies: recommendations for clinicians

    Science.gov (United States)

    Delgado, Julio; Pereira, Arturo; Villamor, Neus; López-Guillermo, Armando; Rozman, Ciril

    2014-01-01

    The widespread availability of statistical packages has undoubtedly helped hematologists worldwide in the analysis of their data, but has also led to the inappropriate use of statistical methods. In this article, we review some basic concepts of survival analysis and also make recommendations about how and when to perform each particular test using SPSS, Stata and R. In particular, we describe a simple way of defining cut-off points for continuous variables and the appropriate and inappropriate uses of the Kaplan-Meier method and Cox proportional hazard regression models. We also provide practical advice on how to check the proportional hazards assumption and briefly review the role of relative survival and multiple imputation. PMID:25176982

  7. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  8. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  9. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  10. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  11. Evaluating disease management program effectiveness: an introduction to survival analysis.

    Science.gov (United States)

    Linden, Ariel; Adams, John L; Roberts, Nancy

    2004-01-01

    Currently, the most widely used method in the disease management industry for evaluating program effectiveness is the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer plausible rationale explaining the change from baseline. Survival analysis allows for the inclusion of data from censored cases, those subjects who either "survived" the program without experiencing the event (e.g., achievement of target clinical levels, hospitalization) or left the program prematurely, due to disenrollement from the health plan or program, or were lost to follow-up. Additionally, independent variables may be included in the model to help explain the variability in the outcome measure. In order to maximize the potential of this statistical method, validity of the model and research design must be assured. This paper reviews survival analysis as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  12. Chondrocyte survival in osteochondral transplant cylinders depends on the harvesting technique.

    Science.gov (United States)

    Hafke, Benedikt; Petri, Maximilian; Suero, Eduardo; Neunaber, Claudia; Kwisda, Sebastian; Krettek, Christian; Jagodzinski, Michael; Omar, Mohamed

    2016-07-01

    In autologous osteochondral transplantation, the edges of the harvested plug are particularly susceptible to mechanical or thermal damage to the chondrocytes. We hypothesised that the applied harvesting device has an impact on chondrocyte vitality. Both knees of five blackhead sheep (ten knees) underwent open osteochondral plug harvesting with three different circular harvesting devices (osteoarticular transfer system harvester [OATS; diameter 8 mm; Arthrex, Munich, Germany], diamond cutter [DC; diameter 8.35 mm; Karl Storz, Tuttlingen, Germany] and hollow reamer with cutting crown [HRCC; diameter 7 mm; Dannoritzer, Tuttlingen, Germany]) from distinctly assigned anatomical sites of the knee joint. The rotary cutters (DC and HRCC) were either used with (+) or without cooling (-). Surgical cuts of the cartilage with a scalpel blade were chosen as control method. After cryotomy cutting, chondrocyte vitality was assessed using fluorescence microscopy and a Live/Dead assay. There were distinct patterns of chondrocyte vitality, with reproducible accumulations of dead chondrocytes along the harvesting edge. No statistical difference in chondrocyte survivorship was seen between the OATS technique and the control method, or between the HRCC+ technique and the control method (P > 0.05). The DC+, HRCC- and DC- techniques yielded significantly lower chondrocyte survival rates compared with the control method (P vitality.

  13. Survival of the Fittest: An Active Queue Management Technique for Noisy Packet Flows

    Directory of Open Access Journals (Sweden)

    Shirish S. Karande

    2007-01-01

    Full Text Available We present a novel active queue management (AQM technique to demonstrate the efficacy of practically harnessing the predictive utility of SSR indications for improved video communication. We consider a network within which corrupted packets are relayed over multiple hops, but a certain percentage of packets needs to be dropped at an intermediate node due to congestion. We propose an AQM technique, survival of the fittest (SOTF, to be employed at the relay node, within which we use packet state information, available from SSR indications and checksums, to drop packets with the highest corruption levels. On the basis of actual 802.11b measurements we show that such a side information (SI aware processing within the network can provide significant performance benefits over an SI-unaware scheme, random queue management (RQM, which is forced to randomly discard packets. With trace-based simulations, we show the utility of the proposed AQM technique in improving the error recovery performance of cross-layer FEC schemes. Finally, with the help of H.264-based video simulations these improvements are shown to translate into a significant improvement in video quality.

  14. Effect of Salinity Adaptation Technique on Survival and Growth Rate of Patin Catfish, Pangasius sp.

    Directory of Open Access Journals (Sweden)

    K. Nirmala

    2007-01-01

    Full Text Available This study was carried out to determine the effect of salinity adaptation techniques on growth and survival of patin catfish Pangasius sp. fry.  Fry of 1.5-2.0 inch in length were reared in the water with different of the initial salinity of 1, 2, 3, 4 and 5 ppt.  Salinity was then daily increased by duplicated the initial water salinity until fish died.  The results of study showed that fry could survive by initial salinity adaptation of 1 ppt and then increasing the salinity by 1 ppt/day to reach 27 ppt.  In the other treatments, all fry died after the salinity reach 18-25 ppt. Keywords: patin catfish, Pangasius, adaptation, salinity   ABSTRAK Penelitian ini bertujuan untuk mengetahui pengaruh teknik adaptasi salinitas terhadap pertumbuhan dan kelangsungan hidup benih ikan patin Pangasius sp.  Benih patin ukuran 1,5-2 inci dipelihara pada salinitas awal berbeda, yaitu 1, 2, 3, 4 dan 5 ppt. Salinitas air pemeliharaan ditingkatkan kelipatan dari salinitas awal setiap hari hingga ikan mati.  Hasil penelitian menunjukkan bahwa adaptasi salinitas awal 1 ppt dan peningkatan sebesar 1ppt/hari menyebabkan ikan dapat bertahan hidup sampai pada salinitas 27 ppt. Pada perlakuan lainnya, benih ikan mengalami kematian masal ketika salinitas mencapai 18-25 ppt. Kata kunci: ikan patin, Pangasius, adaptasi, salinitas

  15. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

    DEFF Research Database (Denmark)

    Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

    2014-01-01

    A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented co...... applications. The methods presented are implemented in such a way that large and complex quantitative genetic data can be analyzed......A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant...

  16. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

    DEFF Research Database (Denmark)

    Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

    2013-01-01

    A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented co...... applications. The methods presented are implemented in such a way that large and complex quantitative genetic data can be analyzed......A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant...

  17. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  18. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  19. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  20. Survival analysis of heart failure patients: A case study.

    Directory of Open Access Journals (Sweden)

    Tanvir Ahmad

    Full Text Available This study was focused on survival analysis of heart failure patients who were admitted to Institute of Cardiology and Allied hospital Faisalabad-Pakistan during April-December (2015. All the patients were aged 40 years or above, having left ventricular systolic dysfunction, belonging to NYHA class III and IV. Cox regression was used to model mortality considering age, ejection fraction, serum creatinine, serum sodium, anemia, platelets, creatinine phosphokinase, blood pressure, gender, diabetes and smoking status as potentially contributing for mortality. Kaplan Meier plot was used to study the general pattern of survival which showed high intensity of mortality in the initial days and then a gradual increase up to the end of study. Martingale residuals were used to assess functional form of variables. Results were validated computing calibration slope and discrimination ability of model via bootstrapping. For graphical prediction of survival probability, a nomogram was constructed. Age, renal dysfunction, blood pressure, ejection fraction and anemia were found as significant risk factors for mortality among heart failure patients.

  1. Survival analysis of heart failure patients: A case study.

    Science.gov (United States)

    Ahmad, Tanvir; Munir, Assia; Bhatti, Sajjad Haider; Aftab, Muhammad; Raza, Muhammad Ali

    2017-01-01

    This study was focused on survival analysis of heart failure patients who were admitted to Institute of Cardiology and Allied hospital Faisalabad-Pakistan during April-December (2015). All the patients were aged 40 years or above, having left ventricular systolic dysfunction, belonging to NYHA class III and IV. Cox regression was used to model mortality considering age, ejection fraction, serum creatinine, serum sodium, anemia, platelets, creatinine phosphokinase, blood pressure, gender, diabetes and smoking status as potentially contributing for mortality. Kaplan Meier plot was used to study the general pattern of survival which showed high intensity of mortality in the initial days and then a gradual increase up to the end of study. Martingale residuals were used to assess functional form of variables. Results were validated computing calibration slope and discrimination ability of model via bootstrapping. For graphical prediction of survival probability, a nomogram was constructed. Age, renal dysfunction, blood pressure, ejection fraction and anemia were found as significant risk factors for mortality among heart failure patients.

  2. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  3. Multilevel survival analysis of health inequalities in life expectancy

    Directory of Open Access Journals (Sweden)

    Merlo Juan

    2009-08-01

    Full Text Available Abstract Background The health status of individuals is determined by multiple factors operating at both micro and macro levels and the interactive effects of them. Measures of health inequalities should reflect such determinants explicitly through sources of levels and combining mean differences at group levels and the variation of individuals, for the benefits of decision making and intervention planning. Measures derived recently from marginal models such as beta-binomial and frailty survival, address this issue to some extent, but are limited in handling data with complex structures. Beta-binomial models were also limited in relation to measuring inequalities of life expectancy (LE directly. Methods We propose a multilevel survival model analysis that estimates life expectancy based on survival time with censored data. The model explicitly disentangles total health inequalities in terms of variance components of life expectancy compared to the source of variation at the level of individuals in households and parishes and so on, and estimates group differences of inequalities at the same time. Adjusted distributions of life expectancy by gender and by household socioeconomic level are calculated. Relative and absolute health inequality indices are derived based on model estimates. The model based analysis is illustrated on a large Swedish cohort of 22,680 men and 26,474 women aged 65–69 in 1970 and followed up for 30 years. Model based inequality measures are compared to the conventional calculations. Results Much variation of life expectancy is observed at individual and household levels. Contextual effects at Parish and Municipality level are negligible. Women have longer life expectancy than men and lower inequality. There is marked inequality by the level of household socioeconomic status measured by the median life expectancy in each socio-economic group and the variation in life expectancy within each group. Conclusion Multilevel

  4. On the analysis of clonogenic survival data: Statistical alternatives to the linear-quadratic model

    International Nuclear Information System (INIS)

    Unkel, Steffen; Belka, Claus; Lauber, Kirsten

    2016-01-01

    the extraction of scores of radioresistance, which displayed significant correlations with the estimated parameters of the regression models. Undoubtedly, LQ regression is a robust method for the analysis of clonogenic survival data. Nevertheless, alternative approaches including non-linear regression and multivariate techniques such as cluster analysis and principal component analysis represent versatile tools for the extraction of parameters and/or scores of the cellular response towards ionizing irradiation with a more intuitive biological interpretation. The latter are highly informative for correlation analyses with other types of data, including functional genomics data that are increasingly beinggenerated

  5. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  6. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  7. The analysis of survival data in nephrology: basic concepts and methods of Cox regression

    NARCIS (Netherlands)

    van Dijk, Paul C.; Jager, Kitty J.; Zwinderman, Aeilko H.; Zoccali, Carmine; Dekker, Friedo W.

    2008-01-01

    How much does the survival of one group differ from the survival of another group? How do differences in age in these two groups affect such a comparison? To obtain a quantity to compare the survival of different patient groups and to account for confounding effects, a multiple regression technique

  8. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  9. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  10. Volumetric and MGMT parameters in glioblastoma patients: Survival analysis

    International Nuclear Information System (INIS)

    Iliadis, Georgios; Kotoula, Vassiliki; Chatzisotiriou, Athanasios; Televantou, Despina; Eleftheraki, Anastasia G; Lambaki, Sofia; Misailidou, Despina; Selviaridis, Panagiotis; Fountzilas, George

    2012-01-01

    In this study several tumor-related volumes were assessed by means of a computer-based application and a survival analysis was conducted to evaluate the prognostic significance of pre- and postoperative volumetric data in patients harboring glioblastomas. In addition, MGMT (O 6 -methylguanine methyltransferase) related parameters were compared with those of volumetry in order to observe possible relevance of this molecule in tumor development. We prospectively analyzed 65 patients suffering from glioblastoma (GBM) who underwent radiotherapy with concomitant adjuvant temozolomide. For the purpose of volumetry T1 and T2-weighted magnetic resonance (MR) sequences were used, acquired both pre- and postoperatively (pre-radiochemotherapy). The volumes measured on preoperative MR images were necrosis, enhancing tumor and edema (including the tumor) and on postoperative ones, net-enhancing tumor. Age, sex, performance status (PS) and type of operation were also included in the multivariate analysis. MGMT was assessed for promoter methylation with Multiplex Ligation-dependent Probe Amplification (MLPA), for RNA expression with real time PCR, and for protein expression with immunohistochemistry in a total of 44 cases with available histologic material. In the multivariate analysis a negative impact was shown for pre-radiochemotherapy net-enhancing tumor on the overall survival (OS) (p = 0.023) and for preoperative necrosis on progression-free survival (PFS) (p = 0.030). Furthermore, the multivariate analysis confirmed the importance of PS in PFS and OS of patients. MGMT promoter methylation was observed in 13/23 (43.5%) evaluable tumors; complete methylation was observed in 3/13 methylated tumors only. High rate of MGMT protein positivity (> 20% positive neoplastic nuclei) was inversely associated with pre-operative tumor necrosis (p = 0.021). Our findings implicate that volumetric parameters may have a significant role in the prognosis of GBM patients. Furthermore

  11. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  12. Noninvasive embryo assessment technique based on buoyancy and its association with embryo survival after cryopreservation.

    Science.gov (United States)

    Wessels, Cara; Penrose, Lindsay; Ahmad, Khaliq; Prien, Samuel

    2017-11-01

    Embryo cryopreservation offers many benefits by allowing genetic preservation, genetic screening, cost reduction, global embryo transport and single embryo transfer. However, freezing of embryos decreases embryo viability, as intracellular ice crystal formation often damages embryos. Success rates of frozen embryo transfer are expected to be 15-20% less than fresh embryo transfer. We have developed a noninvasive embryo assessment technique (NEAT) which enables us to predict embryo viability based on buoyancy. The purpose of this research was twofold. First was to determine if a NEAT, through a specific gravity device can detect embryo survival of cryopreservation. Second, it was to relate embryo buoyancy to embryo viability for establishing pregnancies in sheep. Blastocysts descent times were measured on one-hundred sixty-nine mice blastocysts before cryopreservation, according to standard protocol and post-thawing blastocysts descent times were measured again. There was a significant difference in blastocyst post-thaw descent times with NEAT in those blastocysts which demonstrated viability from those that did not (P embryos. Further studies on a larger scale commercial setting will evaluate the efficacy of NEAT. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Survival analysis and classification methods for forest fire size.

    Science.gov (United States)

    Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.

  14. Survival analysis and classification methods for forest fire size

    Science.gov (United States)

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497

  15. Survival analysis of dialysis patients in selected hospitals of lahore city

    International Nuclear Information System (INIS)

    Ahmad, Z.; Shahzad, I.

    2015-01-01

    There are several reasons which are directly or indirectly relate to affect the survival time of End Stage Renal Disease (ESRD) patients. This study was done to analyse the survival rate of ESRD patients in Lahore city, and to evaluate the influence of various risk factors and prognostic factors on survival of these patients. Methods: A sample of 40 patients was taken from the Jinnah Hospital Lahore and Lahore General Hospital by using the convenience sampling technique. The Log Rank Test was used to determine the significant difference between the categories of qualitative variables of ESRD patients. Multivariate Cox Regression Analysis was used to analyse the effect of different clinical and socio-economic variables on the survival time of these patients. Results: Different qualitative variables like: age, marital status, BMI, comorbid factors, diabetes type, gender, income level, place, risk factor like diabetes, ischemic heart disease, hypertension and Hepatitis status were analysed on the basis of Log Rank Test. While age and comorbid factors were found to be statistically significant which showed that the distribution of age and comorbid factors were different. By using the Cox Regression analysis the coefficient of Mass, serum albumin and family history of diabetes were found to be significant. Conclusions: There were some of the factors which had been taken for the analysis came out less or more significant in patients of ESRD. So it was concluded that mostly clinical factors which were Mass of the Patient, Serum Albumin and Family History of Diabetes made significant contribution towards the survival status of patients. (author)

  16. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  17. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  18. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  20. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  1. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    Science.gov (United States)

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  2. An Approach to Addressing Selection Bias in Survival Analysis

    Science.gov (United States)

    Carlin, Caroline S.; Solid, Craig A.

    2014-01-01

    This work proposes a frailty model that accounts for non-random treatment assignment in survival analysis. Using Monte Carlo simulation, we found that estimated treatment parameters from our proposed endogenous selection survival model (esSurv) closely parallel the consistent two-stage residual inclusion (2SRI) results, while offering computational and interpretive advantages. The esSurv method greatly enhances computational speed relative to 2SRI by eliminating the need for bootstrapped standard errors, and generally results in smaller standard errors than those estimated by 2SRI. In addition, esSurv explicitly estimates the correlation of unobservable factors contributing to both treatment assignment and the outcome of interest, providing an interpretive advantage over the residual parameter estimate in the 2SRI method. Comparisons with commonly used propensity score methods and with a model that does not account for non-random treatment assignment show clear bias in these methods that is not mitigated by increased sample size. We illustrate using actual dialysis patient data comparing mortality of patients with mature arteriovenous grafts for venous access to mortality of patients with grafts placed but not yet ready for use at the initiation of dialysis. We find strong evidence of endogeneity (with estimate of correlation in unobserved factors ρ̂ = 0.55), and estimate a mature-graft hazard ratio of 0.197 in our proposed method, with a similar 0.173 hazard ratio using 2SRI. The 0.630 hazard ratio from a frailty model without a correction for the non-random nature of treatment assignment illustrates the importance of accounting for endogeneity. PMID:24845211

  3. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  4. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  5. Mechanisms and mediation in survival analysis: towards an integrated analytical framework.

    Science.gov (United States)

    Pratschke, Jonathan; Haase, Trutz; Comber, Harry; Sharp, Linda; de Camargo Cancela, Marianna; Johnson, Howard

    2016-02-29

    A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare. The authors begin by summarising debates on causal inference, mediated effects and statistical models, showing that these three strands of research have powerful synergies. They review a range of approaches which seek to extend existing survival models to obtain valid estimates of mediation effects. They then argue for an alternative strategy, which involves integrating survival outcomes within Structural Equation Models via the discrete-time survival model. This approach can provide an integrated framework for studying mediation effects in relation to survival outcomes, an issue of great relevance in applied health research. The authors provide an example of how these techniques can be used to explore whether the social class position of patients has a significant indirect effect on the hazard of death from colon cancer. The results suggest that the indirect effects of social class on survival are substantial and negative (-0.23 overall). In addition to the substantial direct effect of this variable (-0.60), its indirect effects account for more than one quarter of the total effect. The two main pathways for this indirect effect, via emergency admission (-0.12), on the one hand, and hospital caseload, on the other, (-0.10) are of similar size. The discrete-time survival model provides an attractive way of integrating time-to-event data within the field of Structural Equation Modelling. The authors demonstrate the efficacy

  6. Mechanisms and mediation in survival analysis: towards an integrated analytical framework

    Directory of Open Access Journals (Sweden)

    Jonathan Pratschke

    2016-02-01

    Full Text Available Abstract Background A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare. Methods The authors begin by summarising debates on causal inference, mediated effects and statistical models, showing that these three strands of research have powerful synergies. They review a range of approaches which seek to extend existing survival models to obtain valid estimates of mediation effects. They then argue for an alternative strategy, which involves integrating survival outcomes within Structural Equation Models via the discrete-time survival model. This approach can provide an integrated framework for studying mediation effects in relation to survival outcomes, an issue of great relevance in applied health research. The authors provide an example of how these techniques can be used to explore whether the social class position of patients has a significant indirect effect on the hazard of death from colon cancer. Results The results suggest that the indirect effects of social class on survival are substantial and negative (-0.23 overall. In addition to the substantial direct effect of this variable (-0.60, its indirect effects account for more than one quarter of the total effect. The two main pathways for this indirect effect, via emergency admission (-0.12, on the one hand, and hospital caseload, on the other, (-0.10 are of similar size. Conclusions The discrete-time survival model provides an attractive way of integrating time-to-event data within the field of

  7. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  8. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  9. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  10. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  11. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  12. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  13. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  14. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  15. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  16. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  17. Development of known-fate survival monitoring techniques for juvenile wild pigs (Sus scrofa)

    Science.gov (United States)

    David A. Keiter; John C. Kilgo; Mark A. Vukovich; Fred L. Cunningham; James C. Beasley

    2017-01-01

    Context. Wild pigs are an invasive species linked to numerous negative impacts on natural and anthropogenic ecosystems in many regions of the world. Robust estimates of juvenile wild pig survival are needed to improve population dynamics models to facilitate management of this economically and ecologically...

  18. Study of Hip Fracture Risk using Tree Structured Survival Analysis

    Directory of Open Access Journals (Sweden)

    Lu Y

    2003-01-01

    Full Text Available In dieser Studie wird das Hüftfraktur-Risiko bei postmenopausalen Frauen untersucht, indem die Frauen in verschiedene Subgruppen hinsichtlich dieses Risikos klassifiziert werden. Frauen in einer gemeinsamen Subgruppe haben ein ähnliches Risiko, hingegen in verschiedenen Subgruppen ein unterschiedliches Hüftfraktur-Risiko. Die Subgruppen wurden mittels der Tree Structured Survival Analysis (TSSA aus den Daten von 7.665 Frauen der SOF (Study of Osteoporosis Fracture ermittelt. Bei allen Studienteilnehmerinnen wurde die Knochenmineraldichte (BMD von Unterarm, Oberschenkelhals, Hüfte und Wirbelsäule gemessen. Die Zeit von der BMD-Messung bis zur Hüftfraktur wurde als Endpunkt notiert. Eine Stichprobe von 75% der Teilnehmerinnen wurde verwendet, um die prognostischen Subgruppen zu bilden (Trainings-Datensatz, während die anderen 25% als Bestätigung der Ergebnisse diente (Validierungs-Datensatz. Aufgrund des Trainings-Datensatzes konnten mittels TSSA 4 Subgruppen identifiziert werden, deren Hüftfraktur-Risiko bei einem Follow-up von im Mittel 6,5 Jahren bei 19%, 9%, 4% und 1% lag. Die Einteilung in die Subgruppen erfolgte aufgrund der Bewertung der BMD des Ward'schen Dreiecks sowie des Oberschenkelhalses und nach dem Alter. Diese Ergebnisse konnten mittels des Validierungs-Datensatzes reproduziert werden, was die Sinnhaftigkeit der Klassifizierungregeln in einem klinischen Setting bestätigte. Mittels TSSA war eine sinnvolle, aussagekräftige und reproduzierbare Identifikation von prognostischen Subgruppen, die auf dem Alter und den BMD-Werten beruhen, möglich. In this paper we studied the risk of hip fracture for post-menopausal women by classifying women into different subgroups based on their risk of hip fracture. The subgroups were generated such that all the women in a particular subgroup had relatively similar risk while women belonging to two different subgroups had rather different risks of hip fracture. We used the Tree Structured

  19. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  20. Causal Mediation Analysis of Survival Outcome with Multiple Mediators.

    Science.gov (United States)

    Huang, Yen-Tsung; Yang, Hwai-I

    2017-05-01

    Mediation analyses have been a popular approach to investigate the effect of an exposure on an outcome through a mediator. Mediation models with multiple mediators have been proposed for continuous and dichotomous outcomes. However, development of multimediator models for survival outcomes is still limited. We present methods for multimediator analyses using three survival models: Aalen additive hazard models, Cox proportional hazard models, and semiparametric probit models. Effects through mediators can be characterized by path-specific effects, for which definitions and identifiability assumptions are provided. We derive closed-form expressions for path-specific effects for the three models, which are intuitively interpreted using a causal diagram. Mediation analyses using Cox models under the rare-outcome assumption and Aalen additive hazard models consider effects on log hazard ratio and hazard difference, respectively; analyses using semiparametric probit models consider effects on difference in transformed survival time and survival probability. The three models were applied to a hepatitis study where we investigated effects of hepatitis C on liver cancer incidence mediated through baseline and/or follow-up hepatitis B viral load. The three methods show consistent results on respective effect scales, which suggest an adverse estimated effect of hepatitis C on liver cancer not mediated through hepatitis B, and a protective estimated effect mediated through the baseline (and possibly follow-up) of hepatitis B viral load. Causal mediation analyses of survival outcome with multiple mediators are developed for additive hazard and proportional hazard and probit models with utility demonstrated in a hepatitis study.

  1. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  2. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  3. Bayesian Analysis for EMP Survival Probability of Solid State Relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    The principle to estimate the parameter p of binomial distribution by Bayesian method and the several non-informative prior are introduced. The survival probability of DC solid state relay under current injection at certain amplitude is obtained by this method. (authors)

  4. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  5. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  6. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  7. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  8. Survival analysis approach to account for non-exponential decay rate effects in lifetime experiments

    Energy Technology Data Exchange (ETDEWEB)

    Coakley, K.J., E-mail: kevincoakley@nist.gov [National Institute of Standards and Technology, 325 Broadway, Boulder, CO 80305 (United States); Dewey, M.S.; Huber, M.G. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States); Huffer, C.R.; Huffman, P.R. [North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Marley, D.E. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States); North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Mumm, H.P. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States); O' Shaughnessy, C.M. [University of North Carolina at Chapel Hill, 120 E. Cameron Ave., CB #3255, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Schelhammer, K.W. [North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Thompson, A.K.; Yue, A.T. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States)

    2016-03-21

    In experiments that measure the lifetime of trapped particles, in addition to loss mechanisms with exponential survival probability functions, particles can be lost by mechanisms with non-exponential survival probability functions. Failure to account for such loss mechanisms produces systematic measurement error and associated systematic uncertainties in these measurements. In this work, we develop a general competing risks survival analysis method to account for the joint effect of loss mechanisms with either exponential or non-exponential survival probability functions, and a method to quantify the size of systematic effects and associated uncertainties for lifetime estimates. As a case study, we apply our survival analysis formalism and method to the Ultra Cold Neutron lifetime experiment at NIST. In this experiment, neutrons can escape a magnetic trap before they decay due to a wall loss mechanism with an associated non-exponential survival probability function.

  9. Survival analysis approach to account for non-exponential decay rate effects in lifetime experiments

    International Nuclear Information System (INIS)

    Coakley, K.J.; Dewey, M.S.; Huber, M.G.; Huffer, C.R.; Huffman, P.R.; Marley, D.E.; Mumm, H.P.; O'Shaughnessy, C.M.; Schelhammer, K.W.; Thompson, A.K.; Yue, A.T.

    2016-01-01

    In experiments that measure the lifetime of trapped particles, in addition to loss mechanisms with exponential survival probability functions, particles can be lost by mechanisms with non-exponential survival probability functions. Failure to account for such loss mechanisms produces systematic measurement error and associated systematic uncertainties in these measurements. In this work, we develop a general competing risks survival analysis method to account for the joint effect of loss mechanisms with either exponential or non-exponential survival probability functions, and a method to quantify the size of systematic effects and associated uncertainties for lifetime estimates. As a case study, we apply our survival analysis formalism and method to the Ultra Cold Neutron lifetime experiment at NIST. In this experiment, neutrons can escape a magnetic trap before they decay due to a wall loss mechanism with an associated non-exponential survival probability function.

  10. Survival of irradiated glia and glioma cells studied with a new cloning technique

    International Nuclear Information System (INIS)

    Nilsson, S.; Carlsson, J.; Larsson, B.; Ponten, J.

    1980-01-01

    A method allowing cloning of monolayer cultured cells with a low plating efficiency was developed. Cells were grown in several small palladium squares to obtain a high cell density. These squares were surrounded by non-adhesive agarose to prevent large distance migration and thereby mixing of the clones. By using easily-cloned hamster cells for comparison it was found that the survival curves were similar to the curves obtained with conventional cloning. The new method was used to compare the radiosensitivity of cultured human glia and glioma cells which both have a low plating efficiency ( 0 -values (1.5 to 2.5 Gy) and large shoulders (extrapolation numbers around 5) indicating that they were rather resistant and had a high capacity for accumulation of sublethal damage. The survival curves for glia cells had lower D 0 -values (1.3 to 1.5 Gy) and no shoulders at all, indicating that they were more sensitive than the glioma cells. (author)

  11. Integrated survival analysis using an event-time approach in a Bayesian framework.

    Science.gov (United States)

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  12. Integrated survival analysis using an event-time approach in a Bayesian framework

    Science.gov (United States)

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  13. Influence of conformal radiotherapy technique on survival after chemoradiotherapy for patients with stage III non-small cell lung cancer in the National Cancer Data Base.

    Science.gov (United States)

    Sher, David J; Koshy, Matthew; Liptay, Michael J; Fidler, Mary Jo

    2014-07-01

    Definitive chemoradiotherapy is a core treatment modality for patients with stage III non-small cell lung cancer (NSCLC). Although radiotherapy (RT) technologies have advanced dramatically, to the authors' knowledge relatively little is known regarding the importance of irradiation technique on outcome, particularly given the competing risk of distant metastasis. The National Cancer Data Base was used to determine predictors of overall survival (OS) in patients with AJCC stage III NSCLC who were treated with chemoradiotherapy, focusing on the importance of conformal RT (CRT). Patients with stage III NSCLC who were treated with chemoradiotherapy between 2003 and 2005 in the National Cancer Data Base were included. RT technique was defined as conventional, 3-dimensional-conformal, or intensity-modulated RT (IMRT), the latter 2 combined as CRT. Cox proportional hazards regression was performed for univariable and multivariable analyses of OS. The median, 3-year, and 5-year survival outcomes for the 13,292 patients were 12.9 months, 19%, and 11%, respectively. The 3-year and 5-year survival probabilities of patients receiving CRT versus no CRT were 22% versus 19% and 14% versus 11%, respectively (P < .0001). On multivariable analysis, CRT was found to be significantly associated with improved OS (hazards ratio, 0.89). This effect was confirmed on sensitivity analyses, including restricting the cohort to minimum 6-month survivors, young patients with stage IIIA disease, and propensity score-matching. Institutional academic status and patient volume were not found to be associated with OS. CRT was found to be independently associated with a survival advantage. These results reflect the importance of optimal locoregional therapy in patients with stage III NSCLC and provide motivation for further study of advanced RT technologies in patients with NSCLC. © 2014 American Cancer Society.

  14. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  15. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  16. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  17. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  18. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  19. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  20. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  1. Analysis of survival data with dependent censoring copula-based approaches

    CERN Document Server

    Emura, Takeshi

    2018-01-01

    This book introduces readers to copula-based statistical methods for analyzing survival data involving dependent censoring. Primarily focusing on likelihood-based methods performed under copula models, it is the first book solely devoted to the problem of dependent censoring. The book demonstrates the advantages of the copula-based methods in the context of medical research, especially with regard to cancer patients’ survival data. Needless to say, the statistical methods presented here can also be applied to many other branches of science, especially in reliability, where survival analysis plays an important role. The book can be used as a textbook for graduate coursework or a short course aimed at (bio-) statisticians. To deepen readers’ understanding of copula-based approaches, the book provides an accessible introduction to basic survival analysis and explains the mathematical foundations of copula-based survival models.

  2. Texture analysis for survival prediction of pancreatic ductal adenocarcinoma patients with neoadjuvant chemotherapy

    Science.gov (United States)

    Chakraborty, Jayasree; Langdon-Embry, Liana; Escalon, Joanna G.; Allen, Peter J.; Lowery, Maeve A.; O'Reilly, Eileen M.; Do, Richard K. G.; Simpson, Amber L.

    2016-03-01

    Pancreatic ductal adenocarcinoma (PDAC) is the fourth leading cause of cancer-related death in the United States. The five-year survival rate for all stages is approximately 6%, and approximately 2% when presenting with distant disease.1 Only 10-20% of all patients present with resectable disease, but recurrence rates are high with only 5 to 15% remaining free of disease at 5 years. At this time, we are unable to distinguish between resectable PDAC patients with occult metastatic disease from those with potentially curable disease. Early classification of these tumor types may eventually lead to changes in initial management including the use of neoadjuvant chemotherapy or radiation, or in the choice of postoperative adjuvant treatments. Texture analysis is an emerging methodology in oncologic imaging for quantitatively assessing tumor heterogeneity that could potentially aid in the stratification of these patients. The present study derives several texture-based features from CT images of PDAC patients, acquired prior to neoadjuvant chemotherapy, and analyzes their performance, individually as well as in combination, as prognostic markers. A fuzzy minimum redundancy maximum relevance method with leave-one-image-out technique is included to select discriminating features from the set of extracted features. With a naive Bayes classifier, the proposed method predicts the 5-year overall survival of PDAC patients prior to neoadjuvant therapy and achieves the best results in terms of the area under the receiver operating characteristic curve of 0:858 and accuracy of 83:0% with four-fold cross-validation techniques.

  3. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  4. Bayesian linear regression with skew-symmetric error distributions with applications to survival analysis

    KAUST Repository

    Rubio, Francisco J.; Genton, Marc G.

    2016-01-01

    are censored. The latter scenario is of interest in the context of accelerated failure time models, which are relevant in survival analysis. We present a simulation study that demonstrates good frequentist properties of the posterior credible intervals

  5. Acute Myeloid Leukemia: analysis of epidemiological profile and survival rate.

    Science.gov (United States)

    de Lima, Mariana Cardoso; da Silva, Denise Bousfield; Freund, Ana Paula Ferreira; Dacoregio, Juliana Shmitz; Costa, Tatiana El Jaick Bonifácio; Costa, Imaruí; Faraco, Daniel; Silva, Maurício Laerte

    2016-01-01

    To describe the epidemiological profile and the survival rate of patients with acute myeloid leukemia (AML) in a state reference pediatric hospital. Clinical-epidemiological, observational, retrospective, descriptive study. The study included new cases of patients with AML, diagnosed between 2004 and 2012, younger than 15 years. Of the 51 patients studied, 84% were white; 45% were females and 55%, males. Regarding age, 8% were younger than 1 year, 47% were aged between 1 and 10 years, and 45% were older than 10 years. The main signs/symptoms were fever (41.1%), asthenia/lack of appetite (35.2%), and hemorrhagic manifestations (27.4%). The most affected extra-medullary site was the central nervous system (14%). In 47% of patients, the white blood cell (WBC) count was below 10,000/mm(3) at diagnosis. The minimal residual disease (MRD) was less than 0.1%, on the 15th day of treatment in 16% of the sample. Medullary relapse occurred in 14% of cases. When comparing the bone marrow MRD with the vital status, it was observed that 71.42% of the patients with type M3 AML were alive, as were 54.05% of those with non-M3 AML. The death rate was 43% and the main proximate cause was septic shock (63.6%). In this study, the majority of patients were male, white, and older than 1 year. Most patients with WBC count <10,000/mm(3) at diagnosis lived. Overall survival was higher in patients with MRD <0.1%. The prognosis was better in patients with AML-M3. Copyright © 2016 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  6. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  7. Parametric and semiparametric models with applications to reliability, survival analysis, and quality of life

    CERN Document Server

    Nikulin, M; Mesbah, M; Limnios, N

    2004-01-01

    Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.

  8. Prognostic classification index in Iranian colorectal cancer patients: Survival tree analysis

    Directory of Open Access Journals (Sweden)

    Amal Saki Malehi

    2016-01-01

    Full Text Available Aims: The aim of this study was to determine the prognostic index for separating homogenous subgroups in colorectal cancer (CRC patients based on clinicopathological characteristics using survival tree analysis. Methods: The current study was conducted at the Research Center of Gastroenterology and Liver Disease, Shahid Beheshti Medical University in Tehran, between January 2004 and January 2009. A total of 739 patients who already have been diagnosed with CRC based on pathologic report were enrolled. The data included demographic and clinical-pathological characteristic of patients. Tree-structured survival analysis based on a recursive partitioning algorithm was implemented to evaluate prognostic factors. The probability curves were calculated according to the Kaplan-Meier method, and the hazard ratio was estimated as an interest effect size. Result: There were 526 males (71.2% of these patients. The mean survival time (from diagnosis time was 42.46± (3.4. Survival tree identified three variables as main prognostic factors and based on their four prognostic subgroups was constructed. The log-rank test showed good separation of survival curves. Patients with Stage I-IIIA and treated with surgery as the first treatment showed low risk (median = 34 months whereas patients with stage IIIB, IV, and more than 68 years have the worse survival outcome (median = 9.5 months. Conclusion: Constructing the prognostic classification index via survival tree can aid the researchers to assess interaction between clinical variables and determining the cumulative effect of these variables on survival outcome.

  9. Breastfeeding practices in a public health field practice area in Sri Lanka: a survival analysis

    Directory of Open Access Journals (Sweden)

    Agampodi Thilini C

    2007-10-01

    Full Text Available Abstract Background Exclusive breastfeeding up to the completion of the sixth month of age is the national infant feeding recommendation for Sri Lanka. The objective of the present study was to collect data on exclusive breastfeeding up to six months and to describe the association between exclusive breastfeeding and selected socio-demographic factors. Methods A clinic based cross-sectional study was conducted in the Medical Officer of Health area, Beruwala, Sri Lanka in June 2006. Mothers with infants aged 4 to 12 months, attending the 19 child welfare clinics in the area were included in the study. Infants with specific feeding problems (cleft lip and palate and primary lactose intolerance were excluded. Cluster sampling technique was used and consecutive infants fulfilling the inclusion criteria were enrolled. A total of 219 mothers participated in the study. The statistical tests used were survival analysis (Kaplan-Meier survival curves and Cox proportional Hazard model. Results All 219 mothers had initiated breastfeeding. The median duration of exclusive breastfeeding was four months (95% CI 3.75, 4.25. The rates of exclusive breastfeeding at 4 and 6 months were 61.6% (135/219 and 15.5% (24/155 respectively. Bivariate analysis showed that the Muslim ethnicity (p = 0.004, lower levels of parental education (p Conclusion The rate of breastfeeding initiation and exclusive breastfeeding up to the fourth month is very high in Medical Officer of Health area, Beruwala, Sri Lanka. However exclusive breastfeeding up to six months is still low and the prevalence of inappropriate feeding practices is high.

  10. Direct lexical control of eye movements in reading: Evidence from a survival analysis of fixation durations

    Science.gov (United States)

    Reingold, Eyal M.; Reichle, Erik D.; Glaholt, Mackenzie G.; Sheridan, Heather

    2013-01-01

    Participants’ eye movements were monitored in an experiment that manipulated the frequency of target words (high vs. low) as well as their availability for parafoveal processing during fixations on the pre-target word (valid vs. invalid preview). The influence of the word-frequency by preview validity manipulation on the distributions of first fixation duration was examined by using ex-Gaussian fitting as well as a novel survival analysis technique which provided precise estimates of the timing of the first discernible influence of word frequency on first fixation duration. Using this technique, we found a significant influence of word frequency on fixation duration in normal reading (valid preview) as early as 145 ms from the start of fixation. We also demonstrated an equally rapid non-lexical influence on first fixation duration as a function of initial landing position (location) on target words. The time-course of frequency effects, but not location effects was strongly influenced by preview validity, demonstrating the crucial role of parafoveal processing in enabling direct lexical control of reading fixation times. Implications for models of eye-movement control are discussed. PMID:22542804

  11. A comparison of machine learning techniques for survival prediction in breast cancer.

    Science.gov (United States)

    Vanneschi, Leonardo; Farinaccio, Antonella; Mauri, Giancarlo; Antoniotti, Mauro; Provero, Paolo; Giacobini, Mario

    2011-05-11

    The ability to accurately classify cancer patients into risk classes, i.e. to predict the outcome of the pathology on an individual basis, is a key ingredient in making therapeutic decisions. In recent years gene expression data have been successfully used to complement the clinical and histological criteria traditionally used in such prediction. Many "gene expression signatures" have been developed, i.e. sets of genes whose expression values in a tumor can be used to predict the outcome of the pathology. Here we investigate the use of several machine learning techniques to classify breast cancer patients using one of such signatures, the well established 70-gene signature. We show that Genetic Programming performs significantly better than Support Vector Machines, Multilayered Perceptrons and Random Forests in classifying patients from the NKI breast cancer dataset, and comparably to the scoring-based method originally proposed by the authors of the 70-gene signature. Furthermore, Genetic Programming is able to perform an automatic feature selection. Since the performance of Genetic Programming is likely to be improvable compared to the out-of-the-box approach used here, and given the biological insight potentially provided by the Genetic Programming solutions, we conclude that Genetic Programming methods are worth further investigation as a tool for cancer patient classification based on gene expression data.

  12. A comparison of machine learning techniques for survival prediction in breast cancer

    Directory of Open Access Journals (Sweden)

    Vanneschi Leonardo

    2011-05-01

    Full Text Available Abstract Background The ability to accurately classify cancer patients into risk classes, i.e. to predict the outcome of the pathology on an individual basis, is a key ingredient in making therapeutic decisions. In recent years gene expression data have been successfully used to complement the clinical and histological criteria traditionally used in such prediction. Many "gene expression signatures" have been developed, i.e. sets of genes whose expression values in a tumor can be used to predict the outcome of the pathology. Here we investigate the use of several machine learning techniques to classify breast cancer patients using one of such signatures, the well established 70-gene signature. Results We show that Genetic Programming performs significantly better than Support Vector Machines, Multilayered Perceptrons and Random Forests in classifying patients from the NKI breast cancer dataset, and comparably to the scoring-based method originally proposed by the authors of the 70-gene signature. Furthermore, Genetic Programming is able to perform an automatic feature selection. Conclusions Since the performance of Genetic Programming is likely to be improvable compared to the out-of-the-box approach used here, and given the biological insight potentially provided by the Genetic Programming solutions, we conclude that Genetic Programming methods are worth further investigation as a tool for cancer patient classification based on gene expression data.

  13. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  14. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  15. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  16. Exposure, hazard, and survival analysis of diffusion on social networks.

    Science.gov (United States)

    Wu, Jiacheng; Crawford, Forrest W; Kim, David A; Stafford, Derek; Christakis, Nicholas A

    2018-04-29

    Sociologists, economists, epidemiologists, and others recognize the importance of social networks in the diffusion of ideas and behaviors through human societies. To measure the flow of information on real-world networks, researchers often conduct comprehensive sociometric mapping of social links between individuals and then follow the spread of an "innovation" from reports of adoption or change in behavior over time. The innovation is introduced to a small number of individuals who may also be encouraged to spread it to their network contacts. In conjunction with the known social network, the pattern of adoptions gives researchers insight into the spread of the innovation in the population and factors associated with successful diffusion. Researchers have used widely varying statistical tools to estimate these quantities, and there is disagreement about how to analyze diffusion on fully observed networks. Here, we describe a framework for measuring features of diffusion processes on social networks using the epidemiological concepts of exposure and competing risks. Given a realization of a diffusion process on a fully observed network, we show that classical survival regression models can be adapted to estimate the rate of diffusion, and actor/edge attributes associated with successful transmission or adoption, while accounting for the topology of the social network. We illustrate these tools by applying them to a randomized network intervention trial conducted in Honduras to estimate the rate of adoption of 2 health-related interventions-multivitamins and chlorine bleach for water purification-and determine factors associated with successful social transmission. Copyright © 2018 John Wiley & Sons, Ltd.

  17. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  18. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  19. Data Analysis Techniques for Physical Scientists

    Science.gov (United States)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  20. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  1. Bayesian Analysis for Dynamic Generalized Linear Latent Model with Application to Tree Survival Rate

    Directory of Open Access Journals (Sweden)

    Yu-sheng Cheng

    2014-01-01

    Full Text Available Logistic regression model is the most popular regression technique, available for modeling categorical data especially for dichotomous variables. Classic logistic regression model is typically used to interpret relationship between response variables and explanatory variables. However, in real applications, most data sets are collected in follow-up, which leads to the temporal correlation among the data. In order to characterize the different variables correlations, a new method about the latent variables is introduced in this study. At the same time, the latent variables about AR (1 model are used to depict time dependence. In the framework of Bayesian analysis, parameters estimates and statistical inferences are carried out via Gibbs sampler with Metropolis-Hastings (MH algorithm. Model comparison, based on the Bayes factor, and forecasting/smoothing of the survival rate of the tree are established. A simulation study is conducted to assess the performance of the proposed method and a pika data set is analyzed to illustrate the real application. Since Bayes factor approaches vary significantly, efficiency tests have been performed in order to decide which solution provides a better tool for the analysis of real relational data sets.

  2. Survival Analysis of US Air Force Officer Retention Rate

    Science.gov (United States)

    2017-03-23

    an independent global business research organization] has studied the timing of unemployment… the timing of this variable is designated as...retrieval, and management; report writing and graphics design; statistical and mathematical analysis; business forecasting and decision support; operations...less flexible to experimentation with the system’s variables and assumptions. Today , many researchers utilize simulation to model real world

  3. Introduction to SURPH.1 analysis of release-recapture data for survival studies

    International Nuclear Information System (INIS)

    Smith, S.G.; Skalski, J.R.; Schlechte, J.W.; Hoffmann, A.; Cassen, V.

    1994-12-01

    Program SURPH is the culmination of several years of research to develop a comprehensive computer program to analyze survival studies of fish and wildlife populations. Development of this software was motivated by the advent of the PIT-tag (Passive Integrated Transponder) technology that permits the detection of salmonid smolt as they pass through hydroelectric facilities on the Snake and Columbia Rivers in the Pacific Northwest. Repeated detections of individually tagged smolt and analysis of their capture-histories permits estimates of downriver survival probabilities. Eventual installation of detection facilities at adult fish ladders will also permit estimation of ocean survival and upstream survival of returning salmon using the statistical methods incorporated in SURPH.1. However, the utility of SURPH.1 far exceeds solely the analysis of salmonid tagging studies. Release-recapture and radiotelemetry studies from a wide range of terrestrial and aquatic species have been analyzed using SURPH.1 to estimate discrete time survival probabilities and investigate survival relationships. The interactive computing environment of SURPH.1 was specifically developed to allow researchers to investigate the relationship between survival and capture processes and environmental, experimental and individual-based covariates. Program SURPH.1 represents a significant advancement in the ability of ecologists to investigate the interplay between morphologic, genetic, environmental and anthropogenic factors on the survival of wild species. It is hoped that this better understanding of risk factors affecting survival will lead to greater appreciation of the intricacies of nature and to improvements in the management of wild resources. This technical report is an introduction to SURPH.1 and provides a user guide for both the UNIX and MS-Windows reg-sign applications of the SURPH software

  4. Survival after Second and Subsequent Recurrences in Osteosarcoma: A Retrospective Multicenter Analysis.

    Science.gov (United States)

    Tirtei, Elisa; Asaftei, Sebastian D; Manicone, Rosaria; Cesari, Marilena; Paioli, Anna; Rocca, Michele; Ferrari, Stefano; Fagioli, Franca

    2017-05-01

    Purpose Osteosarcoma (OS) is the most common primary bone tumor. Despite complete surgical removal and intensive chemotherapeutic treatment, 30%-35% of patients with OS have local or systemic recurrence. Some patients survive multiple recurrences, but overall survival after OS recurrence is poor. This analysis aims to describe and identify factors influencing post-relapse survival (PRS) after a second OS relapse. Methods This is a retrospective analysis of 60 patients with a second relapse of OS of the extremities in 2 Italian centers between 2003 and 2013. Results Treatment for first and subsequent relapses was planned according to institutional guidelines. After complete surgical remission (CSR) following the first recurrence, patients experienced a second OS relapse with a median disease-free interval (DFI) of 6 months. Lung disease was prevalent: 44 patients (76%) had pulmonary metastases. Survival after the second relapse was 22% at 5 years. Lung disease only correlated with better survival at 5 years (33.6%) compared with other sites of recurrence (5%; p = 0.008). Patients with a single pulmonary lesion had a better 5-year second PRS (42%; p = 0.02). Patients who achieved a second CSR had a 5-year second PRS of 33.4%. Chemotherapy (p<0.001) benefited patients without a third CSR. Conclusions This analysis confirms the importance of an aggressive, repeated surgical approach. Lung metastases only, the number of lesions, DFI and CSR influenced survival. It also confirms the importance of chemotherapy in patients in whom surgical treatment is not feasible.

  5. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  6. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  7. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  8. A retrospective analysis of survival and prognostic factors after stereotactic radiosurgery for aggressive meningiomas

    International Nuclear Information System (INIS)

    Ferraro, Daniel J; Zoberi, Imran; Simpson, Joseph R; Jaboin, Jerry J; Funk, Ryan K; Blackett, John William; Ju, Michelle R; DeWees, Todd A; Chicoine, Michael R; Dowling, Joshua L; Rich, Keith M; Drzymala, Robert E

    2014-01-01

    While most meningiomas are benign, aggressive meningiomas are associated with high levels of recurrence and mortality. A single institution’s Gamma Knife radiosurgical experience with atypical and malignant meningiomas is presented, stratified by the most recent WHO classification. Thirty-one patients with atypical and 4 patients with malignant meningiomas treated with Gamma Knife radiosurgery between July 2000 and July 2011 were retrospectively reviewed. All patients underwent prior surgical resection. Overall survival was the primary endpoint and rate of disease recurrence in the brain was a secondary endpoint. Patients who had previous radiotherapy or prior surgical resection were included. Kaplan-Meier and Cox proportional hazards models were used to estimate survival and identify factors predictive of recurrence and survival. Post-Gamma Knife recurrence was identified in 11 patients (31.4%) with a median overall survival of 36 months and progression-free survival of 25.8 months. Nine patients (25.7%) had died. Three-year overall survival (OS) and progression-free survival (PFS) rates were 78.0% and 65.0%, respectively. WHO grade II 3-year OS and PFS were 83.4% and 70.1%, while WHO grade III 3-year OS and PFS were 33.3% and 0%. Recurrence rate was significantly higher in patients with a prior history of benign meningioma, nuclear atypia, high mitotic rate, spontaneous necrosis, and WHO grade III diagnosis on univariate analysis; only WHO grade III diagnosis was significant on multivariate analysis. Overall survival was adversely affected in patients with WHO grade III diagnosis, prior history of benign meningioma, prior fractionated radiotherapy, larger tumor volume, and higher isocenter number on univariate analysis; WHO grade III diagnosis and larger treated tumor volume were significant on multivariate analysis. Atypical and anaplastic meningiomas remain difficult tumors to treat. WHO grade III diagnosis and treated tumor volume were significantly

  9. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  10. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    domain or in the frequency domain. However their .... computer to speech analysis led to important elaborations ... tool for the estimation of formant trajectory (10), ... prediction Linear prediction In effect determines the filter .... Radio Res. Lab.

  11. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  12. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  13. Survival analysis of cancer risk reduction strategies for BRCA1/2 mutation carriers.

    Science.gov (United States)

    Kurian, Allison W; Sigal, Bronislava M; Plevritis, Sylvia K

    2010-01-10

    Women with BRCA1/2 mutations inherit high risks of breast and ovarian cancer; options to reduce cancer mortality include prophylactic surgery or breast screening, but their efficacy has never been empirically compared. We used decision analysis to simulate risk-reducing strategies in BRCA1/2 mutation carriers and to compare resulting survival probability and causes of death. We developed a Monte Carlo model of breast screening with annual mammography plus magnetic resonance imaging (MRI) from ages 25 to 69 years, prophylactic mastectomy (PM) at various ages, and/or prophylactic oophorectomy (PO) at ages 40 or 50 years in 25-year-old BRCA1/2 mutation carriers. With no intervention, survival probability by age 70 is 53% for BRCA1 and 71% for BRCA2 mutation carriers. The most effective single intervention for BRCA1 mutation carriers is PO at age 40, yielding a 15% absolute survival gain; for BRCA2 mutation carriers, the most effective single intervention is PM, yielding a 7% survival gain if performed at age 40 years. The combination of PM and PO at age 40 improves survival more than any single intervention, yielding 24% survival gain for BRCA1 and 11% for BRCA2 mutation carriers. PM at age 25 instead of age 40 offers minimal incremental benefit (1% to 2%); substituting screening for PM yields a similarly minimal decrement in survival (2% to 3%). Although PM at age 25 plus PO at age 40 years maximizes survival probability, substituting mammography plus MRI screening for PM seems to offer comparable survival. These results may guide women with BRCA1/2 mutations in their choices between prophylactic surgery and breast screening.

  14. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  15. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  16. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  17. Survival Analysis and its Associated Factors of Beta Thalassemia Major in Hamadan Province

    Directory of Open Access Journals (Sweden)

    Reza Zamani

    2015-05-01

    Full Text Available Background: There currently is a lack of knowledge about the long-term survival of patients with beta thalassemia (BT, particularly in regions with low incidence of the disease. The aim of the present study was to determine the survival rate of the patients with BT major and the factors associated with the survival time. Methods: This retrospective cohort study was performed in Hamadan province, located in the west of Iran. The study included patients that referred to the provincial hospitals during 16 year period from 1997 to 2013. The follow up of each subject was calculated from the date of birth to the date of death. Demographic and clinical data were extracted from patients’ medical records using a checklist. Statistical analysis included the Kaplan-Meier method to analyze survivals, log-rank to compare curves between groups, and Cox regression for multivariate prognostic analysis. Results: A total of 133 patients with BT major were enrolled, 54.9% of whom were male and 66.2% were urban. The 10-, 20- and 30-year survival rate for all patients were 98.3%, 88.4% and 80.5%, respectively. Based on hazard ratio (HR, we found that accompanied diseases (P=0.01, blood type (P=0.03 and residency status (P=0.01 were significant predictors for the survival time of patients. Conclusion: The survival rate of BT patients has improved. Future researches such as prospective designs are required for the estimation of survival rate and to find other prognostic factors, which have reliable sources of data.

  18. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  19. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  20. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  1. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  2. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan

    2016-01-01

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social

  3. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  4. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  5. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  6. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  7. Modeling time-to-event (survival) data using classification tree analysis.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2017-12-01

    Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

  8. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  9. Application of activation techniques to biological analysis

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1981-01-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials

  10. SURVIVAL ANALYSIS AND GROWTH OF Cordia trichotoma, BORAGINACEAE, LAMIALES, IN MATO GROSSO DO SUL STATE, BRAZIL

    Directory of Open Access Journals (Sweden)

    Sergio Luiz Salvadori

    2013-12-01

    Full Text Available http://dx.doi.org/10.5902/1980509812357The evaluation of a plant survival percentage and growth may reflect its competitive ability in plantcommunity. Cordia trichotoma is a common native tree in Mato Grosso do Sul State and one of the mostpromising for planting. This study monitored the survival percentage and growth of Cordia trichotomaunder different conditions such as weeding and receiving or not fertilization. The experiment started inSeptember 2008 and it was concluded in March 2010. The seeds collection and sowing were held in urbanarea of Mundo Novo Municipality and the area for permanent planting to measure seedlings survival andgrowth was set at Japorã Municipality, Fazenda Santa Clara. Seedlings were planted in two categories: theuse or not of fertilizer and crowing resulting in four distinct groups: block fertilizer bare earth (ATN, bareland block without fertilizer (BTN, fertilizer and crown block (AC and without fertilizer and crownedblock (BC. The results indicated high survival of Cordia trichotoma in the seedling transplant system from bed to bags. The BC block showed the highest percentage of survival, but the smaller increments in height.The AC, ATN and BTN blocks presented the same survival pattern and similar average growth. However,there may be differences in nutritional and chemical composition of the soil suggesting sector analysis forfuture studies.

  11. Re-analysis of survival data of cancer patients utilizing additive homeopathy.

    Science.gov (United States)

    Gleiss, Andreas; Frass, Michael; Gaertner, Katharina

    2016-08-01

    In this short communication we present a re-analysis of homeopathic patient data in comparison to control patient data from the same Outpatient´s Unit "Homeopathy in malignant diseases" of the Medical University of Vienna. In this analysis we took account of a probable immortal time bias. For patients suffering from advanced stages of cancer and surviving the first 6 or 12 months after diagnosis, respectively, the results show that utilizing homeopathy gives a statistically significant (p<0.001) advantage over control patients regarding survival time. In conclusion, bearing in mind all limitations, the results of this retrospective study suggest that patients with advanced stages of cancer might benefit from additional homeopathic treatment until a survival time of up to 12 months after diagnosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  13. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  14. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  15. When will I succeed in my first-year diploma? Survival analysis in Dutch higher education

    NARCIS (Netherlands)

    Bruinsma, Marjon; Jansen, Ellen P. W. A.

    2009-01-01

    The goal of this study was to illustrate survival analysis with higher education data and gain insight into a limited set of factors that predict when students passed their first-year examination at a Dutch university. Study participants consisted of 565 first-year students in four departments. Data

  16. Revealing the equivalence of two clonal survival models by principal component analysis

    International Nuclear Information System (INIS)

    Lachet, Bernard; Dufour, Jacques

    1976-01-01

    The principal component analysis of 21 chlorella cell survival curves, adjusted by one-hit and two-hit target models, lead to quite similar projections on the principal plan: the homologous parameters of these models are linearly correlated; the reason for the statistical equivalence of these two models, in the present state of experimental inaccuracy, is revealed [fr

  17. Survival analysis of postoperative nausea and vomiting in patients receiving patient-controlled epidural analgesia

    Directory of Open Access Journals (Sweden)

    Shang-Yi Lee

    2014-11-01

    Conclusion: Survival analysis using Cox regression showed that the average consumption of opioids played an important role in postoperative nausea and vomiting, a result not found by logistic regression. Therefore, the incidence of postoperative nausea and vomiting in patients cannot be reliably determined on the basis of a single visit at one point in time.

  18. It's Deja Vu All over Again: Using Multiple-Spell Discrete-Time Survival Analysis.

    Science.gov (United States)

    Willett, John B.; Singer, Judith D.

    1995-01-01

    The multiple-spell discrete-time survival analysis method is introduced and illustrated using longitudinal data on exit from and reentry into the teaching profession. The method is applicable to many educational problems involving the sequential occurrence of disparate events or episodes. (SLD)

  19. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  20. Survival Rate and Associated Factors of Childhood Leukemia in Iran: A Systematic Review and Meta Analysis

    Directory of Open Access Journals (Sweden)

    Yousef Veisani

    2017-02-01

    Full Text Available Context Resent reviews have shown that about 18% of all child cancers are leukemia. Track of the survival rate can help researchers improve quality of life of patients through improving screening or discovery of better treatments. Objectives This review aimed at estimating the 5-year survival rates and associated factors of childhood leukemia in Iran. Data Sources We carried out a systematic review through search of relevant studies published in English (PubMed, Scopus, Google scholar, and ISI and Persian databases (Magiran, Medlib, SID, and Iran Medex. Study Selection The study included all epidemiologic studies that estimated survival rate in children with leukemia in Iran during years 2002 to 2015, and a standardized manner was used for extraction of information. Data Extraction The entire text or summary of all searched articles was extracted and then, related articles were selected, and irrelevant ones were excluded. Fixed and random effects models were calculated by the STATA using standard meta-analysis methods. Heterogeneity was assessed by I² statistics. Results The overall 5-year survival rate in patients with childhood leukemia in Iran was 0.65 (95% CI, 0.62 to 0.67, 10 studies, in the acute lymphoblastic leukemia (ALL subtype was 71.0% (95% CI: 68.0 to 74.0, and in the acute myeloid leukemia (AML subtype was 46.0%. Results of the meta analysis showed significant poor survival with relapse (heart rate (HR 1.59, 95% confidence interval (CI 1.27 to 1.98 and white blood count (WBC counts ≥ 50,000 (HR 2.92, 95% CI 1.23 to 4.60. Conclusions The results showed that 5-year survival rates in patients with AML were lower than patients with ALL. The results of this meta analysis strongly support the need for future research, action, and guidance for clinicians to improve health-related quality of life and outcomes for children with leukemia.

  1. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  2. Fissure sealants in caries prevention:a practice-based study using survival analysis

    OpenAIRE

    Leskinen, K. (Kaja)

    2010-01-01

    Abstract The purpose of this study was to analyse the effectiveness and cost of fissure sealant treatment in preventing dental caries in children in a practice-based research network using survival analysis. The survival times of first permanent molars in children were analysed in three countries: in Finland (age cohorts 1970–1972 and 1980–1982), in Sweden (1980–1982) and in Greece (1980–1982), and additionally at two municipal health centres in Finland (age cohorts 1988–1990 in Kemi...

  3. Mechanisms and mediation in survival analysis: towards an integrated analytical framework.

    LENUS (Irish Health Repository)

    Haase, Trutz

    2016-02-29

    A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare.

  4. Two-stage meta-analysis of survival data from individual participants using percentile ratios

    Science.gov (United States)

    Barrett, Jessica K; Farewell, Vern T; Siannis, Fotios; Tierney, Jayne; Higgins, Julian P T

    2012-01-01

    Methods for individual participant data meta-analysis of survival outcomes commonly focus on the hazard ratio as a measure of treatment effect. Recently, Siannis et al. (2010, Statistics in Medicine 29:3030–3045) proposed the use of percentile ratios as an alternative to hazard ratios. We describe a novel two-stage method for the meta-analysis of percentile ratios that avoids distributional assumptions at the study level. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22825835

  5. Gene expression meta-analysis identifies chromosomal regions involved in ovarian cancer survival

    DEFF Research Database (Denmark)

    Thomassen, Mads; Jochumsen, Kirsten M; Mogensen, Ole

    2009-01-01

    the relation of gene expression and chromosomal position to identify chromosomal regions of importance for early recurrence of ovarian cancer. By use of *Gene Set Enrichment Analysis*, we have ranked chromosomal regions according to their association to survival. Over-representation analysis including 1...... using death (P = 0.015) and recurrence (P = 0.002) as outcome. The combined mutation score is strongly associated to upregulation of several growth factor pathways....

  6. Survival analysis of a treatment data for cancer of the larynx

    International Nuclear Information System (INIS)

    Khan, K.

    2002-01-01

    In this paper a survival analysis of the survival time is done. The Cox regression model is fitted to the survival time with the assumption of proportional hazard. A model is selected after inclusion and exclusion of factors and variables as explanatory variables. The assumption of proportional hazards is tested in the manner suggested by Harrell (1986). The assumption of proportional hazards is supported by these tests. However the plot of Schoenfeld residuals against dose gave a little evidence of non validity of the proportional hazard assumption. The assumption seems to be satisfied for variable time. The martingale residuals suggest no pattern for variable age. The functional form of dose is not linear. Hence the quadratic dose is used as an explanatory variable. A comparison of logistic regression analysis and survival analysis is also made in this paper. It can be concluded that Cox proportional hazards model is a better model than the logistic model as it is more parsimonious and utilizes more information. (author)

  7. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  8. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  9. Effects of non-surgical factors on digital replantation survival rate: a meta-analysis.

    Science.gov (United States)

    Ma, Z; Guo, F; Qi, J; Xiang, W; Zhang, J

    2016-02-01

    This study aimed to evaluate the risk factors affecting survival rate of digital replantation by a meta-analysis. A computer retrieval of MEDLINE, OVID, EMBASE, and CNKI databases was conducted to identify citations for digital replantation with digit or finger or thumb or digital or fingertip and replantation as keywords. RevMan 5.2 software was used to calculate the pooled odds ratios. In total, there were 4678 amputated digits in 2641 patients. Gender and ischemia time had no significant influence on the survival rate of amputation replantation (P > 0.05). Age, injured hand, injury type, zone, and the method of preservation the amputated digit significantly influence the survival rate of digital replantation (P < 0.05). Children, right hand, crush, or avulsion and little finger are the risk factors that adversely affect the outcome. Level 5*. © The Author(s) 2015.

  10. Clinicopathological analysis of recurrence patterns and prognostic factors for survival after hepatectomy for colorectal liver metastasis

    Directory of Open Access Journals (Sweden)

    Okuda Junji

    2010-09-01

    Full Text Available Abstract Background Hepatectomy is recommended as the most effective therapy for liver metastasis from colorectal cancer (CRCLM. It is crucial to elucidate the prognostic clinicopathological factors. Methods Eighty-three patients undergoing initial hepatectomy for CRCLM were retrospectively analyzed with respect to characteristics of primary colorectal and metastatic hepatic tumors, operation details and prognosis. Results The overall 5-year survival rate after initial hepatectomy for CRCLM was 57.5%, and the median survival time was 25 months. Univariate analysis clarified that the significant prognostic factors for poor survival were depth of primary colorectal cancer (≥ serosal invasion, hepatic resection margin ( Conclusions Optimal surgical strategies in conjunction with effective chemotherapeutic regimens need to be established in patients with risk factors for recurrence and poor outcomes as listed above.

  11. Nonparametric Bayesian inference for mean residual life functions in survival analysis.

    Science.gov (United States)

    Poynor, Valerie; Kottas, Athanasios

    2018-01-19

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life (MRL) function, which provides the expected remaining lifetime given that a subject has survived (i.e. is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the MRL function characterizes the survival distribution. We develop general Bayesian nonparametric inference for MRL functions built from a Dirichlet process mixture model for the associated survival distribution. The resulting model for the MRL function admits a representation as a mixture of the kernel MRL functions with time-dependent mixture weights. This model structure allows for a wide range of shapes for the MRL function. Particular emphasis is placed on the selection of the mixture kernel, taken to be a gamma distribution, to obtain desirable properties for the MRL function arising from the mixture model. The inference method is illustrated with a data set of two experimental groups and a data set involving right censoring. The supplementary material available at Biostatistics online provides further results on empirical performance of the model, using simulated data examples. © The Author 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  13. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  14. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    Science.gov (United States)

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  15. Marital status independently predicts testis cancer survival--an analysis of the SEER database.

    Science.gov (United States)

    Abern, Michael R; Dude, Annie M; Coogan, Christopher L

    2012-01-01

    Previous reports have shown that married men with malignancies have improved 10-year survival over unmarried men. We sought to investigate the effect of marital status on 10-year survival in a U.S. population-based cohort of men with testis cancer. We examined 30,789 cases of testis cancer reported to the Surveillance, Epidemiology, and End Results (SEER 17) database between 1973 and 2005. All staging were converted to the 1997 AJCC TNM system. Patients less than 18 years of age at time of diagnosis were excluded. A subgroup analysis of patients with stages I or II non-seminomatous germ cell tumors (NSGCT) was performed. Univariate analysis using t-tests and χ(2) tests compared characteristics of patients separated by marital status. Multivariate analysis was performed using a Cox proportional hazard model to generate Kaplan-Meier survival curves, with all-cause and cancer-specific mortality as the primary endpoints. 20,245 cases met the inclusion criteria. Married men were more likely to be older (38.9 vs. 31.4 years), Caucasian (94.4% vs. 92.1%), stage I (73.1% vs. 61.4%), and have seminoma as the tumor histology (57.3% vs. 43.4%). On multivariate analysis, married status (HR 0.58, P married status (HR 0.60, P married and unmarried men (44.8% vs. 43.4%, P = 0.33). Marital status is an independent predictor of improved overall and cancer-specific survival in men with testis cancer. In men with stages I or II NSGCT, RPLND is an additional predictor of improved overall survival. Marital status does not appear to influence whether men undergo RPLND. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  17. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  18. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  19. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  20. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  1. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  2. Survival analysis of patients with uveal melanoma after organ preserving and liquidation treatment

    Directory of Open Access Journals (Sweden)

    E. E. Grishina

    2018-01-01

    Full Text Available Rationale: Uveal melanoma is the most common primary malignancy of the eye.Aim: To evaluate survival in patients with uveal melanoma stratified according to the type of treatment and to identify factors significantly associated with their survival.Materials and methods: The study was performed on the data extracted from medical files and follow-up forms of patients with uveal melanoma seen in the Ophthalmological Clinical Hospital of the Department of Healthcare, Moscow, from 1977 to 2012. Analysis of survival was used to assess the life longevity of patients with uveal melanoma. The analysis was censored at January 2013, when vital status (dead or alive of all patients was assessed. The factors included into the study analysis, were those taken from the follow-up forms. The incidence of uveal melanoma in Moscow (2012 was 0.9 per 100,000 of the population, whereas its prevalence was 11.1 per 100,000.Results: 698 patients with uveal melanoma were included into the study, among them 260 (37% men (aged from 19 to 87 years, median age 60 years and 438 (63% women (aged from 18 to 93 years, median age 63 years; therefore, the proportion of women under the follow-up monitoring was by 26% higher than that of men. The liquidation treatment (mostly enucleation was performed in 358 (51% of the patients, whereas the organ preserving treatment in 340 (49%. At 5, 7, and 10 years of the follow-up, the disease-specific survival of patients with uveal melanoma after the organ preserving treatment (median survival has not been reached and after the liquidation treatment (median, 88 months were 89 ± 2, 83 ± 3, and 75 ± 4% versus 63 ± 3, 52 ± 4, and 47 ± 5%, respectively (р = 0.001. Overall survival and disease-specific survival of the patients after the liquidation treatment were significantly lower than in the patients after the organ-preserving treatment. According to multiple regression analysis, this was associated not with the type of

  3. ATM and p53 combined analysis predicts survival in glioblastoma multiforme patients: A clinicopathologic study.

    Science.gov (United States)

    Romano, Francesco Jacopo; Guadagno, Elia; Solari, Domenico; Borrelli, Giorgio; Pignatiello, Sara; Cappabianca, Paolo; Del Basso De Caro, Marialaura

    2018-06-01

    Glioblastoma is one of the most malignant cancers, with a distinguishing dismal prognosis: surgery followed by chemo- and radiotherapy represents the current standard of care, and chemo- and radioresistance underlie disease recurrence and short overall survival of patients suffering from this malignancy. ATM is a kinase activated by autophosphorylation upon DNA doublestrand breaks arising from errors during replication, byproducts of metabolism, chemotherapy or ionizing radiations; TP53 is one of the most popular tumor suppressor, with a preeminent role in DNA damage response and repair. To study the effects of the immunohistochemical expression of p-ATM and p53 in glioblastoma patients, 21 cases were retrospectively examined. In normal brain tissue, p-ATM was expressed only in neurons; conversely, in tumors cells, the protein showed a variable cytoplasmic expression (score: +,++,+++), with being completely undetectable in three cases. Statistical analysis revealed that high p-ATM score (++/+++) strongly correlated to shorter survival (P = 0.022). No difference in overall survival was registered between p53 normally expressed (NE) and overexpressed (OE) glioblastoma patients (P = 0.669). Survival analysis performed on the results from combined assessment of the two proteins showed that patients with NE p53 /low pATM score had longer overall survival than the NE p53/ high pATM score counterpart. Cox-regression analysis confirmed this finding (HR = 0.025; CI 95% = 0.002-0.284; P = 0.003). Our study outlined the immunohistochemical expression of p-ATM/p53 in glioblastomas and provided data on their possible prognostic/predictive of response role. A "non-oncogene addiction" to ATM for NEp53 glioblastoma could be postulated, strengthening the rationale for development of ATM inhibiting drugs. © 2018 Wiley Periodicals, Inc.

  4. Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.

    Science.gov (United States)

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A

    2015-11-01

    Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and

  5. Modelling lecturer performance index of private university in Tulungagung by using survival analysis with multivariate adaptive regression spline

    Science.gov (United States)

    Hasyim, M.; Prastyo, D. D.

    2018-03-01

    Survival analysis performs relationship between independent variables and survival time as dependent variable. In fact, not all survival data can be recorded completely by any reasons. In such situation, the data is called censored data. Moreover, several model for survival analysis requires assumptions. One of the approaches in survival analysis is nonparametric that gives more relax assumption. In this research, the nonparametric approach that is employed is Multivariate Regression Adaptive Spline (MARS). This study is aimed to measure the performance of private university’s lecturer. The survival time in this study is duration needed by lecturer to obtain their professional certificate. The results show that research activities is a significant factor along with developing courses material, good publication in international or national journal, and activities in research collaboration.

  6. Trends in Testicular Cancer Survival: A Large Population-based Analysis.

    Science.gov (United States)

    Sui, Wilson; Morrow, David C; Bermejo, Carlos E; Hellenthal, Nicholas J

    2015-06-01

    To determine whether discrepancies in testicular cancer outcomes between Caucasians and non-Caucasians are changing over time. Although testicular cancer is more common in Caucasians, studies have shown that other races have worse outcomes. Using the Surveillance, Epidemiology, and End Results registry, we identified 29,803 patients diagnosed with histologically confirmed testicular cancer between 1983 and 2011. Of these, 12,650 patients (42%) had 10-year follow-up data. We stratified the patients by age group, stage, race, and year of diagnosis and assessed 10-year overall and cancer-specific survival in each cohort. Cox proportional hazard models were used to determine the relative contributions of each stratum to cancer-specific survival. Predicted overall 10-year survival of Caucasian patients with testicular cancer increased slightly from 88% to 89% over the period studied, whereas predicted cancer-specific 10-year survival dropped slightly from 94% to 93%. In contrast, non-Caucasian men demonstrated larger changes in 10-year overall (84%-86%) and cancer-specific (88%-91%) survival. On univariate analysis, race was significantly associated with testicular cancer death, with non-Caucasian men being 1.69 times more likely to die of testicular cancer than Caucasians (hazard ratio, 1.33-2.16; 95% confidence interval, testicular cancer. These data show a convergence in cancer-specific survival between racial groups over time, suggesting that diagnostic and treatment discrepancies may be improving for non-Caucasians. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Survival analysis of female dogs with mammary tumors after mastectomy: epidemiological, clinical and morphological aspects

    Directory of Open Access Journals (Sweden)

    Maria Luíza de M. Dias

    2016-03-01

    Full Text Available Abstract: Mammary gland tumors are the most common type of tumors in bitches but research on survival time after diagnosis is scarce. The purpose of this study was to investigate the relationship between survival time after mastectomy and a number of clinical and morphological variables. Data was collected retrospectively on bitches with mammary tumors seen at the Small Animal Surgery Clinic Service at the University of Brasília. All subjects had undergone mastectomy. Survival analysis was conducted using Cox's proportional hazard method. Of the 139 subjects analyzed, 68 died and 71 survived until the end of the study (64 months. Mean age was 11.76 years (SD=2.71, 53.84% were small dogs. 76.92% of the tumors were malignant, and 65.73% had both thoracic and inguinal glands affected. Survival time in months was associated with age (hazard rate ratios [HRR] =1.23, p-value =1.4x10-4, animal size (HRR between giant and small animals =2.61, p-value =0.02, nodule size (HRR =1.09, p-value =0.03, histological type (HRR between solid carcinoma and carcinoma in a mixed tumor =2.40, p-value =0.02, time between diagnosis and surgery (TDS, with HRR =1.21, p-value =2.7x10-15, and the interaction TDS*follow-up time (HRR =0.98, p-value =1.6x10-11. The present study is one of the few on the subject matter. Several important covariates were evaluated and age, animal size, nodule size, histological type, TDS and TDS*follow up time were identified as significantly associated to survival time.

  8. Outcome predictors in the management of intramedullary classic ependymoma: An integrative survival analysis.

    Science.gov (United States)

    Wang, Yinqing; Cai, Ranze; Wang, Rui; Wang, Chunhua; Chen, Chunmei

    2018-06-01

    This is a retrospective study.The aim of this study was to illustrate the survival outcomes of patients with classic ependymoma (CE) and identify potential prognostic factors.CE is the most common category of spinal ependymomas, but few published studies have discussed predictors of the survival outcome.A Boolean search of the PubMed, Embase, and OVID databases was conducted by 2 investigators independently. The objects were intramedullary grade II ependymoma according to 2007 WHO classification. Univariate Kaplan-Meier analysis and Log-Rank tests were performed to identify variables associated with progression-free survival (PFS) or overall survival (OS). Multivariate Cox regression was performed to assess hazard ratios (HRs) with 95% confidence intervals (95% CIs). Statistical analysis was performed by SPSS version 23.0 (IBM Corp.) with statistical significance defined as P analysis showed that patients who had undergone total resection (TR) had better PFS and OS than those with subtotal resection (STR) and biopsy (P = .002, P = .004, respectively). Within either univariate or multivariate analysis (P = .000, P = .07, respectively), histological type was an independent prognostic factor for PFS of CE [papillary type: HR 0.002, 95% CI (0.000-0.073), P = .001, tanycytic type: HR 0.010, 95% CI (0.000-0.218), P = .003].It was the first integrative analysis of CE to elucidate the correlation between kinds of factors and prognostic outcomes. Definite histological type and safely TR were foundation of CE's management. 4.

  9. Survival after radiotherapy in gastric cancer: Systematic review and meta-analysis

    International Nuclear Information System (INIS)

    Valentini, Vincenzo; Cellini, Francesco; Minsky, Bruce D.; Mattiucci, Gian Carlo; Balducci, Mario; D'Agostino, Giuseppe; D'Angelo, Elisa; Dinapoli, Nicola; Nicolotti, Nicola; Valentini, Chiara; La Torre, Giuseppe

    2009-01-01

    Background and purpose: A systematic review and meta-analysis was performed to assess the impact of radiotherapy on both 3- and 5-year survival in patients with resectable gastric cancer. Methods: Randomized Clinical Trials (RCTs) in which radiotherapy, (preoperative, postoperative and/or intraoperative), was compared with surgery alone or surgery plus chemotherapy in resectable gastric cancer were identified by searching web-based databases and supplemented by manual examination of reference lists. Meta-analysis was performed using Risk Ratios (RRs). Random or fixed effects models were used to combine data. The methodological quality was evaluated by Chalmers' score. Results: Radiotherapy had a significant impact on 5-year survival. Using an intent to treat (ITT) and a Per Protocol (PP) analysis, the overall 5-year RR was 1.26 (95% CI: 1.08-1.48; NNT = 17) and 1.31 (95% CI: 1.04-1.66; NNT = 13), respectively. Although the quality of the studies was variable, the data were consistent and no clear publication bias was found. Conclusion: This meta-analysis showed a statistically significant 5-year survival benefit with the addition of radiotherapy in patients with resectable gastric cancer. Radiotherapy remains a standard component in the treatment of resectable gastric cancer and new RCTs need to address the impact of new conformal radiotherapy technologies.

  10. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  11. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  12. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  13. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  14. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  15. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  16. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  17. Tracheostomy mechanical ventilation in patients with amyotrophic lateral sclerosis: clinical features and survival analysis.

    Science.gov (United States)

    Spataro, Rossella; Bono, Valeria; Marchese, Santino; La Bella, Vincenzo

    2012-12-15

    Tracheostomy mechanical ventilation (TMV) is performed in amyotrophic lateral sclerosis (ALS) patients with a respiratory failure or when the non-invasive ventilation (NIV) is no longer effective. We evaluated the clinical characteristics and survival of a cohort of tracheostomized ALS patients, followed in a single ALS Clinical Center. Between 2001 and 2010, 87 out of 279 ALS patients were submitted to TMV. Onset was spinal in 62 and bulbar in 25. After tracheostomy, most patients were followed up through telephone interviews to caregivers. A complete survival analysis could be performed in fifty-two TMV patients. 31.3% ALS patients underwent tracheostomy, with a male prevalence (M/F=1.69) and a median age of 61 years (interquartile range=47-66). After tracheostomy, nearly all patients were under home care. TMV ALS patients were more likely than non-tracheostomized (NT) patients to be implanted with a PEG device, although the bulbar-/spinal-onset ratio did not differ between the two groups. Kaplan-Meyer analysis showed that tracheostomy increases median survival (TMV, 47 months vs NT, 31 months, p=0.008), with the greatest effect in patients younger than 60 at onset (TMV ≤ 60 years, 57.5 months vs NT ≤ 60 years, 38.5 months, p=0.002). TMV is increasingly performed in ALS patients. Nearly all TMV patients live at home and most of them are fed through a PEG device. Survival after tracheostomy is generally increased, with the stronger effect in patients younger than 60. This survival advantage is apparently lost when TMV is performed in patients older than 60. The results of this study might be useful for the decision-making process of patients and their families about this advanced palliative care. Copyright © 2012. Published by Elsevier B.V.

  18. Determinants of malignant pleural mesothelioma survival and burden of disease in France: a national cohort analysis.

    Science.gov (United States)

    Chouaid, Christos; Assié, Jean Baptiste; Andujar, Pascal; Blein, Cecile; Tournier, Charlène; Vainchtock, Alexandre; Scherpereel, Arnaud; Monnet, Isabelle; Pairon, Jean Claude

    2018-04-01

    This study was undertaken to determine the healthcare burden of malignant pleural mesothelioma (MPM) in France and to analyze its associations with socioeconomic deprivation, population density, and management outcomes. A national hospital database was used to extract incident MPM patients in years 2011 and 2012. Cox models were used to analyze 1- and 2-year survival according to sex, age, co-morbidities, management, population-density index, and social deprivation index. The analysis included 1,890 patients (76% men; age: 73.6 ± 10.0 years; 84% with significant co-morbidities; 57% living in urban zones; 53% in highly underprivileged areas). Only 1% underwent curative surgical procedure; 65% received at least one chemotherapy cycle, 72% of them with at least one pemetrexed and/or bevacizumab administration. One- and 2-year survival rates were 64% and 48%, respectively. Median survival was 14.9 (95% CI: 13.7-15.7) months. The mean cost per patient was 27,624 ± 17,263 euros (31% representing pemetrexed and bevacizumab costs). Multivariate analyses retained men, age >70 years, chronic renal failure, chronic respiratory failure, and never receiving pemetrexed as factors of poor prognosis. After adjusting the analysis to age, sex, and co-morbidities, living in rural/semi-rural area was associated with better 2-year survival (HR: 0.83 [95% CI: 0.73-0.94]; P < 0.01); social deprivation index was not significantly associated with survival. With approximately 1,000 new cases per year in France, MPMs represents a significant national health care burden. Co-morbidities, sex, age, and living place appear to be significant factors of prognosis. © 2018 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  19. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  20. Chemoembolization With Doxorubicin-Eluting Beads for Unresectable Hepatocellular Carcinoma: Five-Year Survival Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Malagari, Katerina, E-mail: kmalag@otonet.gr [University of Athens, Second Department of Radiology (Greece); Pomoni, Mary [University of Athens, Imaging and Research Unit (Greece); Moschouris, Hippocrates, E-mail: hipmosch@gmail.com [Tzanion Hospital, Department of Radiology (Greece); Bouma, Evanthia [University of Athens, Imaging and Research Unit (Greece); Koskinas, John [Ippokration Hospital, University of Athens, Department of Internal Medicine and Hepatology (Greece); Stefaniotou, Aspasia [University of Athens, Imaging and Research Unit (Greece); Marinis, Athanasios [Tzanion Hospital, Department of Surgery (Greece); Kelekis, Alexios; Alexopoulou, Efthymia [University of Athens, Second Department of Radiology (Greece); Chatziioannou, Achilles [University of Athens, First Department of Radiology (Greece); Chatzimichael, Katerina [University of Athens, Second Department of Radiology (Greece); Dourakis, Spyridon [Ippokration Hospital, University of Athens, Department of Internal Medicine and Hepatology (Greece); Kelekis, Nikolaos [University of Athens, Second Department of Radiology (Greece); Rizos, Spyros [Tzanion Hospital, Department of Surgery (Greece); Kelekis, Dimitrios [University of Athens, Imaging and Research Unit (Greece)

    2012-10-15

    Purpose: The purpose of this study was to report on the 5-year survival of hepatocellular carcinoma (HCC) patients treated with DC Bead loaded with doxorubicin (DEB-DOX) in a scheduled scheme in up to three treatments and thereafter on demand. Materials and Methods: 173 HCC patients not suitable for curable treatments were prospectively enrolled (mean age 70.4 {+-} 7.4 years). Child-Pugh (Child) class was A/B (102/71 [59/41 %]), Okuda stage was 0/1/2 (91/61/19 [53.2/35.7/11.1 %]), and mean lesion diameter was 7.6 {+-} 2.1 cm. Lesion morphology was one dominant {<=}5 cm (22 %), one dominant >5 cm (41.6 %), multifocal {<=}5 (26 %), and multifocal >5 (10.4 %). Results: Overall survival at 1, 2, 3, 4, and 5 years was 93.6, 83.8, 62, 41.04, and 22.5 %, with higher rates achieved in Child class A compared with Child class B patients (95, 88.2, 61.7, 45, and 29.4 % vs. 91.5, 75, 50.7, 35.2, and 12.8 %). Mean overall survival was 43.8 months (range 1.2-64.8). Cumulative survival was better for Child class A compared with Child class B patients (p = 0.029). For patients with dominant lesions {<=}5 cm 1-, 2-, 3-, 4-, and 5-year survival rates were 100, 95.2, 71.4, 66.6, and 47.6 % for Child class A and 94.1, 88.2, 58.8, 41.2, 29.4, and 23.5 % for Child class B patients. Regarding DEB-DOX treatment, multivariate analysis identified number of lesions (p = 0.033), lesion vascularity (p < 0.0001), initially achieved complete response (p < 0.0001), and objective response (p = 0.046) as significant and independent determinants of 5-year survival. Conclusion: DEB-DOX results, with high rates of 5-year survival for patients, not amenable to curative treatments. Number of lesions, lesion vascularity, and local response were significant independent determinants of 5-year survival.

  1. Chemoembolization With Doxorubicin-Eluting Beads for Unresectable Hepatocellular Carcinoma: Five-Year Survival Analysis

    International Nuclear Information System (INIS)

    Malagari, Katerina; Pomoni, Mary; Moschouris, Hippocrates; Bouma, Evanthia; Koskinas, John; Stefaniotou, Aspasia; Marinis, Athanasios; Kelekis, Alexios; Alexopoulou, Efthymia; Chatziioannou, Achilles; Chatzimichael, Katerina; Dourakis, Spyridon; Kelekis, Nikolaos; Rizos, Spyros; Kelekis, Dimitrios

    2012-01-01

    Purpose: The purpose of this study was to report on the 5-year survival of hepatocellular carcinoma (HCC) patients treated with DC Bead loaded with doxorubicin (DEB-DOX) in a scheduled scheme in up to three treatments and thereafter on demand. Materials and Methods: 173 HCC patients not suitable for curable treatments were prospectively enrolled (mean age 70.4 ± 7.4 years). Child-Pugh (Child) class was A/B (102/71 [59/41 %]), Okuda stage was 0/1/2 (91/61/19 [53.2/35.7/11.1 %]), and mean lesion diameter was 7.6 ± 2.1 cm. Lesion morphology was one dominant ≤5 cm (22 %), one dominant >5 cm (41.6 %), multifocal ≤5 (26 %), and multifocal >5 (10.4 %). Results: Overall survival at 1, 2, 3, 4, and 5 years was 93.6, 83.8, 62, 41.04, and 22.5 %, with higher rates achieved in Child class A compared with Child class B patients (95, 88.2, 61.7, 45, and 29.4 % vs. 91.5, 75, 50.7, 35.2, and 12.8 %). Mean overall survival was 43.8 months (range 1.2–64.8). Cumulative survival was better for Child class A compared with Child class B patients (p = 0.029). For patients with dominant lesions ≤5 cm 1-, 2-, 3-, 4-, and 5-year survival rates were 100, 95.2, 71.4, 66.6, and 47.6 % for Child class A and 94.1, 88.2, 58.8, 41.2, 29.4, and 23.5 % for Child class B patients. Regarding DEB-DOX treatment, multivariate analysis identified number of lesions (p = 0.033), lesion vascularity (p < 0.0001), initially achieved complete response (p < 0.0001), and objective response (p = 0.046) as significant and independent determinants of 5-year survival. Conclusion: DEB-DOX results, with high rates of 5-year survival for patients, not amenable to curative treatments. Number of lesions, lesion vascularity, and local response were significant independent determinants of 5-year survival.

  2. Tracheostomy and invasive mechanical ventilation in amyotrophic lateral sclerosis: decision-making factors and survival analysis.

    Science.gov (United States)

    Kimura, Fumiharu

    2016-04-28

    Invasive and/or non-invasive mechanical ventilation are most important options of respiratory management in amyotrophic lateral sclerosis. We evaluated the frequency, clinical characteristics, decision-making factors about ventilation and survival analysis of 190 people with amyotrophic lateral sclerosis patients from 1990 until 2013. Thirty-one percentage of patients underwent tracheostomy invasive ventilation with the rate increasing more than the past 20 years. The ratio of tracheostomy invasive ventilation in patients >65 years old was significantly increased after 2000 (25%) as compared to before (10%). After 2010, the standard use of non-invasive ventilation showed a tendency to reduce the frequency of tracheostomy invasive ventilation. Mechanical ventilation prolonged median survival (75 months in tracheostomy invasive ventilation, 43 months in non-invasive ventilation vs natural course, 32 months). The life-extending effects by tracheostomy invasive ventilation were longer in younger patients ≤65 years old at the time of ventilation support than in older patients. Presence of partners and care at home were associated with better survival. Following factors related to the decision to perform tracheostomy invasive ventilation: patients ≤65 years old: greater use of non-invasive ventilation: presence of a spouse: faster tracheostomy: higher progression rate; and preserved motor functions. No patients who underwent tracheostomy invasive ventilation died from a decision to withdraw mechanical ventilation. The present study provides factors related to decision-making process and survival after tracheostomy and help clinicians and family members to expand the knowledge about ventilation.

  3. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  4. The survival analysis of beta thalassemia major patients in South East of Iran

    International Nuclear Information System (INIS)

    Roudbari, M.; Soltani-Rad, M.; Roudbari, S.

    2008-01-01

    The objective was to determine the survival of beta-thalassemia major patients with transfusion, and its related factors in Southeast of Iran. This cross-sectional study was performed in Zahedan, Iran in 2007. The sample included patients who were referred from all over the Zahedan Thalassemia Center from 1998 to 2006. The data were collected using the patient's records, which were recorded by the staff during transfusion. The data included demographic and medical information blood group, blood RH, the kind of transfused blood [KTB], annual number of transfusions [ANOT], accompanied disease [AD], Hemoglobin [Hb] and ferritin level. For data analysis, the Kaplan-Meyer method, and Long Rank test together with Cox Regression were used. Forty-six of 578 patients died and 99% survived for the first year. The ages survival proportions were 5 (97.9%), 10 (97%), 15 (92.1%), and 20 (81.2%) years. The survival time showed significant relationships with the ANOT p=0.0053, KTB p=0.003, Hb=0.002 and ferritin level p=0.0087, and AD p=0.00. Using regular transfusion, paying attention to screening of transfused blood, increasing the families knowledge on the disease to prevent the bearing of thalassemia fetus, are recommended; finally, the detection and treating of the AD, are of great importance to extend the lifetime of the patients. (author)

  5. SPSS survival manual a step by step guide to data analysis using SPSS

    CERN Document Server

    Pallant, Julie

    2010-01-01

    In this thoroughly revised edition of her bestselling text, now covering up to version 18 of the SPSS software, Julie Pallant guides you through the entire research process, helping you choose the right data analysis technique for your project.

  6. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  7. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  8. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  9. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  10. Surrogacy of progression-free survival (PFS) for overall survival (OS) in esophageal cancer trials with preoperative therapy: Literature-based meta-analysis.

    Science.gov (United States)

    Kataoka, K; Nakamura, K; Mizusawa, J; Kato, K; Eba, J; Katayama, H; Shibata, T; Fukuda, H

    2017-10-01

    There have been no reports evaluating progression-free survival (PFS) as a surrogate endpoint in resectable esophageal cancer. This study was conducted to evaluate the trial level correlations between PFS and overall survival (OS) in resectable esophageal cancer with preoperative therapy and to explore the potential benefit of PFS as a surrogate endpoint for OS. A systematic literature search of randomized trials with preoperative chemotherapy or preoperative chemoradiotherapy for esophageal cancer reported from January 1990 to September 2014 was conducted using PubMed and the Cochrane Library. Weighted linear regression using sample size of each trial as a weight was used to estimate coefficient of determination (R 2 ) within PFS and OS. The primary analysis included trials in which the HR for both PFS and OS was reported. The sensitivity analysis included trials in which either HR or median survival time of PFS and OS was reported. In the sensitivity analysis, HR was estimated from the median survival time of PFS and OS, assuming exponential distribution. Of 614 articles, 10 trials were selected for the primary analysis and 15 for the sensitivity analysis. The primary analysis did not show a correlation between treatment effects on PFS and OS (R 2 0.283, 95% CI [0.00-0.90]). The sensitivity analysis did not show an association between PFS and OS (R 2 0.084, 95% CI [0.00-0.70]). Although the number of randomized controlled trials evaluating preoperative therapy for esophageal cancer is limited at the moment, PFS is not suitable for primary endpoint as a surrogate endpoint for OS. Copyright © 2017 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  11. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  12. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  13. Analysis on Lung Cancer Survival from 2001 to 2007 in Qidong, China

    Directory of Open Access Journals (Sweden)

    Jian ZHU

    2011-01-01

    Full Text Available Background and objective Lung cancer is one of the most important malignancies in China. Survival rates of lung cancer on the population-based cancer registry for the years 2001-2007 in Qidong were analysed in order to provide the basis for the prognosis assessment and the control of this cancer. Methods Total 4,451 registered lung cancer cases was followed up to December 31st, 2009. Death certificates only (DCO cases were excluded, leaving 4,382 cases for survival analysis. Cumulative observed survival rate (OS and relative survival rate (RS were calculated using Hakulinen’s method performed by the SURV 3.01 software developed at the Finnish Cancer Registry. Results The 1-, 3-, and 5-year OS rates were 23.73%, 11.89%, 10.01%, and the RS rates were 24.86%, 13.69%, 12.73%, respectively. The 1-, 3-, and 5-year RS of males vs females were 23.70% vs 27.89%, 12.58% vs 16.53%, and 11.73% vs 15.21%, respectively, with statisitically significant differences (χ2=13.77, P=0.032. RS of age groups of 15-34, 35-44, 45-54, 55-64, 65-74 and 75+ were 35.46%, 17.66%, 11.97%, 13.49%, 10.61%, 15.14%, respectively. Remarkable improvement could be seen for the 5-year RS in this setting if compared with that for the years 1972-2000. Conclusion The lung cancer survival outcomes in Qidong have been improved gradually for the past decades. Further measures on the prevention, diagnosis and treatment of lung cancer should be taken.

  14. The costs of treating acute heart failure: an economic analysis of the SURVIVE trial.

    Science.gov (United States)

    de Lissovoy, Gregory; Fraeman, Kathy; Salon, Jeff; Chay Woodward, Tatia; Sterz, Raimund

    2008-01-01

    To estimate the incremental cost per life year gained with levosimendan relative to dobutamine in treatment of acute heart failure based on the Survival of Patients with Acute Heart Failure in Need of Intravenous Inotropic Support (SURVIVE) trial. SURVIVE enrolled 1,327 patients (levosimendan 664, dobutamine 663) from nine nations with 180-day survival from date of randomisation as the primary endpoint. Hospital resource utilisation was determined via clinical case reports. Unit costs were derived from hospital payment schedules for France, Germany and the UK, and represent a third-party payer perspective. Cost-effectiveness analysis was performed for a subset of the SURVIVE patient population selected in accordance with current levosimendan labeling. Mortality in the levosimendan group was 26 versus 28% for dobutamine (hazard ratio 0.91, 95% confidence interval 0.74-1.13, p=0.40). Initial hospitalisation length of stay was identical (levosimendan 14.4, dobutamine 14.5, p=0.98). Slightly lower rates of readmission were observed for levosimendan relative to dobutamine at 31 (p=0.13) and 180 days (p=0.23). Mean costs excluding study drug were equivalent for the index admission (levosimendan euro5,060, dobutamine euro4,952; p=0.91) and complete episode (levosimendan euro5,396, dobutamine euro5,275; p=0.93). At an acquisition cost of euro600 per vial, there is at least 50% likelihood that levosimendan is cost effective relative to dobutamine if willingness to pay is equal to or greater than euro15,000 per life year gained.

  15. Survival analysis with functional covariates for partial follow-up studies.

    Science.gov (United States)

    Fang, Hong-Bin; Wu, Tong Tong; Rapoport, Aaron P; Tan, Ming

    2016-12-01

    Predictive or prognostic analysis plays an increasingly important role in the era of personalized medicine to identify subsets of patients whom the treatment may benefit the most. Although various time-dependent covariate models are available, such models require that covariates be followed in the whole follow-up period. This article studies a new class of functional survival models where the covariates are only monitored in a time interval that is shorter than the whole follow-up period. This paper is motivated by the analysis of a longitudinal study on advanced myeloma patients who received stem cell transplants and T cell infusions after the transplants. The absolute lymphocyte cell counts were collected serially during hospitalization. Those patients are still followed up if they are alive after hospitalization, while their absolute lymphocyte cell counts cannot be measured after that. Another complication is that absolute lymphocyte cell counts are sparsely and irregularly measured. The conventional method using Cox model with time-varying covariates is not applicable because of the different lengths of observation periods. Analysis based on each single observation obviously underutilizes available information and, more seriously, may yield misleading results. This so-called partial follow-up study design represents increasingly common predictive modeling problem where we have serial multiple biomarkers up to a certain time point, which is shorter than the total length of follow-up. We therefore propose a solution to the partial follow-up design. The new method combines functional principal components analysis and survival analysis with selection of those functional covariates. It also has the advantage of handling sparse and irregularly measured longitudinal observations of covariates and measurement errors. Our analysis based on functional principal components reveals that it is the patterns of the trajectories of absolute lymphocyte cell counts, instead of

  16. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  17. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  18. Association between obesity with disease-free survival and overall survival in triple-negative breast cancer: A meta-analysis.

    Science.gov (United States)

    Mei, Lin; He, Lin; Song, Yuhua; Lv, Yang; Zhang, Lijiu; Hao, Fengxi; Xu, Mengmeng

    2018-05-01

    To investigate the relationship between obesity and disease-free survival (DFS) and overall survival (OS) of triple-negative breast cancer. Citations were searched in PubMed, Cochrane Library, and Web of Science. Random effect model meta-analysis was conducted by using Revman software version 5.0, and publication bias was evaluated by creating Egger regression with STATA software version 12. Nine studies (4412 patients) were included for DFS meta-analysis, 8 studies (4392 patients) include for OS meta-analysis. There were no statistical significances between obesity with DFS (P = .60) and OS (P = .71) in triple-negative breast cancer (TNBC) patients. Obesity has no impact on DFS and OS in patients with TNBC.

  19. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  20. Hypofractionated radiation therapy for invasive thyroid carcinoma in dogs: a retrospective analysis of survival

    International Nuclear Information System (INIS)

    Brearley, M.J.; Hayes, A.M.; Murphy, S.

    1999-01-01

    Thirteen dogs with invasive thyroid carcinoma (WHO classification T2b or T3b) seen between January 1991 and October 1997 were treated by external beam Irradiation. Four once-weekly fractions of 9 gray of 4 MeV X-rays were administered. Four of the dogs died of progression of the primary disease and four from metastatic spread. Of the remaining dogs, three died of unrelated problems, although two were still alive at the time of the censor. Kaplan-Meier analysis of the survival time from first dose to death from either primary or metastatic disease gave a median survival time of 96 weeks (mean 85 weeks, range six to 247 weeks). Radiographic evidence of pulmonary metastatic disease at presentation had no prognostic value whereas crude growth rate was a highly significant factor. The present series Indicates that radiation therapy should be considered an important modality for the control of invasive thyroid carcinoma in the dog

  1. Mathematical analysis of 51Cr-labelled red cell survival curves in congenital haemolytic anaemias

    International Nuclear Information System (INIS)

    Kasfiki, A.G.; Antipas, S.E.; Dimitriou, P.A.; Gritzali, F.A.; Melissinos, K.G.

    1982-01-01

    The parameters of 51 Cr labelled red cell survival curves were calculated in 26 patients with homozygous β-thalassaemia, 8 with sickle-cell anaemia and 3 with s-β-thalassaemia, using a non-linear weighted least squares analysis computer program. In thalassaemic children the calculated parameters denote that the shorting of the mean cell life is due to early senescence alone, while there is some evidence that in thalassaemic adults additional extracellular destruction mechanisms participate as well. Red cell survival curves from patients with sickle-cell anaemia and s-β-thalassaemia resemble each other, while their parameters indicate an initial rapid loss of radioactivity, early senescence and the presence of extracellular red cell destruction factors. (orig.)

  2. Retrospective Analysis of the Survival Benefit of Induction Chemotherapy in Stage IVa-b Nasopharyngeal Carcinoma.

    Science.gov (United States)

    Lan, Xiao-Wen; Zou, Xue-Bin; Xiao, Yao; Tang, Jie; OuYang, Pu-Yun; Su, Zhen; Xie, Fang-Yun

    2016-01-01

    The value of adding induction chemotherapy to chemoradiotherapy in locoregionally advanced nasopharyngeal carcinoma (LA-NPC) remains controversial, yet high-risk patients with LA-NPC have poor outcomes after chemoradiotherapy. We aimed to assess the survival benefits of induction chemotherapy in stage IVa-b NPC. A total of 602 patients with stage IVa-b NPC treated with intensity-modulated radiation therapy (IMRT) and concurrent chemotherapy with or without induction chemotherapy were retrospectively analyzed. Overall survival (OS), locoregional relapse-free survival (LRFS), distant metastasis-free survival (DMFS) and progression-free survival (PFS) were evaluated using the Kaplan-Meier method, log-rank test and Cox regression analysis. In univariate analysis, 5-year OS was 83.2% for induction chemotherapy plus concurrent chemotherapy and 74.8% for concurrent chemotherapy alone, corresponding to an absolute risk reduction of 8.4% (P = 0.022). Compared to concurrent chemotherapy alone, addition of induction chemotherapy improved 5-year DMFS (83.2% vs. 74.4%, P = 0.018) but not 5-year LRFS (83.7% vs. 83.0%, P = 0.848) or PFS (71.9% vs. 66.0%, P = 0.12). Age, T category, N category, chemotherapy strategy and clinical stage were associated with 5-year OS (P = 0.017, P = 0.031, P = 0.007, P = 0.022, P = 0.001, respectively). In multivariate analysis, induction chemotherapy plus concurrent chemotherapy was an independent favorable prognostic factor for OS (HR, 0.62; 95% CI, 0.43-0.90, P = 0.012) and DMFS (HR, 0.57; 95% CI, 0.38-0.83, P = 0.004). In subgroup analysis, induction chemotherapy significantly improved 5-year DMFS in stage IVa (86.8% vs. 77.3%, P = 0.008), but provided no significant benefit in stage IVb. In patients with stage IVa-b NPC treated with IMRT, addition of induction chemotherapy to concurrent chemotherapy significantly improved 5-year OS and 5-year DMFS. This study provides a basis for selection of high risk patients in future clinical therapeutic

  3. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  4. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  5. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  7. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  8. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  9. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  10. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  11. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  12. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  13. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  14. Meta-regression analysis of commensal and pathogenic Escherichia coli survival in soil and water.

    Science.gov (United States)

    Franz, Eelco; Schijven, Jack; de Roda Husman, Ana Maria; Blaak, Hetty

    2014-06-17

    The extent to which pathogenic and commensal E. coli (respectively PEC and CEC) can survive, and which factors predominantly determine the rate of decline, are crucial issues from a public health point of view. The goal of this study was to provide a quantitative summary of the variability in E. coli survival in soil and water over a broad range of individual studies and to identify the most important sources of variability. To that end, a meta-regression analysis on available literature data was conducted. The considerable variation in reported decline rates indicated that the persistence of E. coli is not easily predictable. The meta-analysis demonstrated that for soil and water, the type of experiment (laboratory or field), the matrix subtype (type of water and soil), and temperature were the main factors included in the regression analysis. A higher average decline rate in soil of PEC compared with CEC was observed. The regression models explained at best 57% of the variation in decline rate in soil and 41% of the variation in decline rate in water. This indicates that additional factors, not included in the current meta-regression analysis, are of importance but rarely reported. More complete reporting of experimental conditions may allow future inference on the global effects of these variables on the decline rate of E. coli.

  15. Young patients with colorectal cancer have poor survival in the first twenty months after operation and predictable survival in the medium and long-term: Analysis of survival and prognostic markers

    Directory of Open Access Journals (Sweden)

    Wickramarachchi RE

    2010-09-01

    Full Text Available Abstract Objectives This study compares clinico-pathological features in young (50 years with colorectal cancer, survival in the young and the influence of pre-operative clinical and histological factors on survival. Materials and methods A twelve year prospective database of colorectal cancer was analysed. Fifty-three young patients were compared with forty seven consecutive older patients over fifty years old. An analysis of survival was undertaken in young patients using Kaplan Meier graphs, non parametric methods, Cox's Proportional Hazard Ratios and Weibull Hazard models. Results Young patients comprised 13.4 percent of 397 with colorectal cancer. Duration of symptoms and presentation in the young was similar to older patients (median, range; young patients; 6 months, 2 weeks to 2 years, older patients; 4 months, 4 weeks to 3 years, p > 0.05. In both groups, the majority presented without bowel obstruction (young - 81%, older - 94%. Cancer proximal to the splenic flexure was present more in young than in older patients. Synchronous cancers were found exclusively in the young. Mucinous tumours were seen in 16% of young and 4% of older patients (p Conclusion If patients, who are less than 40 years old with colorectal cancer, survive twenty months after operation, the prognosis improves and their survival becomes predictable.

  16. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  17. Integrative analysis of survival-associated gene sets in breast cancer.

    Science.gov (United States)

    Varn, Frederick S; Ung, Matthew H; Lou, Shao Ke; Cheng, Chao

    2015-03-12

    Patient gene expression information has recently become a clinical feature used to evaluate breast cancer prognosis. The emergence of prognostic gene sets that take advantage of these data has led to a rich library of information that can be used to characterize the molecular nature of a patient's cancer. Identifying robust gene sets that are consistently predictive of a patient's clinical outcome has become one of the main challenges in the field. We inputted our previously established BASE algorithm with patient gene expression data and gene sets from MSigDB to develop the gene set activity score (GSAS), a metric that quantitatively assesses a gene set's activity level in a given patient. We utilized this metric, along with patient time-to-event data, to perform survival analyses to identify the gene sets that were significantly correlated with patient survival. We then performed cross-dataset analyses to identify robust prognostic gene sets and to classify patients by metastasis status. Additionally, we created a gene set network based on component gene overlap to explore the relationship between gene sets derived from MSigDB. We developed a novel gene set based on this network's topology and applied the GSAS metric to characterize its role in patient survival. Using the GSAS metric, we identified 120 gene sets that were significantly associated with patient survival in all datasets tested. The gene overlap network analysis yielded a novel gene set enriched in genes shared by the robustly predictive gene sets. This gene set was highly correlated to patient survival when used alone. Most interestingly, removal of the genes in this gene set from the gene pool on MSigDB resulted in a large reduction in the number of predictive gene sets, suggesting a prominent role for these genes in breast cancer progression. The GSAS metric provided a useful medium by which we systematically investigated how gene sets from MSigDB relate to breast cancer patient survival. We used

  18. Talent in Female Gymnastics: a Survival Analysis Based upon Performance Characteristics.

    Science.gov (United States)

    Pion, J; Lenoir, M; Vandorpe, B; Segers, V

    2015-11-01

    This study investigated the link between the anthropometric, physical and motor characteristics assessed during talent identification and dropout in young female gymnasts. 3 cohorts of female gymnasts (n=243; 6-9 years) completed a test battery for talent identification. Performance-levels were monitored over 5 years of competition. Kaplan-Meier and Cox Proportional Hazards analyses were conducted to determine the survival rate and the characteristics that influence dropout respectively. Kaplan-Meier analysis indicated that only 18% of the female gymnasts that passed the baseline talent identification test survived at the highest competition level 5 years later. The Cox Proportional Hazards Model indicated that gymnasts with a score in the best quartile for a specific characteristic significantly increased chances of survival by 45-129%. These characteristics being: basic motor skills (129%), shoulder strength (96%), leg strength (53%) and 3 gross motor coordination items (45-73%). These results suggest that tests batteries commonly used for talent identification in young female gymnasts may also provide valuable insights into future dropout. Therefore, multidimensional test batteries deserve a prominent place in the selection process. The individual test results should encourage trainers to invest in an early development of basic physical and motor characteristics to prevent attrition. © Georg Thieme Verlag KG Stuttgart · New York.

  19. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  20. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  1. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  2. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  3. Survival, causes of death, and prognostic factors in systemic sclerosis: analysis of 947 Brazilian patients.

    Science.gov (United States)

    Sampaio-Barros, Percival D; Bortoluzzo, Adriana B; Marangoni, Roberta G; Rocha, Luiza F; Del Rio, Ana Paula T; Samara, Adil M; Yoshinari, Natalino H; Marques-Neto, João Francisco

    2012-10-01

    To analyze survival, prognostic factors, and causes of death in a large cohort of patients with systemic sclerosis (SSc). From 1991 to 2010, 947 patients with SSc were treated at 2 referral university centers in Brazil. Causes of death were considered SSc-related and non-SSc-related. Multiple logistic regression analysis was used to identify prognostic factors. Survival at 5 and 10 years was estimated using the Kaplan-Meier method. One hundred sixty-eight patients died during the followup. Among the 110 deaths considered related to SSc, there was predominance of lung (48.1%) and heart (24.5%) involvement. Most of the 58 deaths not related to SSc were caused by infection, cardiovascular or cerebrovascular disease, and cancer. Male sex, modified Rodnan skin score (mRSS) > 20, osteoarticular involvement, lung involvement, and renal crisis were the main prognostic factors associated to death. Overall survival rate was 90% for 5 years and 84% for 10 years. Patients presented worse prognosis if they had diffuse SSc (85% vs 92% at 5 yrs, respectively, and 77% vs 87% at 10 yrs, compared to limited SSc), male sex (77% vs 90% at 5 yrs and 64% vs 86% at 10 yrs, compared to female sex), and mRSS > 20 (83% vs 90% at 5 yrs and 66% vs 86% at 10 yrs, compared to mRSS < 20). Survival was worse in male patients with diffuse SSc, and lung and heart involvement represented the main causes of death in this South American series of patients with SSc.

  4. Association of body mass index and survival in pediatric leukemia: a meta-analysis.

    Science.gov (United States)

    Orgel, Etan; Genkinger, Jeanine M; Aggarwal, Divya; Sung, Lillian; Nieder, Michael; Ladas, Elena J

    2016-03-01

    Obesity is a worldwide epidemic in children and adolescents. Adult cohort studies have reported an association between higher body mass index (BMI) and increased leukemia-related mortality; whether a similar effect exists in childhood leukemia remains controversial. We conducted a meta-analysis to determine whether a higher BMI at diagnosis of pediatric acute lymphoblastic leukemia (ALL) or acute myeloid leukemia (AML) is associated with worse event-free survival (EFS), overall survival (OS), and cumulative incidence of relapse (CIR). We searched 4 electronic databases from inception through March 2015 without language restriction and included studies in pediatric ALL or AML (0-21 y of age) reporting BMI as a predictor of survival or relapse. Higher BMI, defined as obese (≥95%) or overweight/obese (≥85%), was compared with lower BMI [nonoverweight/obese (children with a higher BMI (RR: 1.35; 95% CI: 1.20, 1.51) than in those at a lower BMI. A higher BMI was associated with significantly increased mortality (RR: 1.31; 95% CI: 1.09, 1.58) and a statistically nonsignificant trend toward greater risk of relapse (RR: 1.17; 95% CI: 0.99, 1.38) compared with a lower BMI. In AML, a higher BMI was significantly associated with poorer EFS and OS (RR: 1.36; 95% CI: 1.16, 1.60 and RR: 1.56; 95% CI: 1.32, 1.86, respectively) than was a lower BMI. Higher BMI at diagnosis is associated with poorer survival in children with pediatric ALL or AML. © 2016 American Society for Nutrition.

  5. Survival analysis using primary care electronic health record data: A systematic review of the literature.

    Science.gov (United States)

    Hodgkins, Adam Jose; Bonney, Andrew; Mullan, Judy; Mayne, Darren John; Barnett, Stephen

    2018-01-01

    An emerging body of research involves observational studies in which survival analysis is applied to data obtained from primary care electronic health records (EHRs). This systematic review of these studies examined the utility of using this approach. An electronic literature search of the Scopus, PubMed, Web of Science, CINAHL, and Cochrane databases was conducted. Search terms and exclusion criteria were chosen to select studies where survival analysis was applied to the data extracted wholly from EHRs used in primary care medical practice. A total of 46 studies that met the inclusion criteria for the systematic review were examined. All were published within the past decade (2005-2014) with a majority ( n = 26, 57%) being published between 2012 and 2014. Even though citation rates varied from nil to 628, over half ( n = 27, 59%) of the studies were cited 10 times or more. The median number of subjects was 18,042 with five studies including over 1,000,000 patients. Of the included studies, 35 (76%) were published in specialty journals and 11 (24%) in general medical journals. The many conditions studied largely corresponded well with conditions important to general practice. Survival analysis applied to primary care electronic medical data is a research approach that has been frequently used in recent times. The utility of this approach was demonstrated by the ability to produce research with large numbers of subjects, across a wide range of conditions and with the potential of a high impact. Importantly, primary care data were thus available to inform primary care practice.

  6. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  7. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  8. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  9. Demographic and Socio-economic Determinants of Birth Interval Dynamics in Manipur: A Survival Analysis

    Directory of Open Access Journals (Sweden)

    Sanajaoba Singh N,

    2011-01-01

    Full Text Available The birth interval is a major determinant of levels of fertility in high fertility populations. A house-to-house survey of 1225 women in Manipur, a tiny state in North Eastern India was carried out to investigate birth interval patterns and its determinants. Using survival analysis, among the nine explanatory variables of interest, only three factors – infant mortality, Lactation and use of contraceptive devices have highly significant effect (P<0.01 on the duration of birth interval and only three factors – age at marriage of wife, parity and sex of child are found to be significant (P<0.05 on the duration variable.

  10. Parent-child communication and marijuana initiation: evidence using discrete-time survival analysis.

    Science.gov (United States)

    Nonnemaker, James M; Silber-Ashley, Olivia; Farrelly, Matthew C; Dench, Daniel

    2012-12-01

    This study supplements existing literature on the relationship between parent-child communication and adolescent drug use by exploring whether parental and/or adolescent recall of specific drug-related conversations differentially impact youth's likelihood of initiating marijuana use. Using discrete-time survival analysis, we estimated the hazard of marijuana initiation using a logit model to obtain an estimate of the relative risk of initiation. Our results suggest that parent-child communication about drug use is either not protective (no effect) or - in the case of youth reports of communication - potentially harmful (leading to increased likelihood of marijuana initiation). Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Survival analysis to explore the characteristics of employee assistance program (EAP) referrals that remain employed.

    Science.gov (United States)

    Macdonald, S; Albert, W; Maynard, M; French, P

    1989-02-01

    This study examined characteristics of referrals to employee assistance programs (EAP) associated with subsequent termination of employment. As well, relationships between characteristics of the referrals and program characteristics were explored. Longitudinal data were collected at several time periods for 163 referrals to EAPs from five organizations. Survival analysis was conducted to determine which variables were associated with termination of employment. Females, cohabitating couples, and employees who worked for the organization for 5 or more years were most likely to remain employed. One interesting finding was that people with alcohol problems were significantly more likely to be formal referrals.

  12. Estimation of failure criteria in multivariate sensory shelf life testing using survival analysis.

    Science.gov (United States)

    Giménez, Ana; Gagliardi, Andrés; Ares, Gastón

    2017-09-01

    For most food products, shelf life is determined by changes in their sensory characteristics. A predetermined increase or decrease in the intensity of a sensory characteristic has frequently been used to signal that a product has reached the end of its shelf life. Considering all attributes change simultaneously, the concept of multivariate shelf life allows a single measurement of deterioration that takes into account all these sensory changes at a certain storage time. The aim of the present work was to apply survival analysis to estimate failure criteria in multivariate sensory shelf life testing using two case studies, hamburger buns and orange juice, by modelling the relationship between consumers' rejection of the product and the deterioration index estimated using PCA. In both studies, a panel of 13 trained assessors evaluated the samples using descriptive analysis whereas a panel of 100 consumers answered a "yes" or "no" question regarding intention to buy or consume the product. PC1 explained the great majority of the variance, indicating all sensory characteristics evolved similarly with storage time. Thus, PC1 could be regarded as index of sensory deterioration and a single failure criterion could be estimated through survival analysis for 25 and 50% consumers' rejection. The proposed approach based on multivariate shelf life testing may increase the accuracy of shelf life estimations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  14. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  15. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  16. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  17. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  18. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  19. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  20. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  1. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  2. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  3. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  4. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  5. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  6. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  7. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  8. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  9. Recovery and Resource Allocation Strategies to Maximize Mobile Network Survivability by Using Game Theories and Optimization Techniques

    Directory of Open Access Journals (Sweden)

    Pei-Yu Chen

    2013-01-01

    Full Text Available With more and more mobile device users, an increasingly important and critical issue is how to efficiently evaluate mobile network survivability. In this paper, a novel metric called Average Degree of Disconnectivity (Average DOD is proposed, in which the concept of probability is calculated by the contest success function. The DOD metric is used to evaluate the damage degree of the network, where the larger the value of the Average DOD, the more the damage degree of the network. A multiround network attack-defense scenario as a mathematical model is used to support network operators to predict all the strategies both cyber attacker and network defender would likely take. In addition, the Average DOD would be used to evaluate the damage degree of the network. In each round, the attacker could use the attack resources to launch attacks on the nodes of the target network. Meanwhile, the network defender could reallocate its existing resources to recover compromised nodes and allocate defense resources to protect the survival nodes of the network. In the approach to solving this problem, the “gradient method” and “game theory” are adopted to find the optimal resource allocation strategies for both the cyber attacker and mobile network defender.

  10. Survival Analysis of Factors Influencing Cyclic Fatigue of Nickel-Titanium Endodontic Instruments

    Directory of Open Access Journals (Sweden)

    Eva Fišerová

    2015-01-01

    Full Text Available Objective. The aim of this study was to validate a survival analysis assessing the effect of type of rotary system, canal curvature, and instrument size on cyclic resistance. Materials and Methods. Cyclic fatigue testing was carried out in stainless steel artificial canals with radii of curvature of 3 or 5 mm and the angle of curvature of 60 degrees. All the instruments were new and 25 mm in working length, and ISO colour coding indicated the instrument size (yellow for size 20; red for size 25. Wizard Navigator instruments, Mtwo instruments, ProTaper instruments, and Revo-S instruments were passively rotated at 250 rotations per minute, and the time fracture was being recorded. Subsequently, fractographic analysis of broken tips was performed by scanning electron microscope. The data were then analysed by the Kaplan-Meier estimator of the survival function, the Cox proportional hazards model, the Wald test for regression covariates, and the Wald test for significance of regression model. Conclusion. The lifespan registered for the tested instruments was Mtwo > Wizard Navigator > Revo-S > ProTaper; 5 mm radius > 3 mm radius; and yellow > red in ISO colour coding system.

  11. Individual data meta-analysis for the study of survival after pulmonary metastasectomy in colorectal cancer patients: A history of resected liver metastases worsens the prognosis.

    Science.gov (United States)

    Zabaleta, Jon; Iida, Tomohiko; Falcoz, Pierre E; Salah, Samer; Jarabo, José R; Correa, Arlene M; Zampino, Maria G; Matsui, Takashi; Cho, Sukki; Ardissone, Francesco; Watanabe, Kazuhiro; Gonzalez, Michel; Gervaz, Pascal; Emparanza, Jose I; Abraira, Víctor

    2018-03-21

    To assess the impact of a history of liver metastases on survival in patients undergoing surgery for lung metastases from colorectal carcinoma. We reviewed recent studies identified by searching MEDLINE and EMBASE using the Ovid interface, with the following search terms: lung metastasectomy, pulmonary metastasectomy, lung metastases and lung metastasis, supplemented by manual searching. Inclusion criteria were that the research concerned patients with lung metastases from colorectal cancer undergoing surgery with curative intent, and had been published between 2007 and 2014. Exclusion criteria were that the paper was a review, concerned surgical techniques themselves (without follow-up), and included patients treated non-surgically. Using Stata 14, we performed aggregate data and individual data meta-analysis using random-effect and Cox multilevel models respectively. We collected data on 3501 patients from 17 studies. The overall median survival was 43 months. In aggregate data meta-analysis, the hazard ratio for patients with previous liver metastases was 1.19 (95% CI 0.90-1.47), with low heterogeneity (I 2 4.3%). In individual data meta-analysis, the hazard ratio for these patients was 1.37 (95% CI 1.14-1.64; p analysis identified the following factors significantly affecting survival: tumour-infiltrated pulmonary lymph nodes (p analysis protocol in PROSPERO (CRD42015017838). Copyright © 2018 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  12. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  13. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  14. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  15. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  16. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  17. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  18. Acute lymphoblastic leukemia in children and adolescents: prognostic factors and analysis of survival

    Science.gov (United States)

    Lustosa de Sousa, Daniel Willian; de Almeida Ferreira, Francisco Valdeci; Cavalcante Félix, Francisco Helder; de Oliveira Lopes, Marcos Vinicios

    2015-01-01

    Objective To describe the clinical and laboratory features of children and adolescents with acute lymphoblastic leukemia treated at three referral centers in Ceará and evaluate prognostic factors for survival, including age, gender, presenting white blood cell count, immunophenotype, DNA index and early response to treatment. Methods Seventy-six under 19-year-old patients with newly diagnosed acute lymphoblastic leukemia treated with the Grupo Brasileiro de Tratamento de Leucemia da Infância – acute lymphoblastic leukemia-93 and -99 protocols between September 2007 and December 2009 were analyzed. The diagnosis was based on cytological, immunophenotypic and cytogenetic criteria. Associations between variables, prognostic factors and response to treatment were analyzed using the chi-square test and Fisher's exact test. Overall and event-free survival were estimated by Kaplan–Meier analysis and compared using the log-rank test. A Cox proportional hazards model was used to identify independent prognostic factors. Results The average age at diagnosis was 6.3 ± 0.5 years and males were predominant (65%). The most frequently observed clinical features were hepatomegaly, splenomegaly and lymphadenopathy. Central nervous system involvement and mediastinal enlargement occurred in 6.6% and 11.8%, respectively. B-acute lymphoblastic leukemia was more common (89.5%) than T-acute lymphoblastic leukemia. A DNA index >1.16 was found in 19% of patients and was associated with favorable prognosis. On Day 8 of induction therapy, 95% of the patients had lymphoblast counts <1000/μL and white blood cell counts <5.0 × 109/L. The remission induction rate was 95%, the induction mortality rate was 2.6% and overall survival was 72%. Conclusion The prognostic factors identified are compatible with the literature. The 5-year overall and event-free survival rates were lower than those reported for developed countries. As shown by the multivariate analysis, age and baseline white

  19. Acute lymphoblastic leukemia in children and adolescents: prognostic factors and analysis of survival

    Directory of Open Access Journals (Sweden)

    Daniel Willian Lustosa de Sousa

    2015-08-01

    Full Text Available OBJECTIVE: To describe the clinical and laboratory features of children and adolescents with acute lymphoblastic leukemia treated at three referral centers in Ceará and evaluate prognostic factors for survival, including age, gender, presenting white blood cell count, immunophenotype, DNA index and early response to treatment.METHODS: Seventy-six under 19-year-old patients with newly diagnosed acute lymphoblastic leukemia treated with the Grupo Brasileiro de Tratamento de Leucemia da Infância - acute lymphoblastic leukemia-93 and -99 protocols between September 2007 and December 2009 were analyzed. The diagnosis was based on cytological, immunophenotypic and cytogenetic criteria. Associations between variables, prognostic factors and response to treatment were analyzed using the chi-square test and Fisher's exact test. Overall and event-free survival were estimated by Kaplan-Meier analysis and compared using the log-rank test. A Cox proportional hazards model was used to identify independent prognostic factors.RESULTS: The average age at diagnosis was 6.3 ± 0.5 years and males were predominant (65%. The most frequently observed clinical features were hepatomegaly, splenomegaly and lymphadenopathy. Central nervous system involvement and mediastinal enlargement occurred in 6.6% and 11.8%, respectively. B-acute lymphoblastic leukemia was more common (89.5% than T-acute lymphoblastic leukemia. A DNA index >1.16 was found in 19% of patients and was associated with favorable prognosis. On Day 8 of induction therapy, 95% of the patients had lymphoblast counts <1000/µL and white blood cell counts <5.0 Ã- 109/L. The remission induction rate was 95%, the induction mortality rate was 2.6% and overall survival was 72%.CONCLUSION: The prognostic factors identified are compatible with the literature. The 5-year overall and event-free survival rates were lower than those reported for developed countries. As shown by the multivariate analysis, age

  20. Missing data and censoring in the analysis of progression-free survival in oncology clinical trials.

    Science.gov (United States)

    Denne, J S; Stone, A M; Bailey-Iacona, R; Chen, T-T

    2013-01-01

    Progression-free survival (PFS) is increasingly used as a primary endpoint in oncology clinical trials. However, trial conduct is often such that PFS data on some patients may be partially missing either due to incomplete follow-up for progression, or due to data that may be collected but confounded by patients stopping randomized therapy or starting alternative therapy prior to progression. Regulatory guidance on how to handle these patients in the analysis and whether to censor these patients differs between agencies. We present results of a reanalysis of 28 Phase III trials from 12 companies or institutions performed by the Pharmaceutical Research and Manufacturers Association-sponsored PFS Expert Team. We show that analyses not adhering to the intention-to-treat principle tend to give hazard ratio estimates further from unity and describe several factors associated with this shift. We present illustrative simulations to support these findings and provide recommendations for the analysis of PFS.

  1. Cost-effectiveness Analysis in R Using a Multi-state Modeling Survival Analysis Framework: A Tutorial.

    Science.gov (United States)

    Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F

    2017-05-01

    This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.

  2. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  3. Effect of homogenization techniques on reducing the size of microcapsules and the survival of probiotic bacteria therein.

    Science.gov (United States)

    Ding, W K; Shah, N P

    2009-08-01

    This study investigated 2 different homogenization techniques for reducing the size of calcium alginate beads during the microencapsulation process of 8 probiotic bacteria strains, namely, Lactobacillus rhamnosus, L. salivarius, L. plantarum, L. acidophilus, L. paracasei, Bifidobacterium longum, B. lactis type Bi-04, and B. lactis type Bi-07. Two different homogenization techniques were used, namely, ultra-turrax benchtop homogenizer and Microfluidics microfluidizer. Various settings on the homogenization equipment were studied such as the number of passes, speed (rpm), duration (min), and pressure (psi). The traditional mixing method using a magnetic stirrer was used as a control. The size of microcapsules resulting from the homogenization technique, and the various settings were measured using a light microscope and a stage micrometer. The smallest capsules measuring (31.2 microm) were created with the microfluidizer using 26 passes at 1200 psi for 40 min. The greatest loss in viability of 3.21 log CFU/mL was observed when using the ultra-turrax benchtop homogenizer with a speed of 1300 rpm for 5 min. Overall, both homogenization techniques reduced capsule sizes; however, homogenization settings at high rpm also greatly reduced the viability of probiotic organisms.

  4. Impact of anastomotic leak on recurrence and survival after colorectal cancer surgery: a BioGrid Australia analysis.

    Science.gov (United States)

    Sammour, Tarik; Hayes, Ian P; Jones, Ian T; Steel, Malcolm C; Faragher, Ian; Gibbs, Peter

    2018-01-01

    There is conflicting evidence regarding the oncological impact of anastomotic leak following colorectal cancer surgery. This study aims to test the hypothesis that anastomotic leak is independently associated with local recurrence and overall and cancer-specific survival. Analysis of prospectively collected data from multiple centres in Victoria between 1988 and 2015 including all patients who underwent colon or rectal resection for cancer with anastomosis was presented. Overall and cancer-specific survival rates and rates of local recurrence were compared using Cox regression analysis. A total of 4892 patients were included, of which 2856 had completed 5-year follow-up. The overall anastomotic leak rate was 4.0%. Cox regression analysis accounting for differences in age, sex, body mass index, American Society of Anesthesiologists score and tumour stage demonstrated that anastomotic leak was associated with significantly worse 5-year overall survival (χ 2 = 6.459, P = 0.011) for colon cancer, but only if early deaths were included. There was no difference in 5-year colon cancer-specific survival (χ 2 = 0.582, P = 0.446) or local recurrence (χ 2 = 0.735, P = 0.391). For rectal cancer, there was no difference in 5-year overall survival (χ 2 = 0.266, P = 0.606), cancer-specific survival (χ 2 = 0.008, P = 0.928) or local recurrence (χ 2 = 2.192, P = 0.139). Anastomotic leak may reduce 5-year overall survival in colon cancer patients but does not appear to influence the 5-year overall survival in rectal cancer patients. There was no effect on local recurrence or cancer-specific survival. © 2016 Royal Australasian College of Surgeons.

  5. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  6. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  7. Effect of donor ethnicity on kidney survival in different recipient pairs: an analysis of the OPTN/UNOS database.

    Science.gov (United States)

    Callender, C O; Cherikh, W S; Traverso, P; Hernandez, A; Oyetunji, T; Chang, D

    2009-12-01

    Previous multivariate analysis performed between April 1, 1994, and December 31, 2000 from the Organ Procurement Transplant Network/United Network for Organ Sharing (OPTN/UNOS) database has shown that kidneys from black donors were associated with lower graft survival. We compared graft and patient survival of different kidney donor-to-recipient ethnic combinations to see if this result still holds on a recent cohort of US kidney transplants. We included 72,495 recipients of deceased and living donor kidney alone transplants from 2001 to 2005. A multivariate Cox regression method was used to analyze the effect of donor-recipient ethnicity on graft and patient survival within 5 years of transplant, and to adjust for the effect of other donor, recipient, and transplant characteristics. Results are presented as hazard ratios (HR) with the 95% confidence limit (CL) and P values. Adjusted HRs of donor-recipient patient survival were: white to white (1); and white to black (1.22; P = .001). Graft survival HRs were black to black (1.40; P recipients. The graft and patient survival rates for Asian and Latino/Hispanic recipients, however, were not affected by donor ethnicity. This analysis underscores the need for research to better understand the reasons for these disparities and how to improve the posttransplant graft survival rates of black kidney recipients.

  8. Explorative data analysis of MCL reveals gene expression networks implicated in survival and prognosis supported by explorative CGH analysis

    International Nuclear Information System (INIS)

    Blenk, Steffen; Engelmann, Julia C; Pinkert, Stefan; Weniger, Markus; Schultz, Jörg; Rosenwald, Andreas; Müller-Hermelink, Hans K; Müller, Tobias; Dandekar, Thomas

    2008-01-01

    Mantle cell lymphoma (MCL) is an incurable B cell lymphoma and accounts for 6% of all non-Hodgkin's lymphomas. On the genetic level, MCL is characterized by the hallmark translocation t(11;14) that is present in most cases with few exceptions. Both gene expression and comparative genomic hybridization (CGH) data vary considerably between patients with implications for their prognosis. We compare patients over and below the median of survival. Exploratory principal component analysis of gene expression data showed that the second principal component correlates well with patient survival. Explorative analysis of CGH data shows the same correlation. On chromosome 7 and 9 specific genes and bands are delineated which improve prognosis prediction independent of the previously described proliferation signature. We identify a compact survival predictor of seven genes for MCL patients. After extensive re-annotation using GEPAT, we established protein networks correlating with prognosis. Well known genes (CDC2, CCND1) and further proliferation markers (WEE1, CDC25, aurora kinases, BUB1, PCNA, E2F1) form a tight interaction network, but also non-proliferative genes (SOCS1, TUBA1B CEBPB) are shown to be associated with prognosis. Furthermore we show that aggressive MCL implicates a gene network shift to higher expressed genes in late cell cycle states and refine the set of non-proliferative genes implicated with bad prognosis in MCL. The results from explorative data analysis of gene expression and CGH data are complementary to each other. Including further tests such as Wilcoxon rank test we point both to proliferative and non-proliferative gene networks implicated in inferior prognosis of MCL and identify suitable markers both in gene expression and CGH data

  9. Analysis of survival in breast cancer patients by using different parametric models

    Science.gov (United States)

    Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti

    2017-09-01

    In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.

  10. Analysis of the Survival of Children Under Five in Indonesia and Associated Factors

    Science.gov (United States)

    Nur Islami Warrohmah, Annisa; Maniar Berliana, Sarni; Nursalam, Nursalam; Efendi, Ferry; Haryanto, Joni; Has, Eka Misbahatul M.; Ulfiana, Elida; Dwi Wahyuni, Sylvia

    2018-02-01

    The under-five mortality rate (U5MR) remains a challenge for developing nations, including Indonesia. This study aims to assess the key factors associated with mortality of Indonesian infants using survival analysis. Data taken from 14,727 live-born infants (2007-2012) was examined from the nationally representative Indonesian Demographic Health Survey. The Weibull hazard model was performed to analyse the socioeconomic status and related determinants of infant mortality. The findings indicated that mother factors (education, working status, autonomy, economic status, maternal age at birth, birth interval, type of births, complications, history of previous mortality, breastfeeding, antenatal care and place of delivery); infant factors (birth size); residence; and environmental conditions were associated with the childhood mortality. Rural or urban residence was an important determining factor of infant mortality. For example, considering the factor of a mother’s education, rural educated mothers had a significant association with the survival of their infants. In contrast, there was no significant association between urban educated mothers and their infants’ mortality. The results showed obvious contextual differences which determine the childhood mortality. Socio-demographic and economic factors remain critical in determining the death of infants. This study provides evidence for designing targeted interventions, as well as suggesting specific needs based on the population’s place of residence, in the issue of U5MR. Further interventions should also consider other identified variables while developing programmes to address infant’s needs.

  11. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  12. Temperature analysis of laser ignited metalized material using spectroscopic technique

    Science.gov (United States)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  13. Survival Analysis of Occipital Nerve Stimulator Leads Placed under Fluoroscopic Guidance with and without Ultrasonography.

    Science.gov (United States)

    Jones, James H; Brown, Alison; Moyse, Daniel; Qi, Wenjing; Roy, Lance

    2017-11-01

    Electrical stimulation of the greater occipital nerves is performed to treat pain secondary to chronic daily headaches and occipital neuralgia. The use of fluoroscopy alone to guide the surgical placement of electrodes near the greater occipital nerves disregards the impact of tissue planes on lead stability and stimulation efficacy. We hypothesized that occipital neurostimulator (ONS) leads placed with ultrasonography combined with fluoroscopy would demonstrate increased survival rates and times when compared to ONS leads placed with fluoroscopy alone. A 2-arm retrospective chart review. A single academic medical center. This retrospective chart review analyzed the procedure notes and demographic data of patients who underwent the permanent implant of an ONS lead between July 2012 and August 2015. Patient data included the diagnosis (reason for implant), smoking tobacco use, disability, and age. ONS lead data included the date of permanent implant, the imaging modality used during permanent implant (fluoroscopy with or without ultrasonography), and, if applicable, the date and reason for lead removal. A total of 21 patients (53 leads) were included for the review. Chi-squared tests, Fishers exact tests, 2-sample t-tests, and Wilcoxon rank-sum tests were used to compare fluoroscopy against combined fluoroscopy and ultrasonography as implant methods with respect to patient demographics. These tests were also used to evaluate the primary aim of this study, which was to compare the survival rates and times of ONS leads placed with combined ultrasonography and fluoroscopy versus those placed with fluoroscopy alone. Survival analysis was used to assess the effect of implant method, adjusted for patient demographics (age, smoking tobacco use, and disability), on the risk of lead explant. Data from 21 patients were collected, including a total of 53 ONS leads. There was no statistically significant difference in the lead survival rate or time, disability, or patient age

  14. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  15. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  16. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  17. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  18. Imaging Flow Cytometry Analysis to Identify Differences of Survival Motor Neuron Protein Expression in Patients With Spinal Muscular Atrophy.

    Science.gov (United States)

    Arakawa, Reiko; Arakawa, Masayuki; Kaneko, Kaori; Otsuki, Noriko; Aoki, Ryoko; Saito, Kayoko

    2016-08-01

    Spinal muscular atrophy is a neurodegenerative disorder caused by the deficient expression of survival motor neuron protein in motor neurons. A major goal of disease-modifying therapy is to increase survival motor neuron expression. Changes in survival motor neuron protein expression can be monitored via peripheral blood cells in patients; therefore we tested the sensitivity and utility of imaging flow cytometry for this purpose. After the immortalization of peripheral blood lymphocytes from a human healthy control subject and two patients with spinal muscular atrophy type 1 with two and three copies of SMN2 gene, respectively, we used imaging flow cytometry analysis to identify significant differences in survival motor neuron expression. A bright detail intensity analysis was used to investigate differences in the cellular localization of survival motor neuron protein. Survival motor neuron expression was significantly decreased in cells derived from patients with spinal muscular atrophy relative to those derived from a healthy control subject. Moreover, survival motor neuron expression correlated with the clinical severity of spinal muscular atrophy according to SMN2 copy number. The cellular accumulation of survival motor neuron protein was also significantly decreased in cells derived from patients with spinal muscular atrophy relative to those derived from a healthy control subject. The benefits of imaging flow cytometry for peripheral blood analysis include its capacities for analyzing heterogeneous cell populations; visualizing cell morphology; and evaluating the accumulation, localization, and expression of a target protein. Imaging flow cytometry analysis should be implemented in future studies to optimize its application as a tool for spinal muscular atrophy clinical trials. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Survival Analysis of F98 Glioma Rat Cells Following Minibeam or Broad-Beam Synchrotron Radiation Therapy

    International Nuclear Information System (INIS)

    Gil, Silvia; Sarun, Sukhéna; Biete, Albert; Prezado, Yolanda; Sabés, Manel

    2011-01-01

    In the quest of a curative radiotherapy treatment for gliomas new delivery modes are being explored. At the Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF), a new spatially-fractionated technique, called Minibeam Radiation Therapy (MBRT) is under development. The aim of this work is to compare the effectiveness of MBRT and broad-beam (BB) synchrotron radiation to treat F98 glioma rat cells. A dose escalation study was performed in order to delimit the range of doses where a therapeutic effect could be expected. These results will help in the design and optimization of the forthcoming in vivo studies at the ESRF. Two hundred thousand F98 cells were seeded per well in 24-well plates, and incubated for 48 hours before being irradiated with spatially fractionated and seamless synchrotron x-rays at several doses. The percentage of each cell population (alive, early apoptotic and dead cells, where either late apoptotic as necrotic cells are included) was assessed by flow cytometry 48 hours after irradiation, whereas the metabolic activity of surviving cells was analyzed on days 3, 4, and 9 post-irradiation by using QBlue test. The endpoint (or threshold dose from which an important enhancement in the effectiveness of both radiation treatments is achieved) obtained by flow cytometry could be established just before 12 Gy in the two irradiation schemes, whilst the endpoints assessed by the QBlue reagent, taking into account the cell recovery, were set around 18 Gy in both cases. In addition, flow cytometric analysis pointed at a larger effectiveness for minibeams, due to the higher proportion of early apoptotic cells. When the valley doses in MBRT equal the dose deposited in the BB scheme, similar cell survival ratio and cell recovery were observed. However, a significant increase in the number of early apoptotic cells were found 48 hours after the minibeam radiation in comparison with the seamless mode

  20. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  1. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  2. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  3. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    Science.gov (United States)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  4. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  5. The application of radiotracer technique for preconcentration neutron activation analysis

    International Nuclear Information System (INIS)

    Wang Xiaolin; Chen Yinliang; Sun Ying; Fu Yibei

    1995-01-01

    The application of radiotracer technique for preconcentration neutron activation analysis (Pre-NAA) are studied and the method for determination of chemical yield of Pre-NAA is developed. This method has been applied to determination of gold, iridium and rhenium in steel and rock samples and the contents of noble metal are in the range of 1-20 ng·g -1 (sample). In addition, the accuracy difference caused by determination of chemical yield between RNAA and Pre-NAA are also discussed

  6. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  7. Survival benefit of postoperative radiation in papillary meningioma: Analysis of the National Cancer Data Base.

    Science.gov (United States)

    Sumner, Whitney A; Amini, Arya; Hankinson, Todd C; Foreman, Nicholas K; Gaspar, Laurie E; Kavanagh, Brian D; Karam, Sana D; Rusthoven, Chad G; Liu, Arthur K

    2017-01-01

    Papillary meningioma represents a rare subset of World Health Organization (WHO) Grade III meningioma that portends an overall poor prognosis. There is relatively limited data regarding the benefit of postoperative radiation therapy (PORT). We used the National Cancer Data Base (NCDB) to compare overall survival (OS) outcomes of surgically resected papillary meningioma cases undergoing PORT compared to post-operative observation. The NCDB was queried for patients with papillary meningioma, diagnosed between 2004 and 2013, who underwent upfront surgery with or without PORT. Overall survival (OS) was determined using the Kaplan-Meier method. Univariate (UVA) and multivariate (MVA) analyses were performed. In total, 190 patients were identified; 89 patients underwent PORT, 101 patients were observed. Eleven patients received chemotherapy (6 with PORT, 5 without). 2-Year OS was significantly improved with PORT vs. no PORT (93.0% vs. 74.4%), as was 5-year OS (78.5% vs. 62.5%) (hazard ratio [HR], 0.48; 95% confidence interval [CI], 0.27-0.85; p  = 0.01). On MVA, patients receiving PORT had improved OS compared to observation (HR, 0.41; 95% CI, 0.22-0.76; p  = 0.005). On subset analysis by age group, the benefit of PORT vs. no PORT was significant in patients ≤18 years ( n  = 13), with 2-year OS of 85.7% vs. 50.0% (HR, 0.08; 95% CI, 0.01-0.80; p  = 0.032) and for patients >18 years ( n  = 184), with 2-year OS of 94.7% vs. 76.1% (HR, 0.55; 95% CI, 0.31-1.00; p  = 0.049), respectively. In this large contemporary analysis, PORT was associated with improved survival for both adult and pediatric patients with papillary meningioma. PORT should be considered in those who present with this rare, aggressive tumor.

  8. Recursive partitioning analysis (RPA) classification predicts survival in patients with brain metastases from sarcoma.

    Science.gov (United States)

    Grossman, Rachel; Ram, Zvi

    2014-12-01

    Sarcoma rarely metastasizes to the brain, and there are no specific treatment guidelines for these tumors. The recursive partitioning analysis (RPA) classification is a well-established prognostic scale used in many malignancies. In this study we assessed the clinical characteristics of metastatic sarcoma to the brain and the validity of the RPA classification system in a subset of 21 patients who underwent surgical resection of metastatic sarcoma to the brain We retrospectively analyzed the medical, radiological, surgical, pathological, and follow-up clinical records of 21 patients who were operated for metastatic sarcoma to the brain between 1996 and 2012. Gliosarcomas, sarcomas of the head and neck with local extension into the brain, and metastatic sarcomas to the spine were excluded from this reported series. The patients' mean age was 49.6 ± 14.2 years (range, 25-75 years) at the time of diagnosis. Sixteen patients had a known history of systemic sarcoma, mostly in the extremities, and had previously received systemic chemotherapy and radiation therapy for their primary tumor. The mean maximal tumor diameter in the brain was 4.9 ± 1.7 cm (range 1.7-7.2 cm). The group's median preoperative Karnofsky Performance Scale was 80, with 14 patients presenting with Karnofsky Performance Scale of 70 or greater. The median overall survival was 7 months (range 0.2-204 months). The median survival time stratified by the Radiation Therapy Oncology Group RPA classes were 31, 7, and 2 months for RPA class I, II, and III, respectively (P = 0.0001). This analysis is the first to support the prognostic utility of the Radiation Therapy Oncology Group RPA classification for sarcoma brain metastases and may be used as a treatment guideline tool in this rare disease. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Molecular Infectious Disease Epidemiology: Survival Analysis and Algorithms Linking Phylogenies to Transmission Trees

    Science.gov (United States)

    Kenah, Eben; Britton, Tom; Halloran, M. Elizabeth; Longini, Ira M.

    2016-01-01

    Recent work has attempted to use whole-genome sequence data from pathogens to reconstruct the transmission trees linking infectors and infectees in outbreaks. However, transmission trees from one outbreak do not generalize to future outbreaks. Reconstruction of transmission trees is most useful to public health if it leads to generalizable scientific insights about disease transmission. In a survival analysis framework, estimation of transmission parameters is based on sums or averages over the possible transmission trees. A phylogeny can increase the precision of these estimates by providing partial information about who infected whom. The leaves of the phylogeny represent sampled pathogens, which have known hosts. The interior nodes represent common ancestors of sampled pathogens, which have unknown hosts. Starting from assumptions about disease biology and epidemiologic study design, we prove that there is a one-to-one correspondence between the possible assignments of interior node hosts and the transmission trees simultaneously consistent with the phylogeny and the epidemiologic data on person, place, and time. We develop algorithms to enumerate these transmission trees and show these can be used to calculate likelihoods that incorporate both epidemiologic data and a phylogeny. A simulation study confirms that this leads to more efficient estimates of hazard ratios for infectiousness and baseline hazards of infectious contact, and we use these methods to analyze data from a foot-and-mouth disease virus outbreak in the United Kingdom in 2001. These results demonstrate the importance of data on individuals who escape infection, which is often overlooked. The combination of survival analysis and algorithms linking phylogenies to transmission trees is a rigorous but flexible statistical foundation for molecular infectious disease epidemiology. PMID:27070316

  10. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  11. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  12. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  13. Analysis of survival for patients with chronic kidney disease primarily related to renal cancer surgery.

    Science.gov (United States)

    Wu, Jitao; Suk-Ouichai, Chalairat; Dong, Wen; Antonio, Elvis Caraballo; Derweesh, Ithaar H; Lane, Brian R; Demirjian, Sevag; Li, Jianbo; Campbell, Steven C

    2018-01-01

    To evaluate predictors of long-term survival for patients with chronic kidney disease primarily due to surgery (CKD-S). Patients with CKD-S have generally good survival that approximates patients who do not have CKD even after renal cancer surgery (RCS), yet there may be heterogeneity within this cohort. From 1997 to 2008, 4 246 patients underwent RCS at our centre. The median (interquartile range [IQR]) follow-up was 9.4 (7.3-11.0) years. New baseline glomerular filtration rate (GFR) was defined as highest GFR between nadir and 6 weeks after RCS. We retrospectively evaluated three cohorts: no-CKD (new baseline GFR of ≥60 mL/min/1.73 m 2 ); CKD-S (new baseline GFR of cancer-related survival (NRCRS) for the CKD-S cohort. Kaplan-Meier analysis assessed the longitudinal impact of new baseline GFR (45-60 mL/min/1.73 m 2 vs <45 mL/min/1.73 m 2 ) and Cox regression evaluated relative impact of preoperative GFR, new baseline GFR, and relevant demographics/comorbidities. Of the 4 246 patients who underwent RCS, 931 had CKD-S and 1 113 had CKD-M/S, whilst 2 202 had no-CKD even after RCS. Partial/radical nephrectomy (PN/RN) was performed in 54%/46% of the patients, respectively. For CKD-S, 641 patients had a new baseline GFR of 45-60 mL/min/1.73 m 2 and 290 had a new baseline GFR of <45 mL/min/1.73 m 2 . Kaplan-Meier analysis showed significantly reduced NRCRS for patients with CKD-S with a GFR of <45 mL/min/1.73 m 2 compared to those with no-CKD or CKD-S with a GFR of 45-60 mL/min/1.73 m 2 (both P ≤ 0.004), and competing risk analysis confirmed this (P < 0.001). Age, gender, heart disease, and new baseline GFR were all associated independently with NRCRS for patients with CKD-S (all P ≤ 0.02). Our data suggest that CKD-S is heterogeneous, and patients with a reduced new baseline GFR have compromised survival, particularly if <45 mL/min/1.73 m 2 . Our findings may have implications regarding choice of PN/RN in patients at risk of developing

  14. Topology based data analysis identifies a subgroup of breast cancers with a unique mutational profile and excellent survival.

    Science.gov (United States)

    Nicolau, Monica; Levine, Arnold J; Carlsson, Gunnar

    2011-04-26

    High-throughput biological data, whether generated as sequencing, transcriptional microarrays, proteomic, or other means, continues to require analytic methods that address its high dimensional aspects. Because the computational part of data analysis ultimately identifies shape characteristics in the organization of data sets, the mathematics of shape recognition in high dimensions continues to be a crucial part of data analysis. This article introduces a method that extracts information from high-throughput microarray data and, by using topology, provides greater depth of information than current analytic techniques. The method, termed Progression Analysis of Disease (PAD), first identifies robust aspects of cluster analysis, then goes deeper to find a multitude of biologically meaningful shape characteristics in these data. Additionally, because PAD incorporates a visualization tool, it provides a simple picture or graph that can be used to further explore these data. Although PAD can be applied to a wide range of high-throughput data types, it is used here as an example to analyze breast cancer transcriptional data. This identified a unique subgroup of Estrogen Receptor-positive (ER(+)) breast cancers that express high levels of c-MYB and low levels of innate inflammatory genes. These patients exhibit 100% survival and no metastasis. No supervised step beyond distinction between tumor and healthy patients was used to identify this subtype. The group has a clear and distinct, statistically significant molecular signature, it highlights coherent biology but is invisible to cluster methods, and does not fit into the accepted classification of Luminal A/B, Normal-like subtypes of ER(+) breast cancers. We denote the group as c-MYB(+) breast cancer.

  15. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  16. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...... technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces...

  17. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  18. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  19. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  20. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  1. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  2. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  3. Tumour heterogeneity in non-small cell lung carcinoma assessed by CT texture analysis: a potential marker of survival

    International Nuclear Information System (INIS)

    Ganeshan, Balaji; Miles, Ken; Panayiotou, Elleny; Burnand, Kate; Dizdarevic, Sabina

    2012-01-01

    To establish the potential for tumour heterogeneity in non-small cell lung cancer (NSCLC) as assessed by CT texture analysis (CTTA) to provide an independent marker of survival for patients with NSCLC. Tumour heterogeneity was assessed by CTTA of unenhanced images of primary pulmonary lesions from 54 patients undergoing 18 F-fluorodeoxyglucose (FDG) PET-CT for staging of NSCLC. CTTA comprised image filtration to extract fine, medium and coarse features with quantification of the distribution of pixel values (uniformity) within the filtered images. Receiver operating characteristics identified thresholds for PET and CTTA parameters that were related to patient survival using Kaplan-Meier analysis. The median (range) survival was 29.5 (1-38) months. 24, 10, 14 and 6 patients had tumour stages I, II, III and IV respectively. PET stage and tumour heterogeneity assessed by CTTA were significant independent predictors of survival (PET stage: Odds ratio 3.85, 95% confidence limits 0.9-8.09, P = 0.002; CTTA: Odds ratio 56.4, 95% confidence limits 4.79-666, p = 0.001). SUV was not a significantly associated with survival. Assessment of tumour heterogeneity by CTTA of non-contrast enhanced images has the potential for to provide a novel, independent predictor of survival for patients with NSCLC. (orig.)

  4. Rethinking plant functional types in Earth System Models: pan-tropical analysis of tree survival across environmental gradients

    Science.gov (United States)

    Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.

    2016-12-01

    Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.

  5. Network survivability performance

    Science.gov (United States)

    1993-11-01

    This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunications networks to user expectations for network survivability and a foundation for continuing industry activities in the subject area. This report focuses on the survivability of both public and private networks and covers a wide range of users. Two frameworks are established for quantifying and categorizing service outages, and for classifying network survivability techniques and measures. The performance of the network survivability techniques is considered; however, recommended objectives are not established for network survivability performance.

  6. Risk factors for dental caries in childhood: a five-year survival analysis.

    Science.gov (United States)

    Lee, Hyo-Jin; Kim, Jin-Bom; Jin, Bo-Hyoung; Paik, Dai-Il; Bae, Kwang-Hak

    2015-04-01

    The purpose of this study was to examine the risk factors of dental caries at the level of an individual person with survival analysis of the prospective data for 5 years. A total of 249 first-grade students participated in a follow-up study for 5 years. All participants responded to a questionnaire inquiring about socio-demographic variables and oral health behaviors. They also received an oral examination and were tested for Dentocult SM and LB. Over 5 years, the participants received yearly oral follow-up examinations to determine the incidence of dental caries. The incidence of one or more dental caries (DC1) and four or more dental caries (DC4) were defined as one or more and four or more decayed, missing, and filled permanent teeth increments, respectively. Socio-demographic variables, oral health behaviors, and status and caries activity tests were assessed as risk factors for DC1 and DC4. The adjusted hazard ratios (HRs) of risk factors for DC1 and DC4 were calculated using Cox proportional hazard regression models. During the 5-year follow-up period, DC1 and DC4 occurred in 87 and 25 participants, respectively. In multivariate hazard models, five or more decayed, missing, and filled primary molar teeth [HR 1.93, 95% confidence interval (CI) 1.19-3.13], and Dentocult LB of two or three (HR 2.21, 95% CI 1.37-3.56) were independent risk factors of DC1. For DC4, only Dentocult LB of two or three was an independent risk factor (HR 2.95, 95% CI 1.11-7.79). Our results suggest that dental caries incidence at an individual level can be associated with the experience of dental caries in primary teeth and Dentocult LB based on the survival models for the 5-year prospective data. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. HIV testing in the maternity ward and the start of breastfeeding: a survival analysis

    Directory of Open Access Journals (Sweden)

    Glaucia T. Possolli

    2015-08-01

    Full Text Available OBJECTIVE: The purpose of this study was to analyze the factors that influence of the time between birth and the beginning of breastfeeding, especially at the moment of the rapid HIV test results at hospital admission for delivery.METHODS: Cohort study of 932 pregnant women who underwent rapid HIV test admitted in the hospital for delivery in Baby-Friendly Hospitals. The survival curves of time from birth to the first feeding were estimated by the Kaplan-Meier method and the joint effect of independent variables by the Cox model with a hierarchical analysis. As the survival curves were not homogeneous among the five hospitals, hindering the principle of proportionality of risks, the data were divided into two groups according to the median time of onset of breastfeeding at birth in women undergoing rapid HIV testing.RESULTS: Hospitals with median time to breastfeeding onset at birth of up to 60 min were considered as early breastfeeding onset and those with higher medians were considered as late breastfeeding onset at birth. Risk factors common to hospitals considered to be with early and late breastfeeding onset at birth were: cesarean section (RR = 1.75 [95% CI: 1.38-2.22]; RR = 3.83 [95% CI: 3.03-4.85] and rapid test result after birth (RR = 1.45 [95% CI: 1.12-1.89]; RR = 1.65 [95% CI: 1.35-2.02], respectively; and hospitals with late onset: starting prenatal care in the third trimester (RR = 1.86 [95% CI: 1.16-2.97].CONCLUSIONS: The onset of breastfeeding is postponed, even in Baby-Friendly Hospitals, when the results of the rapid HIV test requested in the maternity are not available at the time of delivery.

  8. Sociocultural Factors of Survival of Males and Females in Economically Active Age: a Regional Analysis

    Directory of Open Access Journals (Sweden)

    Evgeniya Khasanovna Tukhtarova

    2018-03-01

    Full Text Available The period, when a person starts and completes his or her professional carrier and labour participation, in general, coincides with the age when the self-preservation behaviour develops. It is a time when a person aims for a healthy and safe lifestyle. During this period, an individual assumes the main standards, values of the self-preservation behaviour inherent in an ethnic, social and cultural macro-environment. To research the sociocultural factors of survival, we applied econometric modelling to demographic processes using the discrete and probabilistic indicators of the mortality tables of male and female in economically active age. The econometric model included the elements of spatiotemporal characteristics of territories. These characteristics are interrelated with the indicators of survival probability and the indicator of average life expectancy in the regions of Russia. We choose the major sociocultural factors by the correlation ratio of indicators and their sensitivity. The econometric analysis has revealed a high degree of sensitivity of a territorial variation of demographic and sociocultural factors in the regions of Russia, including a gender aspect. The most significant socio-economic factors, which determine the self-preservation behaviour of males, are the following: 1 the size of Gross Regional Product per capita; 2 quality of health infrastructure; 3 fixed investments; 4 population with monetary income under the subsistence minimum (share coefficient of income differentials. The female have the same hierarchy of socio-economic factors, except for the sensitivity of variables to the regional differentiation of signs. The household poverty factor has little significance for the women and it is the main difference between male and female. The built model has shown the predictive importance in the assessment of the above-mentioned factors in short and medium-term prospects.

  9. Survival Outcomes in Resected Extrahepatic Cholangiocarcinoma: Effect of Adjuvant Radiotherapy in a Surveillance, Epidemiology, and End Results Analysis

    International Nuclear Information System (INIS)

    Vern-Gross, Tamara Z.; Shivnani, Anand T.; Chen, Ke; Lee, Christopher M.; Tward, Jonathan D.; MacDonald, O. Kenneth; Crane, Christopher H.; Talamonti, Mark S.; Munoz, Louis L.; Small, William

    2011-01-01

    Purpose: The benefit of adjuvant radiotherapy (RT) after surgical resection for extrahepatic cholangiocarcinoma has not been clearly established. We analyzed survival outcomes of patients with resected extrahepatic cholangiocarcinoma and examined the effect of adjuvant RT. Methods and Materials: Data were obtained from the Surveillance, Epidemiology, and End Results (SEER) program between 1973 and 2003. The primary endpoint was the overall survival time. Cox regression analysis was used to perform univariate and multivariate analyses of the following clinical variables: age, year of diagnosis, histologic grade, localized (Stage T1-T2) vs. regional (Stage T3 or greater and/or node positive) stage, gender, race, and the use of adjuvant RT after surgical resection. Results: The records for 2,332 patients were obtained. Patients with previous malignancy, distant disease, incomplete or conflicting records, atypical histologic features, and those treated with preoperative/intraoperative RT were excluded. Of the remaining 1,491 patients eligible for analysis, 473 (32%) had undergone adjuvant RT. After a median follow-up of 27 months (among surviving patients), the median overall survival time for the entire cohort was 20 months. Patients with localized and regional disease had a median survival time of 33 and 18 months, respectively (p < .001). The addition of adjuvant RT was not associated with an improvement in overall or cause-specific survival for patients with local or regional disease. Conclusion: Patients with localized disease had significantly better overall survival than those with regional disease. Adjuvant RT was not associated with an improvement in long-term overall survival in patients with resected extrahepatic bile duct cancer. Key data, including margin status and the use of combined chemotherapy, was not available through the SEER database.

  10. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  11. Analysis of Survival of Patients with Chronic Myeloid Leukemia Treated with Imatinib in the Last 15 Years in Lebanon.

    Science.gov (United States)

    Massoud, Marcel; Sakr, Riwa; Kerbage, Fouad; Makdissi, Joseph; Hawi, Jenny; Rached, Layale; Nasr, Fady; Chahine, Georges

    2017-07-01

    In the 2000s, the introduction of the tyrosine kinase inhibitor (TKI), imatinib, improved the survival outcomes of patients with chronic myeloid leukemia (CML). In Lebanon, we rapidly adopted this treatment strategy. To the best of our knowledge, this is the first study reporting the survival rates of Lebanese CML patients. We examined the rates of major molecular response (MMR) and complete cytogenetic response (CCyR) and analyzed the overall survival, progression-free survival, and event-free survival of CML patients treated with front-line imatinib in 3 university hospitals in Lebanon. We retrospectively reviewed the medical records of 46 patients diagnosed with CML and treated with front-line imatinib 400 mg/day from 2000 and followed up to 2015. In all patients, initially, 2 diagnostic tests were performed: cytogenetic analysis and qualitative molecular testing of the BCR-ABL transcript. The male-to-female sex ratio was 3:1. The median age at diagnosis was 49 years, and the mean age was 44.52 years. At diagnosis, 46 patients were in the chronic phase. All patients started imatinib 400 mg/day. Of the 46 patients, 35 had a typical karyotype, 8 an atypical karyotype, and 3 hypoploidism. The MMR rate at 18 months was 58.69%. The cumulative CCyR rate at 18 months of therapy with imatinib at the standard dose was 67.39%. The event-free survival rate was 75.86% and 74.14% at 5 and 8 years, respectively. The progression-free survival rate was 77.59% and 75.86% at 5 and 8 years, respectively. The overall survival rate was 98.27% and 98.27% at 5 and 8 years, respectively. Of the 46 patients, 12 developed disease progression and were salvaged by second-generation TKIs. These 12 patients were still alive with a MMR. In our study population, the achievement of a MMR and CCyR and overall survival, progression-free survival, and event-free survival were similar to previous published data. Reaching high survival rates with a first-generation TKI in a country with limited

  12. TU-EF-BRD-02: Indicators and Technique Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlone, M. [Princess Margaret Hospital (Canada)

    2015-06-15

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  13. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  14. Auto-SCT improves survival in systemic light chain amyloidosis: a retrospective analysis with 14-year follow-up.

    Science.gov (United States)

    Parmar, S; Kongtim, P; Champlin, R; Dinh, Y; Elgharably, Y; Wang, M; Bashir, Q; Shah, J J; Shah, N; Popat, U; Giralt, S A; Orlowski, R Z; Qazilbash, M H

    2014-08-01

    Optimal treatment approach continues to remain a challenge for systemic light chain amyloidosis (AL). So far, Auto-SCT is the only modality associated with long-term survival. However, failure to show survival benefit in randomized study raises questions regarding its efficacy. We present a comparative outcome analysis of Auto-SCT to conventional therapies (CTR) in AL patients treated over a 14-year period at our institution. Out of the 145 AL amyloidosis patients, Auto-SCT was performed in 80 patients with 1-year non-relapse mortality rate of 12.5%. Novel agents were used as part of induction therapy in 56% of transplant recipients vs 46% of CTR patients. Hematological and organ responses were seen in 74.6% and 39% in the Auto-SCT arm vs 53% and 12% in the CTR arm, respectively. The projected 5-year survival for Auto-SCT vs CTR was 63% vs 38%, respectively. Landmark analysis of patients alive at 1-year after diagnosis showed improved 5-year OS of 72% with Auto-SCT vs 65% in the CTR arm. In the multivariate analysis, age SCT were associated with improved survival. In conclusion, Auto-SCT is associated with long-term survival for patients with AL amyloidosis.

  15. A direct observation technique for evaluating sclerotium germination by Macrophomina phaseolina and effects of biocontrol materials on survival of sclerotia in soil.

    Science.gov (United States)

    Pratt, Robert G

    2006-08-01

    Germination of sclerotia of Macrophomina phaseolina was quantified by direct microscopic observation following application of experimental treatments in vitro and incubation of sclerotia in soil. To assay germination, pieces of agar containing sclerotia were macerated in dilute, liquid cornmeal agar on glass slides; thinly spread; and incubated in a saturated atmosphere for 18-22 h. Germinated sclerotia then were identified by morphological features of germ hyphae. Frequencies of germination were similar in three dilute agar media. Germination was not affected by air-drying sclerotia for 2 weeks, but it was significantly reduced after 4 weeks and greatly reduced or eliminated after 6 or 8 weeks. Survival of sclerotia for 14 days in soil was greatest at 50, 75, and 100% moisture-holding capacity, less at 0 and 25%, and least at 125% (flooded soil). Incorporation of ground poultry litter into soil at 5% by weight reduced survival of sclerotia after 13 days, and incorporation of litter at 10% nearly eliminated it. These results indicate that the direct-observation technique may be used to evaluate animal wastes and other agricultural byproducts for biocontrol activity against sclerotia of M. phaseolina in soil.

  16. Regression modeling strategies with applications to linear models, logistic and ordinal regression, and survival analysis

    CERN Document Server

    Harrell , Jr , Frank E

    2015-01-01

    This highly anticipated second edition features new chapters and sections, 225 new references, and comprehensive R software. In keeping with the previous edition, this book is about the art and science of data analysis and predictive modeling, which entails choosing and using multiple tools. Instead of presenting isolated techniques, this text emphasizes problem solving strategies that address the many issues arising when developing multivariable models using real data and not standard textbook examples. It includes imputation methods for dealing with missing data effectively, methods for fitting nonlinear relationships and for making the estimation of transformations a formal part of the modeling process, methods for dealing with "too many variables to analyze and not enough observations," and powerful model validation techniques based on the bootstrap.  The reader will gain a keen understanding of predictive accuracy, and the harm of categorizing continuous predictors or outcomes.  This text realistically...

  17. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  18. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  19. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  20. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  1. Stereotactic Radiosurgery in the Management of Brain Metastases: An Institutional Retrospective Analysis of Survival

    International Nuclear Information System (INIS)

    Frazier, James L.; Batra, Sachin; Kapor, Sumit; Vellimana, Ananth; Gandhi, Rahul; Carson, Kathryn A.; Shokek, Ori; Lim, Michael; Kleinberg, Lawrence; Rigamonti, Daniele

    2010-01-01

    Purpose: The objective of this study was to report our experience with stereotactic radiosurgery performed with the Gamma Knife (GK) in the treatment of patients with brain metastases and to compare survival for those treated with radiosurgery alone with survival for those treated with radiosurgery and whole-brain radiotherapy. Methods and Materials: Prospectively collected demographic and clinical characteristics and treatment and survival data on 237 patients with intracranial metastases who underwent radiosurgery with the GK between 2003 and 2007 were reviewed. Kaplan-Meier and Cox proportional hazards regression analyses were used to compare survival by demographic and clinical characteristics and treatment. Results: The mean age of the patient population was 56 years. The most common tumor histologies were non-small-cell lung carcinoma (34.2%) and breast cancer (13.9%). The median overall survival time was 8.5 months from the time of treatment. The median survival times for patients with one, two/three, and four or more brain metastases were 8.5, 9.4, and 6.7 months, respectively. Patients aged 65 years or greater and those aged less than 65 years had median survival times of 7.8 and 9 months, respectively (p = 0.008). The Karnofsky Performance Score (KPS) at the time of treatment was a significant predictor of survival: those patients with a KPS of 70 or less had a median survival of 2.9 months compared with 10.3 months (p = 0.034) for those with a KPS of 80 or greater. There was no statistically significant difference in survival between patients treated with radiosurgery alone and those treated with radiosurgery plus whole-brain radiotherapy. Conclusions: Radiosurgery with the GK is an efficacious treatment modality for brain metastases. A KPS greater than 70, histology of breast cancer, smaller tumor volume, and age less than 65 years were associated with a longer median survival in our study.

  2. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  3. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    Mills, D.M.

    1992-01-01

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  4. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  5. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  6. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  7. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  8. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  9. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  10. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  11. Survival of ceramic veneers made of different materials after a minimum follow-up period of five years: a systematic review and meta-analysis.

    Science.gov (United States)

    Petridis, Haralampos P; Zekeridou, Alkisti; Malliari, Maria; Tortopidis, Dimitrios; Koidis, Petros

    2012-01-01

    The purpose of this systematic review was to compare the survival and complication rates of ceramic veneers produced with different techniques and materials after a minimum follow-up time of 5 years. A literature search was conducted, using electronic databases, relevant references, citations and journal researching, for clinical studies reporting on the survival of ceramic veneers fabricated with different techniques and materials with a mean followup time of at least 5 years. The search period spanned from January 1980 up to October 2010. Event rates were calculated for the following complications associated with ceramic veneers: fracture, debonding, marginal discoloration, marginal integrity, and caries. Summary estimates, and 5-year event rates were reported. Comparison between subgroups of different materials, as well as statistical significance, was calculated using a mixed effects model. Nine studies were selected for final analysis over an initial yield of 409 titles. No study directly compared the incidence of complications between ceramic veneers fabricated from different materials. Four of the included studies reported on the survival of ceramic veneers made out of feldspathic ceramics; four studies were on glass-ceramic veneers and one study included veneers fabricated from both materials. The mean observation time ranged between 5 and 10 years. Overall, the 5-year complication rates were low, with the exception of studies reporting on extended ceramic veneers. The most frequent complication reported was marginal discoloration (9% at 5 years), followed by marginal integrity (3.9-7.7% at 5 years). There was no statistically significant difference in the event rates between the subgroups of different materials (feldspathic vs. glass-ceramic). The results of this systematic review showed that ceramic veneers fabricated from feldspathic or glass-ceramics have an adequate clinical survival for at least 5 years of clinical service, with very low complication

  12. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  13. Analysis of DNA repair gene polymorphisms and survival in low-grade and anaplastic gliomas

    DEFF Research Database (Denmark)

    Berntsson, Shala Ghaderi; Wibom, Carl; Sjöström, Sara

    2011-01-01

    different DNA repair genes (ATM, NEIL1, NEIL2, ERCC6 and RPA4) which were associated with survival. Finally, these eight genetic variants were adjusted for treatment, malignancy grade, patient age and gender, leaving one variant, rs4253079, mapped to ERCC6, with a significant association to survival (OR 0...

  14. Lamb survival analysis from birth to weaning in Iranian Kermani sheep.

    Science.gov (United States)

    Barazandeh, Arsalan; Moghbeli, Sadrollah Molaei; Vatankhah, Mahmood; Hossein-Zadeh, Navid Ghavi

    2012-04-01

    Survival records from 1,763 Kermani lambs born between 1996 and 2004 from 294 ewes and 81 rams were used to determine genetic and non-genetic factors affecting lamb survival. Traits included were lamb survival across five periods from birth to 7, 14, 56, 70, and 90 days of age. Traits were analyzed under Weibull proportional hazard sire models. Several binary analyses were also conducted using animal models. Statistical models included the fixed class effects of sex of lamb, month and year of birth, a covariate effect of birth weight, and random genetic effects of both sire (in survival analyses) and animal (in binary analyses). The average survival to 90 days of age was 94.8%. Hazard rates ranged from 1.00 (birth to 90 days of age) to 1.73 (birth to 7 days of age) between the two sexes indicating that male lambs were at higher risk of mortality than females (P lamb survival and lamb birth weight, suggesting that viability and birth weight could be considered simultaneously in the selection programs to obtain optimal birth weight in Kermani lambs. Estimates of heritabilities from survival analyses were medium and ranged from 0.23 to 0.29. In addition, heritability estimates obtained from binary analyses were low and varied from 0.04 to 0.09. The results of this study suggest that progress in survival traits could be possible through managerial strategies and genetic selection.

  15. Rural factors and survival from cancer: analysis of Scottish cancer registrations.

    Science.gov (United States)

    Campbell, N C; Elliott, A M; Sharp, L; Ritchie, L D; Cassidy, J; Little, J

    2000-06-01

    In this survival study 63,976 patients diagnosed with one of six common cancers in Scotland were followed up. Increasing distance from a cancer centre was associated with less chance of diagnosis before death for stomach, breast and colorectal cancers and poorer survival after diagnosis for prostate and lung cancers.

  16. Chemotherapy increases long-term survival in patients with adult medulloblastoma--a literature-based meta-analysis.

    Science.gov (United States)

    Kocakaya, Selin; Beier, Christoph Patrick; Beier, Dagmar

    2016-03-01

    Adult medulloblastoma is a potentially curable malignant entity with an incidence of 0.5-1 per million. Valid data on prognosis, treatment, and demographics are lacking, as most current knowledge stems from retrospective studies. Surgical resection followed by radiotherapy are accepted parts of treatment regimes; however, established prognostic factors and data clarifying the role of chemotherapy are missing. We investigated 227 publications from 1969-2013, with 907 identifiable, individual patients being available for meta-analysis. Demographic data, risk stratification, and treatment of these patients were similar to previous cohorts. The median overall survival (mOS) was 65 months (95% CI: 54.6-75.3) , the 5-year overall survival was 50.9% with 16% of the patients dying more than 5 years after diagnosis. Incomplete resection, clinical and radiological signs for brainstem infiltration, and abstinence from radiotherapy were predictive of worse outcome. Metastatic disease at tumor recurrence was identified as a new prognostic factor, while neither metastasis at initial diagnosis nor desmoplastic/classic histology was correlated with survival. Patients receiving chemotherapy first-line survived significantly longer (mOS: 108 mo, 95% CI: 68.6-148.4) than patients treated with radiation alone (mOS: 57 mo, 95% CI: 39.6-74.4) or patients who received chemotherapy at tumor recurrence. This effect was not biased by tumor stage or decade of treatment. Importantly, (neo)adjuvant chemotherapy also significantly increased the chance for long-term survival (>5 y) compared with radiotherapy alone or chemotherapy at tumor recurrence. This meta-analysis clarifies relevant prognostic factors and suggests that chemotherapy as part of first-line therapy improves overall survival and increases the proportion of patients with long-term survival. © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions

  17. Comparing survival outcomes of gross total resection and subtotal resection with radiotherapy for craniopharyngioma: a meta-analysis.

    Science.gov (United States)

    Wang, Guoqing; Zhang, Xiaoyang; Feng, Mengzhao; Guo, Fuyou

    2018-06-01

    Recent studies suggest that subtotal resection (STR) followed by radiation therapy (RT) is an appealing alternative to gross total resection (GTR) for craniopharyngioma, but it remains controversial. We conducted a meta-analysis to determine whether GTR is superior to STR with RT for craniopharyngioma. A systematic search was performed for articles published until October 2017 in the PubMed, Embase, and Cochrane Central databases. The endpoints of interest are overall survival and progression-free survival. Pooled hazard ratios (HRs) and corresponding 95% confidence intervals (CIs) were calculated using a fixed or random-effects model. The data were analyzed using Review Manager 5.3 software. A total of 744 patients (seven cohort studies) were enrolled for analyses. There were no significant differences between the GTR and STR with RT groups when the authors compared the pooled HRs at the end of the follow-up period. Overall survival (pooled HR = 0.76, 95% CI: 0.46-1.25, P = 0.28) and progression-free survival (pooled HR = 1.52, 95% CI: 0.42-5.44, P = 0.52) were similar between the two groups. The current meta-analysis suggests that GTR and STR with RT have the similar survival outcomes for craniopharyngioma. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  19. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  20. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  1. Analysis of factors influencing survival in patients with severe acute pancreatitis.

    Science.gov (United States)

    Kim, Yeon Ji; Kim, Dae Bum; Chung, Woo Chul; Lee, Ji Min; Youn, Gun Jung; Jung, Yun Duk; Choi, Sooa; Oh, Jung Hwan

    2017-08-01

    Acute pancreatitis (AP) ranges from a mild and self-limiting disease to a fulminant illness with significant morbidity and mortality. Severe acute pancreatitis (SAP) is defined as persistent organ failure lasting for 48 h. We aimed to determine the factors that predict survival and mortality in patients with SAP. We reviewed a consecutive series of patients who were admitted with acute pancreatitis between January 2003 and January 2013. A total of 1213 cases involving 660 patients were evaluated, and 68 cases with SAP were selected for the study. Patients were graded based on the Computer Tomography Severity Index (CTSI), the bedside index for severity (BISAP), and Ranson's criteria. The frequency of SAP was 5.6% (68/1213 cases). Among these patients, 17 died due to pancreatitis-induced causes. We compared several factors between the survivor (n = 51) and non-survivor (n = 17) groups. On multivariate analysis, there were significant differences in the incidence of diabetes mellitus (p = .04), Ranson score (p = .03), bacteremia (p = .05) and body mass index (BMI) (p = .02) between the survivor and non-survivor groups. Bacteremia, high Ranson score, DM, and lower BMI were closely associated with mortality in patients with SAP. When patients with SAP show evidence of bacteremia or diabetes, aggressive treatment is necessary. For the prediction of disease mortality, the Ranson score might be a useful tool in SAP.

  2. The tourism and travel industry and its effect on the Great Recession: A multilevel survival analysis

    Directory of Open Access Journals (Sweden)

    Zdravko Šergo

    2017-12-01

    Full Text Available Does a country with a heavy dependence on a tourism economy have a tendency to succumb to more risk in a recession? With the shift from manufacturing-based economies in the developing world toward service-based industries, including tourism, a reliance on the tourism industry may erode economic stability in tourism-based countries, making them more prone to fall into a recession due to higher risks. In this paper, we wish to emphasise the positive impact of tourism specialisation indices in the international economy on the probability occurrence of a so-called Great Recession. This article uses a multilevel survival analysis and a generalised linear mixed-effect (GLMM structure modelling to investigate the impact of tourism development on the probability of recession frequency (risk in terms of months of duration and severity, by using data collected from 2007 to 2013 from 71 countries around the world, to see if recession frequency is positively correlated with the various indicators of tourism development. Two GLMMs were fitted to this data: logistic regression and count regression with a Poisson distribution. Results for both regressions show considerable evidence that the ratio between the number of overnight stays and the resident population and travel services as a percentage of commercial service exports positively impacts the probability for a country (from our sample to experience a recession event and can make recession worse in terms of severity, measured in months.

  3. Adoption of SO2 emission control technologies - An application of survival analysis

    International Nuclear Information System (INIS)

    Streeter, Jialu Liu

    2016-01-01

    Using data on coal-fired electric power plants, this article investigates the contributing factors affecting the investment decisions on flue-gas desulfurization (FGD), a capital-intensive emission control technology. The paper makes two contributions to the literature. First, the public regulatory status of electric power plants is found to have a strong influence on whether FGD investment is made. Compared to deregulated power plants, those that are still under rate-of-return regulations by Public Utility Commissions are more likely to install FGD. Second, a higher rate of inspections of polluting facilities (not just electric utility power plants) in a state in the previous year is associated with a higher probability of power plants adopting FGD this year. In addition, sulfur content of coal and plant size are both positively associated with the likelihood of FGD installation. The service length of boilers is negatively associated with the likelihood. - Highlights: • Contributing factors affecting investment decisions on emission control devices. • A survival analysis framework is applied in estimation. • Data cover over 300 coal-fired electric utility power plants, 2002–2012. • Still-regulated power plants are more likely to install FGD than deregulated ones. • State-level inspection frequency leads to more FGD installation.

  4. Survival Analysis of Faculty Retention and Promotion in the Social Sciences by Gender.

    Directory of Open Access Journals (Sweden)

    Janet M Box-Steffensmeier

    Full Text Available Recruitment and retention of talent is central to the research performance of universities. Existing research shows that, while men are more likely than women to be promoted at the different stages of the academic career, no such difference is found when it comes to faculty retention rates. Current research on faculty retention, however, focuses on careers in science, technology, engineering, and mathematics (STEM. We extend this line of inquiry to the social sciences.We follow 2,218 tenure-track assistant professors hired since 1990 in seven social science disciplines at nineteen U.S. universities from time of hire to time of departure. We also track their time to promotion to associate and full professor. Using survival analysis, we examine gender differences in time to departure and time to promotion. Our methods account for censoring and unobserved heterogeneity, as well as effect heterogeneity across disciplines and cohorts.We find no statistically significant differences between genders in faculty retention. However, we do find that men are more likely to be granted tenure than women. When it comes to promotion to full professor, the results are less conclusive, as the effect of gender is sensitive to model specification.The results corroborate previous findings about gender patterns in faculty retention and promotion. They suggest that advances have been made when it comes to gender equality in retention and promotion, but important differences still persist.

  5. Arthritis and the Risk of Falling Into Poverty: A Survival Analysis Using Australian Data.

    Science.gov (United States)

    Callander, Emily J; Schofield, Deborah J

    2016-01-01

    Low income is known to be associated with having arthritis. However, no longitudinal studies have documented the relationship between developing arthritis and falling into poverty. The purpose of this study was to evaluate Australians who developed arthritis to determine if they had an elevated risk of falling into poverty. Survival analysis using Cox regression models was applied to nationally representative, longitudinal survey data obtained between January 1, 2007 and December 31, 2012 from Australian adults who were ages 21 years and older in 2007. The hazard ratio for falling into income poverty was 1.08 (95% confidence interval [95% CI] 1.06-1.09) in women who were diagnosed as having arthritis and 1.15 (95% CI 1.13-1.16) in men who were diagnosed as having arthritis, as compared to those who were never diagnosed as having arthritis. The hazard ratio for falling into multidimensional poverty was 1.15 (95% CI 1.14-1.17) in women who were diagnosed as having arthritis and 1.88 (95% CI 1.85-1.91) in men who were diagnosed as having arthritis. Developing arthritis increases the risk of falling into income poverty and multidimensional poverty. The risk of multidimensional poverty is greater than the risk of income poverty. Given the high prevalence of arthritis, the condition is likely an overlooked driver of poverty. © 2016, American College of Rheumatology.

  6. Bayesian linear regression with skew-symmetric error distributions with applications to survival analysis

    KAUST Repository

    Rubio, Francisco J.

    2016-02-09

    We study Bayesian linear regression models with skew-symmetric scale mixtures of normal error distributions. These kinds of models can be used to capture departures from the usual assumption of normality of the errors in terms of heavy tails and asymmetry. We propose a general noninformative prior structure for these regression models and show that the corresponding posterior distribution is proper under mild conditions. We extend these propriety results to cases where the response variables are censored. The latter scenario is of interest in the context of accelerated failure time models, which are relevant in survival analysis. We present a simulation study that demonstrates good frequentist properties of the posterior credible intervals associated with the proposed priors. This study also sheds some light on the trade-off between increased model flexibility and the risk of over-fitting. We illustrate the performance of the proposed models with real data. Although we focus on models with univariate response variables, we also present some extensions to the multivariate case in the Supporting Information.

  7. Relationships between mastitis and functional longevity in Danish Black and White dairy cattle estimated using survival analysis

    NARCIS (Netherlands)

    Neerhof, H.J.; Madsen, P.; Ducrucq, V.; Vollema, A.R.; Jensen, I.; Korsgaard, I.R.

    2000-01-01

    The relationship between mastitis and functional longevity was assessed with survival analysis on data of Danish Black and White dairy cows. Different methods of including the effect of mastitis treatment on the culling decision by a farmer in the model were compared. The model in which mastitis

  8. Examining the Influence of Campus Climate on Students' Time to Degree: A Multilevel Discrete-Time Survival Analysis

    Science.gov (United States)

    Zhou, Ji; Castellanos, Michelle

    2013-01-01

    Utilizing longitudinal data of 3477 students from 28 institutions, we examine the effects of structural diversity and quality of interracial relation on students' persistence towards graduation within six years. We utilize multilevel discrete-time survival analysis to account for the longitudinal persistence patterns as well as the nested…

  9. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  10. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  11. Meta-analysis of single-arm survival studies: a distribution-free approach for estimating summary survival curves with random effects.

    Science.gov (United States)

    Combescure, Christophe; Foucher, Yohann; Jackson, Daniel

    2014-07-10

    In epidemiologic studies and clinical trials with time-dependent outcome (for instance death or disease progression), survival curves are used to describe the risk of the event over time. In meta-analyses of studies reporting a survival curve, the most informative finding is a summary survival curve. In this paper, we propose a method to obtain a distribution-free summary survival curve by expanding the product-limit estimator of survival for aggregated survival data. The extension of DerSimonian and Laird's methodology for multiple outcomes is applied to account for the between-study heterogeneity. Statistics I(2)  and H(2) are used to quantify the impact of the heterogeneity in the published survival curves. A statistical test for between-strata comparison is proposed, with the aim to explore study-level factors potentially associated with survival. The performance of the proposed approach is evaluated in a simulation study. Our approach is also applied to synthesize the survival of untreated patients with hepatocellular carcinoma from aggregate data of 27 studies and synthesize the graft survival of kidney transplant recipients from individual data from six hospitals. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Codevelopment of conceptual understanding and critical attitude: toward a systemic analysis of the survival blanket

    Science.gov (United States)

    Viennot, Laurence; Décamp, Nicolas

    2016-01-01

    One key objective of physics teaching is the promotion of conceptual understanding. Additionally, the critical faculty is universally seen as a central quality to be developed in students. In recent years, however, teaching objectives have placed stronger emphasis on skills than on concepts, and there is a risk that conceptual structuring may be disregarded. The question therefore arises as to whether it is possible for students to develop a critical stance without a conceptual basis, leading in turn to the issue of possible links between the development of conceptual understanding and critical attitude. In an in-depth study to address these questions, the participants were seven prospective physics and chemistry teachers. The methodology included a ‘teaching interview’, designed to observe participants’ responses to limited explanations of a given phenomenon and their ensuing intellectual satisfaction or frustration. The explanatory task related to the physics of how a survival blanket works, requiring a full and appropriate system analysis of the blanket. The analysis identified five recurrent lines of reasoning and linked these to judgments of adequacy of explanation, based on metacognitive/affective (MCA) factors, intellectual (dis)satisfaction and critical stance. Recurrent themes and MCA factors were used to map the intellectual dynamics that emerged during the interview process. Participants’ critical attitude was observed to develop in strong interaction with their comprehension of the topic. The results suggest that most students need to reach a certain level of conceptual mastery before they can begin to question an oversimplified explanation, although one student’s replies show that a different intellectual dynamics is also possible. The paper ends with a discussion of the implications of these findings for future research and for decisions concerning teaching objectives and the design of learning environments.

  13. Pediatric differentiated thyroid carcinoma in stage I: risk factor analysis for disease free survival

    International Nuclear Information System (INIS)

    Wada, Nobuyuki; Rino, Yasushi; Masuda, Munetaka; Ito, Koichi; Sugino, Kiminori; Mimura, Takashi; Nagahama, Mitsuji; Kitagawa, Wataru; Shibuya, Hiroshi; Ohkuwa, Keiko; Nakayama, Hirotaka; Hirakawa, Shohei

    2009-01-01

    To examine the outcomes and risk factors in pediatric differentiated thyroid carcinoma (DTC) patients who were defined as TNM stage I because some patients develop disease recurrence but treatment strategy for such stage I pediatric patients is still controversial. We reviewed 57 consecutive TNM stage I patients (15 years or less) with DTC (46 papillary and 11 follicular) who underwent initial treatment at Ito Hospital between 1962 and 2004 (7 males and 50 females; mean age: 13.1 years; mean follow-up: 17.4 years). Clinicopathological results were evaluated in all patients. Multivariate analysis was performed to reveal the risk factors for disease-free survival (DFS) in these 57 patients. Extrathyroid extension and clinical lymphadenopathy at diagnosis were found in 7 and 12 patients, respectively. Subtotal/total thyroidectomy was performed in 23 patients, modified neck dissection in 38, and radioactive iodine therapy in 10. Pathological node metastasis was confirmed in 37 patients (64.9%). Fifteen patients (26.3%) exhibited local recurrence and 3 of them also developed metachronous lung metastasis. Ten of these 15 achieved disease-free after further treatments and no patients died of disease. In multivariate analysis, male gender (p = 0.017), advanced tumor (T3, 4a) stage (p = 0.029), and clinical lymphadenopathy (p = 0.006) were risk factors for DFS in stage I pediatric patients. Male gender, tumor stage, and lymphadenopathy are risk factors for DFS in stage I pediatric DTC patients. Aggressive treatment (total thyroidectomy, node dissection, and RI therapy) is considered appropriate for patients with risk factors, whereas conservative or stepwise approach may be acceptable for other patients

  14. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  15. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  16. A survival analysis of GBM patients in the West of Scotland pre- and post-introduction of the Stupp regime.

    Science.gov (United States)

    Teo, Mario; Martin, Sean; Owusu-Agyemang, Kevin; Nowicki, Stefan; Clark, Brian; Mackinnon, Mairi; Stewart, Willie; Paul, James; St George, Jerome

    2014-06-01

    It is now accepted that the concomitant administration of temozolomide with radiotherapy (Stupp regime), in the treatment of patients with newly diagnosed glioblastoma multiforme (GBM), significantly improves survival and this practice has been adopted locally since 2004. However, survival outcomes in cancer can vary in different population groups, and outcomes can be affected by a number of local factors including socioeconomic status. In the West of Scotland, we have one of the worse socioeconomic status and overall health record for a western European country. With the ongoing reorganisation and rationalisation in the National Health Service, the addition of prolonged courses of chemotherapy to patients' management significantly adds to the financial burden of a cash stripped NHS. A survival analysis in patients with GBM was therefore performed, comparing outcomes of pre- and post-introduction of the Stupp regime, to justify the current practice. Prospectively collected clinical data were analysed in 105 consecutive patients receiving concurrent chemoradiotherapy (Stupp regime) following surgical treatment of GBM between December 2004 and February 2009. This was compared to those of 106 consecutive GBM patients who had radical radiotherapy (pre-Stupp regime) post-surgery between January 2001 and February 2006. The median overall survival for the post-Stupp cohort was 15.3 months (range, 2.83-50.5 months), with 1-year and 2-year overall survival rates of 65.7% and 19%, respectively. This was in comparison with the median overall pre-Stupp survival of 10.7 months, with 1-year and 2-year survival rates of 42.6% and 12%, respectively (log-rank test, p GBM patients in the West of Scotland.

  17. Design and analysis methods for fish survival experiments based on release-recapture

    National Research Council Canada - National Science Library

    Burnham, Kenneth P

    1987-01-01

    .... The application of the methods developed here is more general, however, as it includes experiments to estimate survival of fish as they pass over spillways or through bypass systems and several dams...

  18. System-level analysis of genes and functions affecting survival during nutrient starvation in Saccharomyces cerevisiae.

    Science.gov (United States)

    Gresham, David; Boer, Viktor M; Caudy, Amy; Ziv, Naomi; Brandt, Nathan J; Storey, John D; Botstein, David

    2011-01-01

    An essential property of all cells is the ability to exit from active cell division and persist in a quiescent state. For single-celled microbes this primarily occurs in response to nutrient deprivation. We studied the genetic requirements for survival of Saccharomyces cerevisiae when starved for either of two nutrients: phosphate or leucine. We measured the survival of nearly all nonessential haploid null yeast mutants in mixed populations using a quantitative sequencing method that estimates the abundance of each mutant on the basis of frequency of unique molecular barcodes. Starvation for phosphate results in a population half-life of 337 hr whereas starvation for leucine results in a half-life of 27.7 hr. To measure survival of individual mutants in each population we developed a statistical framework that accounts for the multiple sources of experimental variation. From the identities of the genes in which mutations strongly affect survival, we identify genetic evidence for several cellular processes affecting survival during nutrient starvation, including autophagy, chromatin remodeling, mRNA processing, and cytoskeleton function. In addition, we found evidence that mitochondrial and peroxisome function is required for survival. Our experimental and analytical methods represent an efficient and quantitative approach to characterizing genetic functions and networks with unprecedented resolution and identified genotype-by-environment interactions that have important implications for interpretation of studies of aging and quiescence in yeast.

  19. Spinal bone metastases in gynecologic malignancies: a retrospective analysis of stability, prognostic factors and survival

    International Nuclear Information System (INIS)

    Foerster, Robert; Habermehl, Daniel; Bruckner, Thomas; Bostel, Tilman; Schlampp, Ingmar; Welzel, Thomas; Debus, Juergen; Rief, Harald

    2014-01-01

    The aim of this retrospective study was to evaluate the stability of spinal metastases in gynecologic cancer patients (pts) on the basis of a validated scoring system after radiotherapy (RT), to define prognostic factors for stability and to calculate survival. Fourty-four women with gynecologic malignancies and spinal bone metastases were treated at our department between January 2000 and January 2012. Out of those 34 were assessed regarding stability using the Taneichi score before, 3 and 6 months after RT. Additionally prognostic factors for stability, overall survival, and bone survival (time between first day of RT of bone metastases and death from any cause) were calculated. Before RT 47% of pts were unstable and 6 months after RT 85% of pts were stable. Karnofsky performance status (KPS) >70% (p = 0.037) and no chemotherapy (ChT) (p = 0.046) prior to RT were significantly predictive for response. 5-year overall survival was 69% and 1-year bone survival was 73%. RT is capable of improving stability of osteolytic spinal metastases from gynecologic cancer by facilitating re-ossification in survivors. KPS may be a predictor for response. Pts who received ChT prior to RT may require additional bone supportive treatment to overcome bone remodeling imbalance. Survival in women with bone metastases from gynecologic cancer remains poor

  20. Multivariate Analysis of the Predictors of Survival for Patients with Hepatocellular Carcinoma Undergoing Transarterial Chemoembolization: Focusing on Superselective Chemoembolization

    International Nuclear Information System (INIS)

    Ji, Suk Kyeong; Cho, Yun Ku; Ahn, Yong Sik; Kim, Mi Young; Park, Yoon Ok; Kim, Jae Kyun; Kim, Wan Tae

    2008-01-01

    While the prognostic factors of survival for patients with hepatocellular carcinoma (HCC) who underwent transarterial chemoembolization (TACE) are well known, the clinical significance of performing selective TACE for HCC patients has not been clearly documented. We tried to analyze the potential factors of disease-free survival for these patients, including the performance of selective TACE. A total of 151 patients with HCC who underwent TACE were retrospectively analyzed for their disease-free survival (a median follow- up of 23 months, range: 1-88 months). Univariate and multivariate analyses were performed for 20 potential factors by using the Cox proportional hazard model, including 19 baseline factors and one procedure-related factor (conventional versus selective TACE). The parameters that proved to be significant on the univariate analysis were subsequently tested with the multivariate model. Conventional or selective TACE was performed for 40 and 111 patients, respectively. Univariate and multivariate analyses revealed that tumor multiplicity, venous tumor thrombosis and selective TACE were the only three independent significant prognostic factors of disease-free survival (p = 0.002, 0.015 and 0.019, respectively). In our study, selective TACE was a favorable prognostic factor for the disease-free survival of patients with HCC who underwent TACE

  1. Nitrosourea efficacy in high-grade glioma: a survival gain analysis summarizing 504 cohorts with 24193 patients.

    Science.gov (United States)

    Wolff, Johannes E A; Berrak, Su; Koontz Webb, Susannah E; Zhang, Ming

    2008-05-01

    Even though past studies have suggested efficacy of nitrosourea drugs in patients with high-grade glioma and temozolomide has recently been shown significantly to be beneficial, no conclusive comparisons between these agents have been published. We performed a survival gain analysis of 364 studies describing 24,193 patients with high-grade glioma treated in 504 cohorts, and compared the effects of drugs. The most frequent diagnoses were glioblastoma multiforme (GBM) (72%) and anaplastic astrocytoma (22%). The mean overall survival (mOS) was 14.1 months. The outcome was influenced by several of the known prognostic factors including the histological grade, if the tumors were newly diagnosed or recurrent, the completeness of resection, patients' age, and gender. This information allowed the calculation of a predicted mOS for each cohort based on their prognostic factors independent of treatment. Survival gain to characterize the influence of treatment was subsequently defined and validated as the difference between the observed and the predicted mOS. In 62 CCNU-treated cohorts and 15 ACNU-treated cohorts the survival gain was 5.3 months and 8.9 months (P < 0.0005), respectively. No detectable survival gain for patients treated with various BCNU-containing regimens was found. Conclusion CCNU- and ACNU-containing regimens were superior to BCNU containing regiments.

  2. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  3. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  4. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Therese, Laurent; Guillot, Philippe; Muja, Cristina

    2011-01-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  5. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  6. Association between pretreatment Glasgow prognostic score and gastric cancer survival and clinicopathological features: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Zhang CX

    2016-06-01

    Full Text Available Chun-Xiao Zhang,* Shu-Yi Wang,* Shuang-Qian Chen, Shuai-Long Yang, Lu Wan, Bin Xiong Department of Oncology, Zhongnan Hospital of Wuhan University, Hubei Key Laboratory of Tumor Biological Behaviors and Hubei Cancer Clinical Study Center, Wuhan, Hubei, People’s Republic of China *These authors contributed equally to this work Background: Glasgow prognostic score (GPS is widely known as a systemic inflammatory-based marker. The relationship between pretreatment GPS and gastric cancer (GC survival and clinicopathological features remains controversial. The aim of the study was to conduct a meta-analysis of published studies to evaluate the association between pretreatment GPS and survival and clinicopathological features in GC patients. Methods: We searched PubMed, Embase, MEDLINE, and BioMed databases for relevant studies. Combined analyses were used to assess the association between pretreatment GPS and overall survival, disease-free survival, and clinicopathological parameters by Stata Version 12.0. Results: A total of 14 studies were included in this meta-analysis, including 5,579 GC patients. The results indicated that pretreatment high GPS (HGPS predicted poor overall survival (hazard ratio =1.51, 95% CI: 1.37–1.66, P<0.01 and disease-free survival (hazard ratio =1.45, 95% CI: 1.26–1.68, P<0.01 in GC patients. Pretreatment HGPS was also significantly associated with advanced tumor–node–metastasis stage (odds ratio [OR] =3.09, 95% CI: 2.11–4.53, P<0.01, lymph node metastasis (OR =4.60, 95% CI: 3.23–6.56, P<0.01, lymphatic invasion (OR =3.04, 95% CI: 2.00–4.62, P<0.01, and venous invasion (OR =3.56, 95% CI: 1.81–6.99, P<0.01. Conclusion: Our meta-analysis indicated that pretreatment HGPS could be a predicative factor of poor survival outcome and clinicopathological features for GC patients. Keywords: Glasgow prognostic score, gastric cancer, survival, clinicopathological feature

  7. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  8. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Fernandez, R.F.; Zhang, W.; Robertson, J.D.; Majidi, V.

    1995-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 μg/g; and for Hg 2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag + , Ba 2+ , Cd 2+ , Cu 2+ , and Pb 2+ , share common binding sites with binding efficiencies varying in the sequence of Pb 2+ >Cu 2+ >Ag 2+ >Cd 2+ >Ba 2+ . The binding of Hg 2+ involved a different binding site with an increase in binding efficiency in the presence of Ag + . (orig.)

  9. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Robertson, J.D.; Majidi, V.

    1994-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. 5 mg of dried algae powder were mixed with 5 mL of single- and multi-metal solutions. The algae cells were then collected by filtration on 0.6 um polycarbonate membranes and analyzed by PIXE using a dual energy irradiation. When C. vulgatis was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 ug/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 ug/g; and for Hg 2+ from 10 ng/g to 10 ug/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 ug/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium is also replaced

  10. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  11. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  12. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  13. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  14. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  15. Breast cancer detection and survival among women with cosmetic breast implants: systematic review and meta-analysis of observational studies.

    Science.gov (United States)

    Lavigne, Eric; Holowaty, Eric J; Pan, Sai Yi; Villeneuve, Paul J; Johnson, Kenneth C; Fergusson, Dean A; Morrison, Howard; Brisson, Jacques

    2013-04-29

    To evaluate whether the stage distribution among women diagnosed as having breast cancer differs between those who have received breast implants for cosmetic purposes and those with no implants and to evaluate whether cosmetic breast augmentation before the detection of breast cancer is a predictor of post-diagnosis survival. Systematic review of observational studies with two meta-analyses. Systematic search of the literature published before September 2012 conducted in Medline, Embase, Global health, CINAHL, IPAB, and PsycINFO. Eligible publications were those that included women diagnosed as having breast cancer and who had had augmentation mammaplasty for cosmetic purposes. The overall odds ratio of the first meta-analysis based on 12 studies was 1.26 (95% confidence interval 0.99 to 1.60; P=0.058; I(2)=35.6%) for a non-localized stage of breast cancer at diagnosis comparing women with implants who had breast cancer and women without implants who had breast cancer. The second meta-analysis, based on five studies, evaluated the relation between cosmetic breast implantation and survival. This meta-analysis showed reduced survival after breast cancer among women who had implants compared with those who did not (overall hazard ratio for breast cancer specific mortality 1.38, 95% confidence interval 1.08 to 1.75). The research published to date suggests that cosmetic breast augmentation adversely affects the survival of women who are subsequently diagnosed as having breast cancer. These findings should be interpreted with caution, as some studies included in the meta-analysis on survival did not adjust for potential confounders. Further investigations are warranted regarding diagnosis and prognosis of breast cancer among women with breast implants.

  16. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  17. Vitamin C and survival among women with breast cancer: a meta-analysis.

    Science.gov (United States)

    Harris, Holly R; Orsini, Nicola; Wolk, Alicja

    2014-05-01

    The association between dietary vitamin C intake and breast cancer survival is inconsistent and few studies have specifically examined vitamin C supplement use among women with breast cancer. The purpose of this study was to summarise results from prospective studies on the association between vitamin C supplement use and dietary vitamin C intake and breast cancer-specific mortality and total mortality. Studies were identified using the PubMed database through February 6, 2014 and by examining the references of retrieved articles. Prospective studies were included if they reported relative risks (RR) with 95% confidence intervals (95% CIs) for at least two categories or as a continuous exposure. Random-effects models were used to combine study-specific results. The ten identified studies examined vitamin C supplement use (n=6) and dietary vitamin C intake (n=7) and included 17,696 breast cancer cases, 2791 total deaths, and 1558 breast cancer-specific deaths. The summary RR (95% CI) for post-diagnosis vitamin C supplement use was 0.81 (95% CI 0.72-0.91) for total mortality and 0.85 (95% CI 0.74-0.99) for breast cancer-specific mortality. The summary RR for a 100mg per day increase in dietary vitamin C intake was 0.73 (95% CI 0.59-0.89) for total mortality and 0.78 (95% CI 0.64-0.94) for breast cancer-specific mortality. Results from this meta-analysis suggest that post-diagnosis vitamin C supplement use may be associated with a reduced risk of mortality. Dietary vitamin C intake was also statistically significantly associated with a reduced risk of total mortality and breast cancer-specific mortality. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Bruxism and dental implant failures: a multilevel mixed effects parametric survival analysis approach.

    Science.gov (United States)

    Chrcanovic, B R; Kisch, J; Albrektsson, T; Wennerberg, A

    2016-11-01

    Recent studies have suggested that the insertion of dental implants in patients being diagnosed with bruxism negatively affected the implant failure rates. The aim of the present study was to investigate the association between the bruxism and the risk of dental implant failure. This retrospective study is based on 2670 patients who received 10 096 implants at one specialist clinic. Implant- and patient-related data were collected. Descriptive statistics were used to describe the patients and implants. Multilevel mixed effects parametric survival analysis was used to test the association between bruxism and risk of implant failure adjusting for several potential confounders. Criteria from a recent international consensus (Lobbezoo et al., J Oral Rehabil, 40, 2013, 2) and from the International Classification of Sleep Disorders (International classification of sleep disorders, revised: diagnostic and coding manual, American Academy of Sleep Medicine, Chicago, 2014) were used to define and diagnose the condition. The number of implants with information available for all variables totalled 3549, placed in 994 patients, with 179 implants reported as failures. The implant failure rates were 13·0% (24/185) for bruxers and 4·6% (155/3364) for non-bruxers (P bruxism was a statistically significantly risk factor to implant failure (HR 3·396; 95% CI 1·314, 8·777; P = 0·012), as well as implant length, implant diameter, implant surface, bone quantity D in relation to quantity A, bone quality 4 in relation to quality 1 (Lekholm and Zarb classification), smoking and the intake of proton pump inhibitors. It is suggested that the bruxism may be associated with an increased risk of dental implant failure. © 2016 John Wiley & Sons Ltd.

  19. The risk of falling into poverty after developing heart disease: a survival analysis.

    Science.gov (United States)

    Callander, Emily J; Schofield, Deborah J

    2016-07-15

    Those with a low income are known to have a higher risk of developing heart disease. However, the inverse relationship - falling into income poverty after developing heart disease has not been explored with longitudinal data. This paper aims to determine if those with heart disease have an elevated risk of falling into poverty. Survival analysis was conducted using the longitudinal Household Income and Labour Dynamics in Australia survey, between the years 2007 and 2012. The study focused on the Australian population aged 21 years and over in 2007 who were not already in poverty and did not already have heart disease, who were followed from 2007 to 2012. Cox regression models adjusting for age, sex and time-varying co-variates (marital status, home ownership and remoteness of area of residence) were constructed to assess the risk of falling into poverty. For those aged 20 who developed heart disease, the hazard ratio for falling into income poverty was 9.24 (95 % CI: 8.97-9.51) and for falling into multidimensional poverty the hazard ratio was 14.21 (95 % CI: 13.76-14.68); for those aged 40 the hazard ratio for falling into income poverty was 3.45 (95 % CI: 3.39-3.51) and for multidimensional poverty, 5.20 (95 % CI: 5.11-5.29); and for those aged 60 the hazard ratio for falling into income poverty was 1.29 (95 % CI: 1.28-1.30) and for multidimensional poverty, 1.52 (95 % CI: 1.51-1.54), relative those who never developed heart disease. The risk for both income and multidimensional poverty decreases with age up to the age of 70, over which, those who developed heart disease had a reduced risk of poverty. For those under the age of 70, developing heart disease is associated with an increased risk of falling into both income poverty and multidimensional poverty.

  20. Stadium IB - IIA cervical cancer patient’s survival rate after receiving definitive radiation and radical operation therapy followed by adjuvant radiation therapy along with analysis of factors affecting the patient’s survival rate

    Science.gov (United States)

    Ruslim, S. K.; Purwoto, G.; Widyahening, I. S.; Ramli, I.

    2017-08-01

    To evaluate the characteristics and overall survival rates of early stage cervical cancer (FIGO IB-IIA) patients who receive definitive radiation therapy and those who are prescribed adjuvant postoperative radiation and to conduct a factors analysis of the variables that affect the overall survival rates in both groups of therapy. The medical records of 85 patients with cervical cancer FIGO stages IB-IIA who were treated at the Department of Radiotherapy of Cipto Mangunkusumo Hospital were reviewed and analyzed to determine their overall survival and the factors that affected it between a definitive radiation group and an adjuvant postoperative radiation group. There were 25 patients in the definitive radiation and 60 patients in the adjuvant radiation group. The overall survival rates in the adjuvant radiation group at years one, two, and three were 96.7%, 95%, and 93.3%, respectively. Negative lymph node metastasis had an average association with overall survival (p 12 g/dl was a factor with an average association with the overall survival (p cervical cancer FIGO stage IB-IIA patients who received definitive radiation or adjuvant postoperative radiation. Negative lymph node metastasis had an effect on the overall survival rate in the adjuvant postoperative radiation group, while a preradiation Hb level >12 g/dl tended to affect the overall survival in the definitive radiation group patients.

  1. Genome analysis of Pseudoalteromonas flavipulchra JG1 reveals various survival advantages in marine environment.

    Science.gov (United States)

    Yu, Min; Tang, Kaihao; Liu, Jiwen; Shi, Xiaochong; Gulder, Tobias A M; Zhang, Xiao-Hua

    2013-10-16

    Competition between bacteria for habitat and resources is very common in the natural environment and is considered to be a selective force for survival. Many strains of the genus Pseudoalteromonas were confirmed to produce bioactive compounds that provide those advantages over their competitors. In our previous study, P. flavipulchra JG1 was found to synthesize a Pseudoalteromonas flavipulchra antibacterial Protein (PfaP) with L-amino acid oxidase activity and five small chemical compounds, which were the main competitive agents of the strain. In addition, the genome of this bacterium has been previously sequenced as Whole Genome Shotgun project (PMID: 22740664). In this study, more extensive genomic analysis was performed to identify specific genes or gene clusters which related to its competitive feature, and further experiments were carried out to confirm the physiological roles of these genes when competing with other microorganisms in marine environment. The antibacterial protein PfaP may also participate in the biosynthesis of 6-bromoindolyl-3-acetic acid, indicating a synergistic effect between the antibacterial macromolecule and small molecules. Chitinases and quorum quenching enzymes present in P. flavipulchra, which coincide with great chitinase and acyl homoserine lactones degrading activities of strain JG1, suggest other potential mechanisms contribute to antibacterial/antifungal activities. Moreover, movability and rapid response mechanisms to phosphorus starvation and other stresses, such as antibiotic, oxidative and heavy metal stress, enable JG1 to adapt to deleterious, fluctuating and oligotrophic marine environments. The genome of P. flavipulchra JG1 exhibits significant genetic advantages against other microorganisms, encoding antimicrobial agents as well as abilities to adapt to various adverse environments. Genes involved in synthesis of various antimicrobial substances enriches the antagonistic mechanisms of P. flavipulchra JG1 and affords

  2. Impact of Interstitial Pneumonia on the Survival and Risk Factors Analysis of Patients with Hematological Malignancy

    Directory of Open Access Journals (Sweden)

    Wei-Liang Chen

    2013-01-01

    Full Text Available Background. The emergence of interstitial pneumonia (IP in patients with hematological malignancy (HM is becoming a challenging scenario in current practice. However, detailed characterization and investigation of outcomes and risk factors on survival have not been addressed. Methods. We conducted a retrospective study of 42,584 cancer patients covering the period between 1996 and 2008 using the institutional cancer registry system. Among 816 HM patients, 61 patients with IP were recognized. The clinical features, laboratory results, and histological types were studied to determine the impact of IP on survival and identify the profile of prognostic factors. Results. HM patients with IP showed a significant worse survival than those without IP in the 5-year overall survival (P=0.027. The overall survival showed no significant difference between infectious pneumonia and noninfectious interstitial pneumonia (IIP versus nIIP (P=0.323. In a multivariate Cox regression model, leukocyte and platelet count were associated with increased risk of death. Conclusions. The occurrence of IP in HM patients is associated with increased mortality. Of interest, nIIP is a prognostic indicator in patients with lymphoma but not in patients with leukemia. However, aggressive management of IP in patients with HM is strongly advised, and further prospective survey is warranted.

  3. Trends of Incidence and Survival of Gastrointestinal Neuroendocrine Tumors in the United States: A Seer Analysis

    Directory of Open Access Journals (Sweden)

    Vassiliki L. Tsikitis, Betsy C. Wertheim, Marlon A. Guerrero

    2012-01-01

    Full Text Available OBJECTIVES: To examine trends in detection and survival of hollow viscus gastrointestinal neuroendocrine tumors (NETs across time and geographic regions of the U.S.METHODS: We used the Surveillance, Epidemiology and End Results (SEER database to investigate 19,669 individuals with newly diagnosed gastrointestinal NETs. Trends in incidence were tested using Poisson regression. Cox proportional hazards regression was used to examine survival.RESULTS: Incidence increased over time for NETs of all gastrointestinal sites (all P < 0.001, except appendix. Rates have risen faster for NETs of the small intestine and rectum than stomach and colon. Rectal NETs were detected at a faster pace among blacks than whites (P < 0.001 and slower in the East than other regions (P < 0.001. We observed that appendiceal and rectal NETs carry the best prognosis and survival of small intestinal and colon NETs has improved for both men and women. Colon NETs showed different temporal trends in survival according to geographic region (Pinteraction = 0.028. Improved prognosis was more consistent across the country for small intestinal NETs.CONCLUSIONS: Incidence of gastrointestinal NETs has increased, accompanied by inconsistently improved survival for different anatomic sites among certain groups defined by race and geographic region.

  4. Hyperfractionated Accelerated Radiotherapy (HART) for Anaplastic Thyroid Carcinoma: Toxicity and Survival Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dandekar, Prasad [Head and Neck/Thyroid Unit, Royal Marsden NHS Foundation Trust, Sutton, Surrey (United Kingdom); Harmer, Clive; Barbachano, Yolanda [Department of Clinical Research and Development, Royal Marsden NHS Foundation Trust, Sutton, Surrey (United Kingdom); Rhys-Evans, Peter; Harrington, Kevin; Nutting, Christopher [Head and Neck-Thyroid Unit, Royal Marsden NHS Foundation Trust, Chelsea, London (United Kingdom); Newbold, Kate [Head and Neck/Thyroid Unit, Royal Marsden NHS Foundation Trust, Sutton, Surrey (United Kingdom); Consultant Clinical Oncologist, Royal Marsden NHS Foundation Trust, Chelsea, London (United Kingdom)

    2009-06-01

    Purpose: Anaplastic thyroid carcinoma (ATC) is one of the most aggressive cancers, and the current protocol of hyperfractionated accelerated radiotherapy was initiated to improve survival while limiting toxicities. Methods and Materials: All patients with ATC from 1991 to 2002 were accrued and received megavoltage radiotherapy from the mastoid processes to the carina up to 60 Gy in twice-daily fractions of 1.8 and 2 Gy, 6 hours apart. Results: Thirty-one patients were accrued with a median age of 69 years, and 55% were women. Debulking was performed in 26%, and total thyroidectomy, in 6%, whereas 68% received radical radiotherapy alone. Local control data were available for 27 patients: 22% had a complete response, 26% had a partial response, 15% showed progressive disease, and 37% showed static disease. Median overall survival for all 31 patients was 70 days (95% confidence interval, 40-99). There was no significant difference in median survival between patients younger (70 days) and older than 70 years (42 days), between men (70 days) and women (49days), and between patients receiving postoperative radiotherapy (77 days) and radical radiotherapy alone (35 days). Grade III or higher skin erythema was seen in 56% patients; desquamation in 21%; dysphagia in 74%; and esophagitis in 79%. Conclusion: The current protocol failed to offer a significant survival benefit, was associated with severe toxicities, and thus was discontinued. There is a suggestion that younger patients with operable disease have longer survival, but this would require a larger study to confirm it.

  5. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Science.gov (United States)

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  6. KMWin – A Convenient Tool for Graphical Presentation of Results from Kaplan-Meier Survival Time Analysis

    Science.gov (United States)

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912

  7. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Directory of Open Access Journals (Sweden)

    Arnd Gross

    Full Text Available BACKGROUND: Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav, SAS export (xpt or text file (dat, which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. RESULTS: On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. CONCLUSIONS: We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  8. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  9. A novel preconcentration technique for the PIXE analysis of water

    Energy Technology Data Exchange (ETDEWEB)

    Savage, J.M. [Element Analysis Corp., Lexington, KY (United States); Fernandez, R.F. [Element Analysis Corp., Lexington, KY (United States); Zhang, W. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Robertson, J.D. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Majidi, V. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States)

    1995-05-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag{sup +}, Ba{sup 2+}, and Cd{sup 2+} in the concentration range from 10 ng/g to 1 {mu}g/g; for Cu{sup 2+} and Pb{sup 2+} from 10 ng/g to 5 {mu}g/g; and for Hg{sup 2+} from 10 ng/g to 10 {mu}g/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 {mu}g/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag{sup +}, Ba{sup 2+}, Cd{sup 2+}, Cu{sup 2+}, and Pb{sup 2+}, share common binding sites with binding efficiencies varying in the sequence of Pb{sup 2+}>Cu{sup 2+}>Ag{sup 2+}>Cd{sup 2+}>Ba{sup 2+}. The binding of Hg{sup 2+} involved a different binding site with an increase in binding efficiency in the presence of Ag{sup +}. (orig.).

  10. Systemic meningococcal disease in children: survival analysis, Arkhangelsk region, Northwest Russia, 1991–2011

    Directory of Open Access Journals (Sweden)

    O. V. Samodova

    2012-01-01

    Full Text Available Systemic meningococcal infection requires prompt and adequate medical care. It is considered as unpredictable disease due to extreme severity of a patient’s condition and high risk for fatal outcome. Survival of the children with systemic meningococcal infection was studied. Retrospective cohort includes all cases of systemic meningococcal disease in children arose in Arkhangelsk region in 1991–2011. Rate of fatal outcomes was high (41%. All death cases occurred during first three days of illness. Survival of the patient with correct pre-admission diagnosis was higher in comparison with initially undiagnosed cases. Survival functions were influenced by form of the disease and presence of septic shock. The usage of intramuscular injection of glucocorticoids on pre-admission stage according to the common recommendations did not improve the outcome.

  11. Number of Lymph Nodes Removed and Survival after Gastric Cancer Resection: An Analysis from the US Gastric Cancer Collaborative.

    Science.gov (United States)

    Gholami, Sepideh; Janson, Lucas; Worhunsky, David J; Tran, Thuy B; Squires, Malcolm Hart; Jin, Linda X; Spolverato, Gaya; Votanopoulos, Konstantinos I; Schmidt, Carl; Weber, Sharon M; Bloomston, Mark; Cho, Clifford S; Levine, Edward A; Fields, Ryan C; Pawlik, Timothy M; Maithel, Shishir K; Efron, Bradley; Norton, Jeffrey A; Poultsides, George A

    2015-08-01

    Examination of at least 16 lymph nodes (LNs) has been traditionally recommended during gastric adenocarcinoma resection to optimize staging, but the impact of this strategy on survival is uncertain. Because recent randomized trials have demonstrated a therapeutic benefit from extended lymphadenectomy, we sought to investigate the impact of the number of LNs removed on prognosis after gastric adenocarcinoma resection. We analyzed patients who underwent gastrectomy for gastric adenocarcinoma from 2000 to 2012, at 7 US academic institutions. Patients with M1 disease or R2 resections were excluded. Disease-specific survival (DSS) was calculated using the Kaplan-Meier method and compared using log-rank and Cox regression analyses. Of 742 patients, 257 (35%) had 7 to 15 LNs removed and 485 (65%) had ≥16 LNs removed. Disease-specific survival was not significantly longer after removal of ≥16 vs 7 to 15 LNs (10-year survival, 55% vs 47%, respectively; p = 0.53) for the entire cohort, but was significantly improved in the subset of patients with stage IA to IIIA (10-year survival, 74% vs 57%, respectively; p = 0.018) or N0-2 disease (72% vs 55%, respectively; p = 0.023). Similarly, for patients who were classified to more likely be "true N0-2," based on frequentist analysis incorporating both the number of positive and of total LNs removed, the hazard ratio for disease-related death (adjusted for T stage, R status, grade, receipt of neoadjuvant and adjuvant therapy, and institution) significantly decreased as the number of LNs removed increased. The number of LNs removed during gastrectomy for adenocarcinoma appears itself to have prognostic implications for long-term survival. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Repair or Replacement for Isolated Tricuspid Valve Pathology? Insights from a Surgical Analysis on Long-Term Survival

    Science.gov (United States)

    Farag, Mina; Arif, Rawa; Sabashnikov, Anton; Zeriouh, Mohamed; Popov, Aron-Frederik; Ruhparwar, Arjang; Schmack, Bastian; Dohmen, Pascal M.; Szabó, Gábor; Karck, Matthias; Weymann, Alexander

    2017-01-01

    Background Long-term follow-up data concerning isolated tricuspid valve pathology after replacement or reconstruction is limited. Current American Heart Association guidelines equally recommend repair and replacement when surgical intervention is indicated. Our aim was to investigate and compare operative mortality and long-term survival in patients undergoing isolated tricuspid valve repair surgery versus replacement. Material/Methods Between 1995 and 2011, 109 consecutive patients underwent surgical correction of tricuspid valve pathology at our institution for varying structural pathologies. A total of 41 (37.6%) patients underwent tricuspid annuloplasty/repair (TAP) with or without ring implantation, while 68 (62.3%) patients received tricuspid valve replacement (TVR) of whom 36 (53%) were mechanical and 32 (47%) were biological prostheses. Results Early survival at 30 days after surgery was 97.6% in the TAP group and 91.1% in the TVR group. After 6 months, 89.1% in the TAP group and 87.8% in the TVR group were alive. In terms of long-term survival, there was no further mortality observed after one year post surgery in both groups (Log Rank p=0.919, Breslow p=0.834, Tarone-Ware p=0.880) in the Kaplan-Meier Survival analysis. The 1-, 5-, and 8-year survival rates were 85.8% for TAP and 87.8% for TVR group. Conclusions Surgical repair of the tricuspid valve does not show survival benefit when compared to replacement. Hence valve replacement should be considered generously in patients with reasonable suspicion that regurgitation after repair will reoccur. PMID:28236633

  13. Analysis of the Indicence and Survival of Female Breast Cancer Patients in Beijing Over a 20-Year Period

    Institute of Scientific and Technical Information of China (English)

    Qijun Wang; Weixing Zhu; Xiumei Xing; Chenxu Qu

    2006-01-01

    OBJECTIVE To provide evidence for breast cancer prevention and control through epidemiological analysis of the incidence, mortality and survival rate of female breast cancer patients in Beijing.METHODS The female registration data in the Beijing urban area from 1982 to 2001 were retrospectively reviewed. The incidence, mortality and survival rate of female breast cancer patients were analyzed using routine and life-table statistical methods.RESULTS During the period of 1982 to 2001, there was a trend of an average annual increase of female breast cancer incidence of 4.6% in urban Beijing, and of 4.9% in the world-population standardized incidence.The epidemiological features of urban Beijing female breast cancer showed:(1)The incidence distribution of different age groups from 25 to 80 years elevated with two peaks at ages of 45~ and 70~ years; (2)There was an elevation in each age group over the last 20 years; (3)The incidence rate at ages of 35 to 64 reached 95.3/105, causing breast cancer to become the number one cancer in females. The changes in the survival rate showed the following: the 5-year observed survival rate (OSR)increased from 62.0% in 1982~1983 to 68.7% in 1987~1988, and the relative-survival rate (RSR) increased from 66.3% to 74.2%. The 10-year OSR and RSR in 1987~1988 were 60.3% and 65.1%, and at 15 years 57.5% and 61.3%, respectively. The mortality rate of breast cancer patients fluctuated from 8 to 10 per 105 population over the 20 years of study.CONCLUSION There is a trend of an annual increase in female breast cancer in Beijing. The 5-year survival is being improved gradually while the mortality remains stable. The results demonstrate that the principles of "early prevention, diagnosis and treatment" for breast cancer are effective in Beijing.

  14. Analysis of single nucleotide variants of HFE gene and association to survival in The Cancer Genome Atlas GBM data.

    Science.gov (United States)

    Lee, Sang Y; Zhu, Junjia; Salzberg, Anna C; Zhang, Bo; Liu, Dajiang J; Muscat, Joshua E; Langan, Sara T; Connor, James R

    2017-01-01

    Human hemochromatosis protein (HFE) is involved in iron metabolism. Two major HFE polymorphisms, H63D and C282Y, have been associated with an increased risk of cancers. Previously, we reported decreased gender effects in overall survival based on H63D or C282Y HFE polymorphisms patients with glioblastoma multiforme (GBM). However, the effect of other single nucleotide variation (SNV) in the HFE gene on the cancer development and progression has not been systematically studied. To expand our finding in a larger sample, and to identify other HFE SNV, we analyzed the frequency of somatic SNV in HFE gene and its relationship to survival in GBM patients using The Cancer Genome Atlas (TCGA) GBM (Caucasian only) database. We found 9 SNVs with increased frequency in blood normal of TCGA GBM patients compared to the 1000Genome. Among 9 SNVs, 7 SNVs were located in the intron and 2 SNVs (i.e., H63D, C282Y) in the exon of HFE gene. The statistical analysis demonstrated that blood normal samples of TCGA GBM have more H63D (p = 0.0002, 95% Confidence interval (CI): 0.2119-0.3223) or C282Y (p = 0.0129, 95% CI: 0.0474-0.1159) HFE polymorphisms than 1000Genome. The Kaplan-Meier survival curve for the 264 GBM samples revealed no difference between wild type (WT) HFE and H63D, and WT HFE and C282Y GBM patients. In addition, there was no difference in the survival of male/female GBM patients based on HFE genotype. There was no correlation between HFE expression and survival. In conclusion, the current results suggest that somatic HFE polymorphisms do not impact GBM patients' survival in the TCGA data set of GBM.

  15. Analysis of single nucleotide variants of HFE gene and association to survival in The Cancer Genome Atlas GBM data.

    Directory of Open Access Journals (Sweden)

    Sang Y Lee

    Full Text Available Human hemochromatosis protein (HFE is involved in iron metabolism. Two major HFE polymorphisms, H63D and C282Y, have been associated with an increased risk of cancers. Previously, we reported decreased gender effects in overall survival based on H63D or C282Y HFE polymorphisms patients with glioblastoma multiforme (GBM. However, the effect of other single nucleotide variation (SNV in the HFE gene on the cancer development and progression has not been systematically studied. To expand our finding in a larger sample, and to identify other HFE SNV, we analyzed the frequency of somatic SNV in HFE gene and its relationship to survival in GBM patients using The Cancer Genome Atlas (TCGA GBM (Caucasian only database. We found 9 SNVs with increased frequency in blood normal of TCGA GBM patients compared to the 1000Genome. Among 9 SNVs, 7 SNVs were located in the intron and 2 SNVs (i.e., H63D, C282Y in the exon of HFE gene. The statistical analysis demonstrated that blood normal samples of TCGA GBM have more H63D (p = 0.0002, 95% Confidence interval (CI: 0.2119-0.3223 or C282Y (p = 0.0129, 95% CI: 0.0474-0.1159 HFE polymorphisms than 1000Genome. The Kaplan-Meier survival curve for the 264 GBM samples revealed no difference between wild type (WT HFE and H63D, and WT HFE and C282Y GBM patients. In addition, there was no difference in the survival of male/female GBM patients based on HFE genotype. There was no correlation between HFE expression and survival. In conclusion, the current results suggest that somatic HFE polymorphisms do not impact GBM patients' survival in the TCGA data set of GBM.

  16. Improved Survival With Radiation Therapy in High-Grade Soft Tissue Sarcomas of the Extremities: A SEER Analysis

    International Nuclear Information System (INIS)

    Koshy, Matthew; Rich, Shayna E.; Mohiuddin, Majid M.

    2010-01-01

    Purpose: The benefit of radiation therapy in extremity soft tissue sarcomas remains controversial. The purpose of this study was to determine the effect of radiation therapy on overall survival among patients with primary soft tissue sarcomas of the extremity who underwent limb-sparing surgery. Methods and Materials: A retrospective study from the Surveillance, Epidemiology, and End Results (SEER) database that included data from January 1, 1988, to December 31, 2005. A total of 6,960 patients constituted the study population. Overall survival curves were constructed using the Kaplan-Meir method and for patients with low- and high-grade tumors. Hazard ratios were calculated based on multivariable Cox proportional hazards models. Results: Of the cohort, 47% received radiation therapy. There was no significant difference in overall survival among patients with low-grade tumors by radiation therapy. In high-grade tumors, the 3-year overall survival was 73% in patients who received radiation therapy vs. 63% for those who did not receive radiation therapy (p < 0.001). On multivariate analysis, patients with high-grade tumors who received radiation therapy had an improved overall survival (hazard ratio 0.67, 95% confidence interval 0.57-0.79). In patients receiving radiation therapy, 13.5% received it in a neoadjuvant setting. The incidence of patients receiving neoadjuvant radiation did not change significantly between 1988 and 2005. Conclusions: To our knowledge, this is the largest population-based study reported in patients undergoing limb-sparing surgery for soft tissue sarcomas of the extremities. It reports that radiation was associated with improved survival in patients with high-grade tumors.

  17. An analysis of the survivability of sensor darts in impacts with trees.

    Energy Technology Data Exchange (ETDEWEB)

    Prentice, John K. (Sci-Tac, Inc., Boulder, CO.); Gardner, David Randall

    2005-07-01

    A methodology was developed for computing the probability that the sensor dart for the 'Near Real-Time Site Characterization for Assured HDBT Defeat' Grand-Challenge LDRD project will survive deployment over a forested region. The probability can be decomposed into three approximately independent probabilities that account for forest coverage, branch density and the physics of an impact between the dart and a tree branch. The probability that a dart survives an impact with a tree branch was determined from the deflection induced by the impact. If a dart that was deflected so that it impacted the ground at an angle of attack exceeding a user-specified, threshold value, the dart was assumed to not survive the impact with the branch; otherwise it was assumed to have survived. A computer code was developed for calculating dart angle of attack at impact with the ground and a Monte Carlo scheme was used to calculate the probability distribution of a sensor dart surviving an impact with a branch as a function of branch radius, length, and height from the ground. Both an early prototype design and the current dart design were used in these studies. As a general rule of thumb, it we observed that for reasonably generic trees and for a threshold angle of attack of 5{sup o} (which is conservative for dart survival), the probability of reaching the ground with an angle of attack less than the threshold is on the order of 30% for the prototype dart design and 60% for the current dart design, though these numbers should be treated with some caution.

  18. Needs analysis for educating community pharmacists to interface with prehospital stroke chain of survival.

    Science.gov (United States)

    Denetclaw, Tina Harrach; Cefalu, Patricia; Manila, Louis L; Panagotacos, John J

    2014-02-01

    Awareness of the American Heart Association's Stroke Chain of Survival, and willingness to learn and share this information with the public, was assessed for community pharmacists practicing near a primary stroke center. Twenty-three community pharmacies local to a primary stroke center were identified and surveyed. The surveyor showed each pharmacist a flier with a mnemonic for assessing stroke symptoms, briefly explained steps in the Stroke Chain of Survival, and noted if the pharmacist was available, listened to the entire presentation, read the information on the flier, agreed to post the flier, and if the pharmacist made any comments. The surveyor also assessed whether the Stroke Chain of Survival was new information to each pharmacist. All subjects read the information on the flier. Twenty-two (95.7%) listened to the entire presentation, and 23 (100%) were willing to post the flier. Two (11%) indicated that the parent company does not allow public posting of noncorporate information but agreed to post the flier internally. Twenty-one (91%) expressed appreciation for receiving the information. Seventeen (74%) indicated that the Stroke Chain of Survival was new information to them, 14 (61%) spontaneously remarked on the importance of the information, and 4 (17%) asked for additional information. Community pharmacists surveyed were willing to interface with the prehospital phase of the Stroke Chain of Survival; nearly 75% of them required education to do so. Community pharmacies are potentially a venue for educating the public on the Stroke Chain of Survival. It may be necessary to approach community pharmacy corporate leadership to partner with such efforts. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  19. The survival analysis on localized prostate cancer treated with neoadjuvant endocrine therapy followed by intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Gao Hong; Li Gaofeng; Wu Qinhong; Li Xuenan; Zhong Qiuzi; Xu Yonggang

    2010-01-01

    Objective: To retrospectively investigate clinical outcomes and prognostic factors in localized prostate cancer treated with neoadjuvant endocrine therapy followed by intensity modulated radiotherapy (IMRT). Methods: Between March 2003 and October 2008, 54 localized prostate cancer treated by IMRT were recruited. All patients had received endocrine therapy before IMRT. The endocrine therapy included surgical castration or medical castration in combination with antiandrogens. The target of IMRT was the prostate and seminal vesicles with or without pelvis. The biochemical failure was defined according to the phoenix definition. By using the risk grouping standard proposed by D'Amico, patients were divided into three groups: low-risk group (n = 5), intermediate-risk group (n = 12), and high-risk group (n = 37). Kaplan-Meier method was used to calculate the overall survival rate. Prognostic factors were analyzed by univariate and multiple Cox regression analysis. Results: The follow-up rate was 98%. The number of patients under follow-up was 39 at 3 years and 25 at 5 years. Potential prognostic factors, including risk groups, mode of endocrine therapy, time of endocrine therapy, phoenix grouping before IMRT, the prostate specific antigen doubling time (PSADT) before radiotherapy, PSA value before IMRT, interval of endocrine therapy and IMRT, irradiation region, and irradiation dose were analyzed by survival analysis. In univariate analysis, time of endocrine therapy (75 % vs 95 %, χ 2 = 6. 45, P = 0. 011), phoenix grouping before IMRT (87% vs 96%, χ 2 = 4. 36, P = 0. 037), interval of endocrine therapy and IMRT (80% vs 95%, χ 2 = 11.60, P= 0. 001), irradiation dose (75% vs 91%, χ 2 =5.92, P= 0. 015) were statistically significant prognostic factors for 3 - year overall survival , and risk groups (85 vs 53 vs 29, χ 2 = 6. 40, P =0. 041) and PSADT before IMRT (62 vs 120, U =24. 50, P =0. 003) were significant factors for the median survival time. In the multiple Cox

  20. Survival Impact of Adjuvant Radiation Therapy in Masaoka Stage II to IV Thymomas: A Systematic Review and Meta-analysis

    International Nuclear Information System (INIS)

    Lim, Yu Jin; Kim, Eunji; Kim, Hak Jae; Wu, Hong-Gyun; Yan, Jinchun; Liu, Qin; Patel, Shilpen

    2016-01-01

    Purpose: To evaluate the survival impact of postoperative radiation therapy (PORT) in stage II to IV thymomas, using systematic review and meta-analysis. Methods and Materials: A database search was conducted with EMBASE, PubMed, Web of Science, Cochrane Library, and Ovid from inception to August 2015. Thymic carcinomas were excluded, and studies comparing overall survival (OS) with and without PORT in thymomas were included. The hazard ratios (HRs) of OS were extracted, and a random-effects model was used in the pooled analysis. Results: Seven retrospective series with a total of 1724 patients were included and analyzed. Almost all of the patients underwent macroscopically complete resection, and thymoma histology was confirmed by the World Health Organization criteria. In the overall analysis of stage II to IV thymomas, OS was not altered with the receipt of PORT (HR 0.79, 95% confidence interval [CI] 0.58-1.08). Although PORT was not associated with survival difference in Masaoka stage II disease (HR 1.45, 95% CI 0.83-2.55), improved OS was observed with the addition of PORT in the discrete pooled analysis of stage III to IV (HR 0.63, 95% CI 0.40-0.99). Significant heterogeneity and publication bias were not found in the analyses. Conclusions: From the present meta-analysis of sole primary thymomas, we suggest the potential OS benefit of PORT in locally advanced tumors with macroscopically complete resection, but not in stage II disease. Further investigations with sufficient survival data are needed to establish detailed treatment indications.

  1. Survival Impact of Adjuvant Radiation Therapy in Masaoka Stage II to IV Thymomas: A Systematic Review and Meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Yu Jin; Kim, Eunji [Department of Radiation Oncology, Seoul National University College of Medicine, Seoul (Korea, Republic of); Kim, Hak Jae, E-mail: khjae@snu.ac.kr [Department of Radiation Oncology, Seoul National University College of Medicine, Seoul (Korea, Republic of); Wu, Hong-Gyun [Department of Radiation Oncology, Seoul National University College of Medicine, Seoul (Korea, Republic of); Cancer Research Institute, Seoul National University College of Medicine, Seoul (Korea, Republic of); Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of); Yan, Jinchun [Department of Radiation Oncology, Dalian Medical University, Liaoning (China); Department of Radiation Oncology, Fudan University Cancer Hospital, Shanghai (China); Liu, Qin [The Wistar Institute, Philadelphia, Pennsylvania (United States); Patel, Shilpen [Department of Radiation Oncology, University of Washington Medical Center, Seattle, Washington (United States)

    2016-04-01

    Purpose: To evaluate the survival impact of postoperative radiation therapy (PORT) in stage II to IV thymomas, using systematic review and meta-analysis. Methods and Materials: A database search was conducted with EMBASE, PubMed, Web of Science, Cochrane Library, and Ovid from inception to August 2015. Thymic carcinomas were excluded, and studies comparing overall survival (OS) with and without PORT in thymomas were included. The hazard ratios (HRs) of OS were extracted, and a random-effects model was used in the pooled analysis. Results: Seven retrospective series with a total of 1724 patients were included and analyzed. Almost all of the patients underwent macroscopically complete resection, and thymoma histology was confirmed by the World Health Organization criteria. In the overall analysis of stage II to IV thymomas, OS was not altered with the receipt of PORT (HR 0.79, 95% confidence interval [CI] 0.58-1.08). Although PORT was not associated with survival difference in Masaoka stage II disease (HR 1.45, 95% CI 0.83-2.55), improved OS was observed with the addition of PORT in the discrete pooled analysis of stage III to IV (HR 0.63, 95% CI 0.40-0.99). Significant heterogeneity and publication bias were not found in the analyses. Conclusions: From the present meta-analysis of sole primary thymomas, we suggest the potential OS benefit of PORT in locally advanced tumors with macroscopically complete resection, but not in stage II disease. Further investigations with sufficient survival data are needed to establish detailed treatment indications.

  2. Nursing diagnoses in children with congenital heart disease: a survival analysis.

    Science.gov (United States)

    Martins da Silva, Viviane; Lopes, Marcos Venícios de Oliveira; Leite de Araujo, Thelma

    2007-01-01

    To analyze the relationship between nursing diagnoses and survival rates in children with congenital heart disease. A total of 270 observations were carried out in 45 children with congenital heart disease who were followed for 15 days. Differences in mean survival times were identified in children not more than 4 months of age with respect to the following diagnoses: impaired gas exchange, ineffective breathing pattern, activity intolerance, delayed growth and development, and decreased cardiac output. The main diagnoses are identified early in the hospitalization period and are conditions resulting from hemodynamic alterations and prescribed medical treatment. Congenital heart disease provokes serious hemodynamic alterations that generate human responses, which should be treated proactively.

  3. A Survival Analysis of Patients with Malignant Biliary Strictures Treated by Percutaneous Metallic Stenting

    International Nuclear Information System (INIS)

    Brountzos, Elias N.; Ptochis, Nikolaos; Panagiotou, Irene; Malagari, Katerina; Tzavara, Chara; Kelekis, Dimitrios

    2007-01-01

    Background. Percutaneous metal stenting is an accepted palliative treatment for malignant biliary obstruction. Nevertheless, factors predicting survival are not known. Methods. Seventy-six patients with inoperable malignant biliary obstruction were treated with percutaneous placement of metallic stents. Twenty patients had non-hilar lesions. Fifty-six patients had hilar lesions classified as Bismuth type I (n = 15 patients), type II (n = 26), type III (n = 12), or type IV (n = 3 patients). Technical and clinical success rates, complications, and long-term outcome were recorded. Clinical success rates, patency, and survival rates were compared in patients treated with complete (n = 41) versus partial (n = 35) liver parenchyma drainage. Survival was calculated and analyzed for potential predictors such as the tumor type, the extent of the disease, the level of obstruction, and the post-intervention bilirubin levels. Results. Stenting was technically successful in all patients (unilateral drainage in 70 patients, bilateral drainage in 6 patients) with an overall significant reduction of the post-intervention bilirubin levels (p < 0.001), resulting in a clinical success rate of 97.3%. Clinical success rates were similar in patients treated with whole-liver drainage versus partial liver drainage. Minor and major complications occurred in 8% and 15% of patients, respectively. Mean overall primary stent patency was 120 days, while the restenosis rate was 12%. Mean overall secondary stent patency was 242.2 days. Patency rates were similar in patients with complete versus partial liver drainage. Mean overall survival was 142.3 days. Survival was similar in the complete and partial drainage groups. The post-intervention serum bilirubin level was an independent predictor of survival (p < 0.001). A cut-off point in post-stenting bilirubin levels of 4 mg/dl dichotomized patients with good versus poor prognosis. Patient age and Bismuth IV lesions were also independent predictors

  4. Analysis of survival curves for Rhizopus, Mucor and Penicillia irradiated with gamma radiation

    International Nuclear Information System (INIS)

    Ishiguro, Etsuji; Danno, Akibumi; Miyazato, Mitsuru

    1994-01-01

    This study was aimed to understand the survival pattern of some microorganisms which were treated by γ-sterilization. Though most of the works were concentrated and reported with D 10 -values, it was presently found that the γ-irradiated survival curves showed sigmoid pattern and L-values were reached about half of the D 10 -value for each strains. It was further confirmed that if L-values were used for practical sterilization with D 10 -values, the estimation of sterilized levels would become more accurate. (author). 10 refs., 3 figs., 1 tab

  5. Analysis of 5 year survival of esophageal cancer treated by radiotherapy

    International Nuclear Information System (INIS)

    Takegawa, Yoshihiro; Ohgushi, Ikuyo; Hiraki, Yoshio; Honke, Yoshifumi; Matsuki, Tsutomu; Yokoyama, Takashi; Yoshida, Mineo.

    1987-01-01

    Since 1984, a total of 1,419 patients with carcinoma of the esophagus were treated at the Department of Radiology of 17 hospitals in Chugoku-Shikoku province. The five year survival rate was 7.3 % (42/578 cases). Thirty-nine out of the forty-two cases were analyzed according to the tumor extent, localization and types of the X-ray findings. In addition, 95 patients (67 had been reported in other journals and 28 in this report) who have survived more than 5 years after radical radiotherapy were analyzed. (author)

  6. Radiation therapy improves survival in rectal small cell cancer - Analysis of Surveillance Epidemiology and End Results (SEER) data.

    Science.gov (United States)

    Modrek, Aram S; Hsu, Howard C; Leichman, Cynthia G; Du, Kevin L

    2015-04-24

    Small cell carcinoma of the rectum is a rare neoplasm with scant literature to guide treatment. We used the Surveillance Epidemiology and End Results (SEER) database to investigate the role of radiation therapy in the treatment of this cancer. The SEER database (National Cancer Institute) was queried for locoregional cases of small cell rectal cancer. Years of diagnosis were limited to 1988-2010 (most recent available) to reduce variability in staging criteria or longitudinal changes in surgery and radiation techniques. Two month conditional survival was applied to minimize bias by excluding patients who did not survive long enough to receive cancer-directed therapy. Patient demographics between the RT and No_RT groups were compared using Pearson Chi-Square tests. Overall survival was compared between patients who received radiotherapy (RT, n = 43) and those who did not (No_RT, n = 28) using the Kaplan-Meier method. Multivariate Cox proportional hazards model was used to evaluate important covariates. Median survival was significantly longer for patients who received radiation compared to those who were not treated with radiation; 26 mo vs. 8 mo, respectively (log-rank P = 0.009). We also noted a higher 1-year overall survival rate for those who received radiation (71.1% vs. 37.8%). Unadjusted hazard ratio for death (HR) was 0.495 with the use of radiation (95% CI 0.286-0.858). Among surgery, radiotherapy, sex and age at diagnosis, radiation therapy was the only significant factor for overall survival with a multivariate HR for death of 0.393 (95% CI 0.206-0.750, P = 0.005). Using SEER data, we have identified a significant survival advantage with the use of radiation therapy in the setting of rectal small cell carcinoma. Limitations of the SEER data apply to this study, particularly the lack of information on chemotherapy usage. Our findings strongly support the use of radiation therapy for patients with locoregional small cell rectal cancer.

  7. OASIS 2: online application for survival analysis 2 with features for the analysis of maximal lifespan and healthspan in aging research.

    Science.gov (United States)

    Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk

    2016-08-30

    Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.

  8. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  9. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  10. Somatic mutation load of estrogen receptor-positive breast tumors predicts overall survival: an analysis of genome sequence data.

    Science.gov (United States)

    Haricharan, Svasti; Bainbridge, Matthew N; Scheet, Paul; Brown, Powel H

    2014-07-01

    Breast cancer is one of the most commonly diagnosed cancers in women. While there are several effective therapies for breast cancer and important single gene prognostic/predictive markers, more than 40,000 women die from this disease every year. The increasing availability of large-scale genomic datasets provides opportunities for identifying factors that influence breast cancer survival in smaller, well-defined subsets. The purpose of this study was to investigate the genomic landscape of various breast cancer subtypes and its potential associations with clinical outcomes. We used statistical analysis of sequence data generated by the Cancer Genome Atlas initiative including somatic mutation load (SML) analysis, Kaplan-Meier survival curves, gene mutational frequency, and mutational enrichment evaluation to study the genomic landscape of breast cancer. We show that ER(+), but not ER(-), tumors with high SML associate with poor overall survival (HR = 2.02). Further, these high mutation load tumors are enriched for coincident mutations in both DNA damage repair and ER signature genes. While it is known that somatic mutations in specific genes affect breast cancer survival, this study is the first to identify that SML may constitute an important global signature for a subset of ER(+) tumors prone to high mortality. Moreover, although somatic mutations in individual DNA damage genes affect clinical outcome, our results indicate that coincident mutations in DNA damage response and signature ER genes may prove more informative for ER(+) breast cancer survival. Next generation sequencing may prove an essential tool for identifying pathways underlying poor outcomes and for tailoring therapeutic strategies.

  11. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    Science.gov (United States)

    Ramers, Douglas L.

    2005-01-01

    pressurizing the bottle on a test stand, and running sweeps of excitations frequencies for each of the piezo sensors and recording the resulting impedance. The sweeps were limited to 401 points by the available analyzer, and it was decided to perform individual sweeps at five different excitation frequency ranges. The frequency ranges used for the PZTs were different in two of the five ranges from the ranges used for the SCP. The bottles were pressurized to empty (no water), 0psig, 77 psig, 155 psig, 227 psig in nearly uniform increments of about 77psi. One of each of the two types of piezo sensors was fastened on to the bottle surface at two locations: about midway between the ends on cylindrical portion of the bottle and at the very edge of one of the end domes. The data was collected in files by sensor type (2 cases), by location (2 cases), by frequency range (5 cases), and pressure (5cases) to produce 100 data sets of 401 impedances. After familiarization with the piezo sensing technology and obtaining the data, the team developed a set of questions to try to answer regarding the data and made assignments of responsibilities. The next section lists the questions, and the remainder of the report describes the data analysis work performed by Dr. Ramers. This includes a discussion of the data, the approach to answering the question using statistical techniques, the use of an emergent system to investigate the data where statistical techniques were not usable, conclusions regarding the data, and recommendations.

  12. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  13. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  14. Teaching Community Survival Skills to Mentally Retarded Adults: A Review and Analysis.

    Science.gov (United States)

    Martin, James E.; And Others

    1982-01-01

    The article reviews research on training mentally retarded adults in the following community survival skills: travel training, money management, meal preparation, clothing and personal care, telephone skill, housekeeping, self-medication, leisure skills, social skills, and conversation. Results are said to indicate the value of behavioral…

  15. Effects of temperature on development, survival and reproduction of insects: Experimental design, data analysis and modeling

    Science.gov (United States)

    Jacques Regniere; James Powell; Barbara Bentz; Vincent Nealis

    2012-01-01

    The developmental response of insects to temperature is important in understanding the ecology of insect life histories. Temperature-dependent phenology models permit examination of the impacts of temperature on the geographical distributions, population dynamics and management of insects. The measurement of insect developmental, survival and reproductive responses to...

  16. Fuselage Burnthrough Protection for Increased Postcrash Occupant Survivability: Safety Benefit Analysis Based on Past Accidents

    National Research Council Canada - National Science Library

    Cherry, Ray

    1999-01-01

    .... The methodology gives a reasonable assessment of the tolerance on the predicted levels. Fire hardening of fuselages will provide benefits in terms of enhanced occupant survival and may be found to be cost beneficial if low-cost solutions can be found...

  17. Survivalism and Public Opinion on Criminality: A Cross-National Analysis of Prostitution

    Science.gov (United States)

    Stack, Steven; Adamczyk, Amy; Cao, Liqun

    2010-01-01

    Explanations of variability in public opinion on crime have drawn disproportionately from the literature on specific symbolic orientations including religious fundamentalism and racial prejudice. In contrast, this article hypothesizes that public opinion is linked to the strength of a general cultural axis of nations: survivalism vs.…

  18. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    Science.gov (United States)

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  19. Multiparametric analysis of magnetic resonance images for glioma grading and patient survival time prediction

    International Nuclear Information System (INIS)

    Garzon, Benjamin; Emblem, Kyrre E.; Mouridsen, Kim; Nedregaard, Baard; Due-Toennessen, Paulina; Nome, Terje; Hald, John K.; Bjoernerud, Atle; Haaberg, Asta K.; Kvinnsland, Yngve

    2011-01-01

    Background. A systematic comparison of magnetic resonance imaging (MRI) options for glioma diagnosis is lacking. Purpose. To investigate multiple MR-derived image features with respect to diagnostic accuracy in tumor grading and survival prediction in glioma patients. Material and Methods. T1 pre- and post-contrast, T2 and dynamic susceptibility contrast scans of 74 glioma patients with histologically confirmed grade were acquired. For each patient, a set of statistical features was obtained from the parametric maps derived from the original images, in a region-of-interest encompassing the tumor volume. A forward stepwise selection procedure was used to find the best combinations of features for grade prediction with a cross-validated logistic model and survival time prediction with a cox proportional-hazards regression. Results. Presence/absence of enhancement paired with kurtosis of the FM (first moment of the first-pass curve) was the feature combination that best predicted tumor grade (grade II vs. grade III-IV; median AUC 0.96), with the main contribution being due to the first of the features. A lower predictive value (median AUC = 0.82) was obtained when grade IV tumors were excluded. Presence/absence of enhancement alone was the best predictor for survival time, and the regression was significant (P < 0.0001). Conclusion. Presence/absence of enhancement, reflecting transendothelial leakage, was the feature with highest predictive value for grade and survival time in glioma patients

  20. Statistical Analysis of Competing Risks: Overall Survival in a Group of Chronic Myeloid Leukemia Patients

    Czech Academy of Sciences Publication Activity Database

    Fürstová, Jana; Valenta, Zdeněk

    2011-01-01

    Roč. 7, č. 1 (2011), s. 2-10 ISSN 1801-5603 Institutional research plan: CEZ:AV0Z10300504 Keywords : competing risks * chronic myeloid leukemia (CML) * overall survival * cause-specific hazard * cumulative incidence function Subject RIV: IN - Informatics, Computer Science http://www.ejbi.eu/images/2011-1/Furstova_en.pdf

  1. SNP-SNP interaction analysis of NF-κB signaling pathway on breast cancer survival

    DEFF Research Database (Denmark)

    Jamshidi, Maral; Fagerholm, Rainer; Khan, Sofia

    2015-01-01

    of SNP pairs without and with an interaction term. We found two interacting pairs associating with prognosis: patients simultaneously homozygous for the rare alleles of rs5996080 and rs7973914 had worse survival (HRinteraction 6.98, 95% CI=3.3-14.4, P=1.42E-07), and patients carrying at least one rare...

  2. Multiparametric analysis of magnetic resonance images for glioma grading and patient survival time prediction

    Energy Technology Data Exchange (ETDEWEB)

    Garzon, Benjamin (Dept. of Circulation and Medical Imaging, NTNU, Trondheim (Norway)), email: benjamin.garzon@ntnu.no; Emblem, Kyrre E. (The Interventional Center, Rikshospitalet, Oslo Univ. Hospital, Oslo (Norway); Dept. of Radiology, MGH-HST AA Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States)); Mouridsen, Kim (Center of Functionally Integrative Neuroscience, Aarhus Univ., Aarhus (Denmark)); Nedregaard, Baard; Due-Toennessen, Paulina; Nome, Terje; Hald, John K. (Dept. of Radiology and Nuclear Medicine, Rikshospitalet, Oslo Univ. Hospital, Oslo (Norway)); Bjoernerud, Atle (The Interventional Center, Rikshospitalet, Oslo Univ. Hospital, Oslo (Norway)); Haaberg, Asta K. (Dept. of Circulation and Medical Imaging, NTNU, Trondheim (Norway); Dept. of Medical Imaging, St Olav' s Hospital, Trondheim (Norway)); Kvinnsland, Yngve (NordicImagingLab, Bergen (Norway))

    2011-11-15

    Background. A systematic comparison of magnetic resonance imaging (MRI) options for glioma diagnosis is lacking. Purpose. To investigate multiple MR-derived image features with respect to diagnostic accuracy in tumor grading and survival prediction in glioma patients. Material and Methods. T1 pre- and post-contrast, T2 and dynamic susceptibility contrast scans of 74 glioma patients with histologically confirmed grade were acquired. For each patient, a set of statistical features was obtained from the parametric maps derived from the original images, in a region-of-interest encompassing the tumor volume. A forward stepwise selection procedure was used to find the best combinations of features for grade prediction with a cross-validated logistic model and survival time prediction with a cox proportional-hazards regression. Results. Presence/absence of enhancement paired with kurtosis of the FM (first moment of the first-pass curve) was the feature combination that best predicted tumor grade (grade II vs. grade III-IV; median AUC 0.96), with the main contribution being due to the first of the features. A lower predictive value (median AUC = 0.82) was obtained when grade IV tumors were excluded. Presence/absence of enhancement alone was the best predictor for survival time, and the regression was significant (P < 0.0001). Conclusion. Presence/absence of enhancement, reflecting transendothelial leakage, was the feature with highest predictive value for grade and survival time in glioma patients

  3. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  4. Solitary plasmacytoma: population-based analysis of survival trends and effect of various treatment modalities in the USA.

    Science.gov (United States)

    Thumallapally, Nishitha; Meshref, Ahmed; Mousa, Mohammed; Terjanian, Terenig

    2017-01-05

    Solitary plasmacytoma (SP) is a localized neoplastic plasma cell disorder with an annual incidence of less than 450 cases. Given the rarity of this disorder, it is difficult to conduct large-scale population studies. Consequently, very limited information on the disorder is available, making it difficult to estimate the incidence and survival rates. Furthermore, limited information is available on the efficacy of various treatment modalities in relation to primary tumor sites. The data for this retrospective study were drawn from the Surveillance, Epidemiology and End Results (SEER) database, which comprises 18 registries; patient demographics, treatment modalities and survival rates were obtained for those diagnosed with SP from 1998 to 2007. Various prognostic factors were analyzed via Kaplan-Meier analysis and log-rank test, with 5-year relative survival rate defined as the primary outcome of interest. Cox regression analysis was employed in the multivariate analysis. The SEER search from 1998 to 2007 yielded records for 1691 SP patients. The median age at diagnosis was 63 years. The patient cohort was 62.4% male, 37.6% female, 80% Caucasian, 14.6% African American and 5.4% other races. Additionally, 57.8% had osseous plasmacytoma, and 31.9% had extraosseous involvement. Unspecified plasmacytoma was noted in 10.2% of patients. The most common treatment modalities were radiotherapy (RT) (48.8%), followed by combination surgery with RT (21.2%) and surgery alone (11.6%). Univariate analysis of prognostic factors revealed that the survival outcomes were better for younger male patients who received RT with surgery (p multiple myeloma (MM) was noted in 551 patients. Age >60 years was associated with a lower 5-year survival in patients who progressed to MM compared to those who were diagnosed initially with MM (15.1 vs 16.6%). Finally, those who received RT and progressed to MM still had a higher chance of survival than those who were diagnosed with MM initially and

  5. [Analysis of clinicopathologic and survival characteristics in patients with right-or left-sided colon cancer].

    Science.gov (United States)

    Hu, Junjie; Zhou, Zhixiang; Liang, Jianwei; Zhou, Haitao; Wang, Zheng; Zhang, Xingmao; Zeng, Weigen

    2015-07-28

    This study aimed to clarify the clinical and histological parameters, and survival difference between right- and left-sided colon cancer. We retrospectively analyzed the medical records (2006.1-2009.12) of 1 088 consecutive colon cancer patients who received surgery at our hospital. Right- and left-sided colon cancers were compared regarding the clinical and histological parameters. The survival analysis was performed by the Kaplan-Meier method, and the log-rank test was used to determine the statistical significance of differences. Right-sided colon cancer was associated with older age, a more advanced state, and poorly differentiated and undifferentiated adenocarcinoma (25.2% vs 13.2%), mucinous adenocarcinoma (33.5% vs 17.3%) and vascular invasion (9.9% vs 3.9%) were more commonly seen in right-sided colon cancer compared with right-sided colon cancer, and all these differences were statistically significant. Median overall survival was right, 67 months; and left, 68 months. The five-years overall survival of right- and left-sided colon cancer was I/II stage, 91.4% vs 88.6% (P = 0.819); III stage, 66.1% vs 75.4% (P = 0.010); and IV stage, 27.8% vs 38.5% (P = 0.020) respectively. Right- and left-sided colon cancers are significantly different regarding clinical and histological parameters. Right-sided colon cancers in stage III and IV have a worse prognosis.

  6. Intraoperative radiotherapy combined with resection for pancreatic cancer. Analysis of survival rates and prognostic factors

    International Nuclear Information System (INIS)

    Kuga, Hirotaka; Nishihara, Kazuyoshi; Matsunaga, Hiroaki; Suehara, Nobuhiro; Abe, Yuji; Ihara, Takaaki; Iwashita, Toshimitsu; Mitsuyama, Shoshu

    2006-01-01

    The purpose of this study was to evaluate the efficiency of intraoperative radiotherapy (IORT) combined with surgical resection. Subjects were consecutive 69 patients with pancreatic cancer treated with surgery alone (n=31) or surgical resection combined with IORT (n=38) in a 13 year period between 1991 and 2003. We evaluated the effects of IORT against local recurrence of cancer and patients' survival, retrospectively. Furthermore, clinicopathological factors affecting the 5-year survival rate in the two groups were comparatively investigated. The IORT group showed a significantly lower local recurrence rate of cancer than that in the surgery alone group (7.8% and 22.6%, respectively; p<0.05). The 5-year survival probability in the IORT group was significantly higher than that in the surgery alone group (29.9% and 3.4%, respectively; p<0.05). According to the Japanese classification of pancreatic cancer, cancers located in the pancreas body or tail, no local residual cancer post operative procedure (R0), low grade local cancer progression (t1, 2), and low grade intrapancreatic neural invasion (ne0, 1) were significantly better prognostic factors in the IORT group than those in the surgery alone group. There were no significant differences between the both groups in the 5-year survival rate in terms of the sex of the patients, cancer of the pancreas head, histological type, more than R1, the presence of lymph node involvement, ne2-3, and clinical stages. IORT is a useful intraoperative adjuvant therapy for pancreatic cancer, when the curative resection is achieved. Our data have suggested that IORT suppresses the local recurrence of cancer and provides the significant survival benefit for those patients. (author)

  7. Dedifferentiated chondrosarcoma: A survival analysis of 159 cases from the SEER database (2001-2011).

    Science.gov (United States)

    Strotman, Patrick K; Reif, Taylor J; Kliethermes, Stephanie A; Sandhu, Jasmin K; Nystrom, Lukas M

    2017-08-01

    Dedifferentiated chondrosarcoma is a rare malignancy with reported 5-year overall survival rates ranging from 7% to 24%. The purpose of this investigation is to determine the overall survival of dedifferentiated chondrosarcoma in a modern patient series and how it is impacted by patient demographics, tumor characteristics, and surgical treatment factors. This is a retrospective review of the Surveillance, Epidemiology, and End Results (SEER) database from 2001 to 2011. Kaplan Meier analyses were used for overall and disease-specific survival. Univariable and multivariable cox regression models were used to identify prognostic factors. Five year overall- and disease-specific survival was 18% (95% CI: 12-26%) and 28% (95% CI: 18-37%), respectively. Individuals with extremity tumors had a worse prognosis than individuals with a primary tumor in the chest wall or axial skeleton (HR 0.20, 95% CI: 0.07-0.56; P = 0.002 and HR 0.60, 95% CI: 0.36-0.99; P = 0.04, respectively). Patients with AJCC stage III or IV disease (HR 2.51, 95% CI: 1.50-4.20; P = 0.001), tumors larger than 8 cm (HR 2.17, 95% CI: 1.11-4.27; P = 0.046), metastatic disease at diagnosis (HR 3.25, 95% CI: 1.98-5.33; P chondrosarcoma is poor with a 5-year overall survival of 18%. Patients with a primary tumor located in the chest wall had a better prognosis. Tumors larger than 8 cm, presence of metastases at diagnosis, and treatment without surgical resection were significant predictors of mortality. © 2017 Wiley Periodicals, Inc.

  8. MUMAL: Multivariate an