WorldWideScience

Sample records for significantly improved predictions

  1. Model training across multiple breeding cycles significantly improves genomic prediction accuracy in rye (Secale cereale L.).

    Science.gov (United States)

    Auinger, Hans-Jürgen; Schönleben, Manfred; Lehermeier, Christina; Schmidt, Malthe; Korzun, Viktor; Geiger, Hartwig H; Piepho, Hans-Peter; Gordillo, Andres; Wilde, Peer; Bauer, Eva; Schön, Chris-Carolin

    2016-11-01

    Genomic prediction accuracy can be significantly increased by model calibration across multiple breeding cycles as long as selection cycles are connected by common ancestors. In hybrid rye breeding, application of genome-based prediction is expected to increase selection gain because of long selection cycles in population improvement and development of hybrid components. Essentially two prediction scenarios arise: (1) prediction of the genetic value of lines from the same breeding cycle in which model training is performed and (2) prediction of lines from subsequent cycles. It is the latter from which a reduction in cycle length and consequently the strongest impact on selection gain is expected. We empirically investigated genome-based prediction of grain yield, plant height and thousand kernel weight within and across four selection cycles of a hybrid rye breeding program. Prediction performance was assessed using genomic and pedigree-based best linear unbiased prediction (GBLUP and PBLUP). A total of 1040 S 2 lines were genotyped with 16 k SNPs and each year testcrosses of 260 S 2 lines were phenotyped in seven or eight locations. The performance gap between GBLUP and PBLUP increased significantly for all traits when model calibration was performed on aggregated data from several cycles. Prediction accuracies obtained from cross-validation were in the order of 0.70 for all traits when data from all cycles (N CS  = 832) were used for model training and exceeded within-cycle accuracies in all cases. As long as selection cycles are connected by a sufficient number of common ancestors and prediction accuracy has not reached a plateau when increasing sample size, aggregating data from several preceding cycles is recommended for predicting genetic values in subsequent cycles despite decreasing relatedness over time.

  2. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    field monitoring. Vibration prediction diminishes the importance of trial-and-error procedures such as drill-off tests, which are valid only for short sections. It also solves an existing lapse in Mechanical Specific Energy (MSE) real-time drilling control programs applying the theory of Teale, which states that a drilling system is perfectly efficient when it spends the exact energy to overcome the in situ rock strength. Using the proprietary software tool this paper will examine the resonant vibration modes that may be initiated while drilling with different BHA's and drill string designs, showing that the combination of a proper BHA design along with the correct selection of input parameters results in an overall improvement to drilling efficiency. Also, being the BHA predictively analyzed, it will be reduced the potential for vibration or stress fatigue in the drill string components, leading to a safer operation. In the recent years there has been an increased focus on vibration detection, analysis, and mitigation techniques, where new technologies, like the Drilling Dynamics Data Recorders (DDDR), may provide the capability to capture high frequency dynamics data at multiple points along the drilling system. These tools allow the achievement of drilling performance improvements not possible before, opening a whole new array of opportunities for optimization and for verification of predictions calculated by the drill string dynamics modeling software tool. The results of this study will identify how the dynamics from the drilling system, interacting with formation, directly relate to inefficiencies and to the possible solutions to mitigate drilling vibrations in order to improve drilling performance. Software vibration prediction and downhole measurements can be used for non-drilling operations like drilling out casing or reaming, where extremely high vibration levels - devastating to the cutting structure of the bit before it has even touched bottom - have

  3. Lipoprotein metabolism indicators improve cardiovascular risk prediction.

    Directory of Open Access Journals (Sweden)

    Daniël B van Schalkwijk

    Full Text Available BACKGROUND: Cardiovascular disease risk increases when lipoprotein metabolism is dysfunctional. We have developed a computational model able to derive indicators of lipoprotein production, lipolysis, and uptake processes from a single lipoprotein profile measurement. This is the first study to investigate whether lipoprotein metabolism indicators can improve cardiovascular risk prediction and therapy management. METHODS AND RESULTS: We calculated lipoprotein metabolism indicators for 1981 subjects (145 cases, 1836 controls from the Framingham Heart Study offspring cohort in which NMR lipoprotein profiles were measured. We applied a statistical learning algorithm using a support vector machine to select conventional risk factors and lipoprotein metabolism indicators that contributed to predicting risk for general cardiovascular disease. Risk prediction was quantified by the change in the Area-Under-the-ROC-Curve (ΔAUC and by risk reclassification (Net Reclassification Improvement (NRI and Integrated Discrimination Improvement (IDI. Two VLDL lipoprotein metabolism indicators (VLDLE and VLDLH improved cardiovascular risk prediction. We added these indicators to a multivariate model with the best performing conventional risk markers. Our method significantly improved both CVD prediction and risk reclassification. CONCLUSIONS: Two calculated VLDL metabolism indicators significantly improved cardiovascular risk prediction. These indicators may help to reduce prescription of unnecessary cholesterol-lowering medication, reducing costs and possible side-effects. For clinical application, further validation is required.

  4. Survival prediction algorithms miss significant opportunities for improvement if used for case selection in trauma quality improvement programs.

    Science.gov (United States)

    Heim, Catherine; Cole, Elaine; West, Anita; Tai, Nigel; Brohi, Karim

    2016-09-01

    Quality improvement (QI) programs have shown to reduce preventable mortality in trauma care. Detailed review of all trauma deaths is a time and resource consuming process and calculated probability of survival (Ps) has been proposed as audit filter. Review is limited on deaths that were 'expected to survive'. However no Ps-based algorithm has been validated and no study has examined elements of preventability associated with deaths classified as 'expected'. The objective of this study was to examine whether trauma performance review can be streamlined using existing mortality prediction tools without missing important areas for improvement. We conducted a retrospective study of all trauma deaths reviewed by our trauma QI program. Deaths were classified into non-preventable, possibly preventable, probably preventable or preventable. Opportunities for improvement (OPIs) involve failure in the process of care and were classified into clinical and system deviations from standards of care. TRISS and PS were used for calculation of probability of survival. Peer-review charts were reviewed by a single investigator. Over 8 years, 626 patients were included. One third showed elements of preventability and 4% were preventable. Preventability occurred across the entire range of the calculated Ps band. Limiting review to unexpected deaths would have missed over 50% of all preventability issues and a third of preventable deaths. 37% of patients showed opportunities for improvement (OPIs). Neither TRISS nor PS allowed for reliable identification of OPIs and limiting peer-review to patients with unexpected deaths would have missed close to 60% of all issues in care. TRISS and PS fail to identify a significant proportion of avoidable deaths and miss important opportunities for process and system improvement. Based on this, all trauma deaths should be subjected to expert panel review in order to aim at a maximal output of performance improvement programs. Copyright © 2016 Elsevier

  5. Physiologically-based, predictive analytics using the heart-rate-to-Systolic-Ratio significantly improves the timeliness and accuracy of sepsis prediction compared to SIRS.

    Science.gov (United States)

    Danner, Omar K; Hendren, Sandra; Santiago, Ethel; Nye, Brittany; Abraham, Prasad

    2017-04-01

    Enhancing the efficiency of diagnosis and treatment of severe sepsis by using physiologically-based, predictive analytical strategies has not been fully explored. We hypothesize assessment of heart-rate-to-systolic-ratio significantly increases the timeliness and accuracy of sepsis prediction after emergency department (ED) presentation. We evaluated the records of 53,313 ED patients from a large, urban teaching hospital between January and June 2015. The HR-to-systolic ratio was compared to SIRS criteria for sepsis prediction. There were 884 patients with discharge diagnoses of sepsis, severe sepsis, and/or septic shock. Variations in three presenting variables, heart rate, systolic BP and temperature were determined to be primary early predictors of sepsis with a 74% (654/884) accuracy compared to 34% (304/884) using SIRS criteria (p < 0.0001)in confirmed septic patients. Physiologically-based predictive analytics improved the accuracy and expediency of sepsis identification via detection of variations in HR-to-systolic ratio. This approach may lead to earlier sepsis workup and life-saving interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Audiovisual biofeedback improves motion prediction accuracy.

    Science.gov (United States)

    Pollock, Sean; Lee, Danny; Keall, Paul; Kim, Taeho

    2013-04-01

    The accuracy of motion prediction, utilized to overcome the system latency of motion management radiotherapy systems, is hampered by irregularities present in the patients' respiratory pattern. Audiovisual (AV) biofeedback has been shown to reduce respiratory irregularities. The aim of this study was to test the hypothesis that AV biofeedback improves the accuracy of motion prediction. An AV biofeedback system combined with real-time respiratory data acquisition and MR images were implemented in this project. One-dimensional respiratory data from (1) the abdominal wall (30 Hz) and (2) the thoracic diaphragm (5 Hz) were obtained from 15 healthy human subjects across 30 studies. The subjects were required to breathe with and without the guidance of AV biofeedback during each study. The obtained respiratory signals were then implemented in a kernel density estimation prediction algorithm. For each of the 30 studies, five different prediction times ranging from 50 to 1400 ms were tested (150 predictions performed). Prediction error was quantified as the root mean square error (RMSE); the RMSE was calculated from the difference between the real and predicted respiratory data. The statistical significance of the prediction results was determined by the Student's t-test. Prediction accuracy was considerably improved by the implementation of AV biofeedback. Of the 150 respiratory predictions performed, prediction accuracy was improved 69% (103/150) of the time for abdominal wall data, and 78% (117/150) of the time for diaphragm data. The average reduction in RMSE due to AV biofeedback over unguided respiration was 26% (p biofeedback improves prediction accuracy. This would result in increased efficiency of motion management techniques affected by system latencies used in radiotherapy.

  7. The Real World Significance of Performance Prediction

    Science.gov (United States)

    Pardos, Zachary A.; Wang, Qing Yang; Trivedi, Shubhendu

    2012-01-01

    In recent years, the educational data mining and user modeling communities have been aggressively introducing models for predicting student performance on external measures such as standardized tests as well as within-tutor performance. While these models have brought statistically reliable improvement to performance prediction, the real world…

  8. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions.

    Directory of Open Access Journals (Sweden)

    Tomislav Hengl

    Full Text Available 80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS project was established in 2008. Over the period 2008-2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management--organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na. We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15-75% in Root Mean Squared Error (RMSE across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring

  9. Improved Wind Speed Prediction Using Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    ZHANG, Y.

    2018-05-01

    Full Text Available Wind power industry plays an important role in promoting the development of low-carbon economic and energy transformation in the world. However, the randomness and volatility of wind speed series restrict the healthy development of the wind power industry. Accurate wind speed prediction is the key to realize the stability of wind power integration and to guarantee the safe operation of the power system. In this paper, combined with the Empirical Mode Decomposition (EMD, the Radial Basis Function Neural Network (RBF and the Least Square Support Vector Machine (SVM, an improved wind speed prediction model based on Empirical Mode Decomposition (EMD-RBF-LS-SVM is proposed. The prediction result indicates that compared with the traditional prediction model (RBF, LS-SVM, the EMD-RBF-LS-SVM model can weaken the random fluctuation to a certain extent and improve the short-term accuracy of wind speed prediction significantly. In a word, this research will significantly reduce the impact of wind power instability on the power grid, ensure the power grid supply and demand balance, reduce the operating costs in the grid-connected systems, and enhance the market competitiveness of the wind power.

  10. Improving Earth/Prediction Models to Improve Network Processing

    Science.gov (United States)

    Wagner, G. S.

    2017-12-01

    The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.

  11. Exploring the significance of human mobility patterns in social link prediction

    KAUST Repository

    Alharbi, Basma Mohammed

    2014-01-01

    Link prediction is a fundamental task in social networks. Recently, emphasis has been placed on forecasting new social ties using user mobility patterns, e.g., investigating physical and semantic co-locations for new proximity measure. This paper explores the effect of in-depth mobility patterns. Specifically, we study individuals\\' movement behavior, and quantify mobility on the basis of trip frequency, travel purpose and transportation mode. Our hybrid link prediction model is composed of two modules. The first module extracts mobility patterns, including travel purpose and mode, from raw trajectory data. The second module employs the extracted patterns for link prediction. We evaluate our method on two real data sets, GeoLife [15] and Reality Mining [5]. Experimental results show that our hybrid model significantly improves the accuracy of social link prediction, when comparing to primary topology-based solutions. Copyright 2014 ACM.

  12. Neurophysiology in preschool improves behavioral prediction of reading ability throughout primary school.

    Science.gov (United States)

    Maurer, Urs; Bucher, Kerstin; Brem, Silvia; Benz, Rosmarie; Kranz, Felicitas; Schulz, Enrico; van der Mark, Sanne; Steinhausen, Hans-Christoph; Brandeis, Daniel

    2009-08-15

    More struggling readers could profit from additional help at the beginning of reading acquisition if dyslexia prediction were more successful. Currently, prediction is based only on behavioral assessment of early phonological processing deficits associated with dyslexia, but it might be improved by adding brain-based measures. In a 5-year longitudinal study of children with (n = 21) and without (n = 23) familial risk for dyslexia, we tested whether neurophysiological measures of automatic phoneme and tone deviance processing obtained in kindergarten would improve prediction of reading over behavioral measures alone. Together, neurophysiological and behavioral measures obtained in kindergarten significantly predicted reading in school. Particularly the late mismatch negativity measure that indicated hemispheric lateralization of automatic phoneme processing improved prediction of reading ability over behavioral measures. It was also the only significant predictor for long-term reading success in fifth grade. Importantly, this result also held for the subgroup of children at familial risk. The results demonstrate that brain-based measures of processing deficits associated with dyslexia improve prediction of reading and thus may be further evaluated to complement clinical practice of dyslexia prediction, especially in targeted populations, such as children with a familial risk.

  13. Improving contact prediction along three dimensions.

    Directory of Open Access Journals (Sweden)

    Christoph Feinauer

    2014-10-01

    Full Text Available Correlation patterns in multiple sequence alignments of homologous proteins can be exploited to infer information on the three-dimensional structure of their members. The typical pipeline to address this task, which we in this paper refer to as the three dimensions of contact prediction, is to (i filter and align the raw sequence data representing the evolutionarily related proteins; (ii choose a predictive model to describe a sequence alignment; (iii infer the model parameters and interpret them in terms of structural properties, such as an accurate contact map. We show here that all three dimensions are important for overall prediction success. In particular, we show that it is possible to improve significantly along the second dimension by going beyond the pair-wise Potts models from statistical physics, which have hitherto been the focus of the field. These (simple extensions are motivated by multiple sequence alignments often containing long stretches of gaps which, as a data feature, would be rather untypical for independent samples drawn from a Potts model. Using a large test set of proteins we show that the combined improvements along the three dimensions are as large as any reported to date.

  14. Text mining improves prediction of protein functional sites.

    Directory of Open Access Journals (Sweden)

    Karin M Verspoor

    Full Text Available We present an approach that integrates protein structure analysis and text mining for protein functional site prediction, called LEAP-FS (Literature Enhanced Automated Prediction of Functional Sites. The structure analysis was carried out using Dynamics Perturbation Analysis (DPA, which predicts functional sites at control points where interactions greatly perturb protein vibrations. The text mining extracts mentions of residues in the literature, and predicts that residues mentioned are functionally important. We assessed the significance of each of these methods by analyzing their performance in finding known functional sites (specifically, small-molecule binding sites and catalytic sites in about 100,000 publicly available protein structures. The DPA predictions recapitulated many of the functional site annotations and preferentially recovered binding sites annotated as biologically relevant vs. those annotated as potentially spurious. The text-based predictions were also substantially supported by the functional site annotations: compared to other residues, residues mentioned in text were roughly six times more likely to be found in a functional site. The overlap of predictions with annotations improved when the text-based and structure-based methods agreed. Our analysis also yielded new high-quality predictions of many functional site residues that were not catalogued in the curated data sources we inspected. We conclude that both DPA and text mining independently provide valuable high-throughput protein functional site predictions, and that integrating the two methods using LEAP-FS further improves the quality of these predictions.

  15. Text Mining Improves Prediction of Protein Functional Sites

    Science.gov (United States)

    Cohn, Judith D.; Ravikumar, Komandur E.

    2012-01-01

    We present an approach that integrates protein structure analysis and text mining for protein functional site prediction, called LEAP-FS (Literature Enhanced Automated Prediction of Functional Sites). The structure analysis was carried out using Dynamics Perturbation Analysis (DPA), which predicts functional sites at control points where interactions greatly perturb protein vibrations. The text mining extracts mentions of residues in the literature, and predicts that residues mentioned are functionally important. We assessed the significance of each of these methods by analyzing their performance in finding known functional sites (specifically, small-molecule binding sites and catalytic sites) in about 100,000 publicly available protein structures. The DPA predictions recapitulated many of the functional site annotations and preferentially recovered binding sites annotated as biologically relevant vs. those annotated as potentially spurious. The text-based predictions were also substantially supported by the functional site annotations: compared to other residues, residues mentioned in text were roughly six times more likely to be found in a functional site. The overlap of predictions with annotations improved when the text-based and structure-based methods agreed. Our analysis also yielded new high-quality predictions of many functional site residues that were not catalogued in the curated data sources we inspected. We conclude that both DPA and text mining independently provide valuable high-throughput protein functional site predictions, and that integrating the two methods using LEAP-FS further improves the quality of these predictions. PMID:22393388

  16. Combining gene prediction methods to improve metagenomic gene annotation

    Directory of Open Access Journals (Sweden)

    Rosen Gail L

    2011-01-01

    Full Text Available Abstract Background Traditional gene annotation methods rely on characteristics that may not be available in short reads generated from next generation technology, resulting in suboptimal performance for metagenomic (environmental samples. Therefore, in recent years, new programs have been developed that optimize performance on short reads. In this work, we benchmark three metagenomic gene prediction programs and combine their predictions to improve metagenomic read gene annotation. Results We not only analyze the programs' performance at different read-lengths like similar studies, but also separate different types of reads, including intra- and intergenic regions, for analysis. The main deficiencies are in the algorithms' ability to predict non-coding regions and gene edges, resulting in more false-positives and false-negatives than desired. In fact, the specificities of the algorithms are notably worse than the sensitivities. By combining the programs' predictions, we show significant improvement in specificity at minimal cost to sensitivity, resulting in 4% improvement in accuracy for 100 bp reads with ~1% improvement in accuracy for 200 bp reads and above. To correctly annotate the start and stop of the genes, we find that a consensus of all the predictors performs best for shorter read lengths while a unanimous agreement is better for longer read lengths, boosting annotation accuracy by 1-8%. We also demonstrate use of the classifier combinations on a real dataset. Conclusions To optimize the performance for both prediction and annotation accuracies, we conclude that the consensus of all methods (or a majority vote is the best for reads 400 bp and shorter, while using the intersection of GeneMark and Orphelia predictions is the best for reads 500 bp and longer. We demonstrate that most methods predict over 80% coding (including partially coding reads on a real human gut sample sequenced by Illumina technology.

  17. Predicting significant torso trauma.

    Science.gov (United States)

    Nirula, Ram; Talmor, Daniel; Brasel, Karen

    2005-07-01

    Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.

  18. Can machine-learning improve cardiovascular risk prediction using routine clinical data?

    Science.gov (United States)

    Kai, Joe; Garibaldi, Jonathan M.; Qureshi, Nadeem

    2017-01-01

    Background Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Methods Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the ‘receiver operating curve’ (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). Findings 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723–0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739–0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755–0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755–0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759–0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Conclusions Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others

  19. Improving urban wind flow predictions through data assimilation

    Science.gov (United States)

    Sousa, Jorge; Gorle, Catherine

    2017-11-01

    Computational fluid dynamic is fundamentally important to several aspects in the design of sustainable and resilient urban environments. The prediction of the flow pattern for example can help to determine pedestrian wind comfort, air quality, optimal building ventilation strategies, and wind loading on buildings. However, the significant variability and uncertainty in the boundary conditions poses a challenge when interpreting results as a basis for design decisions. To improve our understanding of the uncertainties in the models and develop better predictive tools, we started a pilot field measurement campaign on Stanford University's campus combined with a detailed numerical prediction of the wind flow. The experimental data is being used to investigate the potential use of data assimilation and inverse techniques to better characterize the uncertainty in the results and improve the confidence in current wind flow predictions. We consider the incoming wind direction and magnitude as unknown parameters and perform a set of Reynolds-averaged Navier-Stokes simulations to build a polynomial chaos expansion response surface at each sensor location. We subsequently use an inverse ensemble Kalman filter to retrieve an estimate for the probabilistic density function of the inflow parameters. Once these distributions are obtained, the forward analysis is repeated to obtain predictions for the flow field in the entire urban canopy and the results are compared with the experimental data. We would like to acknowledge high-performance computing support from Yellowstone (ark:/85065/d7wd3xhc) provided by NCAR.

  20. A novel method for improved accuracy of transcription factor binding site prediction

    KAUST Repository

    Khamis, Abdullah M.; Motwalli, Olaa Amin; Oliva, Romina; Jankovic, Boris R.; Medvedeva, Yulia; Ashoor, Haitham; Essack, Magbubah; Gao, Xin; Bajic, Vladimir B.

    2018-01-01

    Identifying transcription factor (TF) binding sites (TFBSs) is important in the computational inference of gene regulation. Widely used computational methods of TFBS prediction based on position weight matrices (PWMs) usually have high false positive rates. Moreover, computational studies of transcription regulation in eukaryotes frequently require numerous PWM models of TFBSs due to a large number of TFs involved. To overcome these problems we developed DRAF, a novel method for TFBS prediction that requires only 14 prediction models for 232 human TFs, while at the same time significantly improves prediction accuracy. DRAF models use more features than PWM models, as they combine information from TFBS sequences and physicochemical properties of TF DNA-binding domains into machine learning models. Evaluation of DRAF on 98 human ChIP-seq datasets shows on average 1.54-, 1.96- and 5.19-fold reduction of false positives at the same sensitivities compared to models from HOCOMOCO, TRANSFAC and DeepBind, respectively. This observation suggests that one can efficiently replace the PWM models for TFBS prediction by a small number of DRAF models that significantly improve prediction accuracy. The DRAF method is implemented in a web tool and in a stand-alone software freely available at http://cbrc.kaust.edu.sa/DRAF.

  1. A novel method for improved accuracy of transcription factor binding site prediction

    KAUST Repository

    Khamis, Abdullah M.

    2018-03-20

    Identifying transcription factor (TF) binding sites (TFBSs) is important in the computational inference of gene regulation. Widely used computational methods of TFBS prediction based on position weight matrices (PWMs) usually have high false positive rates. Moreover, computational studies of transcription regulation in eukaryotes frequently require numerous PWM models of TFBSs due to a large number of TFs involved. To overcome these problems we developed DRAF, a novel method for TFBS prediction that requires only 14 prediction models for 232 human TFs, while at the same time significantly improves prediction accuracy. DRAF models use more features than PWM models, as they combine information from TFBS sequences and physicochemical properties of TF DNA-binding domains into machine learning models. Evaluation of DRAF on 98 human ChIP-seq datasets shows on average 1.54-, 1.96- and 5.19-fold reduction of false positives at the same sensitivities compared to models from HOCOMOCO, TRANSFAC and DeepBind, respectively. This observation suggests that one can efficiently replace the PWM models for TFBS prediction by a small number of DRAF models that significantly improve prediction accuracy. The DRAF method is implemented in a web tool and in a stand-alone software freely available at http://cbrc.kaust.edu.sa/DRAF.

  2. Significance of High Resolution GHRSST on prediction of Indian Summer Monsoon

    KAUST Repository

    Jangid, Buddhi Prakash

    2017-02-24

    In this study, the Weather Research and Forecasting (WRF) model was used to assess the importance of very high resolution sea surface temperature (SST) on seasonal rainfall prediction. Two different SST datasets available from the National Centers for Environmental Prediction (NCEP) global model analysis and merged satellite product from Group for High Resolution SST (GHRSST) are used as a lower boundary condition in the WRF model for the Indian Summer Monsoon (ISM) 2010. Before using NCEP SST and GHRSST for model simulation, an initial verification of NCEP SST and GHRSST are performed with buoy measurements. It is found that approximately 0.4 K root mean square difference (RMSD) in GHRSST and NCEP SST when compared with buoy observations available over the Indian Ocean during 01 May to 30 September 2010. Our analyses suggest that use of GHRSST as lower boundary conditions in the WRF model improve the low level temperature, moisture, wind speed and rainfall prediction over ISM region. Moreover, temporal evolution of surface parameters such as temperature, moisture and wind speed forecasts associated with monsoon is also improved with GHRSST forcing as a lower boundary condition. Interestingly, rainfall prediction is improved with the use of GHRSST over the Western Ghats, which mostly not simulated in the NCEP SST based experiment.

  3. Significance of High Resolution GHRSST on prediction of Indian Summer Monsoon

    KAUST Repository

    Jangid, Buddhi Prakash; Kumar, Prashant; Attada, Raju; Kumar, Raj

    2017-01-01

    In this study, the Weather Research and Forecasting (WRF) model was used to assess the importance of very high resolution sea surface temperature (SST) on seasonal rainfall prediction. Two different SST datasets available from the National Centers for Environmental Prediction (NCEP) global model analysis and merged satellite product from Group for High Resolution SST (GHRSST) are used as a lower boundary condition in the WRF model for the Indian Summer Monsoon (ISM) 2010. Before using NCEP SST and GHRSST for model simulation, an initial verification of NCEP SST and GHRSST are performed with buoy measurements. It is found that approximately 0.4 K root mean square difference (RMSD) in GHRSST and NCEP SST when compared with buoy observations available over the Indian Ocean during 01 May to 30 September 2010. Our analyses suggest that use of GHRSST as lower boundary conditions in the WRF model improve the low level temperature, moisture, wind speed and rainfall prediction over ISM region. Moreover, temporal evolution of surface parameters such as temperature, moisture and wind speed forecasts associated with monsoon is also improved with GHRSST forcing as a lower boundary condition. Interestingly, rainfall prediction is improved with the use of GHRSST over the Western Ghats, which mostly not simulated in the NCEP SST based experiment.

  4. Biomarkers improve mortality prediction by prognostic scales in community-acquired pneumonia.

    Science.gov (United States)

    Menéndez, R; Martínez, R; Reyes, S; Mensa, J; Filella, X; Marcos, M A; Martínez, A; Esquinas, C; Ramirez, P; Torres, A

    2009-07-01

    Prognostic scales provide a useful tool to predict mortality in community-acquired pneumonia (CAP). However, the inflammatory response of the host, crucial in resolution and outcome, is not included in the prognostic scales. The aim of this study was to investigate whether information about the initial inflammatory cytokine profile and markers increases the accuracy of prognostic scales to predict 30-day mortality. To this aim, a prospective cohort study in two tertiary care hospitals was designed. Procalcitonin (PCT), C-reactive protein (CRP) and the systemic cytokines tumour necrosis factor alpha (TNFalpha) and interleukins IL6, IL8 and IL10 were measured at admission. Initial severity was assessed by PSI (Pneumonia Severity Index), CURB65 (Confusion, Urea nitrogen, Respiratory rate, Blood pressure, > or = 65 years of age) and CRB65 (Confusion, Respiratory rate, Blood pressure, > or = 65 years of age) scales. A total of 453 hospitalised CAP patients were included. The 36 patients who died (7.8%) had significantly increased levels of IL6, IL8, PCT and CRP. In regression logistic analyses, high levels of CRP and IL6 showed an independent predictive value for predicting 30-day mortality, after adjustment for prognostic scales. Adding CRP to PSI significantly increased the area under the receiver operating characteristic curve (AUC) from 0.80 to 0.85, that of CURB65 from 0.82 to 0.85 and that of CRB65 from 0.79 to 0.85. Adding IL6 or PCT values to CRP did not significantly increase the AUC of any scale. When using two scales (PSI and CURB65/CRB65) and CRP simultaneously the AUC was 0.88. Adding CRP levels to PSI, CURB65 and CRB65 scales improves the 30-day mortality prediction. The highest predictive value is reached with a combination of two scales and CRP. Further validation of that improvement is needed.

  5. Improving Flash Flood Prediction in Multiple Environments

    Science.gov (United States)

    Broxton, P. D.; Troch, P. A.; Schaffner, M.; Unkrich, C.; Goodrich, D.; Wagener, T.; Yatheendradas, S.

    2009-12-01

    Flash flooding is a major concern in many fast responding headwater catchments . There are many efforts to model and to predict these flood events, though it is not currently possible to adequately predict the nature of flash flood events with a single model, and furthermore, many of these efforts do not even consider snow, which can, by itself, or in combination with rainfall events, cause destructive floods. The current research is aimed at broadening the applicability of flash flood modeling. Specifically, we will take a state of the art flash flood model that is designed to work with warm season precipitation in arid environments, the KINematic runoff and EROSion model (KINEROS2), and combine it with a continuous subsurface flow model and an energy balance snow model. This should improve its predictive capacity in humid environments where lateral subsurface flow significantly contributes to streamflow, and it will make possible the prediction of flooding events that involve rain-on-snow or rapid snowmelt. By modeling changes in the hydrologic state of a catchment before a flood begins, we can also better understand the factors or combination of factors that are necessary to produce large floods. Broadening the applicability of an already state of the art flash flood model, such as KINEROS2, is logical because flash floods can occur in all types of environments, and it may lead to better predictions, which are necessary to preserve life and property.

  6. Developing Predictive Maintenance Expertise to Improve Plant Equipment Reliability

    International Nuclear Information System (INIS)

    Wurzbach, Richard N.

    2002-01-01

    On-line equipment condition monitoring is a critical component of the world-class production and safety histories of many successful nuclear plant operators. From addressing availability and operability concerns of nuclear safety-related equipment to increasing profitability through support system reliability and reduced maintenance costs, Predictive Maintenance programs have increasingly become a vital contribution to the maintenance and operation decisions of nuclear facilities. In recent years, significant advancements have been made in the quality and portability of many of the instruments being used, and software improvements have been made as well. However, the single most influential component of the success of these programs is the impact of a trained and experienced team of personnel putting this technology to work. Changes in the nature of the power generation industry brought on by competition, mergers, and acquisitions, has taken the historically stable personnel environment of power generation and created a very dynamic situation. As a result, many facilities have seen a significant turnover in personnel in key positions, including predictive maintenance personnel. It has become the challenge for many nuclear operators to maintain the consistent contribution of quality data and information from predictive maintenance that has become important in the overall equipment decision process. These challenges can be met through the implementation of quality training to predictive maintenance personnel and regular updating and re-certification of key technology holders. The use of data management tools and services aid in the sharing of information across sites within an operating company, and with experts who can contribute value-added data management and analysis. The overall effectiveness of predictive maintenance programs can be improved through the incorporation of newly developed comprehensive technology training courses. These courses address the use of

  7. Accurate prediction of the functional significance of single nucleotide polymorphisms and mutations in the ABCA1 gene.

    Directory of Open Access Journals (Sweden)

    Liam R Brunham

    2005-12-01

    Full Text Available The human genome contains an estimated 100,000 to 300,000 DNA variants that alter an amino acid in an encoded protein. However, our ability to predict which of these variants are functionally significant is limited. We used a bioinformatics approach to define the functional significance of genetic variation in the ABCA1 gene, a cholesterol transporter crucial for the metabolism of high density lipoprotein cholesterol. To predict the functional consequence of each coding single nucleotide polymorphism and mutation in this gene, we calculated a substitution position-specific evolutionary conservation score for each variant, which considers site-specific variation among evolutionarily related proteins. To test the bioinformatics predictions experimentally, we evaluated the biochemical consequence of these sequence variants by examining the ability of cell lines stably transfected with the ABCA1 alleles to elicit cholesterol efflux. Our bioinformatics approach correctly predicted the functional impact of greater than 94% of the naturally occurring variants we assessed. The bioinformatics predictions were significantly correlated with the degree of functional impairment of ABCA1 mutations (r2 = 0.62, p = 0.0008. These results have allowed us to define the impact of genetic variation on ABCA1 function and to suggest that the in silico evolutionary approach we used may be a useful tool in general for predicting the effects of DNA variation on gene function. In addition, our data suggest that considering patterns of positive selection, along with patterns of negative selection such as evolutionary conservation, may improve our ability to predict the functional effects of amino acid variation.

  8. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    Science.gov (United States)

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  9. Decadal climate predictions improved by ocean ensemble dispersion filtering

    Science.gov (United States)

    Kadow, C.; Illing, S.; Kröner, I.; Ulbrich, U.; Cubasch, U.

    2017-06-01

    Decadal predictions by Earth system models aim to capture the state and phase of the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. While short-term weather forecasts represent an initial value problem and long-term climate projections represent a boundary condition problem, the decadal climate prediction falls in-between these two time scales. In recent years, more precise initialization techniques of coupled Earth system models and increased ensemble sizes have improved decadal predictions. However, climate models in general start losing the initialized signal and its predictive skill from one forecast year to the next. Here we show that the climate prediction skill of an Earth system model can be improved by a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. We found that this procedure, called ensemble dispersion filter, results in more accurate results than the standard decadal prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions show an increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with larger ensembles and higher resolution. Our results demonstrate how decadal climate predictions benefit from ocean ensemble dispersion filtering toward the ensemble mean.Plain Language SummaryDecadal predictions aim to predict the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. The ocean memory due to its heat capacity holds big potential skill. In recent years, more precise initialization techniques of coupled Earth system models (incl. atmosphere and ocean) have improved decadal predictions. Ensembles are another important aspect. Applying slightly perturbed predictions to trigger the famous butterfly effect results in an ensemble. Instead of evaluating one prediction, but the whole ensemble with its

  10. Prostate Health Index improves multivariable risk prediction of aggressive prostate cancer.

    Science.gov (United States)

    Loeb, Stacy; Shin, Sanghyuk S; Broyles, Dennis L; Wei, John T; Sanda, Martin; Klee, George; Partin, Alan W; Sokoll, Lori; Chan, Daniel W; Bangma, Chris H; van Schaik, Ron H N; Slawin, Kevin M; Marks, Leonard S; Catalona, William J

    2017-07-01

    To examine the use of the Prostate Health Index (PHI) as a continuous variable in multivariable risk assessment for aggressive prostate cancer in a large multicentre US study. The study population included 728 men, with prostate-specific antigen (PSA) levels of 2-10 ng/mL and a negative digital rectal examination, enrolled in a prospective, multi-site early detection trial. The primary endpoint was aggressive prostate cancer, defined as biopsy Gleason score ≥7. First, we evaluated whether the addition of PHI improves the performance of currently available risk calculators (the Prostate Cancer Prevention Trial [PCPT] and European Randomised Study of Screening for Prostate Cancer [ERSPC] risk calculators). We also designed and internally validated a new PHI-based multivariable predictive model, and created a nomogram. Of 728 men undergoing biopsy, 118 (16.2%) had aggressive prostate cancer. The PHI predicted the risk of aggressive prostate cancer across the spectrum of values. Adding PHI significantly improved the predictive accuracy of the PCPT and ERSPC risk calculators for aggressive disease. A new model was created using age, previous biopsy, prostate volume, PSA and PHI, with an area under the curve of 0.746. The bootstrap-corrected model showed good calibration with observed risk for aggressive prostate cancer and had net benefit on decision-curve analysis. Using PHI as part of multivariable risk assessment leads to a significant improvement in the detection of aggressive prostate cancer, potentially reducing harms from unnecessary prostate biopsy and overdiagnosis. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  11. Combining Gene Signatures Improves Prediction of Breast Cancer Survival

    Science.gov (United States)

    Zhao, Xi; Naume, Bjørn; Langerød, Anita; Frigessi, Arnoldo; Kristensen, Vessela N.; Børresen-Dale, Anne-Lise; Lingjærde, Ole Christian

    2011-01-01

    Background Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123) and test set (n = 81), respectively. Gene sets from eleven previously published gene signatures are included in the study. Principal Findings To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014). Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001). The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. Conclusion Combining the predictive strength of multiple gene signatures improves prediction of breast

  12. Combining gene signatures improves prediction of breast cancer survival.

    Directory of Open Access Journals (Sweden)

    Xi Zhao

    Full Text Available BACKGROUND: Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123 and test set (n = 81, respectively. Gene sets from eleven previously published gene signatures are included in the study. PRINCIPAL FINDINGS: To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014. Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001. The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. CONCLUSION: Combining the predictive strength of multiple gene signatures improves

  13. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  14. Reliable B cell epitope predictions: impacts of method development and improved benchmarking

    DEFF Research Database (Denmark)

    Kringelum, Jens Vindahl; Lundegaard, Claus; Lund, Ole

    2012-01-01

    biomedical applications such as; rational vaccine design, development of disease diagnostics and immunotherapeutics. However, experimental mapping of epitopes is resource intensive making in silico methods an appealing complementary approach. To date, the reported performance of methods for in silico mapping...... evaluation data set improved from 0.712 to 0.727. Our results thus demonstrate that given proper benchmark definitions, B-cell epitope prediction methods achieve highly significant predictive performances suggesting these tools to be a powerful asset in rational epitope discovery. The updated version...

  15. Multi-scale enhancement of climate prediction over land by improving the model sensitivity to vegetation variability

    Science.gov (United States)

    Alessandri, A.; Catalano, F.; De Felice, M.; Hurk, B. V. D.; Doblas-Reyes, F. J.; Boussetta, S.; Balsamo, G.; Miller, P. A.

    2017-12-01

    Here we demonstrate, for the first time, that the implementation of a realistic representation of vegetation in Earth System Models (ESMs) can significantly improve climate simulation and prediction across multiple time-scales. The effective sub-grid vegetation fractional coverage vary seasonally and at interannual time-scales in response to leaf-canopy growth, phenology and senescence. Therefore it affects biophysical parameters such as the surface resistance to evapotranspiration, albedo, roughness lenght, and soil field capacity. To adequately represent this effect in the EC-Earth ESM, we included an exponential dependence of the vegetation cover on the Leaf Area Index.By comparing two sets of simulations performed with and without the new variable fractional-coverage parameterization, spanning from centennial (20th Century) simulations and retrospective predictions to the decadal (5-years), seasonal (2-4 months) and weather (4 days) time-scales, we show for the first time a significant multi-scale enhancement of vegetation impacts in climate simulation and prediction over land. Particularly large effects at multiple time scales are shown over boreal winter middle-to-high latitudes over Canada, West US, Eastern Europe, Russia and eastern Siberia due to the implemented time-varying shadowing effect by tree-vegetation on snow surfaces. Over Northern Hemisphere boreal forest regions the improved representation of vegetation-cover consistently correct the winter warm biases, improves the climate change sensitivity, the decadal potential predictability as well as the skill of forecasts at seasonal and weather time-scales. Significant improvements of the prediction of 2m temperature and rainfall are also shown over transitional land surface hot spots. Both the potential predictability at decadal time-scale and seasonal-forecasts skill are enhanced over Sahel, North American Great Plains, Nordeste Brazil and South East Asia, mainly related to improved performance in

  16. DNCON2: improved protein contact prediction using two-level deep convolutional neural networks.

    Science.gov (United States)

    Adhikari, Badri; Hou, Jie; Cheng, Jianlin

    2018-05-01

    Significant improvements in the prediction of protein residue-residue contacts are observed in the recent years. These contacts, predicted using a variety of coevolution-based and machine learning methods, are the key contributors to the recent progress in ab initio protein structure prediction, as demonstrated in the recent CASP experiments. Continuing the development of new methods to reliably predict contact maps is essential to further improve ab initio structure prediction. In this paper we discuss DNCON2, an improved protein contact map predictor based on two-level deep convolutional neural networks. It consists of six convolutional neural networks-the first five predict contacts at 6, 7.5, 8, 8.5 and 10 Å distance thresholds, and the last one uses these five predictions as additional features to predict final contact maps. On the free-modeling datasets in CASP10, 11 and 12 experiments, DNCON2 achieves mean precisions of 35, 50 and 53.4%, respectively, higher than 30.6% by MetaPSICOV on CASP10 dataset, 34% by MetaPSICOV on CASP11 dataset and 46.3% by Raptor-X on CASP12 dataset, when top L/5 long-range contacts are evaluated. We attribute the improved performance of DNCON2 to the inclusion of short- and medium-range contacts into training, two-level approach to prediction, use of the state-of-the-art optimization and activation functions, and a novel deep learning architecture that allows each filter in a convolutional layer to access all the input features of a protein of arbitrary length. The web server of DNCON2 is at http://sysbio.rnet.missouri.edu/dncon2/ where training and testing datasets as well as the predictions for CASP10, 11 and 12 free-modeling datasets can also be downloaded. Its source code is available at https://github.com/multicom-toolbox/DNCON2/. chengji@missouri.edu. Supplementary data are available at Bioinformatics online.

  17. Improved Modeling and Prediction of Surface Wave Amplitudes

    Science.gov (United States)

    2017-05-31

    AFRL-RV-PS- AFRL-RV-PS- TR-2017-0162 TR-2017-0162 IMPROVED MODELING AND PREDICTION OF SURFACE WAVE AMPLITUDES Jeffry L. Stevens, et al. Leidos...data does not license the holder or any other person or corporation; or convey any rights or permission to manufacture, use, or sell any patented...SUBTITLE Improved Modeling and Prediction of Surface Wave Amplitudes 5a. CONTRACT NUMBER FA9453-14-C-0225 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  18. Improving orbit prediction accuracy through supervised machine learning

    Science.gov (United States)

    Peng, Hao; Bai, Xiaoli

    2018-05-01

    Due to the lack of information such as the space environment condition and resident space objects' (RSOs') body characteristics, current orbit predictions that are solely grounded on physics-based models may fail to achieve required accuracy for collision avoidance and have led to satellite collisions already. This paper presents a methodology to predict RSOs' trajectories with higher accuracy than that of the current methods. Inspired by the machine learning (ML) theory through which the models are learned based on large amounts of observed data and the prediction is conducted without explicitly modeling space objects and space environment, the proposed ML approach integrates physics-based orbit prediction algorithms with a learning-based process that focuses on reducing the prediction errors. Using a simulation-based space catalog environment as the test bed, the paper demonstrates three types of generalization capability for the proposed ML approach: (1) the ML model can be used to improve the same RSO's orbit information that is not available during the learning process but shares the same time interval as the training data; (2) the ML model can be used to improve predictions of the same RSO at future epochs; and (3) the ML model based on a RSO can be applied to other RSOs that share some common features.

  19. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Science.gov (United States)

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  20. Improvement of gas entrainment prediction method. Introduction of surface tension effect

    International Nuclear Information System (INIS)

    Ito, Kei; Sakai, Takaaki; Ohshima, Hiroyuki; Uchibori, Akihiro; Eguchi, Yuzuru; Monji, Hideaki; Xu, Yongze

    2010-01-01

    A gas entrainment (GE) prediction method has been developed to establish design criteria for the large-scale sodium-cooled fast reactor (JSFR) systems. The prototype of the GE prediction method was already confirmed to give reasonable gas core lengths by simple calculation procedures. However, for simplification, the surface tension effects were neglected. In this paper, the evaluation accuracy of gas core lengths is improved by introducing the surface tension effects into the prototype GE prediction method. First, the mechanical balance between gravitational, centrifugal, and surface tension forces is considered. Then, the shape of a gas core tip is approximated by a quadratic function. Finally, using the approximated gas core shape, the authors determine the gas core length satisfying the mechanical balance. This improved GE prediction method is validated by analyzing the gas core lengths observed in simple experiments. Results show that the analytical gas core lengths calculated by the improved GE prediction method become shorter in comparison to the prototype GE prediction method, and are in good agreement with the experimental data. In addition, the experimental data under different temperature and surfactant concentration conditions are reproduced by the improved GE prediction method. (author)

  1. Plant water potential improves prediction of empirical stomatal models.

    Directory of Open Access Journals (Sweden)

    William R L Anderegg

    Full Text Available Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.

  2. Improving MJO Prediction and Simulation Using AGCM Coupled Ocean Model with Refined Vertical Resolution

    Science.gov (United States)

    Tu, Chia-Ying; Tseng, Wan-Ling; Kuo, Pei-Hsuan; Lan, Yung-Yao; Tsuang, Ben-Jei; Hsu, Huang-Hsiung

    2017-04-01

    Precipitation in Taiwan area is significantly influenced by MJO (Madden-Julian Oscillation) in the boreal winter. This study is therefore conducted by toggling the MJO prediction and simulation with a unique model structure. The one-dimensional TKE (Turbulence Kinetic Energy) type ocean model SIT (Snow, Ice, Thermocline) with refined vertical resolution near surface is able to resolve cool skin, as well as diurnal warm layer. SIT can simulate accurate SST and hence give precise air-sea interaction. By coupling SIT with ECHAM5 (MPI-Meteorology), CAM5 (NCAR) and HiRAM (GFDL), the MJO simulations in 20-yrs climate integrations conducted by three SIT-coupled AGCMs are significant improved comparing to those driven by prescribed SST. The horizontal resolutions in ECHAM5, CAM5 and HiRAM are 2-deg., 1-deg and 0.5-deg., respectively. This suggests that the improvement of MJO simulation by coupling SIT is AGCM-resolution independent. This study further utilizes HiRAM coupled SIT to evaluate its MJO forecast skill. HiRAM has been recognized as one of the best model for seasonal forecasts of hurricane/typhoon activity (Zhao et al., 2009; Chen & Lin, 2011; 2013), but was not as successful in MJO forecast. The preliminary result of the HiRAM-SIT experiment during DYNAMO period shows improved success in MJO forecast. These improvements of MJO prediction and simulation in both hindcast experiments and climate integrations are mainly from better-simulated SST diurnal cycle and diurnal amplitude, which is contributed by the refined vertical resolution near ocean surface in SIT. Keywords: MJO Predictability, DYNAMO

  3. Scale invariance properties of intracerebral EEG improve seizure prediction in mesial temporal lobe epilepsy.

    Directory of Open Access Journals (Sweden)

    Kais Gadhoumi

    Full Text Available Although treatment for epilepsy is available and effective for nearly 70 percent of patients, many remain in need of new therapeutic approaches. Predicting the impending seizures in these patients could significantly enhance their quality of life if the prediction performance is clinically practical. In this study, we investigate the improvement of the performance of a seizure prediction algorithm in 17 patients with mesial temporal lobe epilepsy by means of a novel measure. Scale-free dynamics of the intracerebral EEG are quantified through robust estimates of the scaling exponents--the first cumulants--derived from a wavelet leader and bootstrap based multifractal analysis. The cumulants are investigated for the discriminability between preictal and interictal epochs. The performance of our recently published patient-specific seizure prediction algorithm is then out-of-sample tested on long-lasting data using combinations of cumulants and state similarity measures previously introduced. By using the first cumulant in combination with state similarity measures, up to 13 of 17 patients had seizures predicted above chance with clinically practical levels of sensitivity (80.5% and specificity (25.1% of total time under warning for prediction horizons above 25 min. These results indicate that the scale-free dynamics of the preictal state are different from those of the interictal state. Quantifiers of these dynamics may carry a predictive power that can be used to improve seizure prediction performance.

  4. Introducing Model Predictive Control for Improving Power Plant Portfolio Performance

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Bendtsen, Jan Dimon; Børresen, Simon

    2008-01-01

    This paper introduces a model predictive control (MPC) approach for construction of a controller for balancing the power generation against consumption in a power system. The objective of the controller is to coordinate a portfolio consisting of multiple power plant units in the effort to perform...... reference tracking and disturbance rejection in an economically optimal way. The performance function is chosen as a mixture of the `1-norm and a linear weighting to model the economics of the system. Simulations show a significant improvement of the performance of the MPC compared to the current...

  5. Cohort-specific imputation of gene expression improves prediction of warfarin dose for African Americans.

    Science.gov (United States)

    Gottlieb, Assaf; Daneshjou, Roxana; DeGorter, Marianne; Bourgeois, Stephane; Svensson, Peter J; Wadelius, Mia; Deloukas, Panos; Montgomery, Stephen B; Altman, Russ B

    2017-11-24

    Genome-wide association studies are useful for discovering genotype-phenotype associations but are limited because they require large cohorts to identify a signal, which can be population-specific. Mapping genetic variation to genes improves power and allows the effects of both protein-coding variation as well as variation in expression to be combined into "gene level" effects. Previous work has shown that warfarin dose can be predicted using information from genetic variation that affects protein-coding regions. Here, we introduce a method that improves dose prediction by integrating tissue-specific gene expression. In particular, we use drug pathways and expression quantitative trait loci knowledge to impute gene expression-on the assumption that differential expression of key pathway genes may impact dose requirement. We focus on 116 genes from the pharmacokinetic and pharmacodynamic pathways of warfarin within training and validation sets comprising both European and African-descent individuals. We build gene-tissue signatures associated with warfarin dose in a cohort-specific manner and identify a signature of 11 gene-tissue pairs that significantly augments the International Warfarin Pharmacogenetics Consortium dosage-prediction algorithm in both populations. Our results demonstrate that imputed expression can improve dose prediction and bridge population-specific compositions. MATLAB code is available at https://github.com/assafgo/warfarin-cohort.

  6. Improving consensus contact prediction via server correlation reduction.

    Science.gov (United States)

    Gao, Xin; Bu, Dongbo; Xu, Jinbo; Li, Ming

    2009-05-06

    Protein inter-residue contacts play a crucial role in the determination and prediction of protein structures. Previous studies on contact prediction indicate that although template-based consensus methods outperform sequence-based methods on targets with typical templates, such consensus methods perform poorly on new fold targets. However, we find out that even for new fold targets, the models generated by threading programs can contain many true contacts. The challenge is how to identify them. In this paper, we develop an integer linear programming model for consensus contact prediction. In contrast to the simple majority voting method assuming that all the individual servers are equally important and independent, the newly developed method evaluates their correlation by using maximum likelihood estimation and extracts independent latent servers from them by using principal component analysis. An integer linear programming method is then applied to assign a weight to each latent server to maximize the difference between true contacts and false ones. The proposed method is tested on the CASP7 data set. If the top L/5 predicted contacts are evaluated where L is the protein size, the average accuracy is 73%, which is much higher than that of any previously reported study. Moreover, if only the 15 new fold CASP7 targets are considered, our method achieves an average accuracy of 37%, which is much better than that of the majority voting method, SVM-LOMETS, SVM-SEQ, and SAM-T06. These methods demonstrate an average accuracy of 13.0%, 10.8%, 25.8% and 21.2%, respectively. Reducing server correlation and optimally combining independent latent servers show a significant improvement over the traditional consensus methods. This approach can hopefully provide a powerful tool for protein structure refinement and prediction use.

  7. Improving consensus contact prediction via server correlation reduction

    Directory of Open Access Journals (Sweden)

    Xu Jinbo

    2009-05-01

    Full Text Available Abstract Background Protein inter-residue contacts play a crucial role in the determination and prediction of protein structures. Previous studies on contact prediction indicate that although template-based consensus methods outperform sequence-based methods on targets with typical templates, such consensus methods perform poorly on new fold targets. However, we find out that even for new fold targets, the models generated by threading programs can contain many true contacts. The challenge is how to identify them. Results In this paper, we develop an integer linear programming model for consensus contact prediction. In contrast to the simple majority voting method assuming that all the individual servers are equally important and independent, the newly developed method evaluates their correlation by using maximum likelihood estimation and extracts independent latent servers from them by using principal component analysis. An integer linear programming method is then applied to assign a weight to each latent server to maximize the difference between true contacts and false ones. The proposed method is tested on the CASP7 data set. If the top L/5 predicted contacts are evaluated where L is the protein size, the average accuracy is 73%, which is much higher than that of any previously reported study. Moreover, if only the 15 new fold CASP7 targets are considered, our method achieves an average accuracy of 37%, which is much better than that of the majority voting method, SVM-LOMETS, SVM-SEQ, and SAM-T06. These methods demonstrate an average accuracy of 13.0%, 10.8%, 25.8% and 21.2%, respectively. Conclusion Reducing server correlation and optimally combining independent latent servers show a significant improvement over the traditional consensus methods. This approach can hopefully provide a powerful tool for protein structure refinement and prediction use.

  8. Prediction and moderation of improvement in cognitive-behavioral and psychodynamic psychotherapy for panic disorder.

    Science.gov (United States)

    Chambless, Dianne L; Milrod, Barbara; Porter, Eliora; Gallop, Robert; McCarthy, Kevin S; Graf, Elizabeth; Rudden, Marie; Sharpless, Brian A; Barber, Jacques P

    2017-08-01

    To identify variables predicting psychotherapy outcome for panic disorder or indicating which of 2 very different forms of psychotherapy-panic-focused psychodynamic psychotherapy (PFPP) or cognitive-behavioral therapy (CBT)-would be more effective for particular patients. Data were from 161 adults participating in a randomized controlled trial (RCT) including these psychotherapies. Patients included 104 women; 118 patients were White, 33 were Black, and 10 were of other races; 24 were Latino(a). Predictors/moderators measured at baseline or by Session 2 of treatment were used to predict change on the Panic Disorder Severity Scale (PDSS). Higher expectancy for treatment gains (Credibility/Expectancy Questionnaire d = -1.05, CI 95% [-1.50, -0.60]), and later age of onset (d = -0.65, CI 95% [-0.98, -0.32]) were predictive of greater change. Both variables were also significant moderators: patients with low expectancy of improvement improved significantly less in PFPP than their counterparts in CBT, whereas this was not the case for patients with average or high levels of expectancy. When patients had an onset of panic disorder later in life (≥27.5 years old), they fared as well in PFPP as CBT. In contrast, at low and mean levels of onset age, CBT was the more effective treatment. Predictive variables suggest possibly fruitful foci for improvement of treatment outcome. In terms of moderation, CBT was the more consistently effective treatment, but moderators identified some patients who would do as well in PFPP as in CBT, thereby widening empirically supported options for treatment of this disorder. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. An Improved Optimal Slip Ratio Prediction considering Tyre Inflation Pressure Changes

    Directory of Open Access Journals (Sweden)

    Guoxing Li

    2015-01-01

    Full Text Available The prediction of optimal slip ratio is crucial to vehicle control systems. Many studies have verified there is a definitive impact of tyre pressure change on the optimal slip ratio. However, the existing method of optimal slip ratio prediction has not taken into account the influence of tyre pressure changes. By introducing a second-order factor, an improved optimal slip ratio prediction considering tyre inflation pressure is proposed in this paper. In order to verify and evaluate the performance of the improved prediction, a cosimulation platform is developed by using MATLAB/Simulink and CarSim software packages, achieving a comprehensive simulation study of vehicle braking performance cooperated with an ABS controller. The simulation results show that the braking distances and braking time under different tyre pressures and initial braking speeds are effectively shortened with the improved prediction of optimal slip ratio. When the tyre pressure is slightly lower than the nominal pressure, the difference of braking performances between original optimal slip ratio and improved optimal slip ratio is the most obvious.

  10. Combining sequence-based prediction methods and circular dichroism and infrared spectroscopic data to improve protein secondary structure determinations

    Directory of Open Access Journals (Sweden)

    Lees Jonathan G

    2008-01-01

    Full Text Available Abstract Background A number of sequence-based methods exist for protein secondary structure prediction. Protein secondary structures can also be determined experimentally from circular dichroism, and infrared spectroscopic data using empirical analysis methods. It has been proposed that comparable accuracy can be obtained from sequence-based predictions as from these biophysical measurements. Here we have examined the secondary structure determination accuracies of sequence prediction methods with the empirically determined values from the spectroscopic data on datasets of proteins for which both crystal structures and spectroscopic data are available. Results In this study we show that the sequence prediction methods have accuracies nearly comparable to those of spectroscopic methods. However, we also demonstrate that combining the spectroscopic and sequences techniques produces significant overall improvements in secondary structure determinations. In addition, combining the extra information content available from synchrotron radiation circular dichroism data with sequence methods also shows improvements. Conclusion Combining sequence prediction with experimentally determined spectroscopic methods for protein secondary structure content significantly enhances the accuracy of the overall results obtained.

  11. Recent Improvements in IERS Rapid Service/Prediction Center Products

    National Research Council Canada - National Science Library

    Stamatakos, N; Luzum, B; Wooden, W

    2007-01-01

    ...) at USNO has made several improvements to its combination and pre- diction products. These improvements are due to the inclusion of new input data sources as well as modifications to the combination and prediction algorithms...

  12. P-wave characteristics on routine preoperative electrocardiogram improve prediction of new-onset postoperative atrial fibrillation in cardiac surgery.

    Science.gov (United States)

    Wong, Jim K; Lobato, Robert L; Pinesett, Andre; Maxwell, Bryan G; Mora-Mangano, Christina T; Perez, Marco V

    2014-12-01

    To test the hypothesis that including preoperative electrocardiogram (ECG) characteristics with clinical variables significantly improves the new-onset postoperative atrial fibrillation prediction model. Retrospective analysis. Single-center university hospital. Five hundred twenty-six patients, ≥ 18 years of age, who underwent coronary artery bypass grafting, aortic valve replacement, mitral valve replacement/repair, or a combination of valve surgery and coronary artery bypass grafting requiring cardiopulmonary bypass. Retrospective review of medical records. Baseline characteristics and cardiopulmonary bypass times were collected. Digitally-measured timing and voltages from preoperative electrocardiograms were extracted. Postoperative atrial fibrillation was defined as atrial fibrillation requiring therapeutic intervention. Two hundred eight (39.5%) patients developed postoperative atrial fibrillation. Clinical predictors were age, ejection fractionelectrocardiogram variables to the prediction model with only clinical predictors significantly improved the area under the receiver operating characteristic curve, from 0.71 to 0.78 (p<0.01). Overall net reclassification improvement was 0.059 (p = 0.09). Among those who developed postoperative atrial fibrillation, the net reclassification improvement was 0.063 (p = 0.03). Several p-wave characteristics are independently associated with postoperative atrial fibrillation. Addition of these parameters improves the postoperative atrial fibrillation prediction model. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Cohort-specific imputation of gene expression improves prediction of warfarin dose for African Americans

    Directory of Open Access Journals (Sweden)

    Assaf Gottlieb

    2017-11-01

    Full Text Available Abstract Background Genome-wide association studies are useful for discovering genotype–phenotype associations but are limited because they require large cohorts to identify a signal, which can be population-specific. Mapping genetic variation to genes improves power and allows the effects of both protein-coding variation as well as variation in expression to be combined into “gene level” effects. Methods Previous work has shown that warfarin dose can be predicted using information from genetic variation that affects protein-coding regions. Here, we introduce a method that improves dose prediction by integrating tissue-specific gene expression. In particular, we use drug pathways and expression quantitative trait loci knowledge to impute gene expression—on the assumption that differential expression of key pathway genes may impact dose requirement. We focus on 116 genes from the pharmacokinetic and pharmacodynamic pathways of warfarin within training and validation sets comprising both European and African-descent individuals. Results We build gene-tissue signatures associated with warfarin dose in a cohort-specific manner and identify a signature of 11 gene-tissue pairs that significantly augments the International Warfarin Pharmacogenetics Consortium dosage-prediction algorithm in both populations. Conclusions Our results demonstrate that imputed expression can improve dose prediction and bridge population-specific compositions. MATLAB code is available at https://github.com/assafgo/warfarin-cohort

  14. Catchment coevolution: A useful framework for improving predictions of hydrological change?

    Science.gov (United States)

    Troch, Peter A.

    2017-04-01

    The notion that landscape features have co-evolved over time is well known in the Earth sciences. Hydrologists have recently called for a more rigorous connection between emerging spatial patterns of landscape features and the hydrological response of catchments, and have termed this concept catchment coevolution. In this presentation we present a general framework of catchment coevolution that could improve predictions of hydrologic change. We first present empirical evidence of the interaction and feedback of landscape evolution and changes in hydrological response. From this review it is clear that the independent drivers of catchment coevolution are climate, geology, and tectonics. We identify common currency that allows comparing the levels of activity of these independent drivers, such that, at least conceptually, we can quantify the rate of evolution or aging. Knowing the hydrologic age of a catchment by itself is not very meaningful without linking age to hydrologic response. Two avenues of investigation have been used to understand the relationship between (differences in) age and hydrological response: (i) one that is based on relating present landscape features to runoff processes that are hypothesized to be responsible for the current fingerprints in the landscape; and (ii) one that takes advantage of an experimental design known as space-for-time substitution. Both methods have yielded significant insights in the hydrologic response of landscapes with different histories. If we want to make accurate predictions of hydrologic change, we will also need to be able to predict how the catchment will further coevolve in association with changes in the activity levels of the drivers (e.g., climate). There is ample evidence in the literature that suggests that whole-system prediction of catchment coevolution is, at least in principle, plausible. With this imperative we outline a research agenda that implements the concepts of catchment coevolution for building

  15. Machine Learning Principles Can Improve Hip Fracture Prediction

    DEFF Research Database (Denmark)

    Kruse, Christian; Eiken, Pia; Vestergaard, Peter

    2017-01-01

    Apply machine learning principles to predict hip fractures and estimate predictor importance in Dual-energy X-ray absorptiometry (DXA)-scanned men and women. Dual-energy X-ray absorptiometry data from two Danish regions between 1996 and 2006 were combined with national Danish patient data.......89 [0.82; 0.95], but with poor calibration in higher probabilities. A ten predictor subset (BMD, biochemical cholesterol and liver function tests, penicillin use and osteoarthritis diagnoses) achieved a test AUC of 0.86 [0.78; 0.94] using an “xgbTree” model. Machine learning can improve hip fracture...... prediction beyond logistic regression using ensemble models. Compiling data from international cohorts of longer follow-up and performing similar machine learning procedures has the potential to further improve discrimination and calibration....

  16. Predictability of extreme weather events for NE U.S.: improvement of the numerical prediction using a Bayesian regression approach

    Science.gov (United States)

    Yang, J.; Astitha, M.; Anagnostou, E. N.; Hartman, B.; Kallos, G. B.

    2015-12-01

    Weather prediction accuracy has become very important for the Northeast U.S. given the devastating effects of extreme weather events in the recent years. Weather forecasting systems are used towards building strategies to prevent catastrophic losses for human lives and the environment. Concurrently, weather forecast tools and techniques have evolved with improved forecast skill as numerical prediction techniques are strengthened by increased super-computing resources. In this study, we examine the combination of two state-of-the-science atmospheric models (WRF and RAMS/ICLAMS) by utilizing a Bayesian regression approach to improve the prediction of extreme weather events for NE U.S. The basic concept behind the Bayesian regression approach is to take advantage of the strengths of two atmospheric modeling systems and, similar to the multi-model ensemble approach, limit their weaknesses which are related to systematic and random errors in the numerical prediction of physical processes. The first part of this study is focused on retrospective simulations of seventeen storms that affected the region in the period 2004-2013. Optimal variances are estimated by minimizing the root mean square error and are applied to out-of-sample weather events. The applicability and usefulness of this approach are demonstrated by conducting an error analysis based on in-situ observations from meteorological stations of the National Weather Service (NWS) for wind speed and wind direction, and NCEP Stage IV radar data, mosaicked from the regional multi-sensor for precipitation. The preliminary results indicate a significant improvement in the statistical metrics of the modeled-observed pairs for meteorological variables using various combinations of the sixteen events as predictors of the seventeenth. This presentation will illustrate the implemented methodology and the obtained results for wind speed, wind direction and precipitation, as well as set the research steps that will be

  17. The use of patient factors to improve the prediction of operative duration using laparoscopic cholecystectomy.

    Science.gov (United States)

    Thiels, Cornelius A; Yu, Denny; Abdelrahman, Amro M; Habermann, Elizabeth B; Hallbeck, Susan; Pasupathy, Kalyan S; Bingener, Juliane

    2017-01-01

    Reliable prediction of operative duration is essential for improving patient and care team satisfaction, optimizing resource utilization and reducing cost. Current operative scheduling systems are unreliable and contribute to costly over- and underestimation of operative time. We hypothesized that the inclusion of patient-specific factors would improve the accuracy in predicting operative duration. We reviewed all elective laparoscopic cholecystectomies performed at a single institution between 01/2007 and 06/2013. Concurrent procedures were excluded. Univariate analysis evaluated the effect of age, gender, BMI, ASA, laboratory values, smoking, and comorbidities on operative duration. Multivariable linear regression models were constructed using the significant factors (p historical surgeon-specific and procedure-specific operative duration. External validation was done using the ACS-NSQIP database (n = 11,842). A total of 1801 laparoscopic cholecystectomy patients met inclusion criteria. Female sex was associated with reduced operative duration (-7.5 min, p < 0.001 vs. male sex) while increasing BMI (+5.1 min BMI 25-29.9, +6.9 min BMI 30-34.9, +10.4 min BMI 35-39.9, +17.0 min BMI 40 + , all p < 0.05 vs. normal BMI), increasing ASA (+7.4 min ASA III, +38.3 min ASA IV, all p < 0.01 vs. ASA I), and elevated liver function tests (+7.9 min, p < 0.01 vs. normal) were predictive of increased operative duration on univariate analysis. A model was then constructed using these predictive factors. The traditional surgical scheduling system was poorly predictive of actual operative duration (R 2  = 0.001) compared to the patient factors model (R 2  = 0.08). The model remained predictive on external validation (R 2  = 0.14).The addition of surgeon as a variable in the institutional model further improved predictive ability of the model (R 2  = 0.18). The use of routinely available pre-operative patient factors improves the prediction of operative

  18. A national-scale model of linear features improves predictions of farmland biodiversity.

    Science.gov (United States)

    Sullivan, Martin J P; Pearce-Higgins, James W; Newson, Stuart E; Scholefield, Paul; Brereton, Tom; Oliver, Tom H

    2017-12-01

    Modelling species distribution and abundance is important for many conservation applications, but it is typically performed using relatively coarse-scale environmental variables such as the area of broad land-cover types. Fine-scale environmental data capturing the most biologically relevant variables have the potential to improve these models. For example, field studies have demonstrated the importance of linear features, such as hedgerows, for multiple taxa, but the absence of large-scale datasets of their extent prevents their inclusion in large-scale modelling studies.We assessed whether a novel spatial dataset mapping linear and woody-linear features across the UK improves the performance of abundance models of 18 bird and 24 butterfly species across 3723 and 1547 UK monitoring sites, respectively.Although improvements in explanatory power were small, the inclusion of linear features data significantly improved model predictive performance for many species. For some species, the importance of linear features depended on landscape context, with greater importance in agricultural areas. Synthesis and applications . This study demonstrates that a national-scale model of the extent and distribution of linear features improves predictions of farmland biodiversity. The ability to model spatial variability in the role of linear features such as hedgerows will be important in targeting agri-environment schemes to maximally deliver biodiversity benefits. Although this study focuses on farmland, data on the extent of different linear features are likely to improve species distribution and abundance models in a wide range of systems and also can potentially be used to assess habitat connectivity.

  19. Improvement of energy expenditure prediction from heart rate during running

    International Nuclear Information System (INIS)

    Charlot, Keyne; Borne, Rachel; Richalet, Jean-Paul; Chapelot, Didier; Pichon, Aurélien; Cornolo, Jérémy; Brugniaux, Julien Vincent

    2014-01-01

    We aimed to develop new equations that predict exercise-induced energy expenditure (EE) more accurately than previous ones during running by including new parameters as fitness level, body composition and/or running intensity in addition to heart rate (HR). Original equations predicting EE were created from data obtained during three running intensities (25%, 50% and 70% of HR reserve) performed by 50 subjects. Five equations were conserved according to their accuracy assessed from error rates, interchangeability and correlations analyses: one containing only basic parameters, two containing VO 2max  or speed at VO 2max  and two including running speed with or without HR. Equations accuracy was further tested in an independent sample during a 40 min validation test at 50% of HR reserve. It appeared that: (1) the new basic equation was more accurate than pre-existing equations (R 2  0.809 versus. 0,737 respectively); (2) the prediction of EE was more accurate with the addition of VO 2max  (R 2  = 0.879); and (3) the equations containing running speed were the most accurate and were considered to have good agreement with indirect calorimetry. In conclusion, EE estimation during running might be significantly improved by including running speed in the predictive models, a parameter readily available with treadmill or GPS. (paper)

  20. Towards an improved prediction of the free radical scavenging potency of flavonoids: the significance of double PCET mechanisms.

    Science.gov (United States)

    Amić, Ana; Marković, Zoran; Dimitrić Marković, Jasmina M; Stepanić, Višnja; Lučić, Bono; Amić, Dragan

    2014-01-01

    The 1H(+)/1e(-) and 2H(+)/2e(-) proton-coupled electron transfer (PCET) processes of free radical scavenging by flavonoids were theoretically studied for aqueous and lipid environments using the PM6 and PM7 methods. The results reported here indicate that the significant contribution of the second PCET mechanism, resulting in the formation of a quinone/quinone methide, effectively discriminates the active from inactive flavonoids. The predictive potency of descriptors related to the energetics of second PCET mechanisms (the second O-H bond dissociation enthalpy (BDE2) related to hydrogen atom transfer (HAT) mechanism, and the second electron transfer enthalpy (ETE2) related to sequential proton loss electron transfer (SPLET) mechanism) are superior to the currently used indices, which are related to the first 1H(+)/1e(-) processes, and could serve as primary descriptors in development of the QSAR (quantitative structure-activity relationships) of flavonoids. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Predicting subscriber dissatisfaction and improving retention in the wireless telecommunications industry.

    Science.gov (United States)

    Mozer, M C; Wolniewicz, R; Grimes, D B; Johnson, E; Kaushansky, H

    2000-01-01

    Competition in the wireless telecommunications industry is fierce. To maintain profitability, wireless carriers must control churn, which is the loss of subscribers who switch from one carrier to another.We explore techniques from statistical machine learning to predict churn and, based on these predictions, to determine what incentives should be offered to subscribers to improve retention and maximize profitability to the carrier. The techniques include logit regression, decision trees, neural networks, and boosting. Our experiments are based on a database of nearly 47,000 U.S. domestic subscribers and includes information about their usage, billing, credit, application, and complaint history. Our experiments show that under a wide variety of assumptions concerning the cost of intervention and the retention rate resulting from intervention, using predictive techniques to identify potential churners and offering incentives can yield significant savings to a carrier. We also show the importance of a data representation crafted by domain experts. Finally, we report on a real-world test of the techniques that validate our simulation experiments.

  2. Improvement of Risk Prediction After Transcatheter Aortic Valve Replacement by Combining Frailty With Conventional Risk Scores.

    Science.gov (United States)

    Schoenenberger, Andreas W; Moser, André; Bertschi, Dominic; Wenaweser, Peter; Windecker, Stephan; Carrel, Thierry; Stuck, Andreas E; Stortecky, Stefan

    2018-02-26

    This study sought to evaluate whether frailty improves mortality prediction in combination with the conventional scores. European System for Cardiac Operative Risk Evaluation (EuroSCORE) or Society of Thoracic Surgeons (STS) score have not been evaluated in combined models with frailty for mortality prediction after transcatheter aortic valve replacement (TAVR). This prospective cohort comprised 330 consecutive TAVR patients ≥70 years of age. Conventional scores and a frailty index (based on assessment of cognition, mobility, nutrition, and activities of daily living) were evaluated to predict 1-year all-cause mortality using Cox proportional hazards regression (providing hazard ratios [HRs] with confidence intervals [CIs]) and measures of test performance (providing likelihood ratio [LR] chi-square test statistic and C-statistic [CS]). All risk scores were predictive of the outcome (EuroSCORE, HR: 1.90 [95% CI: 1.45 to 2.48], LR chi-square test statistic 19.29, C-statistic 0.67; STS score, HR: 1.51 [95% CI: 1.21 to 1.88], LR chi-square test statistic 11.05, C-statistic 0.64; frailty index, HR: 3.29 [95% CI: 1.98 to 5.47], LR chi-square test statistic 22.28, C-statistic 0.66). A combination of the frailty index with either EuroSCORE (LR chi-square test statistic 38.27, C-statistic 0.72) or STS score (LR chi-square test statistic 28.71, C-statistic 0.68) improved mortality prediction. The frailty index accounted for 58.2% and 77.6% of the predictive information in the combined model with EuroSCORE and STS score, respectively. Net reclassification improvement and integrated discrimination improvement confirmed that the added frailty index improved risk prediction. This is the first study showing that the assessment of frailty significantly enhances prediction of 1-year mortality after TAVR in combined risk models with conventional risk scores and relevantly contributes to this improvement. Copyright © 2018 American College of Cardiology Foundation

  3. SU-D-BRB-02: Combining a Commercial Autoplanning Engine with Database Dose Predictions to Further Improve Plan Quality

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, SP; Moore, JA; Hui, X; Cheng, Z; McNutt, TR [Johns Hopkins University, Baltimore, MD (United States); DeWeese, TL; Tran, P; Quon, H [John Hopkins Hospital, Baltimore, MD (United States); Bzdusek, K [Philips, Fitchburg, WI (United States); Kumar, P [Philips India Limited, Bangalore, Karnataka (India)

    2016-06-15

    Purpose: Database dose predictions and a commercial autoplanning engine both improve treatment plan quality in different but complimentary ways. The combination of these planning techniques is hypothesized to further improve plan quality. Methods: Four treatment plans were generated for each of 10 head and neck (HN) and 10 prostate cancer patients, including Plan-A: traditional IMRT optimization using clinically relevant default objectives; Plan-B: traditional IMRT optimization using database dose predictions; Plan-C: autoplanning using default objectives; and Plan-D: autoplanning using database dose predictions. One optimization was used for each planning method. Dose distributions were normalized to 95% of the planning target volume (prostate: 8000 cGy; HN: 7000 cGy). Objectives used in plan optimization and analysis were the larynx (25%, 50%, 90%), left and right parotid glands (50%, 85%), spinal cord (0%, 50%), rectum and bladder (0%, 20%, 50%, 80%), and left and right femoral heads (0%, 70%). Results: All objectives except larynx 25% and 50% resulted in statistically significant differences between plans (Friedman’s χ{sup 2} ≥ 11.2; p ≤ 0.011). Maximum dose to the rectum (Plans A-D: 8328, 8395, 8489, 8537 cGy) and bladder (Plans A-D: 8403, 8448, 8527, 8569 cGy) were significantly increased. All other significant differences reflected a decrease in dose. Plans B-D were significantly different from Plan-A for 3, 17, and 19 objectives, respectively. Plans C-D were also significantly different from Plan-B for 8 and 13 objectives, respectively. In one case (cord 50%), Plan-D provided significantly lower dose than plan C (p = 0.003). Conclusion: Combining database dose predictions with a commercial autoplanning engine resulted in significant plan quality differences for the greatest number of objectives. This translated to plan quality improvements in most cases, although special care may be needed for maximum dose constraints. Further evaluation is warranted

  4. Significant improvement in the thermal annealing process of optical resonators

    Science.gov (United States)

    Salzenstein, Patrice; Zarubin, Mikhail

    2017-05-01

    Thermal annealing performed during process improves the quality of the roughness of optical resonators reducing stresses at the periphery of their surface thus allowing higher Q-factors. After a preliminary realization, the design of the oven and the electronic method were significantly improved thanks to nichrome resistant alloy wires and chopped basalt fibers for thermal isolation during the annealing process. Q-factors can then be improved.

  5. Comparison of measured and predicted thermal mixing tests using improved finite difference technique

    International Nuclear Information System (INIS)

    Hassan, Y.A.; Rice, J.G.; Kim, J.H.

    1983-01-01

    The numerical diffusion introduced by the use of upwind formulations in the finite difference solution of the flow and energy equations for thermal mixing problems (cold water injection after small break LOCA in a PWR) was examined. The relative importance of numerical diffusion in the flow equations, compared to its effect on the energy equation was demonstrated. The flow field equations were solved using both first order accurate upwind, and second order accurate differencing schemes. The energy equation was treated using the conventional upwind and a mass weighted skew upwind scheme. Results presented for a simple test case showed that, for thermal mixing problems, the numerical diffusion was most significant in the energy equation. The numerical diffusion effect in the flow field equations was much less significant. A comparison of predictions using the skew upwind and the conventional upwind with experimental data from a two dimensional thermal mixing text are presented. The use of the skew upwind scheme showed a significant improvement in the accuracy of the steady state predicted temperatures. (orig./HP)

  6. Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling

    Directory of Open Access Journals (Sweden)

    Eric R. Edelman

    2017-06-01

    Full Text Available For efficient utilization of operating rooms (ORs, accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT. We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT. TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related

  7. Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling.

    Science.gov (United States)

    Edelman, Eric R; van Kuijk, Sander M J; Hamaekers, Ankie E W; de Korte, Marcel J M; van Merode, Godefridus G; Buhre, Wolfgang F F A

    2017-01-01

    For efficient utilization of operating rooms (ORs), accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT) per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT) and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA) physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT). We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT). TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related benefits.

  8. Omega-3 fatty acid therapy dose-dependently and significantly decreased triglycerides and improved flow-mediated dilation, however, did not significantly improve insulin sensitivity in patients with hypertriglyceridemia.

    Science.gov (United States)

    Oh, Pyung Chun; Koh, Kwang Kon; Sakuma, Ichiro; Lim, Soo; Lee, Yonghee; Lee, Seungik; Lee, Kyounghoon; Han, Seung Hwan; Shin, Eak Kyun

    2014-10-20

    Experimental studies demonstrate that higher intake of omega-3 fatty acids (n-3 FA) improves insulin sensitivity, however, we reported that n-3 FA 2g therapy, most commonly used dosage did not significantly improve insulin sensitivity despite reducing triglycerides by 21% in patients. Therefore, we investigated the effects of different dosages of n-3 FA in patients with hypertriglyceridemia. This was a randomized, single-blind, placebo-controlled, parallel study. Age, sex, and body mass index were matched among groups. All patients were recommended to maintain a low fat diet. Forty-four patients (about 18 had metabolic syndrome/type 2 diabetes mellitus) in each group were given placebo, n-3 FA 1 (O1), 2 (O2), or 4 g (O4), respectively daily for 2 months. n-3 FA therapy dose-dependently and significantly decreased triglycerides and triglycerides/HDL cholesterol and improved flow-mediated dilation, compared with placebo (by ANOVA). However, each n-3 FA therapy did not significantly decrease high-sensitivity C-reactive protein and fibrinogen, compared with placebo. O1 significantly increased insulin levels and decreased insulin sensitivity (determined by QUICKI) and O2 significantly decreased plasma adiponectin levels relative to baseline measurements. Of note, when compared with placebo, each n-3 FA therapy did not significantly change insulin, glucose, adiponectin, glycated hemoglobin levels and insulin sensitivity (by ANOVA). We observed similar results in a subgroup of patients with the metabolic syndrome. n-3 FA therapy dose-dependently and significantly decreased triglycerides and improved flow-mediated dilation. Nonetheless, n-3 FA therapy did not significantly improve acute-phase reactants and insulin sensitivity in patients with hypertriglyceridemia, regardless of dosages. Copyright © 2014. Published by Elsevier Ireland Ltd.

  9. Prediction of earth rotation parameters based on improved weighted least squares and autoregressive model

    Directory of Open Access Journals (Sweden)

    Sun Zhangzhen

    2012-08-01

    Full Text Available In this paper, an improved weighted least squares (WLS, together with autoregressive (AR model, is proposed to improve prediction accuracy of earth rotation parameters(ERP. Four weighting schemes are developed and the optimal power e for determination of the weight elements is studied. The results show that the improved WLS-AR model can improve the ERP prediction accuracy effectively, and for different prediction intervals of ERP, different weight scheme should be chosen.

  10. Minimotif Miner 3.0: database expansion and significantly improved reduction of false-positive predictions from consensus sequences.

    Science.gov (United States)

    Mi, Tian; Merlin, Jerlin Camilus; Deverasetty, Sandeep; Gryk, Michael R; Bill, Travis J; Brooks, Andrew W; Lee, Logan Y; Rathnayake, Viraj; Ross, Christian A; Sargeant, David P; Strong, Christy L; Watts, Paula; Rajasekaran, Sanguthevar; Schiller, Martin R

    2012-01-01

    Minimotif Miner (MnM available at http://minimotifminer.org or http://mnm.engr.uconn.edu) is an online database for identifying new minimotifs in protein queries. Minimotifs are short contiguous peptide sequences that have a known function in at least one protein. Here we report the third release of the MnM database which has now grown 60-fold to approximately 300,000 minimotifs. Since short minimotifs are by their nature not very complex we also summarize a new set of false-positive filters and linear regression scoring that vastly enhance minimotif prediction accuracy on a test data set. This online database can be used to predict new functions in proteins and causes of disease.

  11. Pulmonary edema predictive scoring index (PEPSI), a new index to predict risk of reperfusion pulmonary edema and improvement of hemodynamics in percutaneous transluminal pulmonary angioplasty.

    Science.gov (United States)

    Inami, Takumi; Kataoka, Masaharu; Shimura, Nobuhiko; Ishiguro, Haruhisa; Yanagisawa, Ryoji; Taguchi, Hiroki; Fukuda, Keiichi; Yoshino, Hideaki; Satoh, Toru

    2013-07-01

    This study sought to identify useful predictors for hemodynamic improvement and risk of reperfusion pulmonary edema (RPE), a major complication of this procedure. Percutaneous transluminal pulmonary angioplasty (PTPA) has been reported to be effective for the treatment of chronic thromboembolic pulmonary hypertension (CTEPH). PTPA has not been widespread because RPE has not been well predicted. We included 140 consecutive procedures in 54 patients with CTEPH. The flow appearance of the target vessels was graded into 4 groups (Pulmonary Flow Grade), and we proposed PEPSI (Pulmonary Edema Predictive Scoring Index) = (sum total change of Pulmonary Flow Grade scores) × (baseline pulmonary vascular resistance). Correlations between occurrence of RPE and 11 variables, including hemodynamic parameters, number of target vessels, and PEPSI, were analyzed. Hemodynamic parameters significantly improved after median observation period of 6.4 months, and the sum total changes in Pulmonary Flow Grade scores were significantly correlated with the improvement in hemodynamics. Multivariate analysis revealed that PEPSI was the strongest factor correlated with the occurrence of RPE (p PEPSI to be a useful marker of the risk of RPE (cutoff value 35.4, negative predictive value 92.3%). Pulmonary Flow Grade score is useful in determining therapeutic efficacy, and PEPSI is highly supportive to reduce the risk of RPE after PTPA. Using these 2 indexes, PTPA could become a safe and common therapeutic strategy for CTEPH. Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  12. Applying a health action model to predict and improve healthy behaviors in coal miners.

    Science.gov (United States)

    Vahedian-Shahroodi, Mohammad; Tehrani, Hadi; Mohammadi, Faeze; Gholian-Aval, Mahdi; Peyman, Nooshin

    2018-05-01

    One of the most important ways to prevent work-related diseases in occupations such as mining is to promote healthy behaviors among miners. This study aimed to predict and promote healthy behaviors among coal miners by using a health action model (HAM). The study was conducted on 200 coal miners in Iran in two steps. In the first step, a descriptive study was implemented to determine predictive constructs and effectiveness of HAM on behavioral intention. The second step involved a quasi-experimental study to determine the effect of an HAM-based education intervention. This intervention was implemented by the researcher and the head of the safety unit based on the predictive construct specified in the first step over 12 sessions of 60 min. The data was collected using an HAM questionnaire and a checklist of healthy behavior. The results of the first step of the study showed that attitude, belief, and normative constructs were meaningful predictors of behavioral intention. Also, the results of the second step revealed that the mean score of attitude and behavioral intention increased significantly after conducting the intervention in the experimental group, while the mean score of these constructs decreased significantly in the control group. The findings of this study showed that HAM-based educational intervention could improve the healthy behaviors of mine workers. Therefore, it is recommended to extend the application of this model to other working groups to improve healthy behaviors.

  13. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... and the resulting predictions can be compared with predictions from the ‘true’ model. By performing this analysis we expect to give the modeler insight into how the uncertainty of model-based prediction can be reduced.......A major purpose of groundwater modeling is to help decision-makers in efforts to manage the natural environment. Increasingly, it is recognized that both the predictions of interest and their associated uncertainties should be quantified to support robust decision making. In particular, decision...

  14. Significant Improvement of Catalytic Efficiencies in Ionic Liquids

    International Nuclear Information System (INIS)

    Song, Choong Eui; Yoon, Mi Young; Choi, Doo Seong

    2005-01-01

    The use of ionic liquids as reaction media can confer many advantages upon catalytic reactions over reactions in organic solvents. In ionic liquids, catalysts having polar or ionic character can easily be immobilized without additional structural modification and thus the ionic solutions containing the catalyst can easily be separated from the reagents and reaction products, and then, be reused. More interestingly, switching from an organic solvent to an ionic liquid often results in a significant improvement in catalytic performance (e.g., rate acceleration, (enantio)selectivity improvement and an increase in catalyst stability). In this review, some recent interesting results which can nicely demonstrate these positive 'ionic liquid effect' on catalysis are discussed

  15. NOAA's Strategy to Improve Operational Weather Prediction Outlooks at Subseasonal Time Range

    Science.gov (United States)

    Schneider, T.; Toepfer, F.; Stajner, I.; DeWitt, D.

    2017-12-01

    NOAA is planning to extend operational global numerical weather prediction to sub-seasonal time range under the auspices of its Next Generation Global Prediction System (NGGPS) and Extended Range Outlook Programs. A unification of numerical prediction capabilities for weather and subseasonal to seasonal (S2S) timescales is underway at NOAA using the Finite Volume Cubed Sphere (FV3) dynamical core as the basis for the emerging unified system. This presentation will overview NOAA's strategic planning and current activities to improve prediction at S2S time-scales that are ongoing in response to the Weather Research and Forecasting Innovation Act of 2017, Section 201. Over the short-term, NOAA seeks to improve the operational capability through improvements to its ensemble forecast system to extend its range to 30 days using the new FV3 Global Forecast System model, and by using this system to provide reforecast and re-analyses. In parallel, work is ongoing to improve NOAA's operational product suite for 30 day outlooks for temperature, precipitation and extreme weather phenomena.

  16. Training directionally selective motion pathways can significantly improve reading efficiency

    Science.gov (United States)

    Lawton, Teri

    2004-06-01

    This study examined whether perceptual learning at early levels of visual processing would facilitate learning at higher levels of processing. This was examined by determining whether training the motion pathways by practicing leftright movement discrimination, as found previously, would improve the reading skills of inefficient readers significantly more than another computer game, a word discrimination game, or the reading program offered by the school. This controlled validation study found that practicing left-right movement discrimination 5-10 minutes twice a week (rapidly) for 15 weeks doubled reading fluency, and significantly improved all reading skills by more than one grade level, whereas inefficient readers in the control groups barely improved on these reading skills. In contrast to previous studies of perceptual learning, these experiments show that perceptual learning of direction discrimination significantly improved reading skills determined at higher levels of cognitive processing, thereby being generalized to a new task. The deficits in reading performance and attentional focus experienced by the person who struggles when reading are suggested to result from an information overload, resulting from timing deficits in the direction-selectivity network proposed by Russell De Valois et al. (2000), that following practice on direction discrimination goes away. This study found that practicing direction discrimination rapidly transitions the inefficient 7-year-old reader to an efficient reader.

  17. Explicit Modeling of Ancestry Improves Polygenic Risk Scores and BLUP Prediction.

    Science.gov (United States)

    Chen, Chia-Yen; Han, Jiali; Hunter, David J; Kraft, Peter; Price, Alkes L

    2015-09-01

    Polygenic prediction using genome-wide SNPs can provide high prediction accuracy for complex traits. Here, we investigate the question of how to account for genetic ancestry when conducting polygenic prediction. We show that the accuracy of polygenic prediction in structured populations may be partly due to genetic ancestry. However, we hypothesized that explicitly modeling ancestry could improve polygenic prediction accuracy. We analyzed three GWAS of hair color (HC), tanning ability (TA), and basal cell carcinoma (BCC) in European Americans (sample size from 7,440 to 9,822) and considered two widely used polygenic prediction approaches: polygenic risk scores (PRSs) and best linear unbiased prediction (BLUP). We compared polygenic prediction without correction for ancestry to polygenic prediction with ancestry as a separate component in the model. In 10-fold cross-validation using the PRS approach, the R(2) for HC increased by 66% (0.0456-0.0755; P ancestry, which prevents ancestry effects from entering into each SNP effect and being overweighted. Surprisingly, explicitly modeling ancestry produces a similar improvement when using the BLUP approach, which fits all SNPs simultaneously in a single variance component and causes ancestry to be underweighted. We validate our findings via simulations, which show that the differences in prediction accuracy will increase in magnitude as sample sizes increase. In summary, our results show that explicitly modeling ancestry can be important in both PRS and BLUP prediction. © 2015 WILEY PERIODICALS, INC.

  18. CT image biomarkers to improve patient-specific prediction of radiation-induced xerostomia and sticky saliva.

    Science.gov (United States)

    van Dijk, Lisanne V; Brouwer, Charlotte L; van der Schaaf, Arjen; Burgerhof, Johannes G M; Beukinga, Roelof J; Langendijk, Johannes A; Sijtsema, Nanna M; Steenbakkers, Roel J H M

    2017-02-01

    Current models for the prediction of late patient-rated moderate-to-severe xerostomia (XER 12m ) and sticky saliva (STIC 12m ) after radiotherapy are based on dose-volume parameters and baseline xerostomia (XER base ) or sticky saliva (STIC base ) scores. The purpose is to improve prediction of XER 12m and STIC 12m with patient-specific characteristics, based on CT image biomarkers (IBMs). Planning CT-scans and patient-rated outcome measures were prospectively collected for 249 head and neck cancer patients treated with definitive radiotherapy with or without systemic treatment. The potential IBMs represent geometric, CT intensity and textural characteristics of the parotid and submandibular glands. Lasso regularisation was used to create multivariable logistic regression models, which were internally validated by bootstrapping. The prediction of XER 12m could be improved significantly by adding the IBM "Short Run Emphasis" (SRE), which quantifies heterogeneity of parotid tissue, to a model with mean contra-lateral parotid gland dose and XER base . For STIC 12m , the IBM maximum CT intensity of the submandibular gland was selected in addition to STIC base and mean dose to submandibular glands. Prediction of XER 12m and STIC 12m was improved by including IBMs representing heterogeneity and density of the salivary glands, respectively. These IBMs could guide additional research to the patient-specific response of healthy tissue to radiation dose. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. E2F5 status significantly improves malignancy diagnosis of epithelial ovarian cancer

    KAUST Repository

    Kothandaraman, Narasimhan

    2010-02-24

    Background: Ovarian epithelial cancer (OEC) usually presents in the later stages of the disease. Factors, especially those associated with cell-cycle genes, affecting the genesis and tumour progression for ovarian cancer are largely unknown. We hypothesized that over-expressed transcription factors (TFs), as well as those that are driving the expression of the OEC over-expressed genes, could be the key for OEC genesis and potentially useful tissue and serum markers for malignancy associated with OEC.Methods: Using a combination of computational (selection of candidate TF markers and malignancy prediction) and experimental approaches (tissue microarray and western blotting on patient samples) we identified and evaluated E2F5 transcription factor involved in cell proliferation, as a promising candidate regulatory target in early stage disease. Our hypothesis was supported by our tissue array experiments that showed E2F5 expression only in OEC samples but not in normal and benign tissues, and by significantly positively biased expression in serum samples done using western blotting studies.Results: Analysis of clinical cases shows that of the E2F5 status is characteristic for a different population group than one covered by CA125, a conventional OEC biomarker. E2F5 used in different combinations with CA125 for distinguishing malignant cyst from benign cyst shows that the presence of CA125 or E2F5 increases sensitivity of OEC detection to 97.9% (an increase from 87.5% if only CA125 is used) and, more importantly, the presence of both CA125 and E2F5 increases specificity of OEC to 72.5% (an increase from 55% if only CA125 is used). This significantly improved accuracy suggests possibility of an improved diagnostics of OEC. Furthermore, detection of malignancy status in 86 cases (38 benign, 48 early and late OEC) shows that the use of E2F5 status in combination with other clinical characteristics allows for an improved detection of malignant cases with sensitivity

  20. Improving Air Quality (and Weather) Predictions using Advanced Data Assimilation Techniques Applied to Coupled Models during KORUS-AQ

    Science.gov (United States)

    Carmichael, G. R.; Saide, P. E.; Gao, M.; Streets, D. G.; Kim, J.; Woo, J. H.

    2017-12-01

    Ambient aerosols are important air pollutants with direct impacts on human health and on the Earth's weather and climate systems through their interactions with radiation and clouds. Their role is dependent on their distributions of size, number, phase and composition, which vary significantly in space and time. There remain large uncertainties in simulated aerosol distributions due to uncertainties in emission estimates and in chemical and physical processes associated with their formation and removal. These uncertainties lead to large uncertainties in weather and air quality predictions and in estimates of health and climate change impacts. Despite these uncertainties and challenges, regional-scale coupled chemistry-meteorological models such as WRF-Chem have significant capabilities in predicting aerosol distributions and explaining aerosol-weather interactions. We explore the hypothesis that new advances in on-line, coupled atmospheric chemistry/meteorological models, and new emission inversion and data assimilation techniques applicable to such coupled models, can be applied in innovative ways using current and evolving observation systems to improve predictions of aerosol distributions at regional scales. We investigate the impacts of assimilating AOD from geostationary satellite (GOCI) and surface PM2.5 measurements on predictions of AOD and PM in Korea during KORUS-AQ through a series of experiments. The results suggest assimilating datasets from multiple platforms can improve the predictions of aerosol temporal and spatial distributions.

  1. Improved prediction for the mass of the W boson in the NMSSM

    International Nuclear Information System (INIS)

    Staal, O.; Zeune, L.

    2015-10-01

    Electroweak precision observables, being highly sensitive to loop contributions of new physics, provide a powerful tool to test the theory and to discriminate between different models of the underlying physics. In that context, the W boson mass, M W , plays a crucial role. The accuracy of the M W measurement has been significantly improved over the last years, and further improvement of the experimental accuracy is expected from future LHC measurements. In order to fully exploit the precise experimental determination, an accurate theoretical prediction for M W in the Standard Model (SM) and extensions of it is of central importance. We present the currently most accurate prediction for the W boson mass in the Next-to-Minimal Supersymmetric extension of the Standard Model (NMSSM), including the full one-loop result and all available higher-order corrections of SM and SUSY type. The evaluation of M W is performed in a flexible framework, which facilitates the extension to other models beyond the SM. We show numerical results for the W boson mass in the NMSSM, focussing on phenomenologically interesting scenarios, in which the Higgs signal can be interpreted as the lightest or second lightest CP-even Higgs boson of the NMSSM. We find that, for both Higgs signal interpretations, the NMSSM M W prediction is well compatible with the measurement. We study the SUSY contributions to M W in detail and investigate in particular the genuine NMSSM effects from the Higgs and neutralino sectors.

  2. CNNcon: improved protein contact maps prediction using cascaded neural networks.

    Directory of Open Access Journals (Sweden)

    Wang Ding

    Full Text Available BACKGROUNDS: Despite continuing progress in X-ray crystallography and high-field NMR spectroscopy for determination of three-dimensional protein structures, the number of unsolved and newly discovered sequences grows much faster than that of determined structures. Protein modeling methods can possibly bridge this huge sequence-structure gap with the development of computational science. A grand challenging problem is to predict three-dimensional protein structure from its primary structure (residues sequence alone. However, predicting residue contact maps is a crucial and promising intermediate step towards final three-dimensional structure prediction. Better predictions of local and non-local contacts between residues can transform protein sequence alignment to structure alignment, which can finally improve template based three-dimensional protein structure predictors greatly. METHODS: CNNcon, an improved multiple neural networks based contact map predictor using six sub-networks and one final cascade-network, was developed in this paper. Both the sub-networks and the final cascade-network were trained and tested with their corresponding data sets. While for testing, the target protein was first coded and then input to its corresponding sub-networks for prediction. After that, the intermediate results were input to the cascade-network to finish the final prediction. RESULTS: The CNNcon can accurately predict 58.86% in average of contacts at a distance cutoff of 8 Å for proteins with lengths ranging from 51 to 450. The comparison results show that the present method performs better than the compared state-of-the-art predictors. Particularly, the prediction accuracy keeps steady with the increase of protein sequence length. It indicates that the CNNcon overcomes the thin density problem, with which other current predictors have trouble. This advantage makes the method valuable to the prediction of long length proteins. As a result, the effective

  3. Accuracy of Genomic Prediction in Switchgrass (Panicum virgatum L. Improved by Accounting for Linkage Disequilibrium

    Directory of Open Access Journals (Sweden)

    Guillaume P. Ramstein

    2016-04-01

    Full Text Available Switchgrass is a relatively high-yielding and environmentally sustainable biomass crop, but further genetic gains in biomass yield must be achieved to make it an economically viable bioenergy feedstock. Genomic selection (GS is an attractive technology to generate rapid genetic gains in switchgrass, and meet the goals of a substantial displacement of petroleum use with biofuels in the near future. In this study, we empirically assessed prediction procedures for genomic selection in two different populations, consisting of 137 and 110 half-sib families of switchgrass, tested in two locations in the United States for three agronomic traits: dry matter yield, plant height, and heading date. Marker data were produced for the families’ parents by exome capture sequencing, generating up to 141,030 polymorphic markers with available genomic-location and annotation information. We evaluated prediction procedures that varied not only by learning schemes and prediction models, but also by the way the data were preprocessed to account for redundancy in marker information. More complex genomic prediction procedures were generally not significantly more accurate than the simplest procedure, likely due to limited population sizes. Nevertheless, a highly significant gain in prediction accuracy was achieved by transforming the marker data through a marker correlation matrix. Our results suggest that marker-data transformations and, more generally, the account of linkage disequilibrium among markers, offer valuable opportunities for improving prediction procedures in GS. Some of the achieved prediction accuracies should motivate implementation of GS in switchgrass breeding programs.

  4. Predictive Maintenance: One key to improved power plant availability

    International Nuclear Information System (INIS)

    Mobley; Allen, J.W.

    1986-01-01

    Recent developments in microprocessor technology has provided the ability to routinely monitor the actual mechanical condition of all rotating and reciprocating machinery and process variables (i.e. pressure, temperature, flow, etc.) of other process equipment within an operating electric power generating plant. This direct correlation between frequency domain vibration and actual mechanical condition of machinery and trending process variables of non-rotating equipment can provide the ''key'' to improving the availability and reliability, thermal efficiency and provide the baseline information necessary for developing a realistic plan for extending the useful life of power plants. The premise of utilizing microprocessor-based Predictive Maintenance to improve power plant operation has been proven by a number of utilities. This paper provides a comprehensive discussion of the TEC approach to Predictive Maintenance and examples of successful programs

  5. Can video games be used to predict or improve laparoscopic skills?

    Science.gov (United States)

    Rosenberg, Bradley H; Landsittel, Douglas; Averch, Timothy D

    2005-04-01

    Performance of laparoscopic surgery requires adequate hand-eye coordination. Video games are an effective way to judge one's hand-eye coordination, and practicing these games may improve one's skills. Our goal was to see if there is a correlation between skill in video games and skill in laparoscopy. Also, we hoped to demonstrate that practicing video games can improve one's laparoscopic skills. Eleven medical students (nine male, two female) volunteered to participate. On day 1, each student played three commercially available video games (Top Spin, XSN Sports; Project Gotham Racing 2, Bizarre Creations; and Amped 2, XSN Sports) for 30 minutes on an X-box (Microsoft, Seattle, WA) and was judged both objectively and subjectively. Next, the students performed four laparoscopic tasks (object transfer, tracing a figure-of-eight, suture placement, and knot-tying) in a swine model and were assessed for time to complete the task, number of errors committed, and hand-eye coordination. The students were then randomized to control (group A) or "training" (i.e., video game practicing; group B) arms. Two weeks later, all students repeated the laparoscopic skills laboratory and were reassessed. Spearman correlation coefficients demonstrated a significant relation between many of the parameters, particularly time to complete each task and hand-eye coordination at the different games. There was a weaker association between video game performance and both laparoscopic errors committed and hand-eye coordination. Group B subjects did not improve significantly over those in group A in any measure (P >0.05 for all). Video game aptitude appears to predict the level of laparoscopic skill in the novice surgeon. In this study, practicing video games did not improve one's laparoscopic skill significantly, but a larger study with more practice time could prove games to be helpful.

  6. Combining specificity determining and conserved residues improves functional site prediction

    Directory of Open Access Journals (Sweden)

    Gelfand Mikhail S

    2009-06-01

    Full Text Available Abstract Background Predicting the location of functionally important sites from protein sequence and/or structure is a long-standing problem in computational biology. Most current approaches make use of sequence conservation, assuming that amino acid residues conserved within a protein family are most likely to be functionally important. Most often these approaches do not consider many residues that act to define specific sub-functions within a family, or they make no distinction between residues important for function and those more relevant for maintaining structure (e.g. in the hydrophobic core. Many protein families bind and/or act on a variety of ligands, meaning that conserved residues often only bind a common ligand sub-structure or perform general catalytic activities. Results Here we present a novel method for functional site prediction based on identification of conserved positions, as well as those responsible for determining ligand specificity. We define Specificity-Determining Positions (SDPs, as those occupied by conserved residues within sub-groups of proteins in a family having a common specificity, but differ between groups, and are thus likely to account for specific recognition events. We benchmark the approach on enzyme families of known 3D structure with bound substrates, and find that in nearly all families residues predicted by SDPsite are in contact with the bound substrate, and that the addition of SDPs significantly improves functional site prediction accuracy. We apply SDPsite to various families of proteins containing known three-dimensional structures, but lacking clear functional annotations, and discusse several illustrative examples. Conclusion The results suggest a better means to predict functional details for the thousands of protein structures determined prior to a clear understanding of molecular function.

  7. Improved feature selection based on genetic algorithms for real time disruption prediction on JET

    International Nuclear Information System (INIS)

    Rattá, G.A.; Vega, J.; Murari, A.

    2012-01-01

    Highlights: ► A new signal selection methodology to improve disruption prediction is reported. ► The approach is based on Genetic Algorithms. ► An advanced predictor has been created with the new set of signals. ► The new system obtains considerably higher prediction rates. - Abstract: The early prediction of disruptions is an important aspect of the research in the field of Tokamak control. A very recent predictor, called “Advanced Predictor Of Disruptions” (APODIS), developed for the “Joint European Torus” (JET), implements the real time recognition of incoming disruptions with the best success rate achieved ever and an outstanding stability for long periods following training. In this article, a new methodology to select the set of the signals’ parameters in order to maximize the performance of the predictor is reported. The approach is based on “Genetic Algorithms” (GAs). With the feature selection derived from GAs, a new version of APODIS has been developed. The results are significantly better than the previous version not only in terms of success rates but also in extending the interval before the disruption in which reliable predictions are achieved. Correct disruption predictions with a success rate in excess of 90% have been achieved 200 ms before the time of the disruption. The predictor response is compared with that of JET's Protection System (JPS) and the ADODIS predictor is shown to be far superior. Both systems have been carefully tested with a wide number of discharges to understand their relative merits and the most profitable directions of further improvements.

  8. Benthic Light Availability Improves Predictions of Riverine Primary Production

    Science.gov (United States)

    Kirk, L.; Cohen, M. J.

    2017-12-01

    Light is a fundamental control on photosynthesis, and often the only control strongly correlated with gross primary production (GPP) in streams and rivers; yet it has received far less attention than nutrients. Because benthic light is difficult to measure in situ, surrogates such as open sky irradiance are often used. Several studies have now refined methods to quantify canopy and water column attenuation of open sky light in order to estimate the amount of light that actually reaches the benthos. Given the additional effort that measuring benthic light requires, we should ask if benthic light always improves our predictions of GPP compared to just open sky irradiance. We use long-term, high-resolution dissolved oxygen, turbidity, dissolved organic matter (fDOM), and irradiance data from streams and rivers in north-central Florida, US across gradients of size and color to build statistical models of benthic light that predict GPP. Preliminary results on a large, clear river show only modest model improvements over open sky irradiance, even in heavily canopied reaches with pulses of tannic water. However, in another spring-fed river with greater connectivity to adjacent wetlands - and hence larger, more frequent pulses of tannic water - the model improved dramatically with the inclusion of fDOM (model R2 improved from 0.28 to 0.68). River shade modeling efforts also suggest that knowing benthic light will greatly enhance our ability to predict GPP in narrower, forested streams flowing in particular directions. Our objective is to outline conditions where an assessment of benthic light conditions would be necessary for riverine metabolism studies or management strategies.

  9. Integration of biomimicry and nanotechnology for significantly improved detection of circulating tumor cells (CTCs).

    Science.gov (United States)

    Myung, Ja Hye; Park, Sin-Jung; Wang, Andrew Z; Hong, Seungpyo

    2017-12-13

    Circulating tumor cells (CTCs) have received a great deal of scientific and clinical attention as a biomarker for diagnosis and prognosis of many types of cancer. Given their potential significance in clinics, a variety of detection methods, utilizing the recent advances in nanotechnology and microfluidics, have been introduced in an effort of achieving clinically significant detection of CTCs. However, effective detection and isolation of CTCs still remain a tremendous challenge due to their extreme rarity and phenotypic heterogeneity. Among many approaches that are currently under development, this review paper focuses on a unique, promising approach that takes advantages of naturally occurring processes achievable through application of nanotechnology to realize significant improvement in sensitivity and specificity of CTC capture. We provide an overview of successful outcome of this biomimetic CTC capture system in detection of tumor cells from in vitro, in vivo, and clinical pilot studies. We also emphasize the clinical impact of CTCs as biomarkers in cancer diagnosis and predictive prognosis, which provides a cost-effective, minimally invasive method that potentially replaces or supplements existing methods such as imaging technologies and solid tissue biopsy. In addition, their potential prognostic values as treatment guidelines and that ultimately help to realize personalized therapy are discussed. Copyright © 2017. Published by Elsevier B.V.

  10. EMUDRA: Ensemble of Multiple Drug Repositioning Approaches to Improve Prediction Accuracy.

    Science.gov (United States)

    Zhou, Xianxiao; Wang, Minghui; Katsyv, Igor; Irie, Hanna; Zhang, Bin

    2018-04-24

    Availability of large-scale genomic, epigenetic and proteomic data in complex diseases makes it possible to objectively and comprehensively identify therapeutic targets that can lead to new therapies. The Connectivity Map has been widely used to explore novel indications of existing drugs. However, the prediction accuracy of the existing methods, such as Kolmogorov-Smirnov statistic remains low. Here we present a novel high-performance drug repositioning approach that improves over the state-of-the-art methods. We first designed an expression weighted cosine method (EWCos) to minimize the influence of the uninformative expression changes and then developed an ensemble approach termed EMUDRA (Ensemble of Multiple Drug Repositioning Approaches) to integrate EWCos and three existing state-of-the-art methods. EMUDRA significantly outperformed individual drug repositioning methods when applied to simulated and independent evaluation datasets. We predicted using EMUDRA and experimentally validated an antibiotic rifabutin as an inhibitor of cell growth in triple negative breast cancer. EMUDRA can identify drugs that more effectively target disease gene signatures and will thus be a useful tool for identifying novel therapies for complex diseases and predicting new indications for existing drugs. The EMUDRA R package is available at doi:10.7303/syn11510888. bin.zhang@mssm.edu or zhangb@hotmail.com. Supplementary data are available at Bioinformatics online.

  11. PSORTb 3.0: improved protein subcellular localization prediction with refined localization subcategories and predictive capabilities for all prokaryotes.

    Science.gov (United States)

    Yu, Nancy Y; Wagner, James R; Laird, Matthew R; Melli, Gabor; Rey, Sébastien; Lo, Raymond; Dao, Phuong; Sahinalp, S Cenk; Ester, Martin; Foster, Leonard J; Brinkman, Fiona S L

    2010-07-01

    PSORTb has remained the most precise bacterial protein subcellular localization (SCL) predictor since it was first made available in 2003. However, the recall needs to be improved and no accurate SCL predictors yet make predictions for archaea, nor differentiate important localization subcategories, such as proteins targeted to a host cell or bacterial hyperstructures/organelles. Such improvements should preferably be encompassed in a freely available web-based predictor that can also be used as a standalone program. We developed PSORTb version 3.0 with improved recall, higher proteome-scale prediction coverage, and new refined localization subcategories. It is the first SCL predictor specifically geared for all prokaryotes, including archaea and bacteria with atypical membrane/cell wall topologies. It features an improved standalone program, with a new batch results delivery system complementing its web interface. We evaluated the most accurate SCL predictors using 5-fold cross validation plus we performed an independent proteomics analysis, showing that PSORTb 3.0 is the most accurate but can benefit from being complemented by Proteome Analyst predictions. http://www.psort.org/psortb (download open source software or use the web interface). psort-mail@sfu.ca Supplementary data are available at Bioinformatics online.

  12. Improving Permafrost Hydrology Prediction Through Data-Model Integration

    Science.gov (United States)

    Wilson, C. J.; Andresen, C. G.; Atchley, A. L.; Bolton, W. R.; Busey, R.; Coon, E.; Charsley-Groffman, L.

    2017-12-01

    The CMIP5 Earth System Models were unable to adequately predict the fate of the 16GT of permafrost carbon in a warming climate due to poor representation of Arctic ecosystem processes. The DOE Office of Science Next Generation Ecosystem Experiment, NGEE-Arctic project aims to reduce uncertainty in the Arctic carbon cycle and its impact on the Earth's climate system by improved representation of the coupled physical, chemical and biological processes that drive how much buried carbon will be converted to CO2 and CH4, how fast this will happen, which form will dominate, and the degree to which increased plant productivity will offset increased soil carbon emissions. These processes fundamentally depend on permafrost thaw rate and its influence on surface and subsurface hydrology through thermal erosion, land subsidence and changes to groundwater flow pathways as soil, bedrock and alluvial pore ice and massive ground ice melts. LANL and its NGEE colleagues are co-developing data and models to better understand controls on permafrost degradation and improve prediction of the evolution of permafrost and its impact on Arctic hydrology. The LANL Advanced Terrestrial Simulator was built using a state of the art HPC software framework to enable the first fully coupled 3-dimensional surface-subsurface thermal-hydrology and land surface deformation simulations to simulate the evolution of the physical Arctic environment. Here we show how field data including hydrology, snow, vegetation, geochemistry and soil properties, are informing the development and application of the ATS to improve understanding of controls on permafrost stability and permafrost hydrology. The ATS is being used to inform parameterizations of complex coupled physical, ecological and biogeochemical processes for implementation in the DOE ACME land model, to better predict the role of changing Arctic hydrology on the global climate system. LA-UR-17-26566.

  13. Can biomechanical variables predict improvement in crouch gait?

    Science.gov (United States)

    Hicks, Jennifer L.; Delp, Scott L.; Schwartz, Michael H.

    2011-01-01

    Many patients respond positively to treatments for crouch gait, yet surgical outcomes are inconsistent and unpredictable. In this study, we developed a multivariable regression model to determine if biomechanical variables and other subject characteristics measured during a physical exam and gait analysis can predict which subjects with crouch gait will demonstrate improved knee kinematics on a follow-up gait analysis. We formulated the model and tested its performance by retrospectively analyzing 353 limbs of subjects who walked with crouch gait. The regression model was able to predict which subjects would demonstrate ‘improved’ and ‘unimproved’ knee kinematics with over 70% accuracy, and was able to explain approximately 49% of the variance in subjects’ change in knee flexion between gait analyses. We found that improvement in stance phase knee flexion was positively associated with three variables that were drawn from knowledge about the biomechanical contributors to crouch gait: i) adequate hamstrings lengths and velocities, possibly achieved via hamstrings lengthening surgery, ii) normal tibial torsion, possibly achieved via tibial derotation osteotomy, and iii) sufficient muscle strength. PMID:21616666

  14. Improving performance of breast cancer risk prediction using a new CAD-based region segmentation scheme

    Science.gov (United States)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Qiu, Yuchen; Zheng, Bin

    2018-02-01

    Objective of this study is to develop and test a new computer-aided detection (CAD) scheme with improved region of interest (ROI) segmentation combined with an image feature extraction framework to improve performance in predicting short-term breast cancer risk. A dataset involving 570 sets of "prior" negative mammography screening cases was retrospectively assembled. In the next sequential "current" screening, 285 cases were positive and 285 cases remained negative. A CAD scheme was applied to all 570 "prior" negative images to stratify cases into the high and low risk case group of having cancer detected in the "current" screening. First, a new ROI segmentation algorithm was used to automatically remove useless area of mammograms. Second, from the matched bilateral craniocaudal view images, a set of 43 image features related to frequency characteristics of ROIs were initially computed from the discrete cosine transform and spatial domain of the images. Third, a support vector machine model based machine learning classifier was used to optimally classify the selected optimal image features to build a CAD-based risk prediction model. The classifier was trained using a leave-one-case-out based cross-validation method. Applying this improved CAD scheme to the testing dataset, an area under ROC curve, AUC = 0.70+/-0.04, which was significantly higher than using the extracting features directly from the dataset without the improved ROI segmentation step (AUC = 0.63+/-0.04). This study demonstrated that the proposed approach could improve accuracy on predicting short-term breast cancer risk, which may play an important role in helping eventually establish an optimal personalized breast cancer paradigm.

  15. Improving protein disorder prediction by deep bidirectional long short-term memory recurrent neural networks.

    Science.gov (United States)

    Hanson, Jack; Yang, Yuedong; Paliwal, Kuldip; Zhou, Yaoqi

    2017-03-01

    Capturing long-range interactions between structural but not sequence neighbors of proteins is a long-standing challenging problem in bioinformatics. Recently, long short-term memory (LSTM) networks have significantly improved the accuracy of speech and image classification problems by remembering useful past information in long sequential events. Here, we have implemented deep bidirectional LSTM recurrent neural networks in the problem of protein intrinsic disorder prediction. The new method, named SPOT-Disorder, has steadily improved over a similar method using a traditional, window-based neural network (SPINE-D) in all datasets tested without separate training on short and long disordered regions. Independent tests on four other datasets including the datasets from critical assessment of structure prediction (CASP) techniques and >10 000 annotated proteins from MobiDB, confirmed SPOT-Disorder as one of the best methods in disorder prediction. Moreover, initial studies indicate that the method is more accurate in predicting functional sites in disordered regions. These results highlight the usefulness combining LSTM with deep bidirectional recurrent neural networks in capturing non-local, long-range interactions for bioinformatics applications. SPOT-disorder is available as a web server and as a standalone program at: http://sparks-lab.org/server/SPOT-disorder/index.php . j.hanson@griffith.edu.au or yuedong.yang@griffith.edu.au or yaoqi.zhou@griffith.edu.au. Supplementary data is available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit

    2015-04-16

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  17. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit; Dave, Akshat; Ghanem, Bernard

    2015-01-01

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  18. Improved hybrid optimization algorithm for 3D protein structure prediction.

    Science.gov (United States)

    Zhou, Changjun; Hou, Caixia; Wei, Xiaopeng; Zhang, Qiang

    2014-07-01

    A new improved hybrid optimization algorithm - PGATS algorithm, which is based on toy off-lattice model, is presented for dealing with three-dimensional protein structure prediction problems. The algorithm combines the particle swarm optimization (PSO), genetic algorithm (GA), and tabu search (TS) algorithms. Otherwise, we also take some different improved strategies. The factor of stochastic disturbance is joined in the particle swarm optimization to improve the search ability; the operations of crossover and mutation that are in the genetic algorithm are changed to a kind of random liner method; at last tabu search algorithm is improved by appending a mutation operator. Through the combination of a variety of strategies and algorithms, the protein structure prediction (PSP) in a 3D off-lattice model is achieved. The PSP problem is an NP-hard problem, but the problem can be attributed to a global optimization problem of multi-extremum and multi-parameters. This is the theoretical principle of the hybrid optimization algorithm that is proposed in this paper. The algorithm combines local search and global search, which overcomes the shortcoming of a single algorithm, giving full play to the advantage of each algorithm. In the current universal standard sequences, Fibonacci sequences and real protein sequences are certified. Experiments show that the proposed new method outperforms single algorithms on the accuracy of calculating the protein sequence energy value, which is proved to be an effective way to predict the structure of proteins.

  19. TU-CD-BRB-01: Normal Lung CT Texture Features Improve Predictive Models for Radiation Pneumonitis

    International Nuclear Information System (INIS)

    Krafft, S; Briere, T; Court, L; Martel, M

    2015-01-01

    Purpose: Existing normal tissue complication probability (NTCP) models for radiation pneumonitis (RP) traditionally rely on dosimetric and clinical data but are limited in terms of performance and generalizability. Extraction of pre-treatment image features provides a potential new category of data that can improve NTCP models for RP. We consider quantitative measures of total lung CT intensity and texture in a framework for prediction of RP. Methods: Available clinical and dosimetric data was collected for 198 NSCLC patients treated with definitive radiotherapy. Intensity- and texture-based image features were extracted from the T50 phase of the 4D-CT acquired for treatment planning. A total of 3888 features (15 clinical, 175 dosimetric, and 3698 image features) were gathered and considered candidate predictors for modeling of RP grade≥3. A baseline logistic regression model with mean lung dose (MLD) was first considered. Additionally, a least absolute shrinkage and selection operator (LASSO) logistic regression was applied to the set of clinical and dosimetric features, and subsequently to the full set of clinical, dosimetric, and image features. Model performance was assessed by comparing area under the curve (AUC). Results: A simple logistic fit of MLD was an inadequate model of the data (AUC∼0.5). Including clinical and dosimetric parameters within the framework of the LASSO resulted in improved performance (AUC=0.648). Analysis of the full cohort of clinical, dosimetric, and image features provided further and significant improvement in model performance (AUC=0.727). Conclusions: To achieve significant gains in predictive modeling of RP, new categories of data should be considered in addition to clinical and dosimetric features. We have successfully incorporated CT image features into a framework for modeling RP and have demonstrated improved predictive performance. Validation and further investigation of CT image features in the context of RP NTCP

  20. Innovative predictive maintenance concepts to improve life cycle management

    NARCIS (Netherlands)

    Tinga, Tiedo

    2014-01-01

    For naval systems with typically long service lives, high sustainment costs and strict availability requirements, an effective and efficient life cycle management process is very important. In this paper four approaches are discussed to improve that process: physics of failure based predictive

  1. Accounting for genetic architecture improves sequence based genomic prediction for a Drosophila fitness trait.

    Science.gov (United States)

    Ober, Ulrike; Huang, Wen; Magwire, Michael; Schlather, Martin; Simianer, Henner; Mackay, Trudy F C

    2015-01-01

    The ability to predict quantitative trait phenotypes from molecular polymorphism data will revolutionize evolutionary biology, medicine and human biology, and animal and plant breeding. Efforts to map quantitative trait loci have yielded novel insights into the biology of quantitative traits, but the combination of individually significant quantitative trait loci typically has low predictive ability. Utilizing all segregating variants can give good predictive ability in plant and animal breeding populations, but gives little insight into trait biology. Here, we used the Drosophila Genetic Reference Panel to perform both a genome wide association analysis and genomic prediction for the fitness-related trait chill coma recovery time. We found substantial total genetic variation for chill coma recovery time, with a genetic architecture that differs between males and females, a small number of molecular variants with large main effects, and evidence for epistasis. Although the top additive variants explained 36% (17%) of the genetic variance among lines in females (males), the predictive ability using genomic best linear unbiased prediction and a relationship matrix using all common segregating variants was very low for females and zero for males. We hypothesized that the low predictive ability was due to the mismatch between the infinitesimal genetic architecture assumed by the genomic best linear unbiased prediction model and the true genetic architecture of chill coma recovery time. Indeed, we found that the predictive ability of the genomic best linear unbiased prediction model is markedly improved when we combine quantitative trait locus mapping with genomic prediction by only including the top variants associated with main and epistatic effects in the relationship matrix. This trait-associated prediction approach has the advantage that it yields biologically interpretable prediction models.

  2. Improved feature selection based on genetic algorithms for real time disruption prediction on JET

    Energy Technology Data Exchange (ETDEWEB)

    Ratta, G.A., E-mail: garatta@gateme.unsj.edu.ar [GATEME, Facultad de Ingenieria, Universidad Nacional de San Juan, Avda. San Martin 1109 (O), 5400 San Juan (Argentina); JET EFDA, Culham Science Centre, OX14 3DB Abingdon (United Kingdom); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense, 40, 28040 Madrid (Spain); JET EFDA, Culham Science Centre, OX14 3DB Abingdon (United Kingdom); Murari, A. [Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); JET EFDA, Culham Science Centre, OX14 3DB Abingdon (United Kingdom)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer A new signal selection methodology to improve disruption prediction is reported. Black-Right-Pointing-Pointer The approach is based on Genetic Algorithms. Black-Right-Pointing-Pointer An advanced predictor has been created with the new set of signals. Black-Right-Pointing-Pointer The new system obtains considerably higher prediction rates. - Abstract: The early prediction of disruptions is an important aspect of the research in the field of Tokamak control. A very recent predictor, called 'Advanced Predictor Of Disruptions' (APODIS), developed for the 'Joint European Torus' (JET), implements the real time recognition of incoming disruptions with the best success rate achieved ever and an outstanding stability for long periods following training. In this article, a new methodology to select the set of the signals' parameters in order to maximize the performance of the predictor is reported. The approach is based on 'Genetic Algorithms' (GAs). With the feature selection derived from GAs, a new version of APODIS has been developed. The results are significantly better than the previous version not only in terms of success rates but also in extending the interval before the disruption in which reliable predictions are achieved. Correct disruption predictions with a success rate in excess of 90% have been achieved 200 ms before the time of the disruption. The predictor response is compared with that of JET's Protection System (JPS) and the ADODIS predictor is shown to be far superior. Both systems have been carefully tested with a wide number of discharges to understand their relative merits and the most profitable directions of further improvements.

  3. Solar radio proxies for improved satellite orbit prediction

    Science.gov (United States)

    Yaya, Philippe; Hecker, Louis; Dudok de Wit, Thierry; Fèvre, Clémence Le; Bruinsma, Sean

    2017-12-01

    Specification and forecasting of solar drivers to thermosphere density models is critical for satellite orbit prediction and debris avoidance. Satellite operators routinely forecast orbits up to 30 days into the future. This requires forecasts of the drivers to these orbit prediction models such as the solar Extreme-UV (EUV) flux and geomagnetic activity. Most density models use the 10.7 cm radio flux (F10.7 index) as a proxy for solar EUV. However, daily measurements at other centimetric wavelengths have also been performed by the Nobeyama Radio Observatory (Japan) since the 1950's, thereby offering prospects for improving orbit modeling. Here we present a pre-operational service at the Collecte Localisation Satellites company that collects these different observations in one single homogeneous dataset and provides a 30 days forecast on a daily basis. Interpolation and preprocessing algorithms were developed to fill in missing data and remove anomalous values. We compared various empirical time series prediction techniques and selected a multi-wavelength non-recursive analogue neural network. The prediction of the 30 cm flux, and to a lesser extent that of the 10.7 cm flux, performs better than NOAA's present prediction of the 10.7 cm flux, especially during periods of high solar activity. In addition, we find that the DTM-2013 density model (Drag Temperature Model) performs better with (past and predicted) values of the 30 cm radio flux than with the 10.7 flux.

  4. Solar radio proxies for improved satellite orbit prediction

    Directory of Open Access Journals (Sweden)

    Yaya Philippe

    2017-01-01

    Full Text Available Specification and forecasting of solar drivers to thermosphere density models is critical for satellite orbit prediction and debris avoidance. Satellite operators routinely forecast orbits up to 30 days into the future. This requires forecasts of the drivers to these orbit prediction models such as the solar Extreme-UV (EUV flux and geomagnetic activity. Most density models use the 10.7 cm radio flux (F10.7 index as a proxy for solar EUV. However, daily measurements at other centimetric wavelengths have also been performed by the Nobeyama Radio Observatory (Japan since the 1950's, thereby offering prospects for improving orbit modeling. Here we present a pre-operational service at the Collecte Localisation Satellites company that collects these different observations in one single homogeneous dataset and provides a 30 days forecast on a daily basis. Interpolation and preprocessing algorithms were developed to fill in missing data and remove anomalous values. We compared various empirical time series prediction techniques and selected a multi-wavelength non-recursive analogue neural network. The prediction of the 30 cm flux, and to a lesser extent that of the 10.7 cm flux, performs better than NOAA's present prediction of the 10.7 cm flux, especially during periods of high solar activity. In addition, we find that the DTM-2013 density model (Drag Temperature Model performs better with (past and predicted values of the 30 cm radio flux than with the 10.7 flux.

  5. A two-stage approach for improved prediction of residue contact maps

    Directory of Open Access Journals (Sweden)

    Pollastri Gianluca

    2006-03-01

    Full Text Available Abstract Background Protein topology representations such as residue contact maps are an important intermediate step towards ab initio prediction of protein structure. Although improvements have occurred over the last years, the problem of accurately predicting residue contact maps from primary sequences is still largely unsolved. Among the reasons for this are the unbalanced nature of the problem (with far fewer examples of contacts than non-contacts, the formidable challenge of capturing long-range interactions in the maps, the intrinsic difficulty of mapping one-dimensional input sequences into two-dimensional output maps. In order to alleviate these problems and achieve improved contact map predictions, in this paper we split the task into two stages: the prediction of a map's principal eigenvector (PE from the primary sequence; the reconstruction of the contact map from the PE and primary sequence. Predicting the PE from the primary sequence consists in mapping a vector into a vector. This task is less complex than mapping vectors directly into two-dimensional matrices since the size of the problem is drastically reduced and so is the scale length of interactions that need to be learned. Results We develop architectures composed of ensembles of two-layered bidirectional recurrent neural networks to classify the components of the PE in 2, 3 and 4 classes from protein primary sequence, predicted secondary structure, and hydrophobicity interaction scales. Our predictor, tested on a non redundant set of 2171 proteins, achieves classification performances of up to 72.6%, 16% above a base-line statistical predictor. We design a system for the prediction of contact maps from the predicted PE. Our results show that predicting maps through the PE yields sizeable gains especially for long-range contacts which are particularly critical for accurate protein 3D reconstruction. The final predictor's accuracy on a non-redundant set of 327 targets is 35

  6. Research on Improved Depth Belief Network-Based Prediction of Cardiovascular Diseases

    Directory of Open Access Journals (Sweden)

    Peng Lu

    2018-01-01

    Full Text Available Quantitative analysis and prediction can help to reduce the risk of cardiovascular disease. Quantitative prediction based on traditional model has low accuracy. The variance of model prediction based on shallow neural network is larger. In this paper, cardiovascular disease prediction model based on improved deep belief network (DBN is proposed. Using the reconstruction error, the network depth is determined independently, and unsupervised training and supervised optimization are combined. It ensures the accuracy of model prediction while guaranteeing stability. Thirty experiments were performed independently on the Statlog (Heart and Heart Disease Database data sets in the UCI database. Experimental results showed that the mean of prediction accuracy was 91.26% and 89.78%, respectively. The variance of prediction accuracy was 5.78 and 4.46, respectively.

  7. Inhaler Reminders Significantly Improve Asthma Patients' Use of Controller Medications

    Science.gov (United States)

    ... controller medications Share | Inhaler reminders significantly improve asthma patients’ use of controller medications Published Online: July 22, ... the burden and risk of asthma, but many patients do not use them regularly. This poor adherence ...

  8. Healthy, wealthy, and wise: retirement planning predicts employee health improvements.

    Science.gov (United States)

    Gubler, Timothy; Pierce, Lamar

    2014-09-01

    Are poor physical and financial health driven by the same underlying psychological factors? We found that the decision to contribute to a 401(k) retirement plan predicted whether an individual acted to correct poor physical-health indicators revealed during an employer-sponsored health examination. Using this examination as a quasi-exogenous shock to employees' personal-health knowledge, we examined which employees were more likely to improve their health, controlling for differences in initial health, demographics, job type, and income. We found that existing retirement-contribution patterns and future health improvements were highly correlated. Employees who saved for the future by contributing to a 401(k) showed improvements in their abnormal blood-test results and health behaviors approximately 27% more often than noncontributors did. These findings are consistent with an underlying individual time-discounting trait that is both difficult to change and domain interdependent, and that predicts long-term individual behaviors in multiple dimensions. © The Author(s) 2014.

  9. Improved prediction of signal peptides: SignalP 3.0

    DEFF Research Database (Denmark)

    Bendtsen, Jannick Dyrløv; Nielsen, Henrik; von Heijne, G.

    2004-01-01

    We describe improvements of the currently most popular method for prediction of classically secreted proteins, SignalP. SignalP consists of two different predictors based on neural network and hidden Markov model algorithms, where both components have been updated. Motivated by the idea that the ...

  10. Can decadal climate predictions be improved by ocean ensemble dispersion filtering?

    Science.gov (United States)

    Kadow, C.; Illing, S.; Kröner, I.; Ulbrich, U.; Cubasch, U.

    2017-12-01

    Decadal predictions by Earth system models aim to capture the state and phase of the climate several years inadvance. Atmosphere-ocean interaction plays an important role for such climate forecasts. While short-termweather forecasts represent an initial value problem and long-term climate projections represent a boundarycondition problem, the decadal climate prediction falls in-between these two time scales. The ocean memorydue to its heat capacity holds big potential skill on the decadal scale. In recent years, more precise initializationtechniques of coupled Earth system models (incl. atmosphere and ocean) have improved decadal predictions.Ensembles are another important aspect. Applying slightly perturbed predictions results in an ensemble. Insteadof using and evaluating one prediction, but the whole ensemble or its ensemble average, improves a predictionsystem. However, climate models in general start losing the initialized signal and its predictive skill from oneforecast year to the next. Here we show that the climate prediction skill of an Earth system model can be improvedby a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. Wefound that this procedure, called ensemble dispersion filter, results in more accurate results than the standarddecadal prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions showan increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with largerensembles and higher resolution. Our results demonstrate how decadal climate predictions benefit from oceanensemble dispersion filtering toward the ensemble mean. This study is part of MiKlip (fona-miklip.de) - a major project on decadal climate prediction in Germany.We focus on the Max-Planck-Institute Earth System Model using the low-resolution version (MPI-ESM-LR) andMiKlip's basic initialization strategy as in 2017 published decadal climate forecast: http

  11. Accounting for genetic architecture improves sequence based genomic prediction for a Drosophila fitness trait.

    Directory of Open Access Journals (Sweden)

    Ulrike Ober

    Full Text Available The ability to predict quantitative trait phenotypes from molecular polymorphism data will revolutionize evolutionary biology, medicine and human biology, and animal and plant breeding. Efforts to map quantitative trait loci have yielded novel insights into the biology of quantitative traits, but the combination of individually significant quantitative trait loci typically has low predictive ability. Utilizing all segregating variants can give good predictive ability in plant and animal breeding populations, but gives little insight into trait biology. Here, we used the Drosophila Genetic Reference Panel to perform both a genome wide association analysis and genomic prediction for the fitness-related trait chill coma recovery time. We found substantial total genetic variation for chill coma recovery time, with a genetic architecture that differs between males and females, a small number of molecular variants with large main effects, and evidence for epistasis. Although the top additive variants explained 36% (17% of the genetic variance among lines in females (males, the predictive ability using genomic best linear unbiased prediction and a relationship matrix using all common segregating variants was very low for females and zero for males. We hypothesized that the low predictive ability was due to the mismatch between the infinitesimal genetic architecture assumed by the genomic best linear unbiased prediction model and the true genetic architecture of chill coma recovery time. Indeed, we found that the predictive ability of the genomic best linear unbiased prediction model is markedly improved when we combine quantitative trait locus mapping with genomic prediction by only including the top variants associated with main and epistatic effects in the relationship matrix. This trait-associated prediction approach has the advantage that it yields biologically interpretable prediction models.

  12. TMDIM: an improved algorithm for the structure prediction of transmembrane domains of bitopic dimers

    Science.gov (United States)

    Cao, Han; Ng, Marcus C. K.; Jusoh, Siti Azma; Tai, Hio Kuan; Siu, Shirley W. I.

    2017-09-01

    α-Helical transmembrane proteins are the most important drug targets in rational drug development. However, solving the experimental structures of these proteins remains difficult, therefore computational methods to accurately and efficiently predict the structures are in great demand. We present an improved structure prediction method TMDIM based on Park et al. (Proteins 57:577-585, 2004) for predicting bitopic transmembrane protein dimers. Three major algorithmic improvements are introduction of the packing type classification, the multiple-condition decoy filtering, and the cluster-based candidate selection. In a test of predicting nine known bitopic dimers, approximately 78% of our predictions achieved a successful fit (RMSD PHP, MySQL and Apache, with all major browsers supported.

  13. Merging economics and epidemiology to improve the prediction and management of infectious disease.

    Science.gov (United States)

    Perrings, Charles; Castillo-Chavez, Carlos; Chowell, Gerardo; Daszak, Peter; Fenichel, Eli P; Finnoff, David; Horan, Richard D; Kilpatrick, A Marm; Kinzig, Ann P; Kuminoff, Nicolai V; Levin, Simon; Morin, Benjamin; Smith, Katherine F; Springborn, Michael

    2014-12-01

    Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative frequency in the population. The behavioral factors that underpin contact rates are not generally addressed. There is, however, an emerging a class of models that addresses the feedbacks between infectious disease dynamics and the behavioral decisions driving host contact. Referred to as "economic epidemiology" or "epidemiological economics," the approach explores the determinants of decisions about the number and type of contacts made by individuals, using insights and methods from economics. We show how the approach has the potential both to improve predictions of the course of infectious disease, and to support development of novel approaches to infectious disease management.

  14. A new model using routinely available clinical parameters to predict significant liver fibrosis in chronic hepatitis B.

    Directory of Open Access Journals (Sweden)

    Wai-Kay Seto

    Full Text Available OBJECTIVE: We developed a predictive model for significant fibrosis in chronic hepatitis B (CHB based on routinely available clinical parameters. METHODS: 237 treatment-naïve CHB patients [58.4% hepatitis B e antigen (HBeAg-positive] who had undergone liver biopsy were randomly divided into two cohorts: training group (n = 108 and validation group (n = 129. Liver histology was assessed for fibrosis. All common demographics, viral serology, viral load and liver biochemistry were analyzed. RESULTS: Based on 12 available clinical parameters (age, sex, HBeAg status, HBV DNA, platelet, albumin, bilirubin, ALT, AST, ALP, GGT and AFP, a model to predict significant liver fibrosis (Ishak fibrosis score ≥3 was derived using the five best parameters (age, ALP, AST, AFP and platelet. Using the formula log(index+1 = 0.025+0.0031(age+0.1483 log(ALP+0.004 log(AST+0.0908 log(AFP+1-0.028 log(platelet, the PAPAS (Platelet/Age/Phosphatase/AFP/AST index predicts significant fibrosis with an area under the receiving operating characteristics (AUROC curve of 0.776 [0.797 for patients with ALT <2×upper limit of normal (ULN] The negative predictive value to exclude significant fibrosis was 88.4%. This predictive power is superior to other non-invasive models using common parameters, including the AST/platelet/GGT/AFP (APGA index, AST/platelet ratio index (APRI, and the FIB-4 index (AUROC of 0.757, 0.708 and 0.723 respectively. Using the PAPAS index, 67.5% of liver biopsies for patients being considered for treatment with ALT <2×ULN could be avoided. CONCLUSION: The PAPAS index can predict and exclude significant fibrosis, and may reduce the need for liver biopsy in CHB patients.

  15. Meta-Analysis of Predictive Significance of the Black Hole Sign for Hematoma Expansion in Intracerebral Hemorrhage.

    Science.gov (United States)

    Zheng, Jun; Yu, Zhiyuan; Guo, Rui; Li, Hao; You, Chao; Ma, Lu

    2018-04-27

    Hematoma expansion is related to unfavorable prognosis in intracerebral hemorrhage (ICH). The black hole sign is a novel marker on non-contrast computed tomography for predicting hematoma expansion. However, its predictive values are different in previous studies. Thus, this meta-analysis was conducted to evaluate the predictive significance of the black hole sign for hematoma expansion in ICH. A systematic literature search was performed. Original researches on the association between the black hole sign and hematoma expansion in ICH were included. Sensitivity and specificity were pooled to assess the predictive accuracy. Summary receiver operating characteristics curve (SROC) was developed. Deeks' funnel plot asymmetry test was used to assess the publication bias. Five studies with a total of 1495 patients were included in this study. The pooled sensitivity and specificity of the black hole sign for predicting hematoma expansion were 0.30 and 0.91, respectively. The area under the curve was 0.78 in SROC curve. There was no significant publication bias. This meta-analysis shows that the black hole sign is a helpful imaging marker for predicting hematoma expansion in ICH. Although the black hole sign has a relatively low sensitivity, its specificity is relatively high. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Improved Helicopter Rotor Performance Prediction through Loose and Tight CFD/CSD Coupling

    Science.gov (United States)

    Ickes, Jacob C.

    Helicopters and other Vertical Take-Off or Landing (VTOL) vehicles exhibit an interesting combination of structural dynamic and aerodynamic phenomena which together drive the rotor performance. The combination of factors involved make simulating the rotor a challenging and multidisciplinary effort, and one which is still an active area of interest in the industry because of the money and time it could save during design. Modern tools allow the prediction of rotorcraft physics from first principles. Analysis of the rotor system with this level of accuracy provides the understanding necessary to improve its performance. There has historically been a divide between the comprehensive codes which perform aeroelastic rotor simulations using simplified aerodynamic models, and the very computationally intensive Navier-Stokes Computational Fluid Dynamics (CFD) solvers. As computer resources become more available, efforts have been made to replace the simplified aerodynamics of the comprehensive codes with the more accurate results from a CFD code. The objective of this work is to perform aeroelastic rotorcraft analysis using first-principles simulations for both fluids and structural predictions using tools available at the University of Toledo. Two separate codes are coupled together in both loose coupling (data exchange on a periodic interval) and tight coupling (data exchange each time step) schemes. To allow the coupling to be carried out in a reliable and efficient way, a Fluid-Structure Interaction code was developed which automatically performs primary functions of loose and tight coupling procedures. Flow phenomena such as transonics, dynamic stall, locally reversed flow on a blade, and Blade-Vortex Interaction (BVI) were simulated in this work. Results of the analysis show aerodynamic load improvement due to the inclusion of the CFD-based airloads in the structural dynamics analysis of the Computational Structural Dynamics (CSD) code. Improvements came in the form

  17. Significant Improvements in Pyranometer Nighttime Offsets Using High-Flow DC Ventilation

    Energy Technology Data Exchange (ETDEWEB)

    Kutchenreiter, Mark; Michalski, J.J.; Long, C.N.; Habte, Aron

    2017-05-22

    Accurate solar radiation measurements using pyranometers are required to understand radiative impacts on the Earth's energy budget, solar energy production, and to validate radiative transfer models. Ventilators of pyranometers, which are used to keep the domes clean and dry, also affect instrument thermal offset accuracy. This poster presents a high-level overview of the ventilators for single-black-detector pyranometers and black-and-white pyranometers. For single-black-detector pyranometers with ventilators, high-flow-rate (50-CFM and higher), 12-V DC fans lower the offsets, lower the scatter, and improve the predictability of nighttime offsets compared to lower-flow-rate (35-CFM), 120-V AC fans operated in the same type of environmental setup. Black-and-white pyranometers, which are used to measure diffuse horizontal irradiance, sometimes show minor improvement with DC fan ventilation, but their offsets are always small, usually no more than 1 W/m2, whether AC- or DC-ventilated.

  18. Changes in the Oswestry Disability Index that predict improvement after lumbar fusion.

    Science.gov (United States)

    Djurasovic, Mladen; Glassman, Steven D; Dimar, John R; Crawford, Charles H; Bratcher, Kelly R; Carreon, Leah Y

    2012-11-01

    Clinical studies use both disease-specific and generic health outcomes measures. Disease-specific measures focus on health domains most relevant to the clinical population, while generic measures assess overall health-related quality of life. There is little information about which domains of the Oswestry Disability Index (ODI) are most important in determining improvement in overall health-related quality of life, as measured by the 36-Item Short Form Health Survey (SF-36), after lumbar spinal fusion. The objective of the study is to determine which clinical elements assessed by the ODI most influence improvement of overall health-related quality of life. A single tertiary spine center database was used to identify patients undergoing lumbar fusion for standard degenerative indications. Patients with complete preoperative and 2-year outcomes measures were included. Pearson correlation was used to assess the relationship between improvement in each item of the ODI with improvement in the SF-36 physical component summary (PCS) score, as well as achievement of the SF-36 PCS minimum clinically important difference (MCID). Multivariate regression modeling was used to examine which items of the ODI best predicted achievement for the SF-36 PCS MCID. The effect size and standardized response mean were calculated for each of the items of the ODI. A total of 1104 patients met inclusion criteria (674 female and 430 male patients). The mean age at surgery was 57 years. All items of the ODI showed significant correlations with the change in SF-36 PCS score and achievement of MCID for the SF-36 PCS, but only pain intensity, walking, and social life had r values > 0.4 reflecting moderate correlation. These 3 variables were also the dimensions that were independent predictors of the SF-36 PCS, and they were the only dimensions that had effect sizes and standardized response means that were moderate to large. Of the health dimensions measured by the ODI, pain intensity, walking

  19. Concurrent Modeling of Hydrodynamics and Interaction Forces Improves Particle Deposition Predictions.

    Science.gov (United States)

    Jin, Chao; Ren, Carolyn L; Emelko, Monica B

    2016-04-19

    It is widely believed that media surface roughness enhances particle deposition-numerous, but inconsistent, examples of this effect have been reported. Here, a new mathematical framework describing the effects of hydrodynamics and interaction forces on particle deposition on rough spherical collectors in absence of an energy barrier was developed and validated. In addition to quantifying DLVO force, the model includes improved descriptions of flow field profiles and hydrodynamic retardation functions. This work demonstrates that hydrodynamic effects can significantly alter particle deposition relative to expectations when only the DLVO force is considered. Moreover, the combined effects of hydrodynamics and interaction forces on particle deposition on rough, spherical media are not additive, but synergistic. Notably, the developed model's particle deposition predictions are in closer agreement with experimental observations than those from current models, demonstrating the importance of inclusion of roughness impacts in particle deposition description/simulation. Consideration of hydrodynamic contributions to particle deposition may help to explain discrepancies between model-based expectations and experimental outcomes and improve descriptions of particle deposition during physicochemical filtration in systems with nonsmooth collector surfaces.

  20. Improved prediction of aerodynamic noise from wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Guidati, G.; Bareiss, R.; Wagner, S. [Univ. of Stuttgart, Inst. of Aerodynamics and Gasdynamics, Stuttgart (Germany)

    1997-12-31

    This paper focuses on an improved prediction model for inflow-turbulence noise which takes the true airfoil shape into account. Predictions are compared to the results of acoustic measurements on three 2D-models of 0.25 m chord. Two of the models have NACA-636xx airfoils of 12% and 18% relative thickness. The third airfoil was acoustically optimized by using the new prediction model. In the experiments the turbulence intensity of the flow was strongly increased by mounting a grid with 60 mm wide meshes and 12 mm thick rods onto the tunnel exhaust nozzle. The sound radiated from the airfoil was distinguished by the tunnel background noise by using an acoustic antenna consisting of a cross array of 36 microphones in total. An application of a standard beam-forming algorithm allows to determine how much noise is radiated from different parts of the models. This procedure normally results in a peak at the leading and trailing edge of the airfoil. The strength of the leading-edge peak is taken as the source strength for inflow-turbulence noise. (LN) 14 refs.

  1. BAYESIAN FORECASTS COMBINATION TO IMPROVE THE ROMANIAN INFLATION PREDICTIONS BASED ON ECONOMETRIC MODELS

    Directory of Open Access Journals (Sweden)

    Mihaela Simionescu

    2014-12-01

    Full Text Available There are many types of econometric models used in predicting the inflation rate, but in this study we used a Bayesian shrinkage combination approach. This methodology is used in order to improve the predictions accuracy by including information that is not captured by the econometric models. Therefore, experts’ forecasts are utilized as prior information, for Romania these predictions being provided by Institute for Economic Forecasting (Dobrescu macromodel, National Commission for Prognosis and European Commission. The empirical results for Romanian inflation show the superiority of a fixed effects model compared to other types of econometric models like VAR, Bayesian VAR, simultaneous equations model, dynamic model, log-linear model. The Bayesian combinations that used experts’ predictions as priors, when the shrinkage parameter tends to infinite, improved the accuracy of all forecasts based on individual models, outperforming also zero and equal weights predictions and naïve forecasts.

  2. Bilirubin nomogram for prediction of significant hyperbilirubinemia in north Indian neonates.

    Science.gov (United States)

    Pathak, Umesh; Chawla, Deepak; Kaur, Saranjit; Jain, Suksham

    2013-04-01

    (i) To construct hour-specific serum total bilirubin (STB) nomogram in neonates born at =35 weeks of gestation; (ii)To evaluate efficacy of pre-discharge bilirubin measurement in predicting hyperbilirubinemia needing treatment. Diagnostic test performance in a prospective cohort study. Teaching hospital in Northern India. Healthy neonates with gestation =35 weeks or birth weight =2000 g. Serum total bilirubin was measured in all enrolled neonates at 24 ± 6, 72-96 and 96-144 h of postnatal age and when indicated clinically. Neonates were followed up during hospital stay and after discharge till completion of 7th postnatal day. Key outcome was significant hyperbilirubinemia (SHB) defined as need of phototherapy based on modified American Academy of Pediatrics (AAP) guidelines. In neonates born at 38 or more weeks of gestation middle line and in neonates born at 37 or less completed weeks of gestation, lower line of phototherapy thresholds were used to initiate phototherapy. For construction of nomogram, STB values were clubbed in six-hour epochs (age ± 3 hours) for postnatal age up to 48 h and twelve-hour epochs (age ± 6 hours) for age beyond 48 h. Predictive ability of the nomogram was assessed by calculating sensitivity, specificity, positive predictive value, negative predictive value and likelihood ratio, by plotting receiver-operating characteristics (ROC) curve and calculating c-statistic. 997 neonates (birth weight: 2627 ± 536 g, gestation: 37.8 ± 1.5 weeks) were enrolled, of which 931 completed followup. Among enrolled neonates 344 (34.5%) were low birth weight. Rate of exclusive breastfeeding during hospital stay was more than 80%. Bilirubin nomogram was constructed using 40th, 75th and 95th percentile values of hour-specific bilirubin. Pre-discharge STB of =95th percentile was assigned to be in high-risk zone, between 75th and 94th centile in upper-intermediate risk zone, between 40th and 74th centile in lower-intermediate risk zone and below 40th

  3. Using AIRS retrievals in the WRF-LETKF system to improve regional numerical weather prediction

    Directory of Open Access Journals (Sweden)

    Takemasa Miyoshi

    2012-09-01

    Full Text Available In addition to conventional observations, atmospheric temperature and humidity profile data from the Atmospheric Infrared Sounder (AIRS Version 5 retrieval products are assimilated into the Weather Research and Forecasting (WRF model, using the local ensemble transform Kalman filter (LETKF. Although a naive assimilation of all available quality-controlled AIRS retrieval data yields an inferior analysis, the additional enhancements of adaptive inflation and horizontal data thinning result in a general improvement of numerical weather prediction skill due to AIRS data. In particular, the adaptive inflation method is enhanced so that it no longer assumes temporal homogeneity of the observing network and allows for a better treatment of the temporally inhomogeneous AIRS data. Results indicate that the improvements due to AIRS data are more significant in longer-lead forecasts. Forecasts of Typhoons Sinlaku and Jangmi in September 2008 show improvements due to AIRS data.

  4. Improved part-of-speech prediction in suffix analysis.

    Directory of Open Access Journals (Sweden)

    Mario Fruzangohar

    Full Text Available MOTIVATION: Predicting the part of speech (POS tag of an unknown word in a sentence is a significant challenge. This is particularly difficult in biomedicine, where POS tags serve as an input to training sophisticated literature summarization techniques, such as those based on Hidden Markov Models (HMM. Different approaches have been taken to deal with the POS tagger challenge, but with one exception--the TnT POS tagger--previous publications on POS tagging have omitted details of the suffix analysis used for handling unknown words. The suffix of an English word is a strong predictor of a POS tag for that word. As a pre-requisite for an accurate HMM POS tagger for biomedical publications, we present an efficient suffix prediction method for integration into a POS tagger. RESULTS: We have implemented a fully functional HMM POS tagger using experimentally optimised suffix based prediction. Our simple suffix analysis method, significantly outperformed the probability interpolation based TnT method. We have also shown how important suffix analysis can be for probability estimation of a known word (in the training corpus with an unseen POS tag; a common scenario with a small training corpus. We then integrated this simple method in our POS tagger and determined an optimised parameter set for both methods, which can help developers to optimise their current algorithm, based on our results. We also introduce the concept of counting methods in maximum likelihood estimation for the first time and show how counting methods can affect the prediction result. Finally, we describe how machine-learning techniques were applied to identify words, for which prediction of POS tags were always incorrect and propose a method to handle words of this type. AVAILABILITY AND IMPLEMENTATION: Java source code, binaries and setup instructions are freely available at http://genomes.sapac.edu.au/text_mining/pos_tagger.zip.

  5. Bedtime Blood Pressure Chronotherapy Significantly Improves Hypertension Management.

    Science.gov (United States)

    Hermida, Ramón C; Ayala, Diana E; Fernández, José R; Mojón, Artemio; Crespo, Juan J; Ríos, María T; Smolensky, Michael H

    2017-10-01

    Consistent evidence of numerous studies substantiates the asleep blood pressure (BP) mean derived from ambulatory BP monitoring (ABPM) is both an independent and a stronger predictor of cardiovascular disease (CVD) risk than are daytime clinic BP measurements or the ABPM-determined awake or 24-hour BP means. Hence, cost-effective adequate control of sleep-time BP is of marked clinical relevance. Ingestion time, according to circadian rhythms, of hypertension medications of 6 different classes and their combinations significantly improves BP control, particularly sleep-time BP, and reduces adverse effects. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Spatially resolved flux measurements of NOx from London suggest significantly higher emissions than predicted by inventories.

    Science.gov (United States)

    Vaughan, Adam R; Lee, James D; Misztal, Pawel K; Metzger, Stefan; Shaw, Marvin D; Lewis, Alastair C; Purvis, Ruth M; Carslaw, David C; Goldstein, Allen H; Hewitt, C Nicholas; Davison, Brian; Beevers, Sean D; Karl, Thomas G

    2016-07-18

    To date, direct validation of city-wide emissions inventories for air pollutants has been difficult or impossible. However, recent technological innovations now allow direct measurement of pollutant fluxes from cities, for comparison with emissions inventories, which are themselves commonly used for prediction of current and future air quality and to help guide abatement strategies. Fluxes of NOx were measured using the eddy-covariance technique from an aircraft flying at low altitude over London. The highest fluxes were observed over central London, with lower fluxes measured in suburban areas. A footprint model was used to estimate the spatial area from which the measured emissions occurred. This allowed comparison of the flux measurements to the UK's National Atmospheric Emissions Inventory (NAEI) for NOx, with scaling factors used to account for the actual time of day, day of week and month of year of the measurement. The comparison suggests significant underestimation of NOx emissions in London by the NAEI, mainly due to its under-representation of real world road traffic emissions. A comparison was also carried out with an enhanced version of the inventory using real world driving emission factors and road measurement data taken from the London Atmospheric Emissions Inventory (LAEI). The measurement to inventory agreement was substantially improved using the enhanced version, showing the importance of fully accounting for road traffic, which is the dominant NOx emission source in London. In central London there was still an underestimation by the inventory of 30-40% compared with flux measurements, suggesting significant improvements are still required in the NOx emissions inventory.

  7. National Emergency Preparedness and Response: Improving for Incidents of National Significance

    National Research Council Canada - National Science Library

    Clayton, Christopher M

    2006-01-01

    The national emergency management system has need of significant improvement in its contingency planning and early consolidation of effort and coordination between federal, state, and local agencies...

  8. Big Data, Predictive Analytics, and Quality Improvement in Kidney Transplantation: A Proof of Concept.

    Science.gov (United States)

    Srinivas, T R; Taber, D J; Su, Z; Zhang, J; Mour, G; Northrup, D; Tripathi, A; Marsden, J E; Moran, W P; Mauldin, P D

    2017-03-01

    We sought proof of concept of a Big Data Solution incorporating longitudinal structured and unstructured patient-level data from electronic health records (EHR) to predict graft loss (GL) and mortality. For a quality improvement initiative, GL and mortality prediction models were constructed using baseline and follow-up data (0-90 days posttransplant; structured and unstructured for 1-year models; data up to 1 year for 3-year models) on adult solitary kidney transplant recipients transplanted during 2007-2015 as follows: Model 1: United Network for Organ Sharing (UNOS) data; Model 2: UNOS & Transplant Database (Tx Database) data; Model 3: UNOS, Tx Database & EHR comorbidity data; and Model 4: UNOS, Tx Database, EHR data, Posttransplant trajectory data, and unstructured data. A 10% 3-year GL rate was observed among 891 patients (2007-2015). Layering of data sources improved model performance; Model 1: area under the curve (AUC), 0.66; (95% confidence interval [CI]: 0.60, 0.72); Model 2: AUC, 0.68; (95% CI: 0.61-0.74); Model 3: AUC, 0.72; (95% CI: 0.66-077); Model 4: AUC, 0.84, (95 % CI: 0.79-0.89). One-year GL (AUC, 0.87; Model 4) and 3-year mortality (AUC, 0.84; Model 4) models performed similarly. A Big Data approach significantly adds efficacy to GL and mortality prediction models and is EHR deployable to optimize outcomes. © 2016 The American Society of Transplantation and the American Society of Transplant Surgeons.

  9. Improving Robustness of Hydrologic Ensemble Predictions Through Probabilistic Pre- and Post-Processing in Sequential Data Assimilation

    Science.gov (United States)

    Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.

    2018-03-01

    Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.

  10. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    Science.gov (United States)

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  11. Improved prediction of breast cancer outcome by identifying heterogeneous biomarkers.

    Science.gov (United States)

    Choi, Jonghwan; Park, Sanghyun; Yoon, Youngmi; Ahn, Jaegyoon

    2017-11-15

    Identification of genes that can be used to predict prognosis in patients with cancer is important in that it can lead to improved therapy, and can also promote our understanding of tumor progression on the molecular level. One of the common but fundamental problems that render identification of prognostic genes and prediction of cancer outcomes difficult is the heterogeneity of patient samples. To reduce the effect of sample heterogeneity, we clustered data samples using K-means algorithm and applied modified PageRank to functional interaction (FI) networks weighted using gene expression values of samples in each cluster. Hub genes among resulting prioritized genes were selected as biomarkers to predict the prognosis of samples. This process outperformed traditional feature selection methods as well as several network-based prognostic gene selection methods when applied to Random Forest. We were able to find many cluster-specific prognostic genes for each dataset. Functional study showed that distinct biological processes were enriched in each cluster, which seems to reflect different aspect of tumor progression or oncogenesis among distinct patient groups. Taken together, these results provide support for the hypothesis that our approach can effectively identify heterogeneous prognostic genes, and these are complementary to each other, improving prediction accuracy. https://github.com/mathcom/CPR. jgahn@inu.ac.kr. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  12. Improving rational thermal comfort prediction by using subpopulation characteristics: A case study at Hermitage Amsterdam.

    Science.gov (United States)

    Kramer, Rick; Schellen, Lisje; Schellen, Henk; Kingma, Boris

    2017-01-01

    This study aims to improve the prediction accuracy of the rational standard thermal comfort model, known as the Predicted Mean Vote (PMV) model, by (1) calibrating one of its input variables "metabolic rate," and (2) extending it by explicitly incorporating the variable running mean outdoor temperature (RMOT) that relates to adaptive thermal comfort. The analysis was performed with survey data ( n = 1121) and climate measurements of the indoor and outdoor environment from a one year-long case study undertaken at Hermitage Amsterdam museum in the Netherlands. The PMVs were calculated for 35 survey days using (1) an a priori assumed metabolic rate, (2) a calibrated metabolic rate found by fitting the PMVs to the thermal sensation votes (TSVs) of each respondent using an optimization routine, and (3) extending the PMV model by including the RMOT. The results show that the calibrated metabolic rate is estimated to be 1.5 Met for this case study that was predominantly visited by elderly females. However, significant differences in metabolic rates have been revealed between adults and elderly showing the importance of differentiating between subpopulations. Hence, the standard tabular values, which only differentiate between various activities, may be oversimplified for many cases. Moreover, extending the PMV model with the RMOT substantially improves the thermal sensation prediction, but thermal sensation toward extreme cool and warm sensations remains partly underestimated.

  13. Partition dataset according to amino acid type improves the prediction of deleterious non-synonymous SNPs

    International Nuclear Information System (INIS)

    Yang, Jing; Li, Yuan-Yuan; Li, Yi-Xue; Ye, Zhi-Qiang

    2012-01-01

    Highlights: ► Proper dataset partition can improve the prediction of deleterious nsSNPs. ► Partition according to original residue type at nsSNP is a good criterion. ► Similar strategy is supposed promising in other machine learning problems. -- Abstract: Many non-synonymous SNPs (nsSNPs) are associated with diseases, and numerous machine learning methods have been applied to train classifiers for sorting disease-associated nsSNPs from neutral ones. The continuously accumulated nsSNP data allows us to further explore better prediction approaches. In this work, we partitioned the training data into 20 subsets according to either original or substituted amino acid type at the nsSNP site. Using support vector machine (SVM), training classification models on each subset resulted in an overall accuracy of 76.3% or 74.9% depending on the two different partition criteria, while training on the whole dataset obtained an accuracy of only 72.6%. Moreover, the dataset was also randomly divided into 20 subsets, but the corresponding accuracy was only 73.2%. Our results demonstrated that partitioning the whole training dataset into subsets properly, i.e., according to the residue type at the nsSNP site, will improve the performance of the trained classifiers significantly, which should be valuable in developing better tools for predicting the disease-association of nsSNPs.

  14. Mid-Treatment Sleep Duration Predicts Clinically Significant Knee Osteoarthritis Pain reduction at 6 months: Effects From a Behavioral Sleep Medicine Clinical Trial.

    Science.gov (United States)

    Salwen, Jessica K; Smith, Michael T; Finan, Patrick H

    2017-02-01

    To determine the relative influence of sleep continuity (sleep efficiency, sleep onset latency, total sleep time [TST], and wake after sleep onset) on clinical pain outcomes within a trial of cognitive behavioral therapy for insomnia (CBT-I) for patients with comorbid knee osteoarthritis and insomnia. Secondary analyses were performed on data from 74 patients with comorbid insomnia and knee osteoarthritis who completed a randomized clinical trial of 8-session multicomponent CBT-I versus an active behavioral desensitization control condition (BD), including a 6-month follow-up assessment. Data used herein include daily diaries of sleep parameters, actigraphy data, and self-report questionnaires administered at specific time points. Patients who reported at least 30% improvement in self-reported pain from baseline to 6-month follow-up were considered responders (N = 31). Pain responders and nonresponders did not differ significantly at baseline across any sleep continuity measures. At mid-treatment, only TST predicted pain response via t tests and logistic regression, whereas other measures of sleep continuity were nonsignificant. Recursive partitioning analyses identified a minimum cut-point of 382 min of TST achieved at mid-treatment in order to best predict pain improvements 6-month posttreatment. Actigraphy results followed the same pattern as daily diary-based results. Clinically significant pain reductions in response to both CBT-I and BD were optimally predicted by achieving approximately 6.5 hr sleep duration by mid-treatment. Thus, tailoring interventions to increase TST early in treatment may be an effective strategy to promote long-term pain reductions. More comprehensive research on components of behavioral sleep medicine treatments that contribute to pain response is warranted. © Sleep Research Society 2016. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  15. Improved Model for Predicting the Free Energy Contribution of Dinucleotide Bulges to RNA Duplex Stability.

    Science.gov (United States)

    Tomcho, Jeremy C; Tillman, Magdalena R; Znosko, Brent M

    2015-09-01

    Predicting the secondary structure of RNA is an intermediate in predicting RNA three-dimensional structure. Commonly, determining RNA secondary structure from sequence uses free energy minimization and nearest neighbor parameters. Current algorithms utilize a sequence-independent model to predict free energy contributions of dinucleotide bulges. To determine if a sequence-dependent model would be more accurate, short RNA duplexes containing dinucleotide bulges with different sequences and nearest neighbor combinations were optically melted to derive thermodynamic parameters. These data suggested energy contributions of dinucleotide bulges were sequence-dependent, and a sequence-dependent model was derived. This model assigns free energy penalties based on the identity of nucleotides in the bulge (3.06 kcal/mol for two purines, 2.93 kcal/mol for two pyrimidines, 2.71 kcal/mol for 5'-purine-pyrimidine-3', and 2.41 kcal/mol for 5'-pyrimidine-purine-3'). The predictive model also includes a 0.45 kcal/mol penalty for an A-U pair adjacent to the bulge and a -0.28 kcal/mol bonus for a G-U pair adjacent to the bulge. The new sequence-dependent model results in predicted values within, on average, 0.17 kcal/mol of experimental values, a significant improvement over the sequence-independent model. This model and new experimental values can be incorporated into algorithms that predict RNA stability and secondary structure from sequence.

  16. Improved prediction of genetic predisposition to psychiatric disorders using genomic feature best linear unbiased prediction models

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Demontis, Ditte; Børglum, Anders

    is enriched for causal variants. Here we apply the GFBLUP model to a small schizophrenia case-control study to test the promise of this model on psychiatric disorders, and hypothesize that the performance will be increased when applying the model to a larger ADHD case-control study if the genomic feature...... contains the causal variants. Materials and Methods: The schizophrenia study consisted of 882 controls and 888 schizophrenia cases genotyped for 520,000 SNPs. The ADHD study contained 25,954 controls and 16,663 ADHD cases with 8,4 million imputed genotypes. Results: The predictive ability for schizophrenia.......6% for the null model). Conclusion: The improvement in predictive ability for schizophrenia was marginal, however, greater improvement is expected for the larger ADHD data....

  17. Incorporating Scale-Dependent Fracture Stiffness for Improved Reservoir Performance Prediction

    Science.gov (United States)

    Crawford, B. R.; Tsenn, M. C.; Homburg, J. M.; Stehle, R. C.; Freysteinson, J. A.; Reese, W. C.

    2017-12-01

    We present a novel technique for predicting dynamic fracture network response to production-driven changes in effective stress, with the potential for optimizing depletion planning and improving recovery prediction in stress-sensitive naturally fractured reservoirs. A key component of the method involves laboratory geomechanics testing of single fractures in order to develop a unique scaling relationship between fracture normal stiffness and initial mechanical aperture. Details of the workflow are as follows: tensile, opening mode fractures are created in a variety of low matrix permeability rocks with initial, unstressed apertures in the micrometer to millimeter range, as determined from image analyses of X-ray CT scans; subsequent hydrostatic compression of these fractured samples with synchronous radial strain and flow measurement indicates that both mechanical and hydraulic aperture reduction varies linearly with the natural logarithm of effective normal stress; these stress-sensitive single-fracture laboratory observations are then upscaled to networks with fracture populations displaying frequency-length and length-aperture scaling laws commonly exhibited by natural fracture arrays; functional relationships between reservoir pressure reduction and fracture network porosity, compressibility and directional permeabilities as generated by such discrete fracture network modeling are then exported to the reservoir simulator for improved naturally fractured reservoir performance prediction.

  18. Postprocessing for Air Quality Predictions

    Science.gov (United States)

    Delle Monache, L.

    2017-12-01

    In recent year, air quality (AQ) forecasting has made significant progress towards better predictions with the goal of protecting the public from harmful pollutants. This progress is the results of improvements in weather and chemical transport models, their coupling, and more accurate emission inventories (e.g., with the development of new algorithms to account in near real-time for fires). Nevertheless, AQ predictions are still affected at times by significant biases which stem from limitations in both weather and chemistry transport models. Those are the result of numerical approximations and the poor representation (and understanding) of important physical and chemical process. Moreover, although the quality of emission inventories has been significantly improved, they are still one of the main sources of uncertainties in AQ predictions. For operational real-time AQ forecasting, a significant portion of these biases can be reduced with the implementation of postprocessing methods. We will review some of the techniques that have been proposed to reduce both systematic and random errors of AQ predictions, and improve the correlation between predictions and observations of ground-level ozone and surface particulate matter less than 2.5 µm in diameter (PM2.5). These methods, which can be applied to both deterministic and probabilistic predictions, include simple bias-correction techniques, corrections inspired by the Kalman filter, regression methods, and the more recently developed analog-based algorithms. These approaches will be compared and contrasted, and strength and weaknesses of each will be discussed.

  19. Efficiency Improvement of Kalman Filter for GNSS/INS through One-Step Prediction of P Matrix

    Directory of Open Access Journals (Sweden)

    Qingli Li

    2015-01-01

    Full Text Available To meet the real-time and low power consumption demands in MEMS navigation and guidance field, an improved Kalman filter algorithm for GNSS/INS was proposed in this paper named as one-step prediction of P matrix. Quantitative analysis of field test datasets was made to compare the navigation accuracy with the standard algorithm, which indicated that the degradation caused by the simplified algorithm is small enough compared to the navigation errors of the GNSS/INS system itself. Meanwhile, the computation load and time consumption of the algorithm decreased over 50% by the improved algorithm. The work has special significance for navigation applications that request low power consumption and strict real-time response, such as cellphone, wearable devices, and deeply coupled GNSS/INS systems.

  20. Collaboration between a human group and artificial intelligence can improve prediction of multiple sclerosis course: a proof-of-principle study.

    Science.gov (United States)

    Tacchella, Andrea; Romano, Silvia; Ferraldeschi, Michela; Salvetti, Marco; Zaccaria, Andrea; Crisanti, Andrea; Grassi, Francesca

    2017-01-01

    Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR) phase, which proceeds to a secondary progressive (SP) form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm) forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients.

  1. Improving the accuracy of protein secondary structure prediction using structural alignment

    Directory of Open Access Journals (Sweden)

    Gallin Warren J

    2006-06-01

    Full Text Available Abstract Background The accuracy of protein secondary structure prediction has steadily improved over the past 30 years. Now many secondary structure prediction methods routinely achieve an accuracy (Q3 of about 75%. We believe this accuracy could be further improved by including structure (as opposed to sequence database comparisons as part of the prediction process. Indeed, given the large size of the Protein Data Bank (>35,000 sequences, the probability of a newly identified sequence having a structural homologue is actually quite high. Results We have developed a method that performs structure-based sequence alignments as part of the secondary structure prediction process. By mapping the structure of a known homologue (sequence ID >25% onto the query protein's sequence, it is possible to predict at least a portion of that query protein's secondary structure. By integrating this structural alignment approach with conventional (sequence-based secondary structure methods and then combining it with a "jury-of-experts" system to generate a consensus result, it is possible to attain very high prediction accuracy. Using a sequence-unique test set of 1644 proteins from EVA, this new method achieves an average Q3 score of 81.3%. Extensive testing indicates this is approximately 4–5% better than any other method currently available. Assessments using non sequence-unique test sets (typical of those used in proteome annotation or structural genomics indicate that this new method can achieve a Q3 score approaching 88%. Conclusion By using both sequence and structure databases and by exploiting the latest techniques in machine learning it is possible to routinely predict protein secondary structure with an accuracy well above 80%. A program and web server, called PROTEUS, that performs these secondary structure predictions is accessible at http://wishart.biology.ualberta.ca/proteus. For high throughput or batch sequence analyses, the PROTEUS programs

  2. The prediction of resting energy expenditure in type 2 diabetes mellitus is improved by factoring for glycemia.

    Science.gov (United States)

    Gougeon, R; Lamarche, M; Yale, J-F; Venuta, T

    2002-12-01

    Predictive equations have been reported to overestimate resting energy expenditure (REE) for obese persons. The presence of hyperglycemia results in elevated REE in obese persons with type 2 diabetes, and its effect on the validity of these equations is unknown. We tested whether (1) indicators of diabetes control were independent associates of REE in type 2 diabetes and (2) their inclusion would improve predictive equations. A cross-sectional study of 65 (25 men, 40 women) obese type 2 diabetic subjects. Variables measured were: REE by ventilated-hood indirect calorimetry, body composition by bioimpedance analysis, body circumferences, fasting plasma glucose (FPG) and hemoglobin A(1c). Data were analyzed using stepwise multiple linear regression. REE, corrected for weight, fat-free mass, age and gender, was significantly greater with FPG>10 mmol/l (P=0.017) and correlated with FPG (P=0.013) and hemoglobin A(1c) as percentage upper limit of normal (P=0.02). Weight was the main determinant of REE. Together with hip circumference and FPG, it explained 81% of the variation. FPG improved the predictability of the equation by >3%. With poor glycemic control, it can represent an increase in REE of up to 8%. Our data indicate that in a population of obese subjects with type 2 diabetes mellitus, REE is better predicted when fasting plasma glucose is included as a variable.

  3. Clonal Evaluation of Prostate Cancer by ERG/SPINK1 Status to Improve Prognosis Prediction

    Science.gov (United States)

    2017-12-01

    19 NIH Exploiting drivers of androgen receptor signaling negative prostate cancer for precision medicine Goal(s): Identify novel potential drivers...AWARD NUMBER: W81XWH-14-1-0466 TITLE: Clonal evaluation of prostate cancer by ERG/SPINK1 status to improve prognosis prediction PRINCIPAL...Sept 2017 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Clonal Evaluation of Prostate Cancer by ERG/SPINK1 Status to Improve Prognosis Prediction 5b

  4. Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.

    Science.gov (United States)

    Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren

    2014-12-01

    Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.

  5. Temperature prediction model of asphalt pavement in cold regions based on an improved BP neural network

    International Nuclear Information System (INIS)

    Xu, Bo; Dan, Han-Cheng; Li, Liang

    2017-01-01

    Highlights: • Pavement temperature prediction model is presented with improved BP neural network. • Dynamic and static methods are presented to predict pavement temperature. • Pavement temperature can be excellently predicted in next 3 h. - Abstract: Ice cover on pavement threatens traffic safety, and pavement temperature is the main factor used to determine whether the wet pavement is icy or not. In this paper, a temperature prediction model of the pavement in winter is established by introducing an improved Back Propagation (BP) neural network model. Before the application of the BP neural network model, many efforts were made to eliminate chaos and determine the regularity of temperature on the pavement surface (e.g., analyze the regularity of diurnal and monthly variations of pavement temperature). New dynamic and static prediction methods are presented by improving the algorithms to intelligently overcome the prediction inaccuracy at the change point of daily temperature. Furthermore, some scenarios have been compared for different dates and road sections to verify the reliability of the prediction model. According to the analysis results, the daily pavement temperatures can be accurately predicted for the next 3 h from the time of prediction by combining the dynamic and static prediction methods. The presented method in this paper can provide technical references for temperature prediction of the pavement and the development of an early-warning system for icy pavements in cold regions.

  6. Improving residue-residue contact prediction via low-rank and sparse decomposition of residue correlation matrix.

    Science.gov (United States)

    Zhang, Haicang; Gao, Yujuan; Deng, Minghua; Wang, Chao; Zhu, Jianwei; Li, Shuai Cheng; Zheng, Wei-Mou; Bu, Dongbo

    2016-03-25

    Strategies for correlation analysis in protein contact prediction often encounter two challenges, namely, the indirect coupling among residues, and the background correlations mainly caused by phylogenetic biases. While various studies have been conducted on how to disentangle indirect coupling, the removal of background correlations still remains unresolved. Here, we present an approach for removing background correlations via low-rank and sparse decomposition (LRS) of a residue correlation matrix. The correlation matrix can be constructed using either local inference strategies (e.g., mutual information, or MI) or global inference strategies (e.g., direct coupling analysis, or DCA). In our approach, a correlation matrix was decomposed into two components, i.e., a low-rank component representing background correlations, and a sparse component representing true correlations. Finally the residue contacts were inferred from the sparse component of correlation matrix. We trained our LRS-based method on the PSICOV dataset, and tested it on both GREMLIN and CASP11 datasets. Our experimental results suggested that LRS significantly improves the contact prediction precision. For example, when equipped with the LRS technique, the prediction precision of MI and mfDCA increased from 0.25 to 0.67 and from 0.58 to 0.70, respectively (Top L/10 predicted contacts, sequence separation: 5 AA, dataset: GREMLIN). In addition, our LRS technique also consistently outperforms the popular denoising technique APC (average product correction), on both local (MI_LRS: 0.67 vs MI_APC: 0.34) and global measures (mfDCA_LRS: 0.70 vs mfDCA_APC: 0.67). Interestingly, we found out that when equipped with our LRS technique, local inference strategies performed in a comparable manner to that of global inference strategies, implying that the application of LRS technique narrowed down the performance gap between local and global inference strategies. Overall, our LRS technique greatly facilitates

  7. Improved Trust Prediction in Business Environments by Adaptive Neuro Fuzzy Inference Systems

    Directory of Open Access Journals (Sweden)

    Ali Azadeh

    2015-06-01

    Full Text Available Trust prediction turns out to be an important challenge when cooperation among intelligent agents with an impression of trust in their mind, is investigated. In other words, predicting trust values for future time slots help partners to identify the probability of continuing a relationship. Another important case to be considered is the context of trust, i.e. the services and business commitments for which a relationship is defined. Hence, intelligent agents should focus on improving trust to provide a stable and confident context. Modelling of trust between collaborating parties seems to be an important component of the business intelligence strategy. In this regard, a set of metrics have been considered by which the value of confidence level for predicted trust values has been estimated. These metrics are maturity, distance and density (MD2. Prediction of trust for future mutual relationships among agents is a problem that is addressed in this study. We introduce a simulation-based model which utilizes linguistic variables to create various scenarios. Then, future trust values among agents are predicted by the concept of adaptive neuro-fuzzy inference system (ANFIS. Mean absolute percentage errors (MAPEs resulted from ANFIS are compared with confidence levels which are determined by applying MD2. Results determine the efficiency of MD2 for forecasting trust values. This is the first study that utilizes the concept of MD2 for improvement of business trust prediction.

  8. Differential Role of CBT Skills, DBT Skills and Psychological Flexibility in Predicting Depressive versus Anxiety Symptom Improvement

    Science.gov (United States)

    Webb, Christian A.; Beard, Courtney; Kertz, Sarah J.; Hsu, Kean; Björgvinsson, Thröstur

    2016-01-01

    Objective Studies have reported associations between cognitive behavioral therapy (CBT) skill use and symptom improvement in depressed outpatient samples. However, little is known regarding the temporal relationship between different subsets of therapeutic skills and symptom change among relatively severely depressed patients receiving treatment in psychiatric hospital settings. Method Adult patients with major depression (N=173) receiving combined psychotherapeutic and pharmacological treatment at a psychiatric hospital completed repeated assessments of traditional CBT skills, DBT skills and psychological flexibility, as well as depressive and anxiety symptoms. Results Results indicated that only use of behavioral activation (BA) strategies significantly predicted depressive symptom improvement in this sample; whereas DBT skills and psychological flexibility predicted anxiety symptom change. In addition, a baseline symptom severity X BA strategies interaction emerged indicating that those patients with higher pretreatment depression severity exhibited the strongest association between use of BA strategies and depressive symptom improvement. Conclusions Findings suggest the importance of emphasizing the acquisition and regular use of BA strategies with severely depressed patients in short-term psychiatric settings. In contrast, an emphasis on the development of DBT skills and the cultivation of psychological flexibility may prove beneficial for the amelioration of anxiety symptoms. PMID:27057997

  9. Surface tensions of multi-component mixed inorganic/organic aqueous systems of atmospheric significance: measurements, model predictions and importance for cloud activation predictions

    Directory of Open Access Journals (Sweden)

    D. O. Topping

    2007-01-01

    Full Text Available In order to predict the physical properties of aerosol particles, it is necessary to adequately capture the behaviour of the ubiquitous complex organic components. One of the key properties which may affect this behaviour is the contribution of the organic components to the surface tension of aqueous particles in the moist atmosphere. Whilst the qualitative effect of organic compounds on solution surface tensions has been widely reported, our quantitative understanding on mixed organic and mixed inorganic/organic systems is limited. Furthermore, it is unclear whether models that exist in the literature can reproduce the surface tension variability for binary and higher order multi-component organic and mixed inorganic/organic systems of atmospheric significance. The current study aims to resolve both issues to some extent. Surface tensions of single and multiple solute aqueous solutions were measured and compared with predictions from a number of model treatments. On comparison with binary organic systems, two predictive models found in the literature provided a range of values resulting from sensitivity to calculations of pure component surface tensions. Results indicate that a fitted model can capture the variability of the measured data very well, producing the lowest average percentage deviation for all compounds studied. The performance of the other models varies with compound and choice of model parameters. The behaviour of ternary mixed inorganic/organic systems was unreliably captured by using a predictive scheme and this was dependent on the composition of the solutes present. For more atmospherically representative higher order systems, entirely predictive schemes performed poorly. It was found that use of the binary data in a relatively simple mixing rule, or modification of an existing thermodynamic model with parameters derived from binary data, was able to accurately capture the surface tension variation with concentration. Thus

  10. Improving accuracy of protein-protein interaction prediction by considering the converse problem for sequence representation

    Directory of Open Access Journals (Sweden)

    Wang Yong

    2011-10-01

    Full Text Available Abstract Background With the development of genome-sequencing technologies, protein sequences are readily obtained by translating the measured mRNAs. Therefore predicting protein-protein interactions from the sequences is of great demand. The reason lies in the fact that identifying protein-protein interactions is becoming a bottleneck for eventually understanding the functions of proteins, especially for those organisms barely characterized. Although a few methods have been proposed, the converse problem, if the features used extract sufficient and unbiased information from protein sequences, is almost untouched. Results In this study, we interrogate this problem theoretically by an optimization scheme. Motivated by the theoretical investigation, we find novel encoding methods for both protein sequences and protein pairs. Our new methods exploit sufficiently the information of protein sequences and reduce artificial bias and computational cost. Thus, it significantly outperforms the available methods regarding sensitivity, specificity, precision, and recall with cross-validation evaluation and reaches ~80% and ~90% accuracy in Escherichia coli and Saccharomyces cerevisiae respectively. Our findings here hold important implication for other sequence-based prediction tasks because representation of biological sequence is always the first step in computational biology. Conclusions By considering the converse problem, we propose new representation methods for both protein sequences and protein pairs. The results show that our method significantly improves the accuracy of protein-protein interaction predictions.

  11. [Improvement of Phi bodies stain and its clinical significance].

    Science.gov (United States)

    Gong, Xu-Bo; Lu, Xing-Guo; Yan, Li-Juan; Xiao, Xi-Bin; Wu, Dong; Xu, Gen-Bo; Zhang, Xiao-Hong; Zhao, Xiao-Ying

    2009-02-01

    The aim of this study was to improve the dyeing method of hydroperoxidase (HPO), to analyze the morphologic features of Phi bodies and to evaluate the clinical application of this method. 128 bone marrow or peripheral blood smears from patients with myeloid and lymphoid malignancies were stained by improved HPO staining. The Phi bodies were observed with detection rate of Phi bodies in different leukemias. 69 acute myeloid leukemia (AML) specimens were chosen randomly, the positive rate and the number of Phi bodies between the improved HPO and POX stain based on the same substrate of 3, 3'diaminobenzidine were compared. The results showed that the shape of bundle-like Phi bodies was variable, long or short. while the nubbly Phi bodies often presented oval and smooth. Club-like Phi bodies were found in M(3). The detection rates of bundle-like Phi bodies in AML M(1)-M(5) were 42.9% (6/14), 83.3% (15/18), 92.0% (23/25), 52.3% (11/21), 33.3% (5/15) respectively, and those of nubbly Phi bodies were 28.6% (4/14), 66.7% (12/18), 11.1% (3/25), 33.3% (7/21), 20.0% (3/15) respectively. The detection rate of bundle-like Phi bodies in M(3) was significantly higher than that in (M(1) + M(2)) or (M(4) + M(5)) groups. The detection rate of nubbly Phi bodies in (M(1) + M(2)) group was higher than that in M(3) group. In conclusion, after improvement of staining method, the HPO stain becomes simple, the detection rate of Phi bodies is higher than that by the previous method, the positive granules are more obvious, and the results become stable. This improved method plays an important role in differentiating AML from ALL, subtyping AML, and evaluating the therapeutic results.

  12. MEMS based shock pulse detection sensor for improved rotary Stirling cooler end of life prediction

    Science.gov (United States)

    Hübner, M.; Münzberg, M.

    2018-05-01

    The widespread use of rotary Stirling coolers in high performance thermal imagers used for critical 24/7 surveillance tasks justifies any effort to significantly enhance the reliability and predictable uptime of those coolers. Typically the lifetime of the whole imaging device is limited due to continuous wear and finally failure of the rotary compressor of the Stirling cooler, especially due to failure of the comprised bearings. MTTF based lifetime predictions, even based on refined MTTF models taking operational scenario dependent scaling factors into account, still lack in precision to forecast accurately the end of life (EOL) of individual coolers. Consequently preventive maintenance of individual coolers to avoid failures of the main sensor in critical operational scenarios are very costly or even useless. We have developed an integrated test method based on `Micro Electromechanical Systems', so called MEMS sensors, which significantly improves the cooler EOL prediction. The recently commercially available MEMS acceleration sensors have mechanical resonance frequencies up to 50 kHz. They are able to detect solid borne shock pulses in the cooler structure, originating from e.g. metal on metal impacts driven by periodical forces acting on moving inner parts of the rotary compressor within wear dependent slack and play. The impact driven transient shock pulse analyses uses only the high frequency signal <10kHz and differs therefore from the commonly used broadband low frequencies vibrational analysis of reciprocating machines. It offers a direct indicator of the individual state of wear. The predictive cooler lifetime model based on the shock pulse analysis is presented and results are discussed.

  13. Network information improves cancer outcome prediction.

    Science.gov (United States)

    Roy, Janine; Winter, Christof; Isik, Zerrin; Schroeder, Michael

    2014-07-01

    Disease progression in cancer can vary substantially between patients. Yet, patients often receive the same treatment. Recently, there has been much work on predicting disease progression and patient outcome variables from gene expression in order to personalize treatment options. Despite first diagnostic kits in the market, there are open problems such as the choice of random gene signatures or noisy expression data. One approach to deal with these two problems employs protein-protein interaction networks and ranks genes using the random surfer model of Google's PageRank algorithm. In this work, we created a benchmark dataset collection comprising 25 cancer outcome prediction datasets from literature and systematically evaluated the use of networks and a PageRank derivative, NetRank, for signature identification. We show that the NetRank performs significantly better than classical methods such as fold change or t-test. Despite an order of magnitude difference in network size, a regulatory and protein-protein interaction network perform equally well. Experimental evaluation on cancer outcome prediction in all of the 25 underlying datasets suggests that the network-based methodology identifies highly overlapping signatures over all cancer types, in contrast to classical methods that fail to identify highly common gene sets across the same cancer types. Integration of network information into gene expression analysis allows the identification of more reliable and accurate biomarkers and provides a deeper understanding of processes occurring in cancer development and progression. © The Author 2012. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  14. Detecting determinism with improved sensitivity in time series: rank-based nonlinear predictability score.

    Science.gov (United States)

    Naro, Daniel; Rummel, Christian; Schindler, Kaspar; Andrzejak, Ralph G

    2014-09-01

    The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).

  15. Improving Multi-Sensor Drought Monitoring, Prediction and Recovery Assessment Using Gravimetry Information

    Science.gov (United States)

    Aghakouchak, Amir; Tourian, Mohammad J.

    2015-04-01

    Development of reliable drought monitoring, prediction and recovery assessment tools are fundamental to water resources management. This presentation focuses on how gravimetry information can improve drought assessment. First, we provide an overview of the Global Integrated Drought Monitoring and Prediction System (GIDMaPS) which offers near real-time drought information using remote sensing observations and model simulations. Then, we present a framework for integration of satellite gravimetry information for improving drought prediction and recovery assessment. The input data include satellite-based and model-based precipitation, soil moisture estimates and equivalent water height. Previous studies show that drought assessment based on one single indicator may not be sufficient. For this reason, GIDMaPS provides drought information based on multiple drought indicators including Standardized Precipitation Index (SPI), Standardized Soil Moisture Index (SSI) and the Multivariate Standardized Drought Index (MSDI) which combines SPI and SSI probabilistically. MSDI incorporates the meteorological and agricultural drought conditions and provides composite multi-index drought information for overall characterization of droughts. GIDMaPS includes a seasonal prediction component based on a statistical persistence-based approach. The prediction component of GIDMaPS provides the empirical probability of drought for different severity levels. In this presentation we present a new component in which the drought prediction information based on SPI, SSI and MSDI are conditioned on equivalent water height obtained from the Gravity Recovery and Climate Experiment (GRACE). Using a Bayesian approach, GRACE information is used to evaluate persistence of drought. Finally, the deficit equivalent water height based on GRACE is used for assessing drought recovery. In this presentation, both monitoring and prediction components of GIDMaPS will be discussed, and the results from 2014

  16. Combination of blood tests for significant fibrosis and cirrhosis improves the assessment of liver-prognosis in chronic hepatitis C.

    Science.gov (United States)

    Boursier, J; Brochard, C; Bertrais, S; Michalak, S; Gallois, Y; Fouchard-Hubert, I; Oberti, F; Rousselet, M-C; Calès, P

    2014-07-01

    Recent longitudinal studies have emphasised the prognostic value of noninvasive tests of liver fibrosis and cross-sectional studies have shown their combination significantly improves diagnostic accuracy. To compare the prognostic accuracy of six blood fibrosis tests and liver biopsy, and evaluate if test combination improves the liver-prognosis assessment in chronic hepatitis C (CHC). A total of 373 patients with compensated CHC, liver biopsy (Metavir F) and blood tests targeting fibrosis (APRI, FIB4, Fibrotest, Hepascore, FibroMeter) or cirrhosis (CirrhoMeter) were included. Significant liver-related events (SLRE) and liver-related deaths were recorded during follow-up (started the day of biopsy). During the median follow-up of 9.5 years (3508 person-years), 47 patients had a SLRE and 23 patients died from liver-related causes. For the prediction of first SLRE, most blood tests allowed higher prognostication than Metavir F [Harrell C-index: 0.811 (95% CI: 0.751-0.868)] with a significant increase for FIB4: 0.879 [0.832-0.919] (P = 0.002), FibroMeter: 0.870 [0.812-0.922] (P = 0.005) and APRI: 0.861 [0.813-0.902] (P = 0.039). Multivariate analysis identified FibroMeter, CirrhoMeter and sustained viral response as independent predictors of first SLRE. CirrhoMeter was the only independent predictor of liver-related death. The combination of FibroMeter and CirrhoMeter classifications into a new FM/CM classification improved the liver-prognosis assessment compared to Metavir F staging or single tests by identifying five subgroups of patients with significantly different prognoses. Some blood fibrosis tests are more accurate than liver biopsy for determining liver prognosis in CHC. A new combination of two complementary blood tests, one targeted for fibrosis and the other for cirrhosis, optimises assessment of liver-prognosis. © 2014 John Wiley & Sons Ltd.

  17. Significant interarm blood pressure difference predicts cardiovascular risk in hypertensive patients

    Science.gov (United States)

    Kim, Su-A; Kim, Jang Young; Park, Jeong Bae

    2016-01-01

    Abstract There has been a rising interest in interarm blood pressure difference (IAD), due to its relationship with peripheral arterial disease and its possible relationship with cardiovascular disease. This study aimed to characterize hypertensive patients with a significant IAD in relation to cardiovascular risk. A total of 3699 patients (mean age, 61 ± 11 years) were prospectively enrolled in the study. Blood pressure (BP) was measured simultaneously in both arms 3 times using an automated cuff-oscillometric device. IAD was defined as the absolute difference in averaged BPs between the left and right arm, and an IAD ≥ 10 mm Hg was considered to be significant. The Framingham risk score was used to calculate the 10-year cardiovascular risk. The mean systolic IAD (sIAD) was 4.3 ± 4.1 mm Hg, and 285 (7.7%) patients showed significant sIAD. Patients with significant sIAD showed larger body mass index (P < 0.001), greater systolic BP (P = 0.050), more coronary artery disease (relative risk = 1.356, P = 0.034), and more cerebrovascular disease (relative risk = 1.521, P = 0.072). The mean 10-year cardiovascular risk was 9.3 ± 7.7%. By multiple regression, sIAD was significantly but weakly correlated with the 10-year cardiovascular risk (β = 0.135, P = 0.008). Patients with significant sIAD showed a higher prevalence of coronary artery disease, as well as an increase in 10-year cardiovascular risk. Therefore, accurate measurements of sIAD may serve as a simple and cost-effective tool for predicting cardiovascular risk in clinical settings. PMID:27310982

  18. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  19. Can survival prediction be improved by merging gene expression data sets?

    Directory of Open Access Journals (Sweden)

    Haleh Yasrebi

    Full Text Available BACKGROUND: High-throughput gene expression profiling technologies generating a wealth of data, are increasingly used for characterization of tumor biopsies for clinical trials. By applying machine learning algorithms to such clinically documented data sets, one hopes to improve tumor diagnosis, prognosis, as well as prediction of treatment response. However, the limited number of patients enrolled in a single trial study limits the power of machine learning approaches due to over-fitting. One could partially overcome this limitation by merging data from different studies. Nevertheless, such data sets differ from each other with regard to technical biases, patient selection criteria and follow-up treatment. It is therefore not clear at all whether the advantage of increased sample size outweighs the disadvantage of higher heterogeneity of merged data sets. Here, we present a systematic study to answer this question specifically for breast cancer data sets. We use survival prediction based on Cox regression as an assay to measure the added value of merged data sets. RESULTS: Using time-dependent Receiver Operating Characteristic-Area Under the Curve (ROC-AUC and hazard ratio as performance measures, we see in overall no significant improvement or deterioration of survival prediction with merged data sets as compared to individual data sets. This apparently was due to the fact that a few genes with strong prognostic power were not available on all microarray platforms and thus were not retained in the merged data sets. Surprisingly, we found that the overall best performance was achieved with a single-gene predictor consisting of CYB5D1. CONCLUSIONS: Merging did not deteriorate performance on average despite (a The diversity of microarray platforms used. (b The heterogeneity of patients cohorts. (c The heterogeneity of breast cancer disease. (d Substantial variation of time to death or relapse. (e The reduced number of genes in the merged data

  20. Rapid improvements in emotion regulation predict intensive treatment outcome for patients with bulimia nervosa and purging disorder.

    Science.gov (United States)

    MacDonald, Danielle E; Trottier, Kathryn; Olmsted, Marion P

    2017-10-01

    Rapid and substantial behavior change (RSBC) early in cognitive behavior therapy (CBT) for eating disorders is the strongest known predictor of treatment outcome. Rapid change in other clinically relevant variables may also be important. This study examined whether rapid change in emotion regulation predicted treatment outcomes, beyond the effects of RSBC. Participants were diagnosed with bulimia nervosa or purging disorder (N = 104) and completed ≥6 weeks of CBT-based intensive treatment. Hierarchical regression models were used to test whether rapid change in emotion regulation variables predicted posttreatment outcomes, defined in three ways: (a) binge/purge abstinence; (b) cognitive eating disorder psychopathology; and (c) depression symptoms. Baseline psychopathology and emotion regulation difficulties and RSBC were controlled for. After controlling for baseline variables and RSBC, rapid improvement in access to emotion regulation strategies made significant unique contributions to the prediction of posttreatment binge/purge abstinence, cognitive psychopathology of eating disorders, and depression symptoms. Individuals with eating disorders who rapidly improve their belief that they can effectively modulate negative emotions are more likely to achieve a variety of good treatment outcomes. This supports the formal inclusion of emotion regulation skills early in CBT, and encouraging patient beliefs that these strategies are helpful. © 2017 Wiley Periodicals, Inc.

  1. Advanced Materials Test Methods for Improved Life Prediction of Turbine Engine Components

    National Research Council Canada - National Science Library

    Stubbs, Jack

    2000-01-01

    Phase I final report developed under SBIR contract for Topic # AF00-149, "Durability of Turbine Engine Materials/Advanced Material Test Methods for Improved Use Prediction of Turbine Engine Components...

  2. Improved model predictive control for high voltage quality in microgrid applications

    DEFF Research Database (Denmark)

    Dragicevic, T.; Al hasheem, Mohamed; Lu, M.

    2017-01-01

    This paper proposes an improvement of the finite control set model predictive control (FCS-MPC) strategy for enhancing the voltage regulation performance of a voltage source converter (VSC) used for standalone microgrid and uninterrupted power supply (UPS) applications. The modification is based...

  3. Does increasing steps per day predict improvement in physical function and pain interference in adults with fibromyalgia?

    Science.gov (United States)

    Kaleth, Anthony S; Slaven, James E; Ang, Dennis C

    2014-12-01

    To examine the concurrent and predictive associations between the number of steps taken per day and clinical outcomes in patients with fibromyalgia (FM). A total of 199 adults with FM (mean age 46.1 years, 95% women) who were enrolled in a randomized clinical trial wore a hip-mounted accelerometer for 1 week and completed self-report measures of physical function (Fibromyalgia Impact Questionnaire-Physical Impairment [FIQ-PI], Short Form 36 [SF-36] health survey physical component score [PCS], pain intensity and interference (Brief Pain Inventory [BPI]), and depressive symptoms (Patient Health Questionnaire-8 [PHQ-8]) as part of their baseline and followup assessments. Associations of steps per day with self-report clinical measures were evaluated from baseline to week 12 using multivariate regression models adjusted for demographic and baseline covariates. Study participants were primarily sedentary, averaging 4,019 ± 1,530 steps per day. Our findings demonstrate a linear relationship between the change in steps per day and improvement in health outcomes for FM. Incremental increases on the order of 1,000 steps per day were significantly associated with (and predictive of) improvements in FIQ-PI, SF-36 PCS, BPI pain interference, and PHQ-8 (all P physical activity. An exercise prescription that includes recommendations to gradually accumulate at least 5,000 additional steps per day may result in clinically significant improvements in outcomes relevant to patients with FM. Future studies are needed to elucidate the dose-response relationship between steps per day and patient outcomes in FM. Copyright © 2014 by the American College of Rheumatology.

  4. Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM

    Science.gov (United States)

    Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan

    2018-02-01

    The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.

  5. Improving local clustering based top-L link prediction methods via asymmetric link clustering information

    Science.gov (United States)

    Wu, Zhihao; Lin, Youfang; Zhao, Yiji; Yan, Hongyan

    2018-02-01

    Networks can represent a wide range of complex systems, such as social, biological and technological systems. Link prediction is one of the most important problems in network analysis, and has attracted much research interest recently. Many link prediction methods have been proposed to solve this problem with various techniques. We can note that clustering information plays an important role in solving the link prediction problem. In previous literatures, we find node clustering coefficient appears frequently in many link prediction methods. However, node clustering coefficient is limited to describe the role of a common-neighbor in different local networks, because it cannot distinguish different clustering abilities of a node to different node pairs. In this paper, we shift our focus from nodes to links, and propose the concept of asymmetric link clustering (ALC) coefficient. Further, we improve three node clustering based link prediction methods via the concept of ALC. The experimental results demonstrate that ALC-based methods outperform node clustering based methods, especially achieving remarkable improvements on food web, hamster friendship and Internet networks. Besides, comparing with other methods, the performance of ALC-based methods are very stable in both globalized and personalized top-L link prediction tasks.

  6. Methods to improve genomic prediction and GWAS using combined Holstein populations

    DEFF Research Database (Denmark)

    Li, Xiujin

    The thesis focuses on methods to improve GWAS and genomic prediction using combined Holstein populations and investigations G by E interaction. The conclusions are: 1) Prediction reliabilities for Brazilian Holsteins can be increased by adding Nordic and Frensh genotyped bulls and a large G by E...... interaction exists between populations. 2) Combining data from Chinese and Danish Holstein populations increases the power of GWAS and detects new QTL regions for milk fatty acid traits. 3) The novel multi-trait Bayesian model efficiently estimates region-specific genomic variances, covariances...

  7. Improved prediction of residue flexibility by embedding optimized amino acid grouping into RSA-based linear models.

    Science.gov (United States)

    Zhang, Hua; Kurgan, Lukasz

    2014-12-01

    Knowledge of protein flexibility is vital for deciphering the corresponding functional mechanisms. This knowledge would help, for instance, in improving computational drug design and refinement in homology-based modeling. We propose a new predictor of the residue flexibility, which is expressed by B-factors, from protein chains that use local (in the chain) predicted (or native) relative solvent accessibility (RSA) and custom-derived amino acid (AA) alphabets. Our predictor is implemented as a two-stage linear regression model that uses RSA-based space in a local sequence window in the first stage and a reduced AA pair-based space in the second stage as the inputs. This method is easy to comprehend explicit linear form in both stages. Particle swarm optimization was used to find an optimal reduced AA alphabet to simplify the input space and improve the prediction performance. The average correlation coefficients between the native and predicted B-factors measured on a large benchmark dataset are improved from 0.65 to 0.67 when using the native RSA values and from 0.55 to 0.57 when using the predicted RSA values. Blind tests that were performed on two independent datasets show consistent improvements in the average correlation coefficients by a modest value of 0.02 for both native and predicted RSA-based predictions.

  8. Improved predictability of droughts over southern Africa using the standardized precipitation evapotranspiration index and ENSO

    Science.gov (United States)

    Manatsa, Desmond; Mushore, Terrence; Lenouo, Andre

    2017-01-01

    The provision of timely and reliable climate information on which to base management decisions remains a critical component in drought planning for southern Africa. In this observational study, we have not only proposed a forecasting scheme which caters for timeliness and reliability but improved relevance of the climate information by using a novel drought index called the standardised precipitation evapotranspiration index (SPEI), instead of the traditional precipitation only based index, the standardised precipitation index (SPI). The SPEI which includes temperature and other climatic factors in its construction has a more robust connection to ENSO than the SPI. Consequently, the developed ENSO-SPEI prediction scheme can provide quantitative information about the spatial extent and severity of predicted drought conditions in a way that reflects more closely the level of risk in the global warming context of the sub region. However, it is established that the ENSO significant regional impact is restricted only to the period December-March, implying a revisit to the traditional ENSO-based forecast scheme which essentially divides the rainfall season into the two periods, October to December and January to March. Although the prediction of ENSO events has increased with the refinement of numerical models, this work has demonstrated that the prediction of drought impacts related to ENSO is also a reality based only on observations. A large temporal lag is observed between the development of ENSO phenomena (typically in May of the previous year) and the identification of regional SPEI defined drought conditions. It has been shown that using the Southern Africa Regional Climate Outlook Forum's (SARCOF) traditional 3-month averaged Nino 3.4 SST index (June to August) as a predictor does not have an added advantage over using only the May SST index values. In this regard, the extended lead time and improved skill demonstrated in this study could immensely benefit

  9. Clinical Significance of Hemostatic Parameters in the Prediction for Type 2 Diabetes Mellitus and Diabetic Nephropathy

    Directory of Open Access Journals (Sweden)

    Lianlian Pan

    2018-01-01

    Full Text Available It would be important to predict type 2 diabetes mellitus (T2DM and diabetic nephropathy (DN. This study was aimed at evaluating the predicting significance of hemostatic parameters for T2DM and DN. Plasma coagulation and hematologic parameters before treatment were measured in 297 T2DM patients. The risk factors and their predicting power were evaluated. T2DM patients without complications exhibited significantly different activated partial thromboplastin time (aPTT, platelet (PLT, and D-dimer (D-D levels compared with controls (P<0.01. Fibrinogen (FIB, PLT, and D-D increased in DN patients compared with those without complications (P<0.001. Both aPTT and PLT were the independent risk factors for T2DM (OR: 1.320 and 1.211, P<0.01, resp., and FIB and PLT were the independent risk factors for DN (OR: 1.611 and 1.194, P<0.01, resp.. The area under ROC curve (AUC of aPTT and PLT was 0.592 and 0.647, respectively, with low sensitivity in predicting T2DM. AUC of FIB was 0.874 with high sensitivity (85% and specificity (76% for DN, and that of PLT was 0.564, with sensitivity (60% and specificity (89% based on the cutoff values of 3.15 g/L and 245 × 109/L, respectively. This study suggests that hemostatic parameters have a low predicting value for T2DM, whereas fibrinogen is a powerful predictor for DN.

  10. Artificial neural networks to predict presence of significant pathology in patients presenting to routine colorectal clinics.

    Science.gov (United States)

    Maslekar, S; Gardiner, A B; Monson, J R T; Duthie, G S

    2010-12-01

    Artificial neural networks (ANNs) are computer programs used to identify complex relations within data. Routine predictions of presence of colorectal pathology based on population statistics have little meaning for individual patient. This results in large number of unnecessary lower gastrointestinal endoscopies (LGEs - colonoscopies and flexible sigmoidoscopies). We aimed to develop a neural network algorithm that can accurately predict presence of significant pathology in patients attending routine outpatient clinics for gastrointestinal symptoms. Ethics approval was obtained and the study was monitored according to International Committee on Harmonisation - Good Clinical Practice (ICH-GCP) standards. Three-hundred patients undergoing LGE prospectively completed a specifically developed questionnaire, which included 40 variables based on clinical symptoms, signs, past- and family history. Complete data sets of 100 patients were used to train the ANN; the remaining data was used for internal validation. The primary output used was positive finding on LGE, including polyps, cancer, diverticular disease or colitis. For external validation, the ANN was applied to data from 50 patients in primary care and also compared with the predictions of four clinicians. Clear correlation between actual data value and ANN predictions were found (r = 0.931; P = 0.0001). The predictive accuracy of ANN was 95% in training group and 90% (95% CI 84-96) in the internal validation set and this was significantly higher than the clinical accuracy (75%). ANN also showed high accuracy in the external validation group (89%). Artificial neural networks offer the possibility of personal prediction of outcome for individual patients presenting in clinics with colorectal symptoms, making it possible to make more appropriate requests for lower gastrointestinal endoscopy. © 2010 The Authors. Colorectal Disease © 2010 The Association of Coloproctology of Great Britain and Ireland.

  11. Large-scale binding ligand prediction by improved patch-based method Patch-Surfer2.0.

    Science.gov (United States)

    Zhu, Xiaolei; Xiong, Yi; Kihara, Daisuke

    2015-03-01

    Ligand binding is a key aspect of the function of many proteins. Thus, binding ligand prediction provides important insight in understanding the biological function of proteins. Binding ligand prediction is also useful for drug design and examining potential drug side effects. We present a computational method named Patch-Surfer2.0, which predicts binding ligands for a protein pocket. By representing and comparing pockets at the level of small local surface patches that characterize physicochemical properties of the local regions, the method can identify binding pockets of the same ligand even if they do not share globally similar shapes. Properties of local patches are represented by an efficient mathematical representation, 3D Zernike Descriptor. Patch-Surfer2.0 has significant technical improvements over our previous prototype, which includes a new feature that captures approximate patch position with a geodesic distance histogram. Moreover, we constructed a large comprehensive database of ligand binding pockets that will be searched against by a query. The benchmark shows better performance of Patch-Surfer2.0 over existing methods. http://kiharalab.org/patchsurfer2.0/ CONTACT: dkihara@purdue.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Respiratory sinus arrhythmia reactivity to a sad film predicts depression symptom improvement and symptomatic trajectory.

    Science.gov (United States)

    Panaite, Vanessa; Hindash, Alexandra Cowden; Bylsma, Lauren M; Small, Brent J; Salomon, Kristen; Rottenberg, Jonathan

    2016-01-01

    Respiratory sinus arrhythmia (RSA) reactivity, an index of cardiac vagal tone, has been linked to self-regulation and the severity and course of depression (Rottenberg, 2007). Although initial data supports the proposition that RSA withdrawal during a sad film is a specific predictor of depression course (Fraguas, 2007; Rottenberg, 2005), the robustness and specificity of this finding are unclear. To provide a stronger test, RSA reactivity to three emotion films (happy, sad, fear) and to a more robust stressor, a speech task, were examined in currently depressed individuals (n=37), who were assessed for their degree of symptomatic improvement over 30weeks. Robust RSA reactivity to the sad film uniquely predicted overall symptom improvement over 30weeks. RSA reactivity to both sad and stressful stimuli predicted the speed and maintenance of symptomatic improvement. The current analyses provide the most robust support to date that RSA withdrawal to sad stimuli (but not stressful) has specificity in predicting the overall symptomatic improvement. In contrast, RSA reactivity to negative stimuli (both sad and stressful) predicted the trajectory of depression course. Patients' engagement with sad stimuli may be an important sign to attend to in therapeutic settings. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Adjusting the Stems Regional Forest Growth Model to Improve Local Predictions

    Science.gov (United States)

    W. Brad Smith

    1983-01-01

    A simple procedure using double sampling is described for adjusting growth in the STEMS regional forest growth model to compensate for subregional variations. Predictive accuracy of the STEMS model (a distance-independent, individual tree growth model for Lake States forests) was improved by using this procedure

  14. Mutations in gp41 are correlated with coreceptor tropism but do not improve prediction methods substantially.

    Science.gov (United States)

    Thielen, Alexander; Lengauer, Thomas; Swenson, Luke C; Dong, Winnie W Y; McGovern, Rachel A; Lewis, Marilyn; James, Ian; Heera, Jayvant; Valdez, Hernan; Harrigan, P Richard

    2011-01-01

    The main determinants of HIV-1 coreceptor usage are located in the V3-loop of gp120, although mutations in V2 and gp41 are also known. Incorporation of V2 is known to improve prediction algorithms; however, this has not been confirmed for gp41 mutations. Samples with V3 and gp41 genotypes and Trofile assay (Monogram Biosciences, South San Francisco, CA, USA) results were taken from the HOMER cohort (n=444) and from patients screened for the MOTIVATE studies (n=1,916; 859 with maraviroc outcome data). Correlations of mutations with tropism were assessed using Fisher's exact test and prediction models trained using support vector machines. Models were validated by cross-validation, by testing models from one dataset on the other, and by analysing virological outcome. Several mutations within gp41 were highly significant for CXCR4 usage; most strikingly an insertion occurring in 7.7% of HOMER-R5 and 46.3% of HOMER-X4 samples (MOTIVATE 5.7% and 25.2%, respectively). Models trained on gp41 sequence alone achieved relatively high areas under the receiver-operating characteristic curve (AUCs; HOMER 0.713 and MOTIVATE 0.736) that were almost as good as V3 models (0.773 and 0.884, respectively). However, combining the two regions improved predictions only marginally (0.813 and 0.902, respectively). Similar results were found when models were trained on HOMER and validated on MOTIVATE or vice versa. The difference in median log viral load decrease at week 24 between patients with R5 and X4 virus was 1.65 (HOMER 2.45 and MOTIVATE 0.79) for V3 models, 1.59 for gp41-models (2.42 and 0.83, respectively) and 1.58 for the combined predictor (2.44 and 0.86, respectively). Several mutations within gp41 showed strong correlation with tropism in two independent datasets. However, incorporating gp41 mutations into prediction models is not mandatory because they do not improve substantially on models trained on V3 sequences alone.

  15. An improved model to predict nonuniform deformation of Zr-2.5 Nb pressure tubes

    International Nuclear Information System (INIS)

    Lei, Q.M.; Fan, H.Z.

    1997-01-01

    Present circular pressure-tube ballooning models in most fuel channel codes assume that the pressure tube remains circular during ballooning. This model provides adequate predictions of pressure-tube ballooning behaviour when the pressure tube (PT) and the calandria tube (CT) are concentric and when a small (<100 degrees C) top-to-bottom circumferential temperature gradient is present on the pressure tube. However, nonconcentric ballooning is expected to occur under certain postulated CANDU (CANada Deuterium Uranium) accident conditions. This circular geometry assumption prevents the model from accurately predicting nonuniform pressure-tube straining and local PT/CT contact when the pressure tube is subjected to a large circumferential temperature gradient and consequently deforms in a noncircular pattern. This paper describes an improved model that predicts noncircular pressure-tube deformation. Use of this model (once fully validated) will reduce uncertainties in the prediction of pressure-tube ballooning during a postulated loss-of-coolant accident (LOCA) in a CANDU reactor. The noncircular deformation model considers a ring or cross-section of a pressure tube with unit axial length to calculate deformation in the radial and circumferential directions. The model keeps track of the thinning of the pressure-tube wall as well as the shape deviation from a reference circle. Such deviation is expressed in a cosine Fourier series for the lateral symmetry case. The coefficients of the series for the first m terms are calculated by solving a set of algebraic equations at each time step. The model also takes into account the effects of pressure-tube sag or bow on ballooning, using an input value of the offset distance between the centre of the calandria tube and the initial centre of the pressure tube for determining the position radius of the pressure tube. One significant improvement realized in using the noncircular deformation model is a more accurate prediction in

  16. LocARNA-P: Accurate boundary prediction and improved detection of structural RNAs

    DEFF Research Database (Denmark)

    Will, Sebastian; Joshi, Tejal; Hofacker, Ivo L.

    2012-01-01

    Current genomic screens for noncoding RNAs (ncRNAs) predict a large number of genomic regions containing potential structural ncRNAs. The analysis of these data requires highly accurate prediction of ncRNA boundaries and discrimination of promising candidate ncRNAs from weak predictions. Existing...... methods struggle with these goals because they rely on sequence-based multiple sequence alignments, which regularly misalign RNA structure and therefore do not support identification of structural similarities. To overcome this limitation, we compute columnwise and global reliabilities of alignments based...... on sequence and structure similarity; we refer to these structure-based alignment reliabilities as STARs. The columnwise STARs of alignments, or STAR profiles, provide a versatile tool for the manual and automatic analysis of ncRNAs. In particular, we improve the boundary prediction of the widely used nc...

  17. Predictive modelling of interventions to improve iodine intake in New Zealand.

    Science.gov (United States)

    Schiess, Sonja; Cressey, Peter J; Thomson, Barbara M

    2012-10-01

    The potential effects of four interventions to improve iodine intakes of six New Zealand population groups are assessed. A model was developed to estimate iodine intake when (i) bread is manufactured with or without iodized salt, (ii) recommended foods are consumed to augment iodine intake, (iii) iodine supplementation as recommended for pregnant women is taken and (iv) the level of iodization for use in bread manufacture is doubled from 25-65 mg to 100 mg iodine/kg salt. New Zealanders have low and decreasing iodine intakes and low iodine status. Predictive modelling is a useful tool to assess the likely impact, and potential risk, of nutrition interventions. Food consumption information was sourced from 24 h diet recall records for 4576 New Zealanders aged over 5 years. Most consumers (73-100 %) are predicted to achieve an adequate iodine intake when salt iodized at 25-65 mg iodine/kg salt is used in bread manufacture, except in pregnant females of whom 37 % are likely to meet the estimated average requirement. Current dietary advice to achieve estimated average requirements is challenging for some consumers. Pregnant women are predicted to achieve adequate but not excessive iodine intakes when 150 μg of supplemental iodine is taken daily, assuming iodized salt in bread. The manufacture of bread with iodized salt and supplemental iodine for pregnant women are predicted to be effective interventions to lift iodine intakes in New Zealand. Current estimations of iodine intake will be improved with information on discretionary salt and supplemental iodine usage.

  18. Statistical significance of theoretical predictions: A new dimension in nuclear structure theories (I)

    International Nuclear Information System (INIS)

    DUDEK, J; SZPAK, B; FORNAL, B; PORQUET, M-G

    2011-01-01

    In this and the follow-up article we briefly discuss what we believe represents one of the most serious problems in contemporary nuclear structure: the question of statistical significance of parametrizations of nuclear microscopic Hamiltonians and the implied predictive power of the underlying theories. In the present Part I, we introduce the main lines of reasoning of the so-called Inverse Problem Theory, an important sub-field in the contemporary Applied Mathematics, here illustrated on the example of the Nuclear Mean-Field Approach.

  19. Skill of Predicting Heavy Rainfall Over India: Improvement in Recent Years Using UKMO Global Model

    Science.gov (United States)

    Sharma, Kuldeep; Ashrit, Raghavendra; Bhatla, R.; Mitra, A. K.; Iyengar, G. R.; Rajagopal, E. N.

    2017-11-01

    The quantitative precipitation forecast (QPF) performance for heavy rains is still a challenge, even for the most advanced state-of-art high-resolution Numerical Weather Prediction (NWP) modeling systems. This study aims to evaluate the performance of UK Met Office Unified Model (UKMO) over India for prediction of high rainfall amounts (>2 and >5 cm/day) during the monsoon period (JJAS) from 2007 to 2015 in short range forecast up to Day 3. Among the various modeling upgrades and improvements in the parameterizations during this period, the model horizontal resolution has seen an improvement from 40 km in 2007 to 17 km in 2015. Skill of short range rainfall forecast has improved in UKMO model in recent years mainly due to increased horizontal and vertical resolution along with improved physics schemes. Categorical verification carried out using the four verification metrics, namely, probability of detection (POD), false alarm ratio (FAR), frequency bias (Bias) and Critical Success Index, indicates that QPF has improved by >29 and >24% in case of POD and FAR. Additionally, verification scores like EDS (Extreme Dependency Score), EDI (Extremal Dependence Index) and SEDI (Symmetric EDI) are used with special emphasis on verification of extreme and rare rainfall events. These scores also show an improvement by 60% (EDS) and >34% (EDI and SEDI) during the period of study, suggesting an improved skill of predicting heavy rains.

  20. On the importance of paleoclimate modelling for improving predictions of future climate change

    Directory of Open Access Journals (Sweden)

    J. C. Hargreaves

    2009-12-01

    Full Text Available We use an ensemble of runs from the MIROC3.2 AGCM with slab-ocean to explore the extent to which mid-Holocene simulations are relevant to predictions of future climate change. The results are compared with similar analyses for the Last Glacial Maximum (LGM and pre-industrial control climate. We suggest that the paleoclimate epochs can provide some independent validation of the models that is also relevant for future predictions. Considering the paleoclimate epochs, we find that the stronger global forcing and hence larger climate change at the LGM makes this likely to be the more powerful one for estimating the large-scale changes that are anticipated due to anthropogenic forcing. The phenomena in the mid-Holocene simulations which are most strongly correlated with future changes (i.e., the mid to high northern latitude land temperature and monsoon precipitation do, however, coincide with areas where the LGM results are not correlated with future changes, and these are also areas where the paleodata indicate significant climate changes have occurred. Thus, these regions and phenomena for the mid-Holocene may be useful for model improvement and validation.

  1. Factors predicting visual improvement post pars plana vitrectomy for proliferative diabetic retinopathy

    Directory of Open Access Journals (Sweden)

    Evelyn Tai Li Min

    2017-08-01

    Full Text Available AIM: To identify factors predicting visual improvement post vitrectomy for sequelae of proliferative diabetic retinopathy(PDR.METHODS: This was a retrospective analysis of pars plana vitrectomy indicated for sequelae of PDR from Jan. to Dec. 2014 in Hospital Sultanah Bahiyah, Alor Star, Kedah, Malaysia. Data collected included patient demographics, baseline visual acuity(VAand post-operative logMAR best corrected VA at 1y. Data analysis was performed with IBM SPSS Statistics Version 22.0. RESULTS: A total of 103 patients were included. The mean age was 51.2y. On multivariable analysis, each pre-operative positive deviation of 1 logMAR from a baseline VA of 0 logMAR was associated with a post-operative improvement of 0.859 logMAR(P0.001. Likewise, an attached macula pre-operatively was associated with a 0.374(P=0.003logMAR improvement post vitrectomy. Absence of iris neovascularisation and absence of post-operative complications were associated with a post vitrectomy improvement in logMAR by 1.126(P=0.001and 0.377(P=0.005respectively. Absence of long-acting intraocular tamponade was associated with a 0.302(P=0.010improvement of logMAR post vitrectomy.CONCLUSION: Factors associated with visual improvement after vitrectomy are poor pre-operative VA, an attached macula, absence of iris neovascularisation, absence of post-operative complications and abstaining from use of long-acting intraocular tamponade. A thorough understanding of the factors predicting visual improvement will facilitate decision-making in vitreoretinal surgery.

  2. An Improved Generalized Predictive Control in a Robust Dynamic Partial Least Square Framework

    Directory of Open Access Journals (Sweden)

    Jin Xin

    2015-01-01

    Full Text Available To tackle the sensitivity to outliers in system identification, a new robust dynamic partial least squares (PLS model based on an outliers detection method is proposed in this paper. An improved radial basis function network (RBFN is adopted to construct the predictive model from inputs and outputs dataset, and a hidden Markov model (HMM is applied to detect the outliers. After outliers are removed away, a more robust dynamic PLS model is obtained. In addition, an improved generalized predictive control (GPC with the tuning weights under dynamic PLS framework is proposed to deal with the interaction which is caused by the model mismatch. The results of two simulations demonstrate the effectiveness of proposed method.

  3. Improvement of AEP Predictions Using Diurnal CFD Modelling with Site-Specific Stability Weightings Provided from Mesoscale Simulation

    International Nuclear Information System (INIS)

    Hristov, Y; Oxley, G; Žagar, M

    2014-01-01

    The Bolund measurement campaign, performed by Danish Technical University (DTU) Wind Energy Department (also known as RISØ), provided significant insight into wind flow modeling over complex terrain. In the blind comparison study several modelling solutions were submitted with the vast majority being steady-state Computational Fluid Dynamics (CFD) approaches with two equation k-ε turbulence closure. This approach yielded the most accurate results, and was identified as the state-of-the-art tool for wind turbine generator (WTG) micro-siting. Based on the findings from Bolund, further comparison between CFD and field measurement data has been deemed essential in order to improve simulation accuracy for turbine load and long-term Annual Energy Production (AEP) estimations. Vestas Wind Systems A/S is a major WTG original equipment manufacturer (OEM) with an installed base of over 60GW in over 70 countries accounting for 19% of the global installed base. The Vestas Performance and Diagnostic Centre (VPDC) provides online live data to more than 47GW of these turbines allowing a comprehensive comparison between modelled and real-world energy production data. In previous studies, multiple sites have been simulated with a steady neutral CFD formulation for the atmospheric surface layer (ASL), and wind resource (RSF) files have been generated as a base for long-term AEP predictions showing significant improvement over predictions performed with the industry standard linear WAsP tool. In this study, further improvements to the wind resource file generation with CFD are examined using an unsteady diurnal cycle approach with a full atmospheric boundary layer (ABL) formulation, with the unique stratifications throughout the cycle weighted according to mesoscale simulated sectorwise stability frequencies

  4. A study on improvement of analytical prediction model for spacer grid pressure loss coefficients

    International Nuclear Information System (INIS)

    Lim, Jonh Seon

    2002-02-01

    Nuclear fuel assemblies used in the nuclear power plants consist of the nuclear fuel rods, the control rod guide tubes, an instrument guide tube, spacer grids,a bottom nozzle, a top nozzle. The spacer grid is the most important component of the fuel assembly components for thermal hydraulic and mechanical design and analyses. The spacer grids fixed with the guide tubes support the fuel rods and have the very important role to activate thermal energy transfer by the coolant mixing caused to the turbulent flow and crossflow in the subchannels. In this paper, the analytical spacer grid pressure loss prediction model has been studied and improved by considering the test section wall to spacer grid gap pressure loss independently and applying the appropriate friction drag coefficient to predict pressure loss more accurately at the low Reynolds number region. The improved analytical model has been verified based on the hydraulic pressure drop test results for the spacer grids of three types with 5x5, 16x16, 17x17 arrays, respectively. The pressure loss coefficients predicted by the improved analytical model are coincident with those test results within ±12%. This result shows that the improved analytical model can be used for research and design change of the nuclear fuel assembly

  5. Improved predictions of nuclear reaction rates with the TALYS reaction code for astrophysical applications

    International Nuclear Information System (INIS)

    Goriely, S.; Hilaire, S.; Koning, A.J

    2008-01-01

    Context. Nuclear reaction rates of astrophysical applications are traditionally determined on the basis of Hauser-Feshbach reaction codes. These codes adopt a number of approximations that have never been tested, such as a simplified width fluctuation correction, the neglect of delayed or multiple-particle emission during the electromagnetic decay cascade, or the absence of the pre-equilibrium contribution at increasing incident energies. Aims. The reaction code TALYS has been recently updated to estimate the Maxwellian-averaged reaction rates that are of astrophysical relevance. These new developments enable the reaction rates to be calculated with increased accuracy and reliability and the approximations of previous codes to be investigated. Methods. The TALYS predictions for the thermonuclear rates of relevance to astrophysics are detailed and compared with those derived by widely-used codes for the same nuclear ingredients. Results. It is shown that TALYS predictions may differ significantly from those of previous codes, in particular for nuclei for which no or little nuclear data is available. The pre-equilibrium process is shown to influence the astrophysics rates of exotic neutron-rich nuclei significantly. For the first time, the Maxwellian- averaged (n, 2n) reaction rate is calculated for all nuclei and its competition with the radiative capture rate is discussed. Conclusions. The TALYS code provides a new tool to estimate all nuclear reaction rates of relevance to astrophysics with improved accuracy and reliability. (authors)

  6. Predictive power of theoretical modelling of the nuclear mean field: examples of improving predictive capacities

    Science.gov (United States)

    Dedes, I.; Dudek, J.

    2018-03-01

    We examine the effects of the parametric correlations on the predictive capacities of the theoretical modelling keeping in mind the nuclear structure applications. The main purpose of this work is to illustrate the method of establishing the presence and determining the form of parametric correlations within a model as well as an algorithm of elimination by substitution (see text) of parametric correlations. We examine the effects of the elimination of the parametric correlations on the stabilisation of the model predictions further and further away from the fitting zone. It follows that the choice of the physics case and the selection of the associated model are of secondary importance in this case. Under these circumstances we give priority to the relative simplicity of the underlying mathematical algorithm, provided the model is realistic. Following such criteria, we focus specifically on an important but relatively simple case of doubly magic spherical nuclei. To profit from the algorithmic simplicity we chose working with the phenomenological spherically symmetric Woods–Saxon mean-field. We employ two variants of the underlying Hamiltonian, the traditional one involving both the central and the spin orbit potential in the Woods–Saxon form and the more advanced version with the self-consistent density-dependent spin–orbit interaction. We compare the effects of eliminating of various types of correlations and discuss the improvement of the quality of predictions (‘predictive power’) under realistic parameter adjustment conditions.

  7. Improving Wind Farm Dispatchability Using Model Predictive Control for Optimal Operation of Grid-Scale Energy Storage

    Directory of Open Access Journals (Sweden)

    Douglas Halamay

    2014-09-01

    Full Text Available This paper demonstrates the use of model-based predictive control for energy storage systems to improve the dispatchability of wind power plants. Large-scale wind penetration increases the variability of power flow on the grid, thus increasing reserve requirements. Large energy storage systems collocated with wind farms can improve dispatchability of the wind plant by storing energy during generation over-the-schedule and sourcing energy during generation under-the-schedule, essentially providing on-site reserves. Model predictive control (MPC provides a natural framework for this application. By utilizing an accurate energy storage system model, control actions can be planned in the context of system power and state-of-charge limitations. MPC also enables the inclusion of predicted wind farm performance over a near-term horizon that allows control actions to be planned in anticipation of fast changes, such as wind ramps. This paper demonstrates that model-based predictive control can improve system performance compared with a standard non-predictive, non-model-based control approach. It is also demonstrated that secondary objectives, such as reducing the rate of change of the wind plant output (i.e., ramps, can be considered and successfully implemented within the MPC framework. Specifically, it is shown that scheduling error can be reduced by 81%, reserve requirements can be improved by up to 37%, and the number of ramp events can be reduced by 74%.

  8. Improving Predictive Modeling in Pediatric Drug Development: Pharmacokinetics, Pharmacodynamics, and Mechanistic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Slikker, William; Young, John F.; Corley, Rick A.; Dorman, David C.; Conolly, Rory B.; Knudsen, Thomas; Erstad, Brian L.; Luecke, Richard H.; Faustman, Elaine M.; Timchalk, Chuck; Mattison, Donald R.

    2005-07-26

    A workshop was conducted on November 18?19, 2004, to address the issue of improving predictive models for drug delivery to developing humans. Although considerable progress has been made for adult humans, large gaps remain for predicting pharmacokinetic/pharmacodynamic (PK/PD) outcome in children because most adult models have not been tested during development. The goals of the meeting included a description of when, during development, infants/children become adultlike in handling drugs. The issue of incorporating the most recent advances into the predictive models was also addressed: both the use of imaging approaches and genomic information were considered. Disease state, as exemplified by obesity, was addressed as a modifier of drug pharmacokinetics and pharmacodynamics during development. Issues addressed in this workshop should be considered in the development of new predictive and mechanistic models of drug kinetics and dynamics in the developing human.

  9. A prediction score for significant coronary artery disease in Chinese patients ≥50 years old referred for rheumatic valvular heart disease surgery.

    Science.gov (United States)

    Xu, Zhenjun; Pan, Jun; Chen, Tao; Zhou, Qing; Wang, Qiang; Cao, Hailong; Fan, Fudong; Luo, Xuan; Ge, Min; Wang, Dongjin

    2018-04-01

    Our goal was to establish a prediction score and protocol for the preoperative prediction of significant coronary artery disease (CAD) in patients with rheumatic valvular heart disease. Using multivariate logistic regression analysis, we validated the model based on 490 patients without a history of myocardial infarction and who underwent preoperative screening coronary angiography. Significant CAD was defined as ≥50% narrowing of the diameter of the lumen of the left main coronary artery or ≥70% narrowing of the diameter of the lumen of the left anterior descending coronary artery, left circumflex artery or right coronary artery. Significant CAD was present in 9.8% of patients. Age, smoking, diabetes mellitus, diastolic blood pressure, low-density lipoprotein cholesterol and ischaemia evident on an electrocardiogram were independently associated with significant CAD and were entered into the multivariate model. According to the logistic regression predictive risk score, preoperative coronary angiography is recommended in (i) postmenopausal women between 50 and 59 years of age with ≥9.1% logistic regression predictive risk score; (ii) postmenopausal women who are ≥60 years old with a logistic regression predictive risk score ≥6.6% and (iii) men ≥50 years old whose logistic regression predictive risk score was ≥2.8%. Based on this predictive model, 246 (50.2%) preoperative coronary angiograms could be safely avoided. The negative predictive value of the model was 98.8% (246 of 249). This model was accurate for the preoperative prediction of significant CAD in patients with rheumatic valvular heart disease. This model must be validated in larger cohorts and various populations.

  10. Parametric Bayesian priors and better choice of negative examples improve protein function prediction.

    Science.gov (United States)

    Youngs, Noah; Penfold-Brown, Duncan; Drew, Kevin; Shasha, Dennis; Bonneau, Richard

    2013-05-01

    Computational biologists have demonstrated the utility of using machine learning methods to predict protein function from an integration of multiple genome-wide data types. Yet, even the best performing function prediction algorithms rely on heuristics for important components of the algorithm, such as choosing negative examples (proteins without a given function) or determining key parameters. The improper choice of negative examples, in particular, can hamper the accuracy of protein function prediction. We present a novel approach for choosing negative examples, using a parameterizable Bayesian prior computed from all observed annotation data, which also generates priors used during function prediction. We incorporate this new method into the GeneMANIA function prediction algorithm and demonstrate improved accuracy of our algorithm over current top-performing function prediction methods on the yeast and mouse proteomes across all metrics tested. Code and Data are available at: http://bonneaulab.bio.nyu.edu/funcprop.html

  11. Improved Prediction of Falls in Community-Dwelling Older Adults Through Phase-Dependent Entropy of Daily-Life Walking

    Directory of Open Access Journals (Sweden)

    Espen A. F. Ihlen

    2018-03-01

    Full Text Available Age and age-related diseases have been suggested to decrease entropy of human gait kinematics, which is thought to make older adults more susceptible to falls. In this study we introduce a new entropy measure, called phase-dependent generalized multiscale entropy (PGME, and test whether this measure improves fall-risk prediction in community-dwelling older adults. PGME can assess phase-dependent changes in the stability of gait dynamics that result from kinematic changes in events such as heel strike and toe-off. PGME was assessed for trunk acceleration of 30 s walking epochs in a re-analysis of 1 week of daily-life activity data from the FARAO study, originally described by van Schooten et al. (2016. The re-analyzed data set contained inertial sensor data from 52 single- and 46 multiple-time prospective fallers in a 6 months follow-up period, and an equal number of non-falling controls matched by age, weight, height, gender, and the use of walking aids. The predictive ability of PGME for falls was assessed using a partial least squares regression. PGME had a superior predictive ability of falls among single-time prospective fallers when compared to the other gait features. The single-time fallers had a higher PGME (p < 0.0001 of their trunk acceleration at 60% of their step cycle when compared with non-fallers. No significant differences were found between PGME of multiple-time fallers and non-fallers, but PGME was found to improve the prediction model of multiple-time fallers when combined with other gait features. These findings suggest that taking into account phase-dependent changes in the stability of the gait dynamics has additional value for predicting falls in older people, especially for single-time prospective fallers.

  12. Global proteomics profiling improves drug sensitivity prediction: results from a multi-omics, pan-cancer modeling approach.

    Science.gov (United States)

    Ali, Mehreen; Khan, Suleiman A; Wennerberg, Krister; Aittokallio, Tero

    2018-04-15

    Proteomics profiling is increasingly being used for molecular stratification of cancer patients and cell-line panels. However, systematic assessment of the predictive power of large-scale proteomic technologies across various drug classes and cancer types is currently lacking. To that end, we carried out the first pan-cancer, multi-omics comparative analysis of the relative performance of two proteomic technologies, targeted reverse phase protein array (RPPA) and global mass spectrometry (MS), in terms of their accuracy for predicting the sensitivity of cancer cells to both cytotoxic chemotherapeutics and molecularly targeted anticancer compounds. Our results in two cell-line panels demonstrate how MS profiling improves drug response predictions beyond that of the RPPA or the other omics profiles when used alone. However, frequent missing MS data values complicate its use in predictive modeling and required additional filtering, such as focusing on completely measured or known oncoproteins, to obtain maximal predictive performance. Rather strikingly, the two proteomics profiles provided complementary predictive signal both for the cytotoxic and targeted compounds. Further, information about the cellular-abundance of primary target proteins was found critical for predicting the response of targeted compounds, although the non-target features also contributed significantly to the predictive power. The clinical relevance of the selected protein markers was confirmed in cancer patient data. These results provide novel insights into the relative performance and optimal use of the widely applied proteomic technologies, MS and RPPA, which should prove useful in translational applications, such as defining the best combination of omics technologies and marker panels for understanding and predicting drug sensitivities in cancer patients. Processed datasets, R as well as Matlab implementations of the methods are available at https://github.com/mehr-een/bemkl-rbps. mehreen

  13. Machine-learning scoring functions to improve structure-based binding affinity prediction and virtual screening.

    Science.gov (United States)

    Ain, Qurrat Ul; Aleksandrova, Antoniya; Roessler, Florian D; Ballester, Pedro J

    2015-01-01

    Docking tools to predict whether and how a small molecule binds to a target can be applied if a structural model of such target is available. The reliability of docking depends, however, on the accuracy of the adopted scoring function (SF). Despite intense research over the years, improving the accuracy of SFs for structure-based binding affinity prediction or virtual screening has proven to be a challenging task for any class of method. New SFs based on modern machine-learning regression models, which do not impose a predetermined functional form and thus are able to exploit effectively much larger amounts of experimental data, have recently been introduced. These machine-learning SFs have been shown to outperform a wide range of classical SFs at both binding affinity prediction and virtual screening. The emerging picture from these studies is that the classical approach of using linear regression with a small number of expert-selected structural features can be strongly improved by a machine-learning approach based on nonlinear regression allied with comprehensive data-driven feature selection. Furthermore, the performance of classical SFs does not grow with larger training datasets and hence this performance gap is expected to widen as more training data becomes available in the future. Other topics covered in this review include predicting the reliability of a SF on a particular target class, generating synthetic data to improve predictive performance and modeling guidelines for SF development. WIREs Comput Mol Sci 2015, 5:405-424. doi: 10.1002/wcms.1225 For further resources related to this article, please visit the WIREs website.

  14. Clinical-Radiological Parameters Improve the Prediction of the Thrombolysis Time Window by Both MRI Signal Intensities and DWI-FLAIR Mismatch.

    Science.gov (United States)

    Madai, Vince Istvan; Wood, Carla N; Galinovic, Ivana; Grittner, Ulrike; Piper, Sophie K; Revankar, Gajanan S; Martin, Steve Z; Zaro-Weber, Olivier; Moeller-Hartmann, Walter; von Samson-Himmelstjerna, Federico C; Heiss, Wolf-Dieter; Ebinger, Martin; Fiebach, Jochen B; Sobesky, Jan

    2016-01-01

    With regard to acute stroke, patients with unknown time from stroke onset are not eligible for thrombolysis. Quantitative diffusion weighted imaging (DWI) and fluid attenuated inversion recovery (FLAIR) MRI relative signal intensity (rSI) biomarkers have been introduced to predict eligibility for thrombolysis, but have shown heterogeneous results in the past. In the present work, we investigated whether the inclusion of easily obtainable clinical-radiological parameters would improve the prediction of the thrombolysis time window by rSIs and compared their performance to the visual DWI-FLAIR mismatch. In a retrospective study, patients from 2 centers with proven stroke with onset value/mean value of the unaffected hemisphere). Additionally, the visual DWI-FLAIR mismatch was evaluated. Prediction of the thrombolysis time window was evaluated by the area-under-the-curve (AUC) derived from receiver operating characteristic (ROC) curve analysis. Factors such as the association of age, National Institutes of Health Stroke Scale, MRI field strength, lesion size, vessel occlusion and Wahlund-Score with rSI were investigated and the models were adjusted and stratified accordingly. In 82 patients, the unadjusted rSI measures DWI-mean and -SD showed the highest AUCs (AUC 0.86-0.87). Adjustment for clinical-radiological covariates significantly improved the performance of FLAIR-mean (0.91) and DWI-SD (0.91). The best prediction results based on the AUC were found for the final stratified and adjusted models of DWI-SD (0.94) and FLAIR-mean (0.96) and a multivariable DWI-FLAIR model (0.95). The adjusted visual DWI-FLAIR mismatch did not perform in a significantly worse manner (0.89). ADC-rSIs showed fair performance in all models. Quantitative DWI and FLAIR MRI biomarkers as well as the visual DWI-FLAIR mismatch provide excellent prediction of eligibility for thrombolysis in acute stroke, when easily obtainable clinical-radiological parameters are included in the prediction

  15. Improved nucleic acid descriptors for siRNA efficacy prediction.

    Science.gov (United States)

    Sciabola, Simone; Cao, Qing; Orozco, Modesto; Faustino, Ignacio; Stanton, Robert V

    2013-02-01

    Although considerable progress has been made recently in understanding how gene silencing is mediated by the RNAi pathway, the rational design of effective sequences is still a challenging task. In this article, we demonstrate that including three-dimensional descriptors improved the discrimination between active and inactive small interfering RNAs (siRNAs) in a statistical model. Five descriptor types were used: (i) nucleotide position along the siRNA sequence, (ii) nucleotide composition in terms of presence/absence of specific combinations of di- and trinucleotides, (iii) nucleotide interactions by means of a modified auto- and cross-covariance function, (iv) nucleotide thermodynamic stability derived by the nearest neighbor model representation and (v) nucleic acid structure flexibility. The duplex flexibility descriptors are derived from extended molecular dynamics simulations, which are able to describe the sequence-dependent elastic properties of RNA duplexes, even for non-standard oligonucleotides. The matrix of descriptors was analysed using three statistical packages in R (partial least squares, random forest, and support vector machine), and the most predictive model was implemented in a modeling tool we have made publicly available through SourceForge. Our implementation of new RNA descriptors coupled with appropriate statistical algorithms resulted in improved model performance for the selection of siRNA candidates when compared with publicly available siRNA prediction tools and previously published test sets. Additional validation studies based on in-house RNA interference projects confirmed the robustness of the scoring procedure in prospective studies.

  16. Utilizing multiple scale models to improve predictions of extra-axial hemorrhage in the immature piglet.

    Science.gov (United States)

    Scott, Gregory G; Margulies, Susan S; Coats, Brittany

    2016-10-01

    Traumatic brain injury (TBI) is a leading cause of death and disability in the USA. To help understand and better predict TBI, researchers have developed complex finite element (FE) models of the head which incorporate many biological structures such as scalp, skull, meninges, brain (with gray/white matter differentiation), and vasculature. However, most models drastically simplify the membranes and substructures between the pia and arachnoid membranes. We hypothesize that substructures in the pia-arachnoid complex (PAC) contribute substantially to brain deformation following head rotation, and that when included in FE models accuracy of extra-axial hemorrhage prediction improves. To test these hypotheses, microscale FE models of the PAC were developed to span the variability of PAC substructure anatomy and regional density. The constitutive response of these models were then integrated into an existing macroscale FE model of the immature piglet brain to identify changes in cortical stress distribution and predictions of extra-axial hemorrhage (EAH). Incorporating regional variability of PAC substructures substantially altered the distribution of principal stress on the cortical surface of the brain compared to a uniform representation of the PAC. Simulations of 24 non-impact rapid head rotations in an immature piglet animal model resulted in improved accuracy of EAH prediction (to 94 % sensitivity, 100 % specificity), as well as a high accuracy in regional hemorrhage prediction (to 82-100 % sensitivity, 100 % specificity). We conclude that including a biofidelic PAC substructure variability in FE models of the head is essential for improved predictions of hemorrhage at the brain/skull interface.

  17. Rehearsal significantly improves immediate and delayed recall on the Rey Auditory Verbal Learning Test.

    Science.gov (United States)

    Hessen, Erik

    2011-10-01

    A repeated observation during memory assessment with the Rey Auditory Verbal Learning Test (RAVLT) is that patients who spontaneously employ a memory rehearsal strategy by repeating the word list more than once achieve better scores than patients who only repeat the word list once. This observation led to concern about the ability of the standard test procedure of RAVLT and similar tests in eliciting the best possible recall scores. The purpose of the present study was to test the hypothesis that a rehearsal recall strategy of repeating the word list more than once would result in improved scores of recall on the RAVLT. We report on differences in outcome after standard administration and after experimental administration on Immediate and Delayed Recall measures from the RAVLT of 50 patients. The experimental administration resulted in significantly improved scores for all the variables employed. Additionally, it was found that patients who failed effort screening showed significantly poorer improvement on Delayed Recall compared with those who passed the effort screening. The general clear improvement both in raw scores and T-scores demonstrates that recall performance can be significantly influenced by the strategy of the patient or by small variations in instructions by the examiner.

  18. Improvement of NO and CO predictions for a homogeneous combustion SI engine using a novel emissions model

    International Nuclear Information System (INIS)

    Karvountzis-Kontakiotis, Apostolos; Ntziachristos, Leonidas

    2016-01-01

    Highlights: • Presentation of a novel emissions model to predict pollutants formation in engines. • Model based on detailed chemistry, requires no application-specific calibration. • Combined with 0D and 1D combustion models with low additional computational cost. • Demonstrates accurate prediction of cyclic variability of pollutants emissions. - Abstract: This study proposes a novel emissions model for the prediction of spark ignition (SI) engine emissions at homogeneous combustion conditions, using post combustion analysis and a detailed chemistry mechanism. The novel emissions model considers an unburned and a burned zone, where the latter is considered as a homogeneous reactor and is modeled using a detailed chemical kinetics mechanism. This allows detailed emission predictions at high speed practically based only on combustion pressure and temperature profiles, without the need for calibration of the model parameters. The predictability of the emissions model is compared against the extended Zeldovich mechanism for NO and a simplified two-step reaction kinetic model for CO, which both constitute the most widespread existing approaches in the literature. Under various engine load and speed conditions examined, the mean error in NO prediction was 28% for the existing models and less than 1.3% for the new model proposed. The novel emissions model was also used to predict emissions variation due to cyclic combustion variability and demonstrated mean prediction error of 6% and 3.6% for NO and CO respectively, compared to 36% (NO) and 67% (CO) for the simplified model. The results show that the emissions model proposed offers substantial improvements in the prediction of the results without significant increase in calculation time.

  19. Use of net reclassification improvement (NRI method confirms the utility of combined genetic risk score to predict type 2 diabetes.

    Directory of Open Access Journals (Sweden)

    Claudia H T Tam

    Full Text Available BACKGROUND: Recent genome-wide association studies (GWAS identified more than 70 novel loci for type 2 diabetes (T2D, some of which have been widely replicated in Asian populations. In this study, we investigated their individual and combined effects on T2D in a Chinese population. METHODOLOGY: We selected 14 single nucleotide polymorphisms (SNPs in T2D genes relating to beta-cell function validated in Asian populations and genotyped them in 5882 Chinese T2D patients and 2569 healthy controls. A combined genetic score (CGS was calculated by summing up the number of risk alleles or weighted by the effect size for each SNP under an additive genetic model. We tested for associations by either logistic or linear regression analysis for T2D and quantitative traits, respectively. The contribution of the CGS for predicting T2D risk was evaluated by receiver operating characteristic (ROC analysis and net reclassification improvement (NRI. RESULTS: We observed consistent and significant associations of IGF2BP2, WFS1, CDKAL1, SLC30A8, CDKN2A/B, HHEX, TCF7L2 and KCNQ1 (8.5×10(-18significant SNPs exhibited joint effect on increasing T2D risk, fasting plasma glucose and use of insulin therapy as well as reducing HOMA-β, BMI, waist circumference and younger age of diagnosis of T2D. The addition of CGS marginally increased AUC (2% but significantly improved the predictive ability on T2D risk by 11.2% and 11.3% for unweighted and weighted CGS, respectively using the NRI approach (P<0.001. CONCLUSION: In a Chinese population, the use of a CGS of 8 SNPs modestly but significantly improved its discriminative ability to predict T2D above and beyond that attributed to clinical risk factors (sex, age and BMI.

  20. Improved prediction and tracking of volcanic ash clouds

    Science.gov (United States)

    Mastin, Larry G.; Webley, Peter

    2009-01-01

    During the past 30??years, more than 100 airplanes have inadvertently flown through clouds of volcanic ash from erupting volcanoes. Such encounters have caused millions of dollars in damage to the aircraft and have endangered the lives of tens of thousands of passengers. In a few severe cases, total engine failure resulted when ash was ingested into turbines and coating turbine blades. These incidents have prompted the establishment of cooperative efforts by the International Civil Aviation Organization and the volcanological community to provide rapid notification of eruptive activity, and to monitor and forecast the trajectories of ash clouds so that they can be avoided by air traffic. Ash-cloud properties such as plume height, ash concentration, and three-dimensional ash distribution have been monitored through non-conventional remote sensing techniques that are under active development. Forecasting the trajectories of ash clouds has required the development of volcanic ash transport and dispersion models that can calculate the path of an ash cloud over the scale of a continent or a hemisphere. Volcanological inputs to these models, such as plume height, mass eruption rate, eruption duration, ash distribution with altitude, and grain-size distribution, must be assigned in real time during an event, often with limited observations. Databases and protocols are currently being developed that allow for rapid assignment of such source parameters. In this paper, we summarize how an interdisciplinary working group on eruption source parameters has been instigating research to improve upon the current understanding of volcanic ash cloud characterization and predictions. Improved predictions of ash cloud movement and air fall will aid in making better hazard assessments for aviation and for public health and air quality. ?? 2008 Elsevier B.V.

  1. Do Electrochemiluminescence Assays Improve Prediction of Time to Type 1 Diabetes in Autoantibody-Positive TrialNet Subjects?

    Science.gov (United States)

    Fouts, Alexandra; Pyle, Laura; Yu, Liping; Miao, Dongmei; Michels, Aaron; Krischer, Jeffrey; Sosenko, Jay; Gottlieb, Peter; Steck, Andrea K

    2016-10-01

    To explore whether electrochemiluminescence (ECL) assays can help improve prediction of time to type 1 diabetes in the TrialNet autoantibody-positive population. TrialNet subjects who were positive for one or more autoantibodies (microinsulin autoantibody, GAD65 autoantibody [GADA], IA-2A, and ZnT8A) with available ECL-insulin autoantibody (IAA) and ECL-GADA data at their initial visit were analyzed; after a median follow-up of 24 months, 177 of these 1,287 subjects developed diabetes. Univariate analyses showed that autoantibodies by radioimmunoassays (RIAs), ECL-IAA, ECL-GADA, age, sex, number of positive autoantibodies, presence of HLA DR3/4-DQ8 genotype, HbA1c, and oral glucose tolerance test (OGTT) measurements were all significantly associated with progression to diabetes. Subjects who were ECL positive had a risk of progression to diabetes within 6 years of 58% compared with 5% for the ECL-negative subjects (P < 0.0001). Multivariate Cox proportional hazards models were compared, with the base model including age, sex, OGTT measurements, and number of positive autoantibodies by RIAs. The model with positivity for ECL-GADA and/or ECL-IAA was the best, and factors that remained significantly associated with time to diabetes were area under the curve (AUC) C-peptide, fasting C-peptide, AUC glucose, number of positive autoantibodies by RIAs, and ECL positivity. Adding ECL to the Diabetes Prevention Trial risk score (DPTRS) improved the receiver operating characteristic curves with AUC of 0.83 (P < 0.0001). ECL assays improved the ability to predict time to diabetes in these autoantibody-positive relatives at risk for developing diabetes. These findings might be helpful in the design and eligibility criteria for prevention trials in the future. © 2016 by the American Diabetes Association.

  2. Incorporating Single-nucleotide Polymorphisms Into the Lyman Model to Improve Prediction of Radiation Pneumonitis

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, Susan L., E-mail: sltucker@mdanderson.org [Department of Bioinformatics and Computational Biology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li Minghuan [Department of Radiation Oncology, Shandong Cancer Hospital, Jinan, Shandong (China); Xu Ting; Gomez, Daniel [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Yuan Xianglin [Department of Oncology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan (China); Yu Jinming [Department of Radiation Oncology, Shandong Cancer Hospital, Jinan, Shandong (China); Liu Zhensheng; Yin Ming; Guan Xiaoxiang; Wang Lie; Wei Qingyi [Department of Epidemiology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Mohan, Radhe [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Vinogradskiy, Yevgeniy [University of Colorado School of Medicine, Aurora, Colorado (United States); Martel, Mary [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liao Zhongxing [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2013-01-01

    Purpose: To determine whether single-nucleotide polymorphisms (SNPs) in genes associated with DNA repair, cell cycle, transforming growth factor-{beta}, tumor necrosis factor and receptor, folic acid metabolism, and angiogenesis can significantly improve the fit of the Lyman-Kutcher-Burman (LKB) normal-tissue complication probability (NTCP) model of radiation pneumonitis (RP) risk among patients with non-small cell lung cancer (NSCLC). Methods and Materials: Sixteen SNPs from 10 different genes (XRCC1, XRCC3, APEX1, MDM2, TGF{beta}, TNF{alpha}, TNFR, MTHFR, MTRR, and VEGF) were genotyped in 141 NSCLC patients treated with definitive radiation therapy, with or without chemotherapy. The LKB model was used to estimate the risk of severe (grade {>=}3) RP as a function of mean lung dose (MLD), with SNPs and patient smoking status incorporated into the model as dose-modifying factors. Multivariate analyses were performed by adding significant factors to the MLD model in a forward stepwise procedure, with significance assessed using the likelihood-ratio test. Bootstrap analyses were used to assess the reproducibility of results under variations in the data. Results: Five SNPs were selected for inclusion in the multivariate NTCP model based on MLD alone. SNPs associated with an increased risk of severe RP were in genes for TGF{beta}, VEGF, TNF{alpha}, XRCC1 and APEX1. With smoking status included in the multivariate model, the SNPs significantly associated with increased risk of RP were in genes for TGF{beta}, VEGF, and XRCC3. Bootstrap analyses selected a median of 4 SNPs per model fit, with the 6 genes listed above selected most often. Conclusions: This study provides evidence that SNPs can significantly improve the predictive ability of the Lyman MLD model. With a small number of SNPs, it was possible to distinguish cohorts with >50% risk vs <10% risk of RP when they were exposed to high MLDs.

  3. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    International Nuclear Information System (INIS)

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P.

    2016-01-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  4. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Mei [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Wang, Dong, E-mail: wangdong@nju.edu.cn [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Wang, Yuankun; Zeng, Xiankui [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Ge, Shanshan; Yan, Hengqian [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Singh, Vijay P. [Department of Biological and Agricultural Engineering Zachry Department of Civil Engineering, Texas A & M University, College Station, TX 77843 (United States)

    2016-07-15

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  5. Missing Value Imputation Improves Mortality Risk Prediction Following Cardiac Surgery: An Investigation of an Australian Patient Cohort.

    Science.gov (United States)

    Karim, Md Nazmul; Reid, Christopher M; Tran, Lavinia; Cochrane, Andrew; Billah, Baki

    2017-03-01

    The aim of this study was to evaluate the impact of missing values on the prediction performance of the model predicting 30-day mortality following cardiac surgery as an example. Information from 83,309 eligible patients, who underwent cardiac surgery, recorded in the Australia and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) database registry between 2001 and 2014, was used. An existing 30-day mortality risk prediction model developed from ANZSCTS database was re-estimated using the complete cases (CC) analysis and using multiple imputation (MI) analysis. Agreement between the risks generated by the CC and MI analysis approaches was assessed by the Bland-Altman method. Performances of the two models were compared. One or more missing predictor variables were present in 15.8% of the patients in the dataset. The Bland-Altman plot demonstrated significant disagreement between the risk scores (prisk of mortality. Compared to CC analysis, MI analysis resulted in an average of 8.5% decrease in standard error, a measure of uncertainty. The MI model provided better prediction of mortality risk (observed: 2.69%; MI: 2.63% versus CC: 2.37%, Pvalues improved the 30-day mortality risk prediction following cardiac surgery. Copyright © 2016 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  6. Improved pump turbine transient behaviour prediction using a Thoma number-dependent hillchart model

    International Nuclear Information System (INIS)

    Manderla, M; Koutnik, J; Kiniger, K

    2014-01-01

    Water hammer phenomena are important issues for high head hydro power plants. Especially, if several reversible pump-turbines are connected to the same waterways there may be strong interactions between the hydraulic machines. The prediction and coverage of all relevant load cases is challenging and difficult using classical simulation models. On the basis of a recent pump-storage project, dynamic measurements motivate an improved modeling approach making use of the Thoma number dependency of the actual turbine behaviour. The proposed approach is validated for several transient scenarios and turns out to increase correlation between measurement and simulation results significantly. By applying a fully automated simulation procedure broad operating ranges can be covered which provides a consistent insight into critical load case scenarios. This finally allows the optimization of the closing strategy and hence the overall power plant performance

  7. Improved pump turbine transient behaviour prediction using a Thoma number-dependent hillchart model

    Science.gov (United States)

    Manderla, M.; Kiniger, K.; Koutnik, J.

    2014-03-01

    Water hammer phenomena are important issues for high head hydro power plants. Especially, if several reversible pump-turbines are connected to the same waterways there may be strong interactions between the hydraulic machines. The prediction and coverage of all relevant load cases is challenging and difficult using classical simulation models. On the basis of a recent pump-storage project, dynamic measurements motivate an improved modeling approach making use of the Thoma number dependency of the actual turbine behaviour. The proposed approach is validated for several transient scenarios and turns out to increase correlation between measurement and simulation results significantly. By applying a fully automated simulation procedure broad operating ranges can be covered which provides a consistent insight into critical load case scenarios. This finally allows the optimization of the closing strategy and hence the overall power plant performance.

  8. Unified Health Gamification can significantly improve well-being in corporate environments.

    Science.gov (United States)

    Shahrestani, Arash; Van Gorp, Pieter; Le Blanc, Pascale; Greidanus, Fabrizio; de Groot, Kristel; Leermakers, Jelle

    2017-07-01

    There is a multitude of mHealth applications that aim to solve societal health problems by stimulating specific types of physical activities via gamification. However, physical health activities cover just one of the three World Health Organization (WHO) dimensions of health. This paper introduces the novel notion of Unified Health Gamification (UHG), which covers besides physical health also social and cognitive health and well-being. Instead of rewarding activities in the three WHO dimensions using different mHealth competitions, UHG combines the scores for such activities on unified leaderboards and lets people interact in social circles beyond personal interests. This approach is promising in corporate environments since UHG can connect the employees with intrinsic motivation for physical health with those who have quite different interests. In order to evaluate this approach, we realized an app prototype and we evaluated it in two corporate pilot studies. In total, eighteen pilot users participated voluntarily for six weeks. Half of the participants were recruited from an occupational health setting and the other half from a treatment setting. Our results suggest that the UHG principles are worth more investigation: various positive health effects were found based on a validated survey. The mean mental health improved significantly at one pilot location and at the level of individual pilot participants, multiple other effects were found to be significant: among others, significant mental health improvements were found for 28% of the participants. Most participants intended to use the app beyond the pilot, especially if it would be further developed.

  9. Improving the Accuracy of Predicting Maximal Oxygen Consumption (VO2pk)

    Science.gov (United States)

    Downs, Meghan E.; Lee, Stuart M. C.; Ploutz-Snyder, Lori; Feiveson, Alan

    2016-01-01

    Maximal oxygen (VO2pk) is the maximum amount of oxygen that the body can use during intense exercise and is used for benchmarking endurance exercise capacity. The most accurate method to determineVO2pk requires continuous measurements of ventilation and gas exchange during an exercise test to maximal effort, which necessitates expensive equipment, a trained staff, and time to set-up the equipment. For astronauts, accurate VO2pk measures are important to assess mission critical task performance capabilities and to prescribe exercise intensities to optimize performance. Currently, astronauts perform submaximal exercise tests during flight to predict VO2pk; however, while submaximal VO2pk prediction equations provide reliable estimates of mean VO2pk for populations, they can be unacceptably inaccurate for a given individual. The error in current predictions and logistical limitations of measuring VO2pk, particularly during spaceflight, highlights the need for improved estimation methods.

  10. Predictive values of early rest/24 hour delay Tl-201 perfusion SPECT for wall motion improvement in patients with acute myocardial infarction after reperfusion

    International Nuclear Information System (INIS)

    Hyun, In Young; Kwan, June

    1998-01-01

    We studied early rest/24 hour delay Tl-201 perfusion SPECT for prediction of wall motion improvement after reperfusion in patients with acute myocardial infarction. Among 17 patients (male/female=11/6, age: 59±13) with acute myocardial infarction, 15 patients were treated with percutaneous transcoronary angioplasty (direct:2, delay:11) and intravenous urokinase (2). Spontaneous resolution occurred in infarct related arteries of 2 patients. We confirmed TIMI 3 flow of infarct-related artery after reperfusion in all patients with coronary angiography. We performed rest Tl-201 perfusion SPECT less then 6 hours after reperfusion and delay Tl-201 perfusion SPECT next day. Tl-201 uptake was visually graded as 4 point score from normal (0) to severe defect (3). Rest Tl-201 uptake ≤2 or combination of rest Tl-201 uptake ≤2 or late reversibility were considered to be viable. Myocardial wall motion was graded as 5 point score from normal (1) to dyskinesia (5). Myocardial wall motion was considered to be improved when a segment showed an improvement ≥1 grade in follow up echo compared with the baseline values. Among 98 segments with wall motion abnormality, the severity of myocardial wall motion decrease was as follow: mild hypokinesia: 18/98 (18%), severe hypokinesia: 28/98 (29%), akinesia: 51/98 (52%), dyskinesia: 1/98 (1%). The wall motion improved in 85%. Redistribution (13%), and reverse redistribution (4%) were observed in 24 hour delay SPECT. Positive predictive value (PPV) and negative predictive value (NPV) of combination of late reversibility and rest Tl-201uptake were 99%, and 54%.PPV and NPV of rest Tl-201 uptake were 100% and 52% respectively. Predictive values of comibination of rest Tl-201 uptake and late reversibility were not significantly different compared with predictive values of rest Tl-201 uptake only. We conclude that early Tl-201 perfusion SPECT predict myocardial wall motion improvement with excellent positive but relatively low negative

  11. Improving default risk prediction using Bayesian model uncertainty techniques.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.

  12. Noninvasive work of breathing improves prediction of post-extubation outcome.

    Science.gov (United States)

    Banner, Michael J; Euliano, Neil R; Martin, A Daniel; Al-Rawas, Nawar; Layon, A Joseph; Gabrielli, Andrea

    2012-02-01

    We hypothesized that non-invasively determined work of breathing per minute (WOB(N)/min) (esophageal balloon not required) may be useful for predicting extubation outcome, i.e., appropriate work of breathing values may be associated with extubation success, while inappropriately increased values may be associated with failure. Adult candidates for extubation were divided into a training set (n = 38) to determine threshold values of indices for assessing extubation and a prospective validation set (n = 59) to determine the predictive power of the threshold values for patients successfully extubated and those who failed extubation. All were evaluated for extubation during a spontaneous breathing trial (5 cmH(2)O pressure support ventilation, 5 cmH(2)O positive end expiratory pressure) using routine clinical practice standards. WOB(N)/min data were blinded to attending physicians. Area under the receiver operating characteristic curves (AUC), sensitivity, specificity, and positive and negative predictive values of all extubation indices were determined. AUC for WOB(N)/min was 0.96 and significantly greater (p indices. WOB(N)/min had a specificity of 0.83, the highest sensitivity at 0.96, positive predictive value at 0.84, and negative predictive value at 0.96 compared to all indices. For 95% of those successfully extubated, WOB(N)/min was ≤10 J/min. WOB(N)/min had the greatest overall predictive accuracy for extubation compared to traditional indices. WOB(N)/min warrants consideration for use in a complementary manner with spontaneous breathing pattern data for predicting extubation outcome.

  13. Frameworks for improvement: clinical audit, the plan-do-study-act cycle and significant event audit.

    Science.gov (United States)

    Gillam, Steve; Siriwardena, A Niroshan

    2013-01-01

    This is the first in a series of articles about quality improvement tools and techniques. We explore common frameworks for improvement, including the model for improvement and its application to clinical audit, plan-do-study-act (PDSA) cycles and significant event analysis (SEA), examining the similarities and differences between these and providing examples of each.

  14. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    Science.gov (United States)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  15. RAPID COMMUNICATION: Improving prediction accuracy of GPS satellite clocks with periodic variation behaviour

    Science.gov (United States)

    Heo, Youn Jeong; Cho, Jeongho; Heo, Moon Beom

    2010-07-01

    The broadcast ephemeris and IGS ultra-rapid predicted (IGU-P) products are primarily available for use in real-time GPS applications. The IGU orbit precision has been remarkably improved since late 2007, but its clock products have not shown acceptably high-quality prediction performance. One reason for this fact is that satellite atomic clocks in space can be easily influenced by various factors such as temperature and environment and this leads to complicated aspects like periodic variations, which are not sufficiently described by conventional models. A more reliable prediction model is thus proposed in this paper in order to be utilized particularly in describing the periodic variation behaviour satisfactorily. The proposed prediction model for satellite clocks adds cyclic terms to overcome the periodic effects and adopts delay coordinate embedding, which offers the possibility of accessing linear or nonlinear coupling characteristics like satellite behaviour. The simulation results have shown that the proposed prediction model outperforms the IGU-P solutions at least on a daily basis.

  16. Specific Components of Pediatricians' Medication-Related Care Predict Attention-Deficit/Hyperactivity Disorder Symptom Improvement.

    Science.gov (United States)

    Epstein, Jeffery N; Kelleher, Kelly J; Baum, Rebecca; Brinkman, William B; Peugh, James; Gardner, William; Lichtenstein, Phil; Langberg, Joshua M

    2017-06-01

    The development of attention-deficit/hyperactivity disorder (ADHD) care quality measurements is a prerequisite to improving the quality of community-based pediatric care of children with ADHD. Unfortunately, the evidence base for existing ADHD care quality metrics is poor. The objective of this study was to identify which components of ADHD care best predict patient outcomes. Parents of 372 medication-naïve children in grades 1 to 5 presenting to their community-based pediatrician (N = 195) for an ADHD-related concern and who were subsequently prescribed ADHD medication were identified. Parents completed the Vanderbilt ADHD Parent Rating Scale (VAPRS) at the time ADHD was raised as a concern and then approximately 12 months after starting ADHD medication. Each patient's chart was reviewed to measure 12 different components of ADHD care. Across all children, the mean decrease in VAPRS total symptom score during the first year of treatment was 11.6 (standard deviation 10.1). Of the 12 components of ADHD care, shorter times to first contact and more teacher ratings collected in the first year of treatment significantly predicted greater decreases in patient total symptom scores. Notably, it was timeliness of contacts, defined as office visits, phone calls, or email communication, that predicted more ADHD symptom decreases. Office visits alone, in terms of number or timeliness, did not predict patient outcomes. The magnitude of ADHD symptom decrease that can be achieved with the use of ADHD medications was associated with specific components of ADHD care. Future development and modifications of ADHD quality care metrics should include these ADHD care components. Copyright © 2017 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. Prediction of FAD binding sites in electron transport proteins according to efficient radial basis function networks and significant amino acid pairs.

    Science.gov (United States)

    Le, Nguyen-Quoc-Khanh; Ou, Yu-Yen

    2016-07-30

    Cellular respiration is a catabolic pathway for producing adenosine triphosphate (ATP) and is the most efficient process through which cells harvest energy from consumed food. When cells undergo cellular respiration, they require a pathway to keep and transfer electrons (i.e., the electron transport chain). Due to oxidation-reduction reactions, the electron transport chain produces a transmembrane proton electrochemical gradient. In case protons flow back through this membrane, this mechanical energy is converted into chemical energy by ATP synthase. The convert process is involved in producing ATP which provides energy in a lot of cellular processes. In the electron transport chain process, flavin adenine dinucleotide (FAD) is one of the most vital molecules for carrying and transferring electrons. Therefore, predicting FAD binding sites in the electron transport chain is vital for helping biologists understand the electron transport chain process and energy production in cells. We used an independent data set to evaluate the performance of the proposed method, which had an accuracy of 69.84 %. We compared the performance of the proposed method in analyzing two newly discovered electron transport protein sequences with that of the general FAD binding predictor presented by Mishra and Raghava and determined that the accuracy of the proposed method improved by 9-45 % and its Matthew's correlation coefficient was 0.14-0.5. Furthermore, the proposed method enabled reducing the number of false positives significantly and can provide useful information for biologists. We developed a method that is based on PSSM profiles and SAAPs for identifying FAD binding sites in newly discovered electron transport protein sequences. This approach achieved a significant improvement after we added SAAPs to PSSM features to analyze FAD binding proteins in the electron transport chain. The proposed method can serve as an effective tool for predicting FAD binding sites in electron

  18. Determination of significance in Ecological Impact Assessment: Past change, current practice and future improvements

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, Sam; Hudson, Malcolm D., E-mail: mdh@soton.ac.uk

    2013-01-15

    Ecological Impact Assessment (EcIA) is an important tool for conservation and achieving sustainable development. 'Significant' impacts are those which disturb or alter the environment to a measurable degree. Significance is a crucial part of EcIA, our understanding of the concept in practice is vital if it is to be effective as a tool. This study employed three methods to assess how the determination of significance has changed through time, what current practice is, and what would lead to future improvements. Three data streams were collected: interviews with expert stakeholders, a review of 30 Environmental Statements and a broad-scale survey of the United Kingdom Institute of Ecology and Environmental Management (IEEM) members. The approach taken in the determination of significance has become more standardised and subjectivity has become constrained through a transparent framework. This has largely been driven by a set of guidelines produced by IEEM in 2006. The significance of impacts is now more clearly justified and the accuracy with which it is determined has improved. However, there are limitations to accuracy and effectiveness of the determination of significance. These are the quality of baseline survey data, our scientific understanding of ecological processes and the lack of monitoring and feedback of results. These in turn are restricted by the limited resources available in consultancies. The most notable recommendations for future practice are the implementation of monitoring and the publication of feedback, the creation of a central database for baseline survey data and the streamlining of guidance. - Highlights: Black-Right-Pointing-Pointer The assessment of significance has changed markedly through time. Black-Right-Pointing-Pointer The IEEM guidelines have driven a standardisation of practice. Black-Right-Pointing-Pointer Currently limited by quality of baseline data and scientific understanding. Black-Right-Pointing-Pointer Monitoring

  19. Procalcitonin Improves the Glasgow Prognostic Score for Outcome Prediction in Emergency Patients with Cancer: A Cohort Study

    Directory of Open Access Journals (Sweden)

    Anna Christina Rast

    2015-01-01

    Full Text Available The Glasgow Prognostic Score (GPS is useful for predicting long-term mortality in cancer patients. Our aim was to validate the GPS in ED patients with different cancer-related urgency and investigate whether biomarkers would improve its accuracy. We followed consecutive medical patients presenting with a cancer-related medical urgency to a tertiary care hospital in Switzerland. Upon admission, we measured procalcitonin (PCT, white blood cell count, urea, 25-hydroxyvitamin D, corrected calcium, C-reactive protein, and albumin and calculated the GPS. Of 341 included patients (median age 68 years, 61% males, 81 (23.8% died within 30 days after admission. The GPS showed moderate prognostic accuracy (AUC 0.67 for mortality. Among the different biomarkers, PCT provided the highest prognostic accuracy (odds ratio 1.6 (95% confidence interval 1.3 to 1.9, P<0.001, AUC 0.69 and significantly improved the GPS to a combined AUC of 0.74 (P=0.007. Considering all investigated biomarkers, the AUC increased to 0.76 (P<0.001. The GPS performance was significantly improved by the addition of PCT and other biomarkers for risk stratification in ED cancer patients. The benefit of early risk stratification by the GPS in combination with biomarkers from different pathways should be investigated in further interventional trials.

  20. An Improved Prediction Model for the Impact Sound Level of Lightweight Floors: Introducing Decoupled Floor-Ceiling and Beam-Plate Moment

    DEFF Research Database (Denmark)

    Mosharrof, Mohammad Sazzad; Brunskog, Jonas; Ljunggren, Fredrik

    2011-01-01

    the impact sound pressure level in a receiving room for a coupled floor structure where floor and ceiling are rigidly connected by beams. A theoretical model for predicting the impact sound level for a decoupled floor structure, which has no rigid mechanical connections between the floor and the ceiling......, is developed. An analytical method has been implemented, where a spatial Fourier transform method as well as the Poisson’s sum formula is applied to model transformed plate displacements. Radiated sound power was calculated from these displacements and normalized sound pressure levels were calculated in one...... and is found to be dependent on frequency, showing significant improvement in predicting impact sound level at high frequency region....

  1. A critical review of predictive models for the onset of significant void in forced-convection subcooled boiling

    International Nuclear Information System (INIS)

    Dorra, H.; Lee, S.C.; Bankoff, S.G.

    1993-06-01

    This predictive models for the onset of significant void (OSV) in forced-convection subcooled boiling are reviewed and compared with extensive data. Three analytical models and seven empirical correlations are considered in this review. These models and correlations are put onto a common basis and are compared, again on a common basis, with a variety of data. The evaluation of their range of validity and applicability under various operating conditions are discussed. The results show that the correlations of Saha-Zuber seems to be the best model to predict OSV in vertical subcooled boiling flow

  2. Improved time series prediction with a new method for selection of model parameters

    International Nuclear Information System (INIS)

    Jade, A M; Jayaraman, V K; Kulkarni, B D

    2006-01-01

    A new method for model selection in prediction of time series is proposed. Apart from the conventional criterion of minimizing RMS error, the method also minimizes the error on the distribution of singularities, evaluated through the local Hoelder estimates and its probability density spectrum. Predictions of two simulated and one real time series have been done using kernel principal component regression (KPCR) and model parameters of KPCR have been selected employing the proposed as well as the conventional method. Results obtained demonstrate that the proposed method takes into account the sharp changes in a time series and improves the generalization capability of the KPCR model for better prediction of the unseen test data. (letter to the editor)

  3. An improved method for predicting brittleness of rocks via well logs in tight oil reservoirs

    Science.gov (United States)

    Wang, Zhenlin; Sun, Ting; Feng, Cheng; Wang, Wei; Han, Chuang

    2018-06-01

    There can be no industrial oil production in tight oil reservoirs until fracturing is undertaken. Under such conditions, the brittleness of the rocks is a very important factor. However, it has so far been difficult to predict. In this paper, the selected study area is the tight oil reservoirs in Lucaogou formation, Permian, Jimusaer sag, Junggar basin. According to the transformation of dynamic and static rock mechanics parameters and the correction of confining pressure, an improved method is proposed for quantitatively predicting the brittleness of rocks via well logs in tight oil reservoirs. First, 19 typical tight oil core samples are selected in the study area. Their static Young’s modulus, static Poisson’s ratio and petrophysical parameters are measured. In addition, the static brittleness indices of four other tight oil cores are measured under different confining pressure conditions. Second, the dynamic Young’s modulus, Poisson’s ratio and brittleness index are calculated using the compressional and shear wave velocity. With combination of the measured and calculated results, the transformation model of dynamic and static brittleness index is built based on the influence of porosity and clay content. The comparison of the predicted brittleness indices and measured results shows that the model has high accuracy. Third, on the basis of the experimental data under different confining pressure conditions, the amplifying factor of brittleness index is proposed to correct for the influence of confining pressure on the brittleness index. Finally, the above improved models are applied to formation evaluation via well logs. Compared with the results before correction, the results of the improved models agree better with the experimental data, which indicates that the improved models have better application effects. The brittleness index prediction method of tight oil reservoirs is improved in this research. It is of great importance in the optimization of

  4. Clinical and angiographic predictors of haemodynamically significant angiographic lesions: development and validation of a risk score to predict positive fractional flow reserve.

    Science.gov (United States)

    Sareen, Nishtha; Baber, Usman; Kezbor, Safwan; Sayseng, Sonny; Aquino, Melissa; Mehran, Roxana; Sweeny, Joseph; Barman, Nitin; Kini, Annapoorna; Sharma, Samin K

    2017-04-07

    Coronary revascularisation based upon physiological evaluation of lesions improves clinical outcomes. Angiographic or visual stenosis assessment alone is insufficient in predicting haemodynamic stenosis severity by fractional flow reserve (FFR) and therefore cannot be used to guide revascularisation, particularly in the lesion subset system formulated. Of 1,023 consecutive lesions (883 patients), 314 (31%) were haemodynamically significant. Characteristics associated with FFR ≤0.8 include male gender, higher SYNTAX score, lesions ≥20 mm, stenosis >50%, bifurcation, calcification, absence of tortuosity and smaller reference diameter. A user-friendly integer score was developed with the five variables demonstrating the strongest association. On prospective validation (in 279 distinct lesions), the increasing value of the score correlated well with increasing haemodynamic significance (C-statistic 0.85). We identified several clinical and angiographic characteristics and formulated a scoring system to guide the approach to intermediate lesions. This may translate into cost savings. Larger studies with prospective validation are required to confirm our results.

  5. Using self-similarity compensation for improving inter-layer prediction in scalable 3D holoscopic video coding

    Science.gov (United States)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2013-09-01

    Holoscopic imaging, also known as integral imaging, has been recently attracting the attention of the research community, as a promising glassless 3D technology due to its ability to create a more realistic depth illusion than the current stereoscopic or multiview solutions. However, in order to gradually introduce this technology into the consumer market and to efficiently deliver 3D holoscopic content to end-users, backward compatibility with legacy displays is essential. Consequently, to enable 3D holoscopic content to be delivered and presented on legacy displays, a display scalable 3D holoscopic coding approach is required. Hence, this paper presents a display scalable architecture for 3D holoscopic video coding with a three-layer approach, where each layer represents a different level of display scalability: Layer 0 - a single 2D view; Layer 1 - 3D stereo or multiview; and Layer 2 - the full 3D holoscopic content. In this context, a prediction method is proposed, which combines inter-layer prediction, aiming to exploit the existing redundancy between the multiview and the 3D holoscopic layers, with self-similarity compensated prediction (previously proposed by the authors for non-scalable 3D holoscopic video coding), aiming to exploit the spatial redundancy inherent to the 3D holoscopic enhancement layer. Experimental results show that the proposed combined prediction can improve significantly the rate-distortion performance of scalable 3D holoscopic video coding with respect to the authors' previously proposed solutions, where only inter-layer or only self-similarity prediction is used.

  6. Significant change of predictions related to the future of nuclear power

    International Nuclear Information System (INIS)

    Dumitrache, Ion

    2002-01-01

    During the last two decades of the 20th century, nuclear power contribution increased slowly in the world. This trend was mainly determined by the commissioning of new nuclear power plants, NPP, in the non-developed countries, except for Japan and South Korea. Almost all the forecasts offered the image of the stagnant nuclear power business. Sweden, Germany, Holland and Belgium Governments made clear the intention to stop the production of electricity based on fission. Recently, despite the negative effects on nuclear power of the terrorism events of September 11, 2001, the predictions related to the nuclear power future become much more optimistic. USA, Japan, South Korea and Canada made clear that new NPPs will offer their significant electricity contribution several decades, even after years 2020-2030. Moreover, several old NPP from USA obtained the license for an additional 20 years period of operation. The analysis indicated that most of the existing NPP in USA may increase the level of the maximum global power defined by the initial design. In the European Union the situation is much more complicated. About 35% of the electricity is based now on fission. Several countries, like Sweden and Germany, maintain the position of phasing out the NPPs, as soon as the licensed life-time is over. Finland decided to build a new power plant. France is very favorable to nuclear power, but does not need more energy. In the UK several very old NPP will be shut down, and companies like BNFL and British Energy intend to build new NPP, based on Westinghouse or AECL-Canada advanced reactors. Switzerland and Spain are favorable to the future use of nuclear power. In the eastern part of Europe, almost all the countries intend to base their electricity production on coal, fission, hydro and gas, nuclear contribution being significant. The most impressive increases of nuclear power output are related to Asia; in China, from 2.2 Gwe in 1999, to 18.7 Gwe in 2020, reference case, or 10

  7. Improve accuracy and sensibility in glycan structure prediction by matching glycan isotope abundance

    International Nuclear Information System (INIS)

    Xu Guang; Liu Xin; Liu Qingyan; Zhou Yanhong; Li Jianjun

    2012-01-01

    Highlights: ► A glycan isotope pattern recognition strategy for glycomics. ► A new data preprocessing procedure to detect ion peaks in a giving MS spectrum. ► A linear soft margin SVM classification for isotope pattern recognition. - Abstract: Mass Spectrometry (MS) is a powerful technique for the determination of glycan structures and is capable of providing qualitative and quantitative information. Recent development in computational method offers an opportunity to use glycan structure databases and de novo algorithms for extracting valuable information from MS or MS/MS data. However, detecting low-intensity peaks that are buried in noisy data sets is still a challenge and an algorithm for accurate prediction and annotation of glycan structures from MS data is highly desirable. The present study describes a novel algorithm for glycan structure prediction by matching glycan isotope abundance (mGIA), which takes isotope masses, abundances, and spacing into account. We constructed a comprehensive database containing 808 glycan compositions and their corresponding isotope abundance. Unlike most previously reported methods, not only did we take into count the m/z values of the peaks but also their corresponding logarithmic Euclidean distance of the calculated and detected isotope vectors. Evaluation against a linear classifier, obtained by training mGIA algorithm with datasets of three different human tissue samples from Consortium for Functional Glycomics (CFG) in association with Support Vector Machine (SVM), was proposed to improve the accuracy of automatic glycan structure annotation. In addition, an effective data preprocessing procedure, including baseline subtraction, smoothing, peak centroiding and composition matching for extracting correct isotope profiles from MS data was incorporated. The algorithm was validated by analyzing the mouse kidney MS data from CFG, resulting in the identification of 6 more glycan compositions than the previous annotation

  8. Non-invasive prediction of hemodynamically significant coronary artery stenoses by contrast density difference in coronary CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Hell, Michaela M., E-mail: michaela.hell@uk-erlangen.de [Department of Cardiology, University of Erlangen (Germany); Dey, Damini [Department of Biomedical Sciences, Biomedical Imaging Research Institute, Cedars-Sinai Medical Center, Taper Building, Room A238, 8700 Beverly Boulevard, Los Angeles, CA 90048 (United States); Marwan, Mohamed; Achenbach, Stephan; Schmid, Jasmin; Schuhbaeck, Annika [Department of Cardiology, University of Erlangen (Germany)

    2015-08-15

    Highlights: • Overestimation of coronary lesions by coronary computed tomography angiography and subsequent unnecessary invasive coronary angiography and revascularization is a concern. • Differences in plaque characteristics and contrast density difference between hemodynamically significant and non-significant stenoses, as defined by invasive fractional flow reserve, were assessed. • At a threshold of ≥24%, contrast density difference predicted hemodynamically significant lesions with a specificity of 75%, sensitivity of 33%, PPV of 35% and NPV of 73%. • The determination of contrast density difference required less time than transluminal attenuation gradient measurement. - Abstract: Objectives: Coronary computed tomography angiography (CTA) allows the detection of obstructive coronary artery disease. However, its ability to predict the hemodynamic significance of stenoses is limited. We assessed differences in plaque characteristics and contrast density difference between hemodynamically significant and non-significant stenoses, as defined by invasive fractional flow reserve (FFR). Methods: Lesion characteristics of 59 consecutive patients (72 lesions) in whom invasive FFR was performed in at least one coronary artery with moderate to high-grade stenoses in coronary CTA were evaluated by two experienced readers. Coronary CTA data sets were acquired on a second-generation dual-source CT scanner using retrospectively ECG-gated spiral acquisition or prospectively ECG-triggered axial acquisition mode. Plaque volume and composition (non-calcified, calcified), remodeling index as well as contrast density difference (defined as the percentage decline in luminal CT attenuation/cross-sectional area over the lesion) were assessed using a semi-automatic software tool (Autoplaq). Additionally, the transluminal attenuation gradient (defined as the linear regression coefficient between intraluminal CT attenuation and length from the ostium) was determined

  9. The accuracy of survival time prediction for patients with glioma is improved by measuring mitotic spindle checkpoint gene expression.

    Directory of Open Access Journals (Sweden)

    Li Bie

    Full Text Available Identification of gene expression changes that improve prediction of survival time across all glioma grades would be clinically useful. Four Affymetrix GeneChip datasets from the literature, containing data from 771 glioma samples representing all WHO grades and eight normal brain samples, were used in an ANOVA model to screen for transcript changes that correlated with grade. Observations were confirmed and extended using qPCR assays on RNA derived from 38 additional glioma samples and eight normal samples for which survival data were available. RNA levels of eight major mitotic spindle assembly checkpoint (SAC genes (BUB1, BUB1B, BUB3, CENPE, MAD1L1, MAD2L1, CDC20, TTK significantly correlated with glioma grade and six also significantly correlated with survival time. In particular, the level of BUB1B expression was highly correlated with survival time (p<0.0001, and significantly outperformed all other measured parameters, including two standards; WHO grade and MIB-1 (Ki-67 labeling index. Measurement of the expression levels of a small set of SAC genes may complement histological grade and other clinical parameters for predicting survival time.

  10. Improvement of injury severity prediction (ISP) of AACN during on-site triage using vehicle deformation pattern for car-to-car (C2C) side impacts.

    Science.gov (United States)

    Pal, Chinmoy; Hirayama, Shigeru; Narahari, Sangolla; Jeyabharath, Manoharan; Prakash, Gopinath; Kulothungan, Vimalathithan; Combest, John

    2018-02-28

    The Advanced Automatic Crash Notification (AACN) system needs to predict injury accurately, to provide appropriate treatment for seriously injured occupants involved in motor vehicle crashes. This study investigates the possibility of improving the accuracy of the AACN system, using vehicle deformation parameters in car-to-car (C2C) side impacts. This study was based on car-to-car (C2C) crash data from NASS-CDS, CY 2004-2014. Variables from Kononen's algorithm (published in 2011) were used to build a "base model" for this study. Two additional variables, intrusion magnitude and max deformation location, are added to Kononen's algorithm variables (age, belt usage, number of events, and delta-v) to build a "proposed model." This proposed model operates in two stages: In the first stage, the AACN system uses Kononen's variables and predicts injury severity, based on which emergency medical services (EMS) is dispatched; in the second stage, the EMS team conveys deformation-related information, for accurate prediction of serious injury. Logistic regression analysis reveals that the vehicle deformation location and intrusion magnitude are significant parameters in predicting the level of injury. The percentage of serious injury decreases as the deformation location shifts away from the driver sitting position. The proposed model can improve the sensitivity (serious injury correctly predicted as serious) from 50% to 63%, and overall prediction accuracy increased from 83.5% to 85.9%. The proposed method can improve the accuracy of injury prediction in side-impact collisions. Similar opportunities exist for other crash modes also.

  11. Improved prediction of reservoir behavior through integration of quantitative geological and petrophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Auman, J. B.; Davies, D. K.; Vessell, R. K.

    1997-08-01

    Methodology that promises improved reservoir characterization and prediction of permeability, production and injection behavior during primary and enhanced recovery operations was demonstrated. The method is based on identifying intervals of unique pore geometry by a combination of image analysis techniques and traditional petrophysical measurements to calculate rock type and estimate permeability and saturation. Results from a complex carbonate and sandstone reservoir were presented as illustrative examples of the versatility and high level of accuracy of this method in predicting reservoir quality. 16 refs., 5 tabs., 14 figs.

  12. Carotid endarterectomy significantly improves postoperative laryngeal sensitivity.

    Science.gov (United States)

    Hammer, Georg Philipp; Tomazic, Peter Valentin; Vasicek, Sarah; Graupp, Matthias; Gugatschka, Markus; Baumann, Anneliese; Konstantiniuk, Peter; Koter, Stephan Herwig

    2016-11-01

    sensory threshold on the operated-on side (6.08 ± 2.02 mm Hg) decreased significantly at the 6-week follow-up, even in relation to the preoperative measure (P = .022). With the exception of one patient with permanent unilateral vocal fold immobility, no signs of nerve injury were detected. In accordance with previous reports, injuries to the recurrent laryngeal nerve during CEA seem to be rare. In most patients, postoperative symptoms (globus, dysphagia, dysphonia) and signs fade within a few weeks without any specific therapeutic intervention. This study shows an improved long-term postoperative superior laryngeal nerve function with regard to laryngopharyngeal sensitivity. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Improving protein fold recognition and structural class prediction accuracies using physicochemical properties of amino acids.

    Science.gov (United States)

    Raicar, Gaurav; Saini, Harsh; Dehzangi, Abdollah; Lal, Sunil; Sharma, Alok

    2016-08-07

    Predicting the three-dimensional (3-D) structure of a protein is an important task in the field of bioinformatics and biological sciences. However, directly predicting the 3-D structure from the primary structure is hard to achieve. Therefore, predicting the fold or structural class of a protein sequence is generally used as an intermediate step in determining the protein's 3-D structure. For protein fold recognition (PFR) and structural class prediction (SCP), two steps are required - feature extraction step and classification step. Feature extraction techniques generally utilize syntactical-based information, evolutionary-based information and physicochemical-based information to extract features. In this study, we explore the importance of utilizing the physicochemical properties of amino acids for improving PFR and SCP accuracies. For this, we propose a Forward Consecutive Search (FCS) scheme which aims to strategically select physicochemical attributes that will supplement the existing feature extraction techniques for PFR and SCP. An exhaustive search is conducted on all the existing 544 physicochemical attributes using the proposed FCS scheme and a subset of physicochemical attributes is identified. Features extracted from these selected attributes are then combined with existing syntactical-based and evolutionary-based features, to show an improvement in the recognition and prediction performance on benchmark datasets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Combining clinical variables to optimize prediction of antidepressant treatment outcomes.

    Science.gov (United States)

    Iniesta, Raquel; Malki, Karim; Maier, Wolfgang; Rietschel, Marcella; Mors, Ole; Hauser, Joanna; Henigsberg, Neven; Dernovsek, Mojca Zvezdana; Souery, Daniel; Stahl, Daniel; Dobson, Richard; Aitchison, Katherine J; Farmer, Anne; Lewis, Cathryn M; McGuffin, Peter; Uher, Rudolf

    2016-07-01

    The outcome of treatment with antidepressants varies markedly across people with the same diagnosis. A clinically significant prediction of outcomes could spare the frustration of trial and error approach and improve the outcomes of major depressive disorder through individualized treatment selection. It is likely that a combination of multiple predictors is needed to achieve such prediction. We used elastic net regularized regression to optimize prediction of symptom improvement and remission during treatment with escitalopram or nortriptyline and to identify contributing predictors from a range of demographic and clinical variables in 793 adults with major depressive disorder. A combination of demographic and clinical variables, with strong contributions from symptoms of depressed mood, reduced interest, decreased activity, indecisiveness, pessimism and anxiety significantly predicted treatment outcomes, explaining 5-10% of variance in symptom improvement with escitalopram. Similar combinations of variables predicted remission with area under the curve 0.72, explaining approximately 15% of variance (pseudo R(2)) in who achieves remission, with strong contributions from body mass index, appetite, interest-activity symptom dimension and anxious-somatizing depression subtype. Escitalopram-specific outcome prediction was more accurate than generic outcome prediction, and reached effect sizes that were near or above a previously established benchmark for clinical significance. Outcome prediction on the nortriptyline arm did not significantly differ from chance. These results suggest that easily obtained demographic and clinical variables can predict therapeutic response to escitalopram with clinically meaningful accuracy, suggesting a potential for individualized prescription of this antidepressant drug. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. An integrated PRA module for fast determination of risk significance and improvement effectiveness

    International Nuclear Information System (INIS)

    Chao, Chun-Chang; Lin, Jyh-Der

    2004-01-01

    With the widely use of PRA technology in risk-informed applications, to predict the changes of CDF and LERF becomes a standard process for risk-informed applications. This paper describes an integrated PRA module prepared for risk-informed applications. The module contains a super risk engine, a super fault tree engine, an advanced PRA model and a tool for data base maintenance. The individual element of the module also works well for purpose other than risk-informed applications. The module has been verified and validated through a series of scrupulous benchmark tests with similar software. The results of the benchmark tests showed that the module has remarkable accuracy and speed even for an extremely large-size top-logic fault tree as well as for the case in which large amount of MCSs may be generated. The risk monitor for nuclear power plants in Taiwan is the first application to adopt the module. The results predicted by the risk monitor are now accepted by the regulatory agency. A tool to determine the risk significance according to the inspection findings will be the next application to adopt the module in the near future. This tool classified the risk significance into four different color codes according to the level of increase on CDF. Experience of application showed that the flexibility, the accuracy and speed of the module make it useful in any risk-informed applications when risk indexes must be determined by resolving a PRA model. (author)

  16. Better estimation of protein-DNA interaction parameters improve prediction of functional sites

    Directory of Open Access Journals (Sweden)

    O'Flanagan Ruadhan A

    2008-12-01

    Full Text Available Abstract Background Characterizing transcription factor binding motifs is a common bioinformatics task. For transcription factors with variable binding sites, we need to get many suboptimal binding sites in our training dataset to get accurate estimates of free energy penalties for deviating from the consensus DNA sequence. One procedure to do that involves a modified SELEX (Systematic Evolution of Ligands by Exponential Enrichment method designed to produce many such sequences. Results We analyzed low stringency SELEX data for E. coli Catabolic Activator Protein (CAP, and we show here that appropriate quantitative analysis improves our ability to predict in vitro affinity. To obtain large number of sequences required for this analysis we used a SELEX SAGE protocol developed by Roulet et al. The sequences obtained from here were subjected to bioinformatic analysis. The resulting bioinformatic model characterizes the sequence specificity of the protein more accurately than those sequence specificities predicted from previous analysis just by using a few known binding sites available in the literature. The consequences of this increase in accuracy for prediction of in vivo binding sites (and especially functional ones in the E. coli genome are also discussed. We measured the dissociation constants of several putative CAP binding sites by EMSA (Electrophoretic Mobility Shift Assay and compared the affinities to the bioinformatics scores provided by methods like the weight matrix method and QPMEME (Quadratic Programming Method of Energy Matrix Estimation trained on known binding sites as well as on the new sites from SELEX SAGE data. We also checked predicted genome sites for conservation in the related species S. typhimurium. We found that bioinformatics scores based on SELEX SAGE data does better in terms of prediction of physical binding energies as well as in detecting functional sites. Conclusion We think that training binding site detection

  17. Improved methods of online monitoring and prediction in condensate and feed water system of nuclear power plant

    International Nuclear Information System (INIS)

    Wang, Hang; Peng, Min-jun; Wu, Peng; Cheng, Shou-yu

    2016-01-01

    Highlights: • Different methods for online monitoring and diagnosis are summarized. • Numerical simulation modeling of condensate and feed water system in nuclear power plant are done by FORTRAN programming. • Integrated online monitoring and prediction methods have been developed and tested. • Online monitoring module, fault diagnosis module and trends prediction module can be verified with each other. - Abstract: Faults or accidents may occur in a nuclear power plant (NPP), but it is hard for operators to recognize the situation and take effective measures quickly. So, online monitoring, diagnosis and prediction (OMDP) is used to provide enough information to operators and improve the safety of NPPs. In this paper, distributed conservation equation (DCE) and artificial immunity system (AIS) are proposed for online monitoring and diagnosis. On this basis, quantitative simulation models and interactive database are combined to predict the trends and severity of faults. The effectiveness of OMDP in improving the monitoring and prediction of condensate and feed water system (CFWS) was verified through simulation tests.

  18. Improving Flood Predictions in Data-Scarce Basins

    Science.gov (United States)

    Vimal, Solomon; Zanardo, Stefano; Rafique, Farhat; Hilberts, Arno

    2017-04-01

    Flood modeling methodology at Risk Management Solutions Ltd. has evolved over several years with the development of continental scale flood risk models spanning most of Europe, the United States and Japan. Pluvial (rain fed) and fluvial (river fed) flood maps represent the basis for the assessment of regional flood risk. These maps are derived by solving the 1D energy balance equation for river routing and 2D shallow water equation (SWE) for overland flow. The models are run with high performance computing and GPU based solvers as the time taken for simulation is large in such continental scale modeling. These results are validated with data from authorities and business partners, and have been used in the insurance industry for many years. While this methodology has been proven extremely effective in regions where the quality and availability of data are high, its application is very challenging in other regions where data are scarce. This is generally the case for low and middle income countries, where simpler approaches are needed for flood risk modeling and assessment. In this study we explore new methods to make use of modeling results obtained in data-rich contexts to improve predictive ability in data-scarce contexts. As an example, based on our modeled flood maps in data-rich countries, we identify statistical relationships between flood characteristics and topographic and climatic indicators, and test their generalization across physical domains. Moreover, we apply the Height Above Nearest Drainage (HAND)approach to estimate "probable" saturated areas for different return period flood events as functions of basin characteristics. This work falls into the well-established research field of Predictions in Ungauged Basins.

  19. Low-Quality Structural and Interaction Data Improves Binding Affinity Prediction via Random Forest.

    Science.gov (United States)

    Li, Hongjian; Leung, Kwong-Sak; Wong, Man-Hon; Ballester, Pedro J

    2015-06-12

    Docking scoring functions can be used to predict the strength of protein-ligand binding. It is widely believed that training a scoring function with low-quality data is detrimental for its predictive performance. Nevertheless, there is a surprising lack of systematic validation experiments in support of this hypothesis. In this study, we investigated to which extent training a scoring function with data containing low-quality structural and binding data is detrimental for predictive performance. We actually found that low-quality data is not only non-detrimental, but beneficial for the predictive performance of machine-learning scoring functions, though the improvement is less important than that coming from high-quality data. Furthermore, we observed that classical scoring functions are not able to effectively exploit data beyond an early threshold, regardless of its quality. This demonstrates that exploiting a larger data volume is more important for the performance of machine-learning scoring functions than restricting to a smaller set of higher data quality.

  20. Improved predictions of nuclear reaction rates for astrophysics applications with the TALYS reaction code

    International Nuclear Information System (INIS)

    Goriely, S.; Hilaire, S.; Koning, A.J.

    2008-01-01

    Nuclear reaction rates for astrophysics applications are traditionally determined on the basis of Hauser-Feshbach reaction codes, like MOST. These codes use simplified schemes to calculate the capture reaction cross section on a given target nucleus, not only in its ground state but also on the different thermally populated states of the stellar plasma at a given temperature. Such schemes include a number of approximations that have never been tested, such as an approximate width fluctuation correction, the neglect of delayed particle emission during the electromagnetic decay cascade or the absence of the pre-equilibrium contribution at increasing incident energies. New developments have been brought to the reaction code TALYS to estimate the Maxwellian-averaged reaction rates of astrophysics relevance. These new developments give us the possibility to calculate with an improved accuracy the reaction cross sections and the corresponding astrophysics rates. The TALYS predictions for the thermonuclear rates of astrophysics relevance are presented and compared with those obtained with the MOST code on the basis of the same nuclear ingredients for nuclear structure properties, optical model potential, nuclear level densities and γ-ray strength. It is shown that, in particular, the pre-equilibrium process significantly influences the astrophysics rates of exotic neutron-rich nuclei. The reciprocity theorem traditionally used in astrophysics to determine photo-rates is also shown no to be valid for exotic nuclei. The predictions obtained with different nuclear inputs are also analyzed to provide an estimate of the theoretical uncertainties still affecting the reaction rate prediction far away from the experimentally known regions. (authors)

  1. Using isotopes to improve impact and hydrological predictions of land-surface schemes in global climate models

    International Nuclear Information System (INIS)

    McGuffie, K.; Henderson-Sellers, A.

    2002-01-01

    Global climate model (GCM) predictions of the impact of large-scale land-use change date back to 1984 as do the earliest isotopic studies of large-basin hydrology. Despite this coincidence in interest and geography, with both papers focussed on the Amazon, there have been few studies that have tried to exploit isotopic information with the goal of improving climate model simulations of the land-surface. In this paper we analyze isotopic results from the IAEA global data base specifically with the goal of identifying signatures of potential value for improving global and regional climate model simulations of the land-surface. Evaluation of climate model predictions of the impacts of deforestation of the Amazon has been shown to be of significance by recent results which indicate impacts occurring distant from the Amazon i.e. tele-connections causing climate change elsewhere around the globe. It is suggested that these could be similar in magnitude and extent to the global impacts of ENSO events. Validation of GCM predictions associated with Amazonian deforestation are increasingly urgently required because of the additional effects of other aspects of climate change, particularly synergies occurring between forest removal and greenhouse gas increases, especially CO 2 . Here we examine three decades distributions of deuterium excess across the Amazon and use the results to evaluate the relative importance of the fractionating (partial evaporation) and non-fractionating (transpiration) processes. These results illuminate GCM scenarios of importance to the regional climate and hydrology: (i) the possible impact of increased stomatal resistance in the rainforest caused by higher levels of atmospheric CO2 [4]; and (ii) the consequences of the combined effects of deforestation and global warming on the regions climate and hydrology

  2. Integrative approach to pre-operative determination of clinically significant prostate cancer

    Directory of Open Access Journals (Sweden)

    Shatylko T.V.

    2015-09-01

    Full Text Available Aim: improvement of early diagnostics of prostate cancer by developing a technique, which makes possible to predict its clinical significance in outpatient setting before initiation of invasive procedures. Material and Methods. Clinical data of 398 patients who underwent transrectal prostate biopsy in 2012-2014 in SSMU S. R. Mirotvortsev Clinical Hospital, was used to build an artificial neural network, while its output allowed to determine whether the tumour corresponds to Epstein criteria and which D'Amico risk group it belongs to. Internal validation was performed on 80 patients, who underwent prostate biopsy in September 2014 — December 2014. Sensitivity, specificity, positive and negative predictive value of artificial neural network were calculated. Results. Accuracy of predicting adenocarcinoma presence in biopsy specimen was 93,75%; accuracy of predicting whether the cancer meets active surveillance criteria was 90%. Accuracy of predicting T stage (T1c, T2a, T2b, T2cwas 57,1%. Prediction of D'Amico risk group was accurate in 70% of cases; for low-risk cancer accuracy was 81,2%. Conclusion. Artificial neural networks may be responsible for prostate cancer risk stratification and determination of its clinical significance prior to biopsy.

  3. Neuro-Linguistic Programming: Improving Rapport between Track/Cross Country Coaches and Significant Others

    Science.gov (United States)

    Helm, David Jay

    2017-01-01

    This study examines the background information and the components of N.L.P., being eye movements, use of predicates, and posturing, as they apply to improving rapport and empathy between track/cross country coaches and their significant others in the arena of competition to help alleviate the inherent stressors.

  4. Improved prediction of drug-target interactions using regularized least squares integrating with kernel fusion technique

    Energy Technology Data Exchange (ETDEWEB)

    Hao, Ming; Wang, Yanli, E-mail: ywang@ncbi.nlm.nih.gov; Bryant, Stephen H., E-mail: bryant@ncbi.nlm.nih.gov

    2016-02-25

    Identification of drug-target interactions (DTI) is a central task in drug discovery processes. In this work, a simple but effective regularized least squares integrating with nonlinear kernel fusion (RLS-KF) algorithm is proposed to perform DTI predictions. Using benchmark DTI datasets, our proposed algorithm achieves the state-of-the-art results with area under precision–recall curve (AUPR) of 0.915, 0.925, 0.853 and 0.909 for enzymes, ion channels (IC), G protein-coupled receptors (GPCR) and nuclear receptors (NR) based on 10 fold cross-validation. The performance can further be improved by using a recalculated kernel matrix, especially for the small set of nuclear receptors with AUPR of 0.945. Importantly, most of the top ranked interaction predictions can be validated by experimental data reported in the literature, bioassay results in the PubChem BioAssay database, as well as other previous studies. Our analysis suggests that the proposed RLS-KF is helpful for studying DTI, drug repositioning as well as polypharmacology, and may help to accelerate drug discovery by identifying novel drug targets. - Graphical abstract: Flowchart of the proposed RLS-KF algorithm for drug-target interaction predictions. - Highlights: • A nonlinear kernel fusion algorithm is proposed to perform drug-target interaction predictions. • Performance can further be improved by using the recalculated kernel. • Top predictions can be validated by experimental data.

  5. Improved prediction of drug-target interactions using regularized least squares integrating with kernel fusion technique

    International Nuclear Information System (INIS)

    Hao, Ming; Wang, Yanli; Bryant, Stephen H.

    2016-01-01

    Identification of drug-target interactions (DTI) is a central task in drug discovery processes. In this work, a simple but effective regularized least squares integrating with nonlinear kernel fusion (RLS-KF) algorithm is proposed to perform DTI predictions. Using benchmark DTI datasets, our proposed algorithm achieves the state-of-the-art results with area under precision–recall curve (AUPR) of 0.915, 0.925, 0.853 and 0.909 for enzymes, ion channels (IC), G protein-coupled receptors (GPCR) and nuclear receptors (NR) based on 10 fold cross-validation. The performance can further be improved by using a recalculated kernel matrix, especially for the small set of nuclear receptors with AUPR of 0.945. Importantly, most of the top ranked interaction predictions can be validated by experimental data reported in the literature, bioassay results in the PubChem BioAssay database, as well as other previous studies. Our analysis suggests that the proposed RLS-KF is helpful for studying DTI, drug repositioning as well as polypharmacology, and may help to accelerate drug discovery by identifying novel drug targets. - Graphical abstract: Flowchart of the proposed RLS-KF algorithm for drug-target interaction predictions. - Highlights: • A nonlinear kernel fusion algorithm is proposed to perform drug-target interaction predictions. • Performance can further be improved by using the recalculated kernel. • Top predictions can be validated by experimental data.

  6. Improvement of PM10 prediction in East Asia using inverse modeling

    Science.gov (United States)

    Koo, Youn-Seo; Choi, Dae-Ryun; Kwon, Hi-Yong; Jang, Young-Kee; Han, Jin-Seok

    2015-04-01

    Aerosols from anthropogenic emissions in industrialized region in China as well as dust emissions from southern Mongolia and northern China that transport along prevailing northwestern wind have a large influence on the air quality in Korea. The emission inventory in the East Asia region is an important factor in chemical transport modeling (CTM) for PM10 (particulate matters less than 10 ㎛ in aerodynamic diameter) forecasts and air quality management in Korea. Most previous studies showed that predictions of PM10 mass concentration by the CTM were underestimated when comparing with observational data. In order to fill the gap in discrepancies between observations and CTM predictions, the inverse Bayesian approach with Comprehensive Air-quality Model with extension (CAMx) forward model was applied to obtain optimized a posteriori PM10 emissions in East Asia. The predicted PM10 concentrations with a priori emission were first compared with observations at monitoring sites in China and Korea for January and August 2008. The comparison showed that PM10 concentrations with a priori PM10 emissions for anthropogenic and dust sources were generally under-predicted. The result from the inverse modeling indicated that anthropogenic PM10 emissions in the industrialized and urbanized areas in China were underestimated while dust emissions from desert and barren soil in southern Mongolia and northern China were overestimated. A priori PM10 emissions from northeastern China regions including Shenyang, Changchun, and Harbin were underestimated by about 300% (i.e., the ratio of a posteriori to a priori PM10 emission was a factor of about 3). The predictions of PM10 concentrations with a posteriori emission showed better agreement with the observations, implying that the inverse modeling minimized the discrepancies in the model predictions by improving PM10 emissions in East Asia.

  7. Improving prediction of heterodimeric protein complexes using combination with pairwise kernel.

    Science.gov (United States)

    Ruan, Peiying; Hayashida, Morihiro; Akutsu, Tatsuya; Vert, Jean-Philippe

    2018-02-19

    Since many proteins become functional only after they interact with their partner proteins and form protein complexes, it is essential to identify the sets of proteins that form complexes. Therefore, several computational methods have been proposed to predict complexes from the topology and structure of experimental protein-protein interaction (PPI) network. These methods work well to predict complexes involving at least three proteins, but generally fail at identifying complexes involving only two different proteins, called heterodimeric complexes or heterodimers. There is however an urgent need for efficient methods to predict heterodimers, since the majority of known protein complexes are precisely heterodimers. In this paper, we use three promising kernel functions, Min kernel and two pairwise kernels, which are Metric Learning Pairwise Kernel (MLPK) and Tensor Product Pairwise Kernel (TPPK). We also consider the normalization forms of Min kernel. Then, we combine Min kernel or its normalization form and one of the pairwise kernels by plugging. We applied kernels based on PPI, domain, phylogenetic profile, and subcellular localization properties to predicting heterodimers. Then, we evaluate our method by employing C-Support Vector Classification (C-SVC), carrying out 10-fold cross-validation, and calculating the average F-measures. The results suggest that the combination of normalized-Min-kernel and MLPK leads to the best F-measure and improved the performance of our previous work, which had been the best existing method so far. We propose new methods to predict heterodimers, using a machine learning-based approach. We train a support vector machine (SVM) to discriminate interacting vs non-interacting protein pairs, based on informations extracted from PPI, domain, phylogenetic profiles and subcellular localization. We evaluate in detail new kernel functions to encode these data, and report prediction performance that outperforms the state-of-the-art.

  8. Patient-specific metrics of invasiveness reveal significant prognostic benefit of resection in a predictable subset of gliomas.

    Directory of Open Access Journals (Sweden)

    Anne L Baldock

    Full Text Available Malignant gliomas are incurable, primary brain neoplasms noted for their potential to extensively invade brain parenchyma. Current methods of clinical imaging do not elucidate the full extent of brain invasion, making it difficult to predict which, if any, patients are likely to benefit from gross total resection. Our goal was to apply a mathematical modeling approach to estimate the overall tumor invasiveness on a patient-by-patient basis and determine whether gross total resection would improve survival in patients with relatively less invasive gliomas.In 243 patients presenting with contrast-enhancing gliomas, estimates of the relative invasiveness of each patient's tumor, in terms of the ratio of net proliferation rate of the glioma cells to their net dispersal rate, were derived by applying a patient-specific mathematical model to routine pretreatment MR imaging. The effect of varying degrees of extent of resection on overall survival was assessed for cohorts of patients grouped by tumor invasiveness.We demonstrate that patients with more diffuse tumors showed no survival benefit (P = 0.532 from gross total resection over subtotal/biopsy, while those with nodular (less diffuse tumors showed a significant benefit (P = 0.00142 with a striking median survival benefit of over eight months compared to sub-totally resected tumors in the same cohort (an 80% improvement in survival time for GTR only seen for nodular tumors.These results suggest that our patient-specific, model-based estimates of tumor invasiveness have clinical utility in surgical decision making. Quantification of relative invasiveness assessed from routinely obtained pre-operative imaging provides a practical predictor of the benefit of gross total resection.

  9. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Directory of Open Access Journals (Sweden)

    Michael F Sloma

    2017-11-01

    Full Text Available Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  10. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Science.gov (United States)

    Sloma, Michael F; Mathews, David H

    2017-11-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  11. Improving the Accuracy of a Heliocentric Potential (HCP Prediction Model for the Aviation Radiation Dose

    Directory of Open Access Journals (Sweden)

    Junga Hwang

    2016-12-01

    Full Text Available The space radiation dose over air routes including polar routes should be carefully considered, especially when space weather shows sudden disturbances such as coronal mass ejections (CMEs, flares, and accompanying solar energetic particle events. We recently established a heliocentric potential (HCP prediction model for real-time operation of the CARI-6 and CARI-6M programs. Specifically, the HCP value is used as a critical input value in the CARI-6/6M programs, which estimate the aviation route dose based on the effective dose rate. The CARI-6/6M approach is the most widely used technique, and the programs can be obtained from the U.S. Federal Aviation Administration (FAA. However, HCP values are given at a one month delay on the FAA official webpage, which makes it difficult to obtain real-time information on the aviation route dose. In order to overcome this critical limitation regarding the time delay for space weather customers, we developed a HCP prediction model based on sunspot number variations (Hwang et al. 2015. In this paper, we focus on improvements to our HCP prediction model and update it with neutron monitoring data. We found that the most accurate method to derive the HCP value involves (1 real-time daily sunspot assessments, (2 predictions of the daily HCP by our prediction algorithm, and (3 calculations of the resultant daily effective dose rate. Additionally, we also derived the HCP prediction algorithm in this paper by using ground neutron counts. With the compensation stemming from the use of ground neutron count data, the newly developed HCP prediction model was improved.

  12. Managing uncertainty in metabolic network structure and improving predictions using EnsembleFBA.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    2017-03-01

    Full Text Available Genome-scale metabolic network reconstructions (GENREs are repositories of knowledge about the metabolic processes that occur in an organism. GENREs have been used to discover and interpret metabolic functions, and to engineer novel network structures. A major barrier preventing more widespread use of GENREs, particularly to study non-model organisms, is the extensive time required to produce a high-quality GENRE. Many automated approaches have been developed which reduce this time requirement, but automatically-reconstructed draft GENREs still require curation before useful predictions can be made. We present a novel approach to the analysis of GENREs which improves the predictive capabilities of draft GENREs by representing many alternative network structures, all equally consistent with available data, and generating predictions from this ensemble. This ensemble approach is compatible with many reconstruction methods. We refer to this new approach as Ensemble Flux Balance Analysis (EnsembleFBA. We validate EnsembleFBA by predicting growth and gene essentiality in the model organism Pseudomonas aeruginosa UCBPP-PA14. We demonstrate how EnsembleFBA can be included in a systems biology workflow by predicting essential genes in six Streptococcus species and mapping the essential genes to small molecule ligands from DrugBank. We found that some metabolic subsystems contributed disproportionately to the set of predicted essential reactions in a way that was unique to each Streptococcus species, leading to species-specific outcomes from small molecule interactions. Through our analyses of P. aeruginosa and six Streptococci, we show that ensembles increase the quality of predictions without drastically increasing reconstruction time, thus making GENRE approaches more practical for applications which require predictions for many non-model organisms. All of our functions and accompanying example code are available in an open online repository.

  13. Biomarkers for predicting type 2 diabetes development-Can metabolomics improve on existing biomarkers?

    Directory of Open Access Journals (Sweden)

    Otto Savolainen

    Full Text Available The aim was to determine if metabolomics could be used to build a predictive model for type 2 diabetes (T2D risk that would improve prediction of T2D over current risk markers.Gas chromatography-tandem mass spectrometry metabolomics was used in a nested case-control study based on a screening sample of 64-year-old Caucasian women (n = 629. Candidate metabolic markers of T2D were identified in plasma obtained at baseline and the power to predict diabetes was tested in 69 incident cases occurring during 5.5 years follow-up. The metabolomics results were used as a standalone prediction model and in combination with established T2D predictive biomarkers for building eight T2D prediction models that were compared with each other based on their sensitivity and selectivity for predicting T2D.Established markers of T2D (impaired fasting glucose, impaired glucose tolerance, insulin resistance (HOMA, smoking, serum adiponectin alone, and in combination with metabolomics had the largest areas under the curve (AUC (0.794 (95% confidence interval [0.738-0.850] and 0.808 [0.749-0.867] respectively, with the standalone metabolomics model based on nine fasting plasma markers having a lower predictive power (0.657 [0.577-0.736]. Prediction based on non-blood based measures was 0.638 [0.565-0.711].Established measures of T2D risk remain the best predictor of T2D risk in this population. Additional markers detected using metabolomics are likely related to these measures as they did not enhance the overall prediction in a combined model.

  14. Predicting Intentions of a Familiar Significant Other Beyond the Mirror Neuron System

    Directory of Open Access Journals (Sweden)

    Stephanie Cacioppo

    2017-08-01

    Full Text Available Inferring intentions of others is one of the most intriguing issues in interpersonal interaction. Theories of embodied cognition and simulation suggest that this mechanism takes place through a direct and automatic matching process that occurs between an observed action and past actions. This process occurs via the reactivation of past self-related sensorimotor experiences within the inferior frontoparietal network (including the mirror neuron system, MNS. The working model is that the anticipatory representations of others' behaviors require internal predictive models of actions formed from pre-established, shared representations between the observer and the actor. This model suggests that observers should be better at predicting intentions performed by a familiar actor, rather than a stranger. However, little is known about the modulations of the intention brain network as a function of the familiarity between the observer and the actor. Here, we combined functional magnetic resonance imaging (fMRI with a behavioral intention inference task, in which participants were asked to predict intentions from three types of actors: A familiar actor (their significant other, themselves (another familiar actor, and a non-familiar actor (a stranger. Our results showed that the participants were better at inferring intentions performed by familiar actors than non-familiar actors and that this better performance was associated with greater activation within and beyond the inferior frontoparietal network i.e., in brain areas related to familiarity (e.g., precuneus. In addition, and in line with Hebbian principles of neural modulations, the more the participants reported being cognitively close to their partner, the less the brain areas associated with action self-other comparison (e.g., inferior parietal lobule, attention (e.g., superior parietal lobule, recollection (hippocampus, and pair bond (ventral tegmental area, VTA were recruited, suggesting that the

  15. Plaque Structural Stress Estimations Improve Prediction of Future Major Adverse Cardiovascular Events After Intracoronary Imaging.

    Science.gov (United States)

    Brown, Adam J; Teng, Zhongzhao; Calvert, Patrick A; Rajani, Nikil K; Hennessy, Orla; Nerlekar, Nitesh; Obaid, Daniel R; Costopoulos, Charis; Huang, Yuan; Hoole, Stephen P; Goddard, Martin; West, Nick E J; Gillard, Jonathan H; Bennett, Martin R

    2016-06-01

    Although plaque rupture is responsible for most myocardial infarctions, few high-risk plaques identified by intracoronary imaging actually result in future major adverse cardiovascular events (MACE). Nonimaging markers of individual plaque behavior are therefore required. Rupture occurs when plaque structural stress (PSS) exceeds material strength. We therefore assessed whether PSS could predict future MACE in high-risk nonculprit lesions identified on virtual-histology intravascular ultrasound. Baseline nonculprit lesion features associated with MACE during long-term follow-up (median: 1115 days) were determined in 170 patients undergoing 3-vessel virtual-histology intravascular ultrasound. MACE was associated with plaque burden ≥70% (hazard ratio: 8.6; 95% confidence interval, 2.5-30.6; P<0.001) and minimal luminal area ≤4 mm(2) (hazard ratio: 6.6; 95% confidence interval, 2.1-20.1; P=0.036), although absolute event rates for high-risk lesions remained <10%. PSS derived from virtual-histology intravascular ultrasound was subsequently estimated in nonculprit lesions responsible for MACE (n=22) versus matched control lesions (n=22). PSS showed marked heterogeneity across and between similar lesions but was significantly increased in MACE lesions at high-risk regions, including plaque burden ≥70% (13.9±11.5 versus 10.2±4.7; P<0.001) and thin-cap fibroatheroma (14.0±8.9 versus 11.6±4.5; P=0.02). Furthermore, PSS improved the ability of virtual-histology intravascular ultrasound to predict MACE in plaques with plaque burden ≥70% (adjusted log-rank, P=0.003) and minimal luminal area ≤4 mm(2) (P=0.002). Plaques responsible for MACE had larger superficial calcium inclusions, which acted to increase PSS (P<0.05). Baseline PSS is increased in plaques responsible for MACE and improves the ability of intracoronary imaging to predict events. Biomechanical modeling may complement plaque imaging for risk stratification of coronary nonculprit lesions. © 2016

  16. On the Use of Backward Difference Formulae to Improve the Prediction of Direction in Market Related Data

    Directory of Open Access Journals (Sweden)

    E. Momoniat

    2013-01-01

    Full Text Available The use of a BDF method as a tool to correct the direction of predictions made using curve fitting techniques is investigated. Random data is generated in such a fashion that it has the same properties as the data we are modelling. The data is assumed to have “memory” such that certain information imbedded in the data will remain within a certain range of points. Data within this period where “memory” exists—say at time steps t1,t2,…,tn—is curve-fitted to produce a prediction at the next discrete time step, tn+1. In this manner a vector of predictions is generated and converted into a discrete ordinary differential representing the gradient of the data. The BDF method implemented with this lower order approximation is used as a means of improving upon the direction of the generated predictions. The use of the BDF method in this manner improves the prediction of the direction of the time series by approximately 30%.

  17. Significance of MPEG-7 textural features for improved mass detection in mammography.

    Science.gov (United States)

    Eltonsy, Nevine H; Tourassi, Georgia D; Fadeev, Aleksey; Elmaghraby, Adel S

    2006-01-01

    The purpose of the study is to investigate the significance of MPEG-7 textural features for improving the detection of masses in screening mammograms. The detection scheme was originally based on morphological directional neighborhood features extracted from mammographic regions of interest (ROIs). Receiver Operating Characteristics (ROC) was performed to evaluate the performance of each set of features independently and merged into a back-propagation artificial neural network (BPANN) using the leave-one-out sampling scheme (LOOSS). The study was based on a database of 668 mammographic ROIs (340 depicting cancer regions and 328 depicting normal parenchyma). Overall, the ROC area index of the BPANN using the directional morphological features was Az=0.85+/-0.01. The MPEG-7 edge histogram descriptor-based BPNN showed an ROC area index of Az=0.71+/-0.01 while homogeneous textural descriptors using 30 and 120 channels helped the BPNN achieve similar ROC area indexes of Az=0.882+/-0.02 and Az=0.877+/-0.01 respectively. After merging the MPEG-7 homogeneous textural features with the directional neighborhood features the performance of the BPANN increased providing an ROC area index of Az=0.91+/-0.01. MPEG-7 homogeneous textural descriptor significantly improved the morphology-based detection scheme.

  18. ESLpred2: improved method for predicting subcellular localization of eukaryotic proteins

    Directory of Open Access Journals (Sweden)

    Raghava Gajendra PS

    2008-11-01

    Full Text Available Abstract Background The expansion of raw protein sequence databases in the post genomic era and availability of fresh annotated sequences for major localizations particularly motivated us to introduce a new improved version of our previously forged eukaryotic subcellular localizations prediction method namely "ESLpred". Since, subcellular localization of a protein offers essential clues about its functioning, hence, availability of localization predictor would definitely aid and expedite the protein deciphering studies. However, robustness of a predictor is highly dependent on the superiority of dataset and extracted protein attributes; hence, it becomes imperative to improve the performance of presently available method using latest dataset and crucial input features. Results Here, we describe augmentation in the prediction performance obtained for our most popular ESLpred method using new crucial features as an input to Support Vector Machine (SVM. In addition, recently available, highly non-redundant dataset encompassing three kingdoms specific protein sequence sets; 1198 fungi sequences, 2597 from animal and 491 plant sequences were also included in the present study. First, using the evolutionary information in the form of profile composition along with whole and N-terminal sequence composition as an input feature vector of 440 dimensions, overall accuracies of 72.7, 75.8 and 74.5% were achieved respectively after five-fold cross-validation. Further, enhancement in performance was observed when similarity search based results were coupled with whole and N-terminal sequence composition along with profile composition by yielding overall accuracies of 75.9, 80.8, 76.6% respectively; best accuracies reported till date on the same datasets. Conclusion These results provide confidence about the reliability and accurate prediction of SVM modules generated in the present study using sequence and profile compositions along with similarity search

  19. Breast calcifications. A standardized mammographic reporting and data system to improve positive predictive value

    International Nuclear Information System (INIS)

    Perugini, G.; Bonzanini, B.; Valentino, C.

    1999-01-01

    The purpose of this work is to investigate the usefulness of a standardized reporting and data system in improving the positive predictive value of mammography in breast calcifications. Using the Breast Imaging Reporting and Data System lexicon developed by the American College of Radiology, it is defined 5 descriptive categories of breast calcifications and classified diagnostic suspicion of malignancy on a 3-grade scale (low, intermediate and high). Two radiologists reviewed 117 mammographic studies selected from those of the patients submitted to surgical biopsy for mammographically detected calcifications from January 1993 to December 1997, and classified them according to the above criteria. The positive predictive value was calculated for all examinations and for the stratified groups. Defining a standardized system for assessing and describing breast calcifications helps improve the diagnostic accuracy of mammography in clinical practice [it

  20. Histone deacetylase inhibitor significantly improved the cloning efficiency of porcine somatic cell nuclear transfer embryos.

    Science.gov (United States)

    Huang, Yongye; Tang, Xiaochun; Xie, Wanhua; Zhou, Yan; Li, Dong; Yao, Chaogang; Zhou, Yang; Zhu, Jianguo; Lai, Liangxue; Ouyang, Hongsheng; Pang, Daxin

    2011-12-01

    Valproic acid (VPA), a histone deacetylase inbibitor, has been shown to generate inducible pluripotent stem (iPS) cells from mouse and human fibroblasts with a significant higher efficiency. Because successful cloning by somatic cell nuclear transfer (SCNT) undergoes a full reprogramming process in which the epigenetic state of a differentiated donor nuclear is converted into an embryonic totipotent state, we speculated that VPA would be useful in promoting cloning efficiency. Therefore, in the present study, we examined whether VPA can promote the developmental competence of SCNT embryos by improving the reprogramming state of donor nucleus. Here we report that 1 mM VPA for 14 to 16 h following activation significantly increased the rate of blastocyst formation of porcine SCNT embryos constructed from Landrace fetal fibroblast cells compared to the control (31.8 vs. 11.4%). However, we found that the acetylation level of Histone H3 lysine 14 and Histone H4 lysine 5 and expression level of Oct4, Sox2, and Klf4 was not significantly changed between VPA-treated and -untreated groups at the blastocyst stage. The SCNT embryos were transferred to 38 surrogates, and the cloning efficiency in the treated group was significantly improved compared with the control group. Taken together, we have demonstrated that VPA can improve both in vitro and in vivo development competence of porcine SCNT embryos.

  1. Ceramic Composite Intermediate Temperature Stress-Rupture Properties Improved Significantly

    Science.gov (United States)

    Morscher, Gregory N.; Hurst, Janet B.

    2002-01-01

    Silicon carbide (SiC) composites are considered to be potential materials for future aircraft engine parts such as combustor liners. It is envisioned that on the hot side (inner surface) of the combustor liner, composites will have to withstand temperatures in excess of 1200 C for thousands of hours in oxidizing environments. This is a severe condition; however, an equally severe, if not more detrimental, condition exists on the cold side (outer surface) of the combustor liner. Here, the temperatures are expected to be on the order of 800 to 1000 C under high tensile stress because of thermal gradients and attachment of the combustor liner to the engine frame (the hot side will be under compressive stress, a less severe stress-state for ceramics). Since these composites are not oxides, they oxidize. The worst form of oxidation for strength reduction occurs at these intermediate temperatures, where the boron nitride (BN) interphase oxidizes first, which causes the formation of a glass layer that strongly bonds the fibers to the matrix. When the fibers strongly bond to the matrix or to one another, the composite loses toughness and strength and becomes brittle. To increase the intermediate temperature stress-rupture properties, researchers must modify the BN interphase. With the support of the Ultra-Efficient Engine Technology (UEET) Program, significant improvements were made as state-of-the-art SiC/SiC composites were developed during the Enabling Propulsion Materials (EPM) program. Three approaches were found to improve the intermediate-temperature stress-rupture properties: fiber-spreading, high-temperature silicon- (Si) doped boron nitride (BN), and outside-debonding BN.

  2. Factors Related to Significant Improvement of Estimated Glomerular Filtration Rates in Chronic Hepatitis B Patients Receiving Telbivudine Therapy

    Directory of Open Access Journals (Sweden)

    Te-Fu Lin

    2017-01-01

    Full Text Available Background and Aim. The improvement of estimated glomerular filtration rates (eGFRs in chronic hepatitis B (CHB patients receiving telbivudine therapy is well known. The aim of this study was to clarify the kinetics of eGFRs and to identify the significant factors related to the improvement of eGFRs in telbivudine-treated CHB patients in a real-world setting. Methods. Serial eGFRs were calculated every 3 months using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI equation. The patients were classified as CKD-1, -2, or -3 according to a baseline eGFR of ≥90, 60–89, or <60 mL/min/1.73 m2, respectively. A significant improvement of eGFR was defined as a more than 10% increase from the baseline. Results. A total of 129 patients were enrolled, of whom 36% had significantly improved eGFRs. According to a multivariate analysis, diabetes mellitus (DM (p=0.028 and CKD-3 (p=0.043 were both significantly related to such improvement. The rates of significant improvement of eGFR were about 73% and 77% in patients with DM and CKD-3, respectively. Conclusions. Telbivudine is an alternative drug of choice for the treatment of hepatitis B patients for whom renal safety is a concern, especially patients with DM and CKD-3.

  3. Improving a two-equation eddy-viscosity turbulence model to predict the aerodynamic performance of thick wind turbine airfoils

    Science.gov (United States)

    Bangga, Galih; Kusumadewi, Tri; Hutomo, Go; Sabila, Ahmad; Syawitri, Taurista; Setiadi, Herlambang; Faisal, Muhamad; Wiranegara, Raditya; Hendranata, Yongki; Lastomo, Dwi; Putra, Louis; Kristiadi, Stefanus

    2018-03-01

    Numerical simulations for relatively thick airfoils are carried out in the present studies. An attempt to improve the accuracy of the numerical predictions is done by adjusting the turbulent viscosity of the eddy-viscosity Menter Shear-Stress-Transport (SST) model. The modification involves the addition of a damping factor on the wall-bounded flows incorporating the ratio of the turbulent kinetic energy to its specific dissipation rate for separation detection. The results are compared with available experimental data and CFD simulations using the original Menter SST model. The present model improves the lift polar prediction even though the stall angle is still overestimated. The improvement is caused by the better prediction of separated flow under a strong adverse pressure gradient. The results show that the Reynolds stresses are damped near the wall causing variation of the logarithmic velocity profiles.

  4. Predictive Factors for Subjective Improvement in Lumbar Spinal Stenosis Patients with Nonsurgical Treatment: A 3-Year Prospective Cohort Study.

    Directory of Open Access Journals (Sweden)

    Ko Matsudaira

    Full Text Available To assess the predictive factors for subjective improvement with nonsurgical treatment in consecutive patients with lumbar spinal stenosis (LSS.Patients with LSS were enrolled from 17 medical centres in Japan. We followed up 274 patients (151 men; mean age, 71 ± 7.4 years for 3 years. A multivariable logistic regression model was used to assess the predictive factors for subjective symptom improvement with nonsurgical treatment.In 30% of patients, conservative treatment led to a subjective improvement in the symptoms; in 70% of patients, the symptoms remained unchanged, worsened, or required surgical treatment. The multivariable analysis of predictive factors for subjective improvement with nonsurgical treatment showed that the absence of cauda equina symptoms (only radicular symptoms had an odds ratio (OR of 3.31 (95% confidence interval [CI]: 1.50-7.31; absence of degenerative spondylolisthesis/scoliosis had an OR of 2.53 (95% CI: 1.13-5.65; <1-year duration of illness had an OR of 3.81 (95% CI: 1.46-9.98; and hypertension had an OR of 2.09 (95% CI: 0.92-4.78.The predictive factors for subjective symptom improvement with nonsurgical treatment in LSS patients were the presence of only radicular symptoms, absence of degenerative spondylolisthesis/scoliosis, and an illness duration of <1 year.

  5. Wind Power Prediction Considering Nonlinear Atmospheric Disturbances

    Directory of Open Access Journals (Sweden)

    Yagang Zhang

    2015-01-01

    Full Text Available This paper considers the effect of nonlinear atmospheric disturbances on wind power prediction. A Lorenz system is introduced as an atmospheric disturbance model. Three new improved wind forecasting models combined with a Lorenz comprehensive disturbance are put forward in this study. Firstly, we define the form of the Lorenz disturbance variable and the wind speed perturbation formula. Then, different artificial neural network models are used to verify the new idea and obtain better wind speed predictions. Finally we separately use the original and improved wind speed series to predict the related wind power. This proves that the corrected wind speed provides higher precision wind power predictions. This research presents a totally new direction in the wind prediction field and has profound theoretical research value and practical guiding significance.

  6. Improving Allergen Prediction in Main Crops Using a Weighted Integrative Method.

    Science.gov (United States)

    Li, Jing; Wang, Jing; Li, Jing

    2017-12-01

    As a public health problem, food allergy is frequently caused by food allergy proteins, which trigger a type-I hypersensitivity reaction in the immune system of atopic individuals. The food allergens in our daily lives are mainly from crops including rice, wheat, soybean and maize. However, allergens in these main crops are far from fully uncovered. Although some bioinformatics tools or methods predicting the potential allergenicity of proteins have been proposed, each method has their limitation. In this paper, we built a novel algorithm PREAL W , which integrated PREAL, FAO/WHO criteria and motif-based method by a weighted average score, to benefit the advantages of different methods. Our results illustrated PREAL W has better performance significantly in the crops' allergen prediction. This integrative allergen prediction algorithm could be useful for critical food safety matters. The PREAL W could be accessed at http://lilab.life.sjtu.edu.cn:8080/prealw .

  7. The Bi-Directional Prediction of Carbon Fiber Production Using a Combination of Improved Particle Swarm Optimization and Support Vector Machine.

    Science.gov (United States)

    Xiao, Chuncai; Hao, Kuangrong; Ding, Yongsheng

    2014-12-30

    This paper creates a bi-directional prediction model to predict the performance of carbon fiber and the productive parameters based on a support vector machine (SVM) and improved particle swarm optimization (IPSO) algorithm (SVM-IPSO). In the SVM, it is crucial to select the parameters that have an important impact on the performance of prediction. The IPSO is proposed to optimize them, and then the SVM-IPSO model is applied to the bi-directional prediction of carbon fiber production. The predictive accuracy of SVM is mainly dependent on its parameters, and IPSO is thus exploited to seek the optimal parameters for SVM in order to improve its prediction capability. Inspired by a cell communication mechanism, we propose IPSO by incorporating information of the global best solution into the search strategy to improve exploitation, and we employ IPSO to establish the bi-directional prediction model: in the direction of the forward prediction, we consider productive parameters as input and property indexes as output; in the direction of the backward prediction, we consider property indexes as input and productive parameters as output, and in this case, the model becomes a scheme design for novel style carbon fibers. The results from a set of the experimental data show that the proposed model can outperform the radial basis function neural network (RNN), the basic particle swarm optimization (PSO) method and the hybrid approach of genetic algorithm and improved particle swarm optimization (GA-IPSO) method in most of the experiments. In other words, simulation results demonstrate the effectiveness and advantages of the SVM-IPSO model in dealing with the problem of forecasting.

  8. The Bi-Directional Prediction of Carbon Fiber Production Using a Combination of Improved Particle Swarm Optimization and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Chuncai Xiao

    2014-12-01

    Full Text Available This paper creates a bi-directional prediction model to predict the performance of carbon fiber and the productive parameters based on a support vector machine (SVM and improved particle swarm optimization (IPSO algorithm (SVM-IPSO. In the SVM, it is crucial to select the parameters that have an important impact on the performance of prediction. The IPSO is proposed to optimize them, and then the SVM-IPSO model is applied to the bi-directional prediction of carbon fiber production. The predictive accuracy of SVM is mainly dependent on its parameters, and IPSO is thus exploited to seek the optimal parameters for SVM in order to improve its prediction capability. Inspired by a cell communication mechanism, we propose IPSO by incorporating information of the global best solution into the search strategy to improve exploitation, and we employ IPSO to establish the bi-directional prediction model: in the direction of the forward prediction, we consider productive parameters as input and property indexes as output; in the direction of the backward prediction, we consider property indexes as input and productive parameters as output, and in this case, the model becomes a scheme design for novel style carbon fibers. The results from a set of the experimental data show that the proposed model can outperform the radial basis function neural network (RNN, the basic particle swarm optimization (PSO method and the hybrid approach of genetic algorithm and improved particle swarm optimization (GA-IPSO method in most of the experiments. In other words, simulation results demonstrate the effectiveness and advantages of the SVM-IPSO model in dealing with the problem of forecasting.

  9. Improving the Spatial Prediction of Soil Organic Carbon Stocks in a Complex Tropical Mountain Landscape by Methodological Specifications in Machine Learning Approaches.

    Science.gov (United States)

    Ließ, Mareike; Schmidt, Johannes; Glaser, Bruno

    2016-01-01

    Tropical forests are significant carbon sinks and their soils' carbon storage potential is immense. However, little is known about the soil organic carbon (SOC) stocks of tropical mountain areas whose complex soil-landscape and difficult accessibility pose a challenge to spatial analysis. The choice of methodology for spatial prediction is of high importance to improve the expected poor model results in case of low predictor-response correlations. Four aspects were considered to improve model performance in predicting SOC stocks of the organic layer of a tropical mountain forest landscape: Different spatial predictor settings, predictor selection strategies, various machine learning algorithms and model tuning. Five machine learning algorithms: random forests, artificial neural networks, multivariate adaptive regression splines, boosted regression trees and support vector machines were trained and tuned to predict SOC stocks from predictors derived from a digital elevation model and satellite image. Topographical predictors were calculated with a GIS search radius of 45 to 615 m. Finally, three predictor selection strategies were applied to the total set of 236 predictors. All machine learning algorithms-including the model tuning and predictor selection-were compared via five repetitions of a tenfold cross-validation. The boosted regression tree algorithm resulted in the overall best model. SOC stocks ranged between 0.2 to 17.7 kg m-2, displaying a huge variability with diffuse insolation and curvatures of different scale guiding the spatial pattern. Predictor selection and model tuning improved the models' predictive performance in all five machine learning algorithms. The rather low number of selected predictors favours forward compared to backward selection procedures. Choosing predictors due to their indiviual performance was vanquished by the two procedures which accounted for predictor interaction.

  10. Prediction of multi-wake problems using an improved Jensen wake model

    DEFF Research Database (Denmark)

    Tian, Linlin; Zhu, Wei Jun; Shen, Wen Zhong

    2017-01-01

    The improved analytical wake model named as 2D_k Jensen model (which was proposed to overcome some shortcomes in the classical Jensen wake model) is applied and validated in this work for wind turbine multi-wake predictions. Different from the original Jensen model, this newly developed 2D_k Jensen...... model uses a cosine shape instead of the top-hat shape for the velocity deficit in the wake, and the wake decay rate as a variable that is related to the ambient turbulence as well as the rotor generated turbulence. Coupled with four different multi-wake combination models, the 2D_k Jensen model...... is assessed through (1) simulating two wakes interaction under full wake and partial wake conditions and (2) predicting the power production in the Horns Rev wind farm for different wake sectors around two different wind directions. Through comparisons with field measurements, results from Large Eddy...

  11. Standard deviation of carotid young's modulus and presence or absence of plaque improves prediction of coronary heart disease risk.

    Science.gov (United States)

    Niu, Lili; Zhang, Yanling; Qian, Ming; Xiao, Yang; Meng, Long; Zheng, Rongqin; Zheng, Hairong

    2017-11-01

    The stiffness of large arteries and the presence or absence of plaque are associated with coronary heart disease (CHD). Because arterial walls are biologically heterogeneous, the standard deviation of Young's modulus (YM-std) of the large arteries may better predict coronary atherosclerosis. However, the role of YM-std in the occurrence of coronary events has not been addressed so far. Therefore, this study investigated whether the carotid YM-std and the presence or absence of plaque improved CHD risk prediction. One hundred and three patients with CHD (age 66 ± 11 years) and 107 patients at high risk of atherosclerosis (age 61 ± 7 years) were recruited. Carotid YM was measured by the vessel texture matching method, and YM-std was calculated. Carotid intima-media thickness was measured by the MyLab 90 ultrasound Platform employed dedicated software RF-tracking technology. In logistic regression analysis, YM-std (OR = 1·010; 95% CI = 1·003-1·016), carotid plaque (OR = 16·759; 95% CI = 3·719-75·533) and YM-std plus plaque (OR = 0·989; 95% CI = 0·981-0·997) were independent predictors of CHD. The traditional risk factors (TRF) plus YM-std plus plaque model showed a significant improvement in area under the receiver-operating characteristic curve (AUC), which increased from 0·717 (TRF only) to 0·777 (95% CI for the difference in adjusted AUC: 0·010-0·110). Carotid YM-std is a powerful independent predictor of CHD. Adding plaque and YM-std to TRF improves CHD risk prediction. © 2016 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  12. Improvement of a predictive model in ovarian cancer patients submitted to platinum-based chemotherapy: implications of a GST activity profile.

    Science.gov (United States)

    Pereira, Deolinda; Assis, Joana; Gomes, Mónica; Nogueira, Augusto; Medeiros, Rui

    2016-05-01

    The success of chemotherapy in ovarian cancer (OC) is directly associated with the broad variability in platinum response, with implications in patients survival. This heterogeneous response might result from inter-individual variations in the platinum-detoxification pathway due to the expression of glutathione-S-transferase (GST) enzymes. We hypothesized that GSTM1 and GSTT1 polymorphisms might have an impact as prognostic and predictive determinants for OC. We conducted a hospital-based study in a cohort of OC patients submitted to platinum-based chemotherapy. GSTM1 and GSTT1 genotypes were determined by multiplex PCR. GSTM1-null genotype patients presented a significantly longer 5-year survival and an improved time to progression when compared with GSTM1-wt genotype patients (log-rank test, P = 0.001 and P = 0.013, respectively). Multivariate Cox regression analysis indicates that the inclusion of genetic information regarding GSTM1 polymorphism increased the predictive ability of risk of death after OC platinum-based chemotherapy (c-index from 0.712 to 0.833). Namely, residual disease (HR, 4.90; P = 0.016) and GSTM1-wt genotype emerged as more important predictors of risk of death (HR, 2.29; P = 0.039; P = 0.036 after bootstrap). No similar effect on survival was observed regarding GSTT1 polymorphism, and there were no statistically significant differences between GSTM1 and GSTT1 genotypes and the assessed patients' clinical-pathological characteristics. GSTM1 polymorphism seems to have an impact in OC prognosis as it predicts a better response to platinum-based chemotherapy and hence an improved survival. The characterization of the GSTM1 genetic profile might be a useful molecular tool and a putative genetic marker for OC clinical outcome.

  13. Prediction model of ammonium uranyl carbonate calcination by microwave heating using incremental improved Back-Propagation neural network

    Energy Technology Data Exchange (ETDEWEB)

    Li Yingwei [Faculty of Metallurgical and Energy Engineering, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Key Laboratory of Unconventional Metallurgy, Ministry of Education, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Peng Jinhui, E-mail: jhpeng@kmust.edu.c [Faculty of Metallurgical and Energy Engineering, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Key Laboratory of Unconventional Metallurgy, Ministry of Education, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Liu Bingguo [Faculty of Metallurgical and Energy Engineering, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Key Laboratory of Unconventional Metallurgy, Ministry of Education, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Li Wei [Key Laboratory of Unconventional Metallurgy, Ministry of Education, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Huang Daifu [No. 272 Nuclear Industry Factory, China National Nuclear Corporation, Hengyang, Hunan Province 421002 (China); Zhang Libo [Faculty of Metallurgical and Energy Engineering, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China); Key Laboratory of Unconventional Metallurgy, Ministry of Education, Kunming University of Science and Technology, Kunming, Yunnan Province 650093 (China)

    2011-05-15

    Research highlights: The incremental improved Back-Propagation neural network prediction model using the Levenberg-Marquardt algorithm based on optimizing theory is put forward. The prediction model of the nonlinear system is built, which can effectively predict the experiment of microwave calcining of ammonium uranyl carbonate (AUC). AUC can accept the microwave energy and microwave heating can quickly decompose AUC. In the experiment of microwave calcining of AUC, the contents of U and U{sup 4+} increased with increasing of microwave power and irradiation time, and decreased with increasing of the material average depth. - Abstract: The incremental improved Back-Propagation (BP) neural network prediction model was put forward, which was very useful in overcoming the problems, such as long testing cycle, high testing quantity, difficulty of optimization for process parameters, many training data probably were offered by the way of increment batch and the limitation of the system memory could make the training data infeasible, which existed in the process of calcinations for ammonium uranyl carbonate (AUC) by microwave heating. The prediction model of the nonlinear system was built, which could effectively predict the experiment of microwave calcining of AUC. The predicted results indicated that the contents of U and U{sup 4+} were increased with increasing of microwave power and irradiation time, and decreased with increasing of the material average depth.

  14. Prediction model of ammonium uranyl carbonate calcination by microwave heating using incremental improved Back-Propagation neural network

    International Nuclear Information System (INIS)

    Li Yingwei; Peng Jinhui; Liu Bingguo; Li Wei; Huang Daifu; Zhang Libo

    2011-01-01

    Research highlights: → The incremental improved Back-Propagation neural network prediction model using the Levenberg-Marquardt algorithm based on optimizing theory is put forward. → The prediction model of the nonlinear system is built, which can effectively predict the experiment of microwave calcining of ammonium uranyl carbonate (AUC). → AUC can accept the microwave energy and microwave heating can quickly decompose AUC. → In the experiment of microwave calcining of AUC, the contents of U and U 4+ increased with increasing of microwave power and irradiation time, and decreased with increasing of the material average depth. - Abstract: The incremental improved Back-Propagation (BP) neural network prediction model was put forward, which was very useful in overcoming the problems, such as long testing cycle, high testing quantity, difficulty of optimization for process parameters, many training data probably were offered by the way of increment batch and the limitation of the system memory could make the training data infeasible, which existed in the process of calcinations for ammonium uranyl carbonate (AUC) by microwave heating. The prediction model of the nonlinear system was built, which could effectively predict the experiment of microwave calcining of AUC. The predicted results indicated that the contents of U and U 4+ were increased with increasing of microwave power and irradiation time, and decreased with increasing of the material average depth.

  15. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    Science.gov (United States)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  16. Improving 3D structure prediction from chemical shift data

    Energy Technology Data Exchange (ETDEWEB)

    Schot, Gijs van der [Utrecht University, Computational Structural Biology, Bijvoet Center for Biomolecular Research, Faculty of Science-Chemistry (Netherlands); Zhang, Zaiyong [Technische Universitaet Muenchen, Biomolecular NMR and Munich Center for Integrated Protein Science, Department Chemie (Germany); Vernon, Robert [University of Washington, Department of Biochemistry (United States); Shen, Yang [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Vranken, Wim F. [VIB, Department of Structural Biology (Belgium); Baker, David [University of Washington, Department of Biochemistry (United States); Bonvin, Alexandre M. J. J., E-mail: a.m.j.j.bonvin@uu.nl [Utrecht University, Computational Structural Biology, Bijvoet Center for Biomolecular Research, Faculty of Science-Chemistry (Netherlands); Lange, Oliver F., E-mail: oliver.lange@tum.de [Technische Universitaet Muenchen, Biomolecular NMR and Munich Center for Integrated Protein Science, Department Chemie (Germany)

    2013-09-15

    We report advances in the calculation of protein structures from chemical shift nuclear magnetic resonance data alone. Our previously developed method, CS-Rosetta, assembles structures from a library of short protein fragments picked from a large library of protein structures using chemical shifts and sequence information. Here we demonstrate that combination of a new and improved fragment picker and the iterative sampling algorithm RASREC yield significant improvements in convergence and accuracy. Moreover, we introduce improved criteria for assessing the accuracy of the models produced by the method. The method was tested on 39 proteins in the 50-100 residue size range and yields reliable structures in 70 % of the cases. All structures that passed the reliability filter were accurate (<2 A RMSD from the reference)

  17. Plasma proteomics classifiers improve risk prediction for renal disease in patients with hypertension or type 2 diabetes

    DEFF Research Database (Denmark)

    Pena, Michelle J; Jankowski, Joachim; Heinze, Georg

    2015-01-01

    OBJECTIVE: Micro and macroalbuminuria are strong risk factors for progression of nephropathy in patients with hypertension or type 2 diabetes. Early detection of progression to micro and macroalbuminuria may facilitate prevention and treatment of renal diseases. We aimed to develop plasma...... proteomics classifiers to predict the development of micro or macroalbuminuria in hypertension or type 2 diabetes. METHODS: Patients with hypertension (n = 125) and type 2 diabetes (n = 82) were selected for this case-control study from the Prevention of REnal and Vascular ENd-stage Disease cohort....... RESULTS: In hypertensive patients, the classifier improved risk prediction for transition in albuminuria stage on top of the reference model (C-index from 0.69 to 0.78; P diabetes, the classifier improved risk prediction for transition from micro to macroalbuminuria (C-index from 0...

  18. Congestion Prediction Modeling for Quality of Service Improvement in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ga-Won Lee

    2014-04-01

    Full Text Available Information technology (IT is pushing ahead with drastic reforms of modern life for improvement of human welfare. Objects constitute “Information Networks” through smart, self-regulated information gathering that also recognizes and controls current information states in Wireless Sensor Networks (WSNs. Information observed from sensor networks in real-time is used to increase quality of life (QoL in various industries and daily life. One of the key challenges of the WSNs is how to achieve lossless data transmission. Although nowadays sensor nodes have enhanced capacities, it is hard to assure lossless and reliable end-to-end data transmission in WSNs due to the unstable wireless links and low hard ware resources to satisfy high quality of service (QoS requirements. We propose a node and path traffic prediction model to predict and minimize the congestion. This solution includes prediction of packet generation due to network congestion from both periodic and event data generation. Simulation using NS-2 and Matlab is used to demonstrate the effectiveness of the proposed solution.

  19. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  20. Predicting Improvement in Writer's Cramp Symptoms following Botulinum Neurotoxin Injection Therapy

    Directory of Open Access Journals (Sweden)

    Mallory Jackman

    2016-09-01

    Full Text Available Introduction: Writer's cramp is a specific focal hand dystonia causing abnormal posturing and tremor in the upper limb. The most popular medical intervention, botulinum neurotoxin type A (BoNT-A therapy, is variably effective for 50–70% of patients. BoNT-A non-responders undergo ineffective treatment and may experience significant side effects. Various assessments have been used to determine response prediction to BoNT-A, but not in the same population of patients. Methods: A comprehensive assessment was employed to measure various symptom aspects. Clinical scales, full upper-limb kinematic measures, self-report, and task performance measures were assessed for nine writer's cramp patients at baseline. Patients received two BoNT-A injections then were classified as responders or non-responders based on a quantified self-report measure. Baseline scores were compared between groups, across all measures, to determine which scores predicted a positive BoNT-A response. Results: Five of nine patients were responders. No kinematic measures were predictably different between groups. Analyses revealed three features that predicted a favorable response and separated the two groups: higher than average cramp severity and cramp frequency, and below average cramp latency. Discussion: Non-kinematic measures appear to be superior in making such predictions. Specifically, measures of cramp severity, frequency, and latency during performance of a specific set of writing and drawing tasks were predictive factors. Since kinematic was not used to determine the injection pattern and the injections were visually guided, it may still be possible to use individual patient kinematics for better outcomes. 

  1. Protein docking prediction using predicted protein-protein interface

    Directory of Open Access Journals (Sweden)

    Li Bin

    2012-01-01

    Full Text Available Abstract Background Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. Results We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm, is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. Conclusion We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  2. Protein docking prediction using predicted protein-protein interface.

    Science.gov (United States)

    Li, Bin; Kihara, Daisuke

    2012-01-10

    Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  3. Improving student success using predictive models and data visualisations

    Directory of Open Access Journals (Sweden)

    Hanan Ayad

    2012-08-01

    Full Text Available The need to educate a competitive workforce is a global problem. In the US, for example, despite billions of dollars spent to improve the educational system, approximately 35% of students never finish high school. The drop rate among some demographic groups is as high as 50–60%. At the college level in the US only 30% of students graduate from 2-year colleges in 3 years or less and approximately 50% graduate from 4-year colleges in 5 years or less. A basic challenge in delivering global education, therefore, is improving student success. By student success we mean improving retention, completion and graduation rates. In this paper we describe a Student Success System (S3 that provides a holistic, analytical view of student academic progress.1 The core of S3 is a flexible predictive modelling engine that uses machine intelligence and statistical techniques to identify at-risk students pre-emptively. S3 also provides a set of advanced data visualisations for reaching diagnostic insights and a case management tool for managing interventions. S3's open modular architecture will also allow integration and plug-ins with both open and proprietary software. Powered by learning analytics, S3 is intended as an end-to-end solution for identifying at-risk students, understanding why they are at risk, designing interventions to mitigate that risk and finally closing the feedback look by tracking the efficacy of the applied intervention.

  4. Improving malignancy prediction in breast lesions with the combination of apparent diffusion coefficient and dynamic contrast-enhanced kinetic descriptors

    International Nuclear Information System (INIS)

    Nogueira, Luisa; Brandão, Sofia; Matos, Eduarda; Gouveia Nunes, Rita; Ferreira, Hugo Alexandre; Loureiro, Joana; Ramos, Isabel

    2015-01-01

    Aim: To assess how the joint use of apparent diffusion coefficient (ADC) and kinetic parameters (uptake phase and delayed enhancement characteristics) from dynamic contrast-enhanced (DCE) can boost the ability to predict breast lesion malignancy. Materials and methods: Breast magnetic resonance examinations including DCE and diffusion-weighted imaging (DWI) were performed on 51 women. The association between kinetic parameters and ADC were evaluated and compared between lesion types. Models with binary outcome of malignancy were studied using generalized estimating equations (GEE), (GEE), and using kinetic parameters and ADC values as malignancy predictors. Model accuracy was assessed using the corrected maximum quasi-likelihood under the independence confidence criterion (QICC). Predicted probability of malignancy was estimated for the best model. Results: ADC values were significantly associated with kinetic parameters: medium and rapid uptake phase (p<0.001) and plateau and washout curve types (p=0.004). Comparison between lesion type showed significant differences for ADC (p=0.001), early phase (p<0.001), and curve type (p<0.001). The predicted probabilities of malignancy for the first ADC quartile (≤1.17×10 −3  mm 2 /s) and persistent, plateau and washout curves, were 54.6%, 86.9%, and 97.8%, respectively, and for the third ADC quartile (≥1.51×10 −3  mm 2 /s) were 3.2%, 15.5%, and 54.8%, respectively. The predicted probability of malignancy was less than 5% for 18.8% of the lesions and greater than 33% for 50.7% of the lesions (24/35 lesions, corresponding to a malignancy rate of 68.6%). Conclusion: The best malignancy predictors were low ADCs and washout curves. ADC and kinetic parameters provide differentiated information on the microenvironment of the lesion, with joint models displaying improved predictive performance. -- Highlights: •ADC and kinetic parameters provide diverse information regarding lesion environment. •The best predictors

  5. The challenges of ESRD care in developing economies: sub-Saharan African opportunities for significant improvement.

    Science.gov (United States)

    Bamgboye, Ebun Ladipo

    Chronic kidney disease (CKD) is a significant cause of morbidity and mortality in sub-Saharan Africa. This, along with other noncommunicable diseases like hypertension, diabetes, and heart diseases, poses a double burden on a region that is still struggling to cope with the scourge of communicable diseases like malaria, tuberculosis, HIV, and more recently Ebola. Causes of CKD in the region are predominantly glomerulonephritis and hypertension, although type 2 diabetes is also becoming a significant cause as is the retroviral disease. Patients are generally younger than in the developed world, and there is a significant male preponderance. Most patients are managed by hemodialysis, with peritoneal dialysis and kidney transplantation being available in only few countries in the region. Government funding and support for dialysis is often unavailable, and when available, often with restrictions. There is a dearth of trained manpower to treat the disease, and many countries have a limited number of units, which are often ill-equipped to deal adequately with the number of patients who require end-stage renal disease (ESRD) care in the region. Although there has been a significant improvement when compared with the situation, even as recently as 10 years ago, there is also the potential for further improvement, which would significantly improve the outcomes in patients with ESRD in the region. The information in this review was obtained from a combination of renal registry reports (published and unpublished), published articles, responses to a questionnaire sent to nephrologists prior to the World Congress of Nephrology (WCN) in Cape Town, and from nephrologists attending the WCN in Cape Town (March 13 - 17, 2015).

  6. Improving diagnosis, prognosis and prediction by using biomarkers in CRC patients (Review).

    Science.gov (United States)

    Nikolouzakis, Taxiarchis Konstantinos; Vassilopoulou, Loukia; Fragkiadaki, Persefoni; Mariolis Sapsakos, Theodoros; Papadakis, Georgios Z; Spandidos, Demetrios A; Tsatsakis, Aristides M; Tsiaoussis, John

    2018-06-01

    Colorectal cancer (CRC) is among the most common cancers. In fact, it is placed in the third place among the most diagnosed cancer in men, after lung and prostate cancer, and in the second one for the most diagnosed cancer in women, following breast cancer. Moreover, its high mortality rates classifies it among the leading causes of cancer‑related death worldwide. Thus, in order to help clinicians to optimize their practice, it is crucial to introduce more effective tools that will improve not only early diagnosis, but also prediction of the most likely progression of the disease and response to chemotherapy. In that way, they will be able to decrease both morbidity and mortality of their patients. In accordance with that, colon cancer research has described numerous biomarkers for diagnostic, prognostic and predictive purposes that either alone or as part of a panel would help improve patient's clinical management. This review aims to describe the most accepted biomarkers among those proposed for use in CRC divided based on the clinical specimen that is examined (tissue, faeces or blood) along with their restrictions. Lastly, new insight in CRC monitoring will be discussed presenting promising emerging biomarkers (telomerase activity, telomere length and micronuclei frequency).

  7. Can assimilation of crowdsourced data in hydrological modelling improve flood prediction?

    Science.gov (United States)

    Mazzoleni, Maurizio; Verlaan, Martin; Alfonso, Leonardo; Monego, Martina; Norbiato, Daniele; Ferri, Miche; Solomatine, Dimitri P.

    2017-02-01

    Monitoring stations have been used for decades to properly measure hydrological variables and better predict floods. To this end, methods to incorporate these observations into mathematical water models have also been developed. Besides, in recent years, the continued technological advances, in combination with the growing inclusion of citizens in participatory processes related to water resources management, have encouraged the increase of citizen science projects around the globe. In turn, this has stimulated the spread of low-cost sensors to allow citizens to participate in the collection of hydrological data in a more distributed way than the classic static physical sensors do. However, two main disadvantages of such crowdsourced data are the irregular availability and variable accuracy from sensor to sensor, which makes them challenging to use in hydrological modelling. This study aims to demonstrate that streamflow data, derived from crowdsourced water level observations, can improve flood prediction if integrated in hydrological models. Two different hydrological models, applied to four case studies, are considered. Realistic (albeit synthetic) time series are used to represent crowdsourced data in all case studies. In this study, it is found that the data accuracies have much more influence on the model results than the irregular frequencies of data availability at which the streamflow data are assimilated. This study demonstrates that data collected by citizens, characterized by being asynchronous and inaccurate, can still complement traditional networks formed by few accurate, static sensors and improve the accuracy of flood forecasts.

  8. Improvement of Surface Temperature Prediction Using SVR with MOGREPS Data for Short and Medium range over South Korea

    Science.gov (United States)

    Lim, S. J.; Choi, R. K.; Ahn, K. D.; Ha, J. C.; Cho, C. H.

    2014-12-01

    As the Korea Meteorology Administration (KMA) has operated Met Office Global and Regional Ensemble Prediction System (MOGREPS) with introduction of Unified Model (UM), many attempts have been made to improve predictability in temperature forecast in last years. In this study, post-processing method of MOGREPS for surface temperature prediction is developed with machine learning over 52 locations in South Korea. Past 60-day lag time was used as a training phase of Support Vector Regression (SVR) method for surface temperature forecast model. The selected inputs for SVR are followings: date and surface temperatures from Numerical Weather prediction (NWP), such as GDAPS, individual 24 ensemble members, mean and median of ensemble members for every 3hours for 12 days.To verify the reliability of SVR-based ensemble prediction (SVR-EP), 93 days are used (from March 1 to May 31, 2014). The result yielded improvement of SVR-EP by RMSE value of 16 % throughout entire prediction period against conventional ensemble prediction (EP). In particular, short range predictability of SVR-EP resulted in 18.7% better RMSE for 1~3 day forecast. The mean temperature bias between SVR-EP and EP at all test locations showed around 0.36°C and 1.36°C, respectively. SVR-EP is currently extending for more vigorous sensitivity test, such as increasing training phase and optimizing machine learning model.

  9. An enhanced topologically significant directed random walk in cancer classification using gene expression datasets

    Directory of Open Access Journals (Sweden)

    Choon Sen Seah

    2017-12-01

    Full Text Available Microarray technology has become one of the elementary tools for researchers to study the genome of organisms. As the complexity and heterogeneity of cancer is being increasingly appreciated through genomic analysis, cancerous classification is an emerging important trend. Significant directed random walk is proposed as one of the cancerous classification approach which have higher sensitivity of risk gene prediction and higher accuracy of cancer classification. In this paper, the methodology and material used for the experiment are presented. Tuning parameter selection method and weight as parameter are applied in proposed approach. Gene expression dataset is used as the input datasets while pathway dataset is used to build a directed graph, as reference datasets, to complete the bias process in random walk approach. In addition, we demonstrate that our approach can improve sensitive predictions with higher accuracy and biological meaningful classification result. Comparison result takes place between significant directed random walk and directed random walk to show the improvement in term of sensitivity of prediction and accuracy of cancer classification.

  10. Risk prediction is improved by adding markers of subclinical organ damage to SCORE

    DEFF Research Database (Denmark)

    Sehestedt, Thomas; Jeppesen, Jørgen; Hansen, Tine W

    2010-01-01

    cardiovascular, anti-diabetic, or lipid-lowering treatment, aged 41, 51, 61, or 71 years, we measured traditional cardiovascular risk factors, left ventricular (LV) mass index, atherosclerotic plaques in the carotid arteries, carotid/femoral pulse wave velocity (PWV), and urine albumin/creatinine ratio (UACR......) and followed them for a median of 12.8 years. Eighty-one subjects died because of cardiovascular causes. Risk of cardiovascular death was independently of SCORE associated with LV hypertrophy [hazard ratio (HR) 2.2 (95% CI 1.2-4.0)], plaques [HR 2.5 (1.6-4.0)], UACR > or = 90th percentile [HR 3.3 (1.......07). CONCLUSION: Subclinical organ damage predicted cardiovascular death independently of SCORE and the combination may improve risk prediction....

  11. Improving ELM-Based Service Quality Prediction by Concise Feature Extraction

    Directory of Open Access Journals (Sweden)

    Yuhai Zhao

    2015-01-01

    Full Text Available Web services often run on highly dynamic and changing environments, which generate huge volumes of data. Thus, it is impractical to monitor the change of every QoS parameter for the timely trigger precaution due to high computational costs associated with the process. To address the problem, this paper proposes an active service quality prediction method based on extreme learning machine. First, we extract web service trace logs and QoS information from the service log and convert them into feature vectors. Second, by the proposed EC rules, we are enabled to trigger the precaution of QoS as soon as possible with high confidence. An efficient prefix tree based mining algorithm together with some effective pruning rules is developed to mine such rules. Finally, we study how to extract a set of diversified features as the representative of all mined results. The problem is proved to be NP-hard. A greedy algorithm is presented to approximate the optimal solution. Experimental results show that ELM trained by the selected feature subsets can efficiently improve the reliability and the earliness of service quality prediction.

  12. Improving the Spatial Prediction of Soil Organic Carbon Stocks in a Complex Tropical Mountain Landscape by Methodological Specifications in Machine Learning Approaches.

    Directory of Open Access Journals (Sweden)

    Mareike Ließ

    Full Text Available Tropical forests are significant carbon sinks and their soils' carbon storage potential is immense. However, little is known about the soil organic carbon (SOC stocks of tropical mountain areas whose complex soil-landscape and difficult accessibility pose a challenge to spatial analysis. The choice of methodology for spatial prediction is of high importance to improve the expected poor model results in case of low predictor-response correlations. Four aspects were considered to improve model performance in predicting SOC stocks of the organic layer of a tropical mountain forest landscape: Different spatial predictor settings, predictor selection strategies, various machine learning algorithms and model tuning. Five machine learning algorithms: random forests, artificial neural networks, multivariate adaptive regression splines, boosted regression trees and support vector machines were trained and tuned to predict SOC stocks from predictors derived from a digital elevation model and satellite image. Topographical predictors were calculated with a GIS search radius of 45 to 615 m. Finally, three predictor selection strategies were applied to the total set of 236 predictors. All machine learning algorithms-including the model tuning and predictor selection-were compared via five repetitions of a tenfold cross-validation. The boosted regression tree algorithm resulted in the overall best model. SOC stocks ranged between 0.2 to 17.7 kg m-2, displaying a huge variability with diffuse insolation and curvatures of different scale guiding the spatial pattern. Predictor selection and model tuning improved the models' predictive performance in all five machine learning algorithms. The rather low number of selected predictors favours forward compared to backward selection procedures. Choosing predictors due to their indiviual performance was vanquished by the two procedures which accounted for predictor interaction.

  13. Prediction of time trends in recovery of cognitive function after mild head injury

    DEFF Research Database (Denmark)

    Müller, Kay; Ingebrigtsen, Tor; Wilsgaard, Tom

    2009-01-01

    . There was significant improvement of performance after 6 months. APOE-epsilon4 genotype was the only independent factor significantly predicting less improvement. CONCLUSION: The presence of the APOE-epsilon4 allele predicts less recovery of cognitive function after mild head injury....... change. RESULTS: A Glasgow Coma Scale score of less than 15, traumatic brain injury demonstrated with computed tomography, magnetic resonance imaging, and serum S-100B greater than 0.14 microg/L predicted impaired cognitive performance both at baseline and after 6 months; APOE genotype did not...

  14. Improved Transient Response Estimations in Predicting 40 Hz Auditory Steady-State Response Using Deconvolution Methods

    Directory of Open Access Journals (Sweden)

    Xiaodan Tan

    2017-12-01

    with variable weights in three templates. The significantly improved prediction accuracy of ASSR achieved by MSAD strongly supports the linear superposition mechanism of ASSR if an accurate template of transient AEPs can be reconstructed. The capacity in obtaining both ASSR and its underlying transient components accurately and simultaneously has the potential to contribute significantly to diagnosis of patients with neuropsychiatric disorders.

  15. Improving behavioral performance under full attention by adjusting response criteria to changes in stimulus predictability.

    Science.gov (United States)

    Katzner, Steffen; Treue, Stefan; Busse, Laura

    2012-09-04

    One of the key features of active perception is the ability to predict critical sensory events. Humans and animals can implicitly learn statistical regularities in the timing of events and use them to improve behavioral performance. Here, we used a signal detection approach to investigate whether such improvements in performance result from changes of perceptual sensitivity or rather from adjustments of a response criterion. In a regular sequence of briefly presented stimuli, human observers performed a noise-limited motion detection task by monitoring the stimulus stream for the appearance of a designated target direction. We manipulated target predictability through the hazard rate, which specifies the likelihood that a target is about to occur, given it has not occurred so far. Analyses of response accuracy revealed that improvements in performance could be accounted for by adjustments of the response criterion; a growing hazard rate was paralleled by an increasing tendency to report the presence of a target. In contrast, the hazard rate did not affect perceptual sensitivity. Consistent with previous research, we also found that reaction time decreases as the hazard rate grows. A simple rise-to-threshold model could well describe this decrease and attribute predictability effects to threshold adjustments rather than changes in information supply. We conclude that, even under conditions of full attention and constant perceptual sensitivity, behavioral performance can be optimized by dynamically adjusting the response criterion to meet ongoing changes in the likelihood of a target.

  16. Does early change predict long-term (6 months) improvements in subjects who receive manual therapy for low back pain?

    Science.gov (United States)

    Cook, Chad; Petersen, Shannon; Donaldson, Megan; Wilhelm, Mark; Learman, Ken

    2017-09-01

    Early change is commonly assessed for manual therapy interventions and has been used to determine treatment appropriateness. However, current studies have only explored the relationship of between or within-session changes and short-/medium-term outcomes. The goal of this study was to determine whether pain changes after two weeks of pragmatic manual therapy could predict those participants with chronic low back pain who demonstrate continued improvements at 6-month follow-up. This study was a retrospective observational design. Univariate logistic regression analyses were performed using a 33% and a 50% pain change to predict improvement. Those who experienced a ≥33% pain reduction by 2 weeks had 6.98 (95% CI = 1.29, 37.53) times higher odds of 50% improvement on the GRoC and 4.74 (95% CI = 1.31, 17.17) times higher odds of 50% improvement on the ODI (at 6 months). Subjects who reported a ≥50% pain reduction at 2 weeks had 5.98 (95% CI = 1.56, 22.88) times higher odds of a 50% improvement in the GRoC and 3.99 (95% CI = 1.23, 12.88) times higher odds of a 50% improvement in the ODI (at 6 months). Future studies may investigate whether a change in plan of care is beneficial for patients who are not showing early improvement predictive of a good long-term outcome.

  17. Evoked Emotions Predict Food Choice

    NARCIS (Netherlands)

    Dalenberg, Jelle R.; Gutjar, Swetlana; ter Horst, Gert J.; de Graaf, Kees; Renken, Remco J.; Jager, Gerry

    2014-01-01

    In the current study we show that non-verbal food-evoked emotion scores significantly improve food choice prediction over merely liking scores. Previous research has shown that liking measures correlate with choice. However, liking is no strong predictor for food choice in real life environments.

  18. Exploring the genetic architecture and improving genomic prediction accuracy for mastitis and milk production traits in dairy cattle by mapping variants to hepatic transcriptomic regions responsive to intra-mammary infection.

    Science.gov (United States)

    Fang, Lingzhao; Sahana, Goutam; Ma, Peipei; Su, Guosheng; Yu, Ying; Zhang, Shengli; Lund, Mogens Sandø; Sørensen, Peter

    2017-05-12

    A better understanding of the genetic architecture of complex traits can contribute to improve genomic prediction. We hypothesized that genomic variants associated with mastitis and milk production traits in dairy cattle are enriched in hepatic transcriptomic regions that are responsive to intra-mammary infection (IMI). Genomic markers [e.g. single nucleotide polymorphisms (SNPs)] from those regions, if included, may improve the predictive ability of a genomic model. We applied a genomic feature best linear unbiased prediction model (GFBLUP) to implement the above strategy by considering the hepatic transcriptomic regions responsive to IMI as genomic features. GFBLUP, an extension of GBLUP, includes a separate genomic effect of SNPs within a genomic feature, and allows differential weighting of the individual marker relationships in the prediction equation. Since GFBLUP is computationally intensive, we investigated whether a SNP set test could be a computationally fast way to preselect predictive genomic features. The SNP set test assesses the association between a genomic feature and a trait based on single-SNP genome-wide association studies. We applied these two approaches to mastitis and milk production traits (milk, fat and protein yield) in Holstein (HOL, n = 5056) and Jersey (JER, n = 1231) cattle. We observed that a majority of genomic features were enriched in genomic variants that were associated with mastitis and milk production traits. Compared to GBLUP, the accuracy of genomic prediction with GFBLUP was marginally improved (3.2 to 3.9%) in within-breed prediction. The highest increase (164.4%) in prediction accuracy was observed in across-breed prediction. The significance of genomic features based on the SNP set test were correlated with changes in prediction accuracy of GFBLUP (P layers of biological knowledge to provide novel insights into the biological basis of complex traits, and to improve the accuracy of genomic prediction. The SNP set

  19. Improving prediction of fall risk among nursing home residents using electronic medical records.

    Science.gov (United States)

    Marier, Allison; Olsho, Lauren E W; Rhodes, William; Spector, William D

    2016-03-01

    Falls are physically and financially costly, but may be preventable with targeted intervention. The Minimum Data Set (MDS) is one potential source of information on fall risk factors among nursing home residents, but its limited breadth and relatively infrequent updates may limit its practical utility. Richer, more frequently updated data from electronic medical records (EMRs) may improve ability to identify individuals at highest risk for falls. The authors applied a repeated events survival model to analyze MDS 3.0 and EMR data for 5129 residents in 13 nursing homes within a single large California chain that uses a centralized EMR system from a leading vendor. Estimated regression parameters were used to project resident fall probability. The authors examined the proportion of observed falls within each projected fall risk decile to assess improvements in predictive power from including EMR data. In a model incorporating fall risk factors from the MDS only, 28.6% of observed falls occurred among residents in the highest projected risk decile. In an alternative specification incorporating more frequently updated measures for the same risk factors from the EMR data, 32.3% of observed falls occurred among residents in the highest projected risk decile, a 13% increase over the base MDS-only specification. Incorporating EMR data improves ability to identify those at highest risk for falls relative to prediction using MDS data alone. These improvements stem chiefly from the greater frequency with which EMR data are updated, with minimal additional gains from availability of additional risk factor variables. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Genomic selection: genome-wide prediction in plant improvement.

    Science.gov (United States)

    Desta, Zeratsion Abera; Ortiz, Rodomiro

    2014-09-01

    Association analysis is used to measure relations between markers and quantitative trait loci (QTL). Their estimation ignores genes with small effects that trigger underpinning quantitative traits. By contrast, genome-wide selection estimates marker effects across the whole genome on the target population based on a prediction model developed in the training population (TP). Whole-genome prediction models estimate all marker effects in all loci and capture small QTL effects. Here, we review several genomic selection (GS) models with respect to both the prediction accuracy and genetic gain from selection. Phenotypic selection or marker-assisted breeding protocols can be replaced by selection, based on whole-genome predictions in which phenotyping updates the model to build up the prediction accuracy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. The film boiling look-up table: an improvement in predicting post-chf temperatures

    International Nuclear Information System (INIS)

    Groeneveld, D.C.; Leung, L.K.H.; Vasic, A.Z.; Guo, Y.J.; El Nakla, M.; Cheng, S.C.

    2002-01-01

    During the past 50 years more than 60 film boiling prediction methods have been proposed (Groeneveld and Leung, 2000). These prediction methods generally are applicable over limited ranges of flow conditions and do not provide reasonable predictions when extrapolated well outside the range of their respective database. Leung et al. (1996, 1997) and Kirillov et al. (1996) have proposed the use of a film-boiling look-up table as an alternative to the many models, equations and correlations for the inverted annular film boiling (IAFB) and the dispersed flow film-boiling (DFFB) regime. The film-boiling look-up table is a logical follow-up to the development of the successful CHF look-up table (Groeneveld et al., 1996). It is basically a normalized data bank of heat-transfer coefficients for discrete values of pressure, mass flux, quality and heat flux or surface-temperature. The look-up table proposed by Leung et al. (1996, 1997), and referred to as PDO-LW-96, was based on 14,687 data and predicted the surface temperature with an average error of 1.2% and an rms error of 6.73%. The heat-transfer coefficient was predicted with an average error of -4.93% and an rms error of 16.87%. Leung et al. clearly showed that the look-up table approach, as a general predictive tool for film-boiling heat transfer, was superior to the correlation or model approach. Error statistics were not provided for the look-up table proposed by Kirillov et al. (1996). This paper reviews the look-up table approach and describes improvements to the derivation of the film-boiling look-up table. These improvements include: (i) a larger data base, (ii) a wider range of thermodynamic qualities, (iii) use of the wall temperature instead of the heat flux as an independent parameter, (iv) employment of fully-developed film-boiling data only for the derivation of the look-up table, (v) a finer subdivision and thus more table entries, (vi) smoother table, and (vii) use of the best of five prediction methods

  2. An Improved Algorithm for Predicting Free Recalls

    Science.gov (United States)

    Laming, Donald

    2008-01-01

    Laming [Laming, D. (2006). "Predicting free recalls." "Journal of Experimental Psychology: Learning, Memory, and Cognition," 32, 1146-1163] has shown that, in a free-recall experiment in which the participants rehearsed out loud, entire sequences of recalls could be predicted, to a useful degree of precision, from the prior sequences of stimuli…

  3. Predictive control strategy of a gas turbine for improvement of combined cycle power plant dynamic performance and efficiency.

    Science.gov (United States)

    Mohamed, Omar; Wang, Jihong; Khalil, Ashraf; Limhabrash, Marwan

    2016-01-01

    This paper presents a novel strategy for implementing model predictive control (MPC) to a large gas turbine power plant as a part of our research progress in order to improve plant thermal efficiency and load-frequency control performance. A generalized state space model for a large gas turbine covering the whole steady operational range is designed according to subspace identification method with closed loop data as input to the identification algorithm. Then the model is used in developing a MPC and integrated into the plant existing control strategy. The strategy principle is based on feeding the reference signals of the pilot valve, natural gas valve, and the compressor pressure ratio controller with the optimized decisions given by the MPC instead of direct application of the control signals. If the set points for the compressor controller and turbine valves are sent in a timely manner, there will be more kinetic energy in the plant to release faster responses on the output and the overall system efficiency is improved. Simulation results have illustrated the feasibility of the proposed application that has achieved significant improvement in the frequency variations and load following capability which are also translated to be improvements in the overall combined cycle thermal efficiency of around 1.1 % compared to the existing one.

  4. Predictive Maintenance (PdM) Centralization for Significant Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Dale

    2010-09-15

    Cost effective predictive maintenance (PdM) technologies and basic energy calculations can mine energy savings form processes or maintenance activities. Centralizing and packaging this information correctly empowers facility maintenance and reliability professionals to build financial justification and support for strategies and personnel to weather global economic downturns and competition. Attendees will learn how to: Systematically build a 'pilot project' for applying PdM and tracking systems; Break down a typical electrical bill to calculate energy savings; Use return on investment (ROI) calculations to identify the best and highest value options, strategies and tips for substantiating your energy reduction maintenance strategies.

  5. An ensemble machine learning approach to predict survival in breast cancer.

    Science.gov (United States)

    Djebbari, Amira; Liu, Ziying; Phan, Sieu; Famili, Fazel

    2008-01-01

    Current breast cancer predictive signatures are not unique. Can we use this fact to our advantage to improve prediction? From the machine learning perspective, it is well known that combining multiple classifiers can improve classification performance. We propose an ensemble machine learning approach which consists of choosing feature subsets and learning predictive models from them. We then combine models based on certain model fusion criteria and we also introduce a tuning parameter to control sensitivity. Our method significantly improves classification performance with a particular emphasis on sensitivity which is critical to avoid misclassifying poor prognosis patients as good prognosis.

  6. Fourier transform wavefront control with adaptive prediction of the atmosphere.

    Science.gov (United States)

    Poyneer, Lisa A; Macintosh, Bruce A; Véran, Jean-Pierre

    2007-09-01

    Predictive Fourier control is a temporal power spectral density-based adaptive method for adaptive optics that predicts the atmosphere under the assumption of frozen flow. The predictive controller is based on Kalman filtering and a Fourier decomposition of atmospheric turbulence using the Fourier transform reconstructor. It provides a stable way to compensate for arbitrary numbers of atmospheric layers. For each Fourier mode, efficient and accurate algorithms estimate the necessary atmospheric parameters from closed-loop telemetry and determine the predictive filter, adjusting as conditions change. This prediction improves atmospheric rejection, leading to significant improvements in system performance. For a 48x48 actuator system operating at 2 kHz, five-layer prediction for all modes is achievable in under 2x10(9) floating-point operations/s.

  7. Prognostic durability of liver fibrosis tests and improvement in predictive performance for mortality by combining tests.

    Science.gov (United States)

    Bertrais, Sandrine; Boursier, Jérôme; Ducancelle, Alexandra; Oberti, Frédéric; Fouchard-Hubert, Isabelle; Moal, Valérie; Calès, Paul

    2017-06-01

    There is currently no recommended time interval between noninvasive fibrosis measurements for monitoring chronic liver diseases. We determined how long a single liver fibrosis evaluation may accurately predict mortality, and assessed whether combining tests improves prognostic performance. We included 1559 patients with chronic liver disease and available baseline liver stiffness measurement (LSM) by Fibroscan, aspartate aminotransferase to platelet ratio index (APRI), FIB-4, Hepascore, and FibroMeter V2G . Median follow-up was 2.8 years during which 262 (16.8%) patients died, with 115 liver-related deaths. All fibrosis tests were able to predict mortality, although APRI (and FIB-4 for liver-related mortality) showed lower overall discriminative ability than the other tests (differences in Harrell's C-index: P fibrosis, 1 year in patients with significant fibrosis, and liver disease (MELD) score testing sets. In the training set, blood tests and LSM were independent predictors of all-cause mortality. The best-fit multivariate model included age, sex, LSM, and FibroMeter V2G with C-index = 0.834 (95% confidence interval, 0.803-0.862). The prognostic model for liver-related mortality included the same covariates with C-index = 0.868 (0.831-0.902). In the testing set, the multivariate models had higher prognostic accuracy than FibroMeter V2G or LSM alone for all-cause mortality and FibroMeter V2G alone for liver-related mortality. The prognostic durability of a single baseline fibrosis evaluation depends on the liver fibrosis level. Combining LSM with a blood fibrosis test improves mortality risk assessment. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  8. An Improved Bacterial-Foraging Optimization-Based Machine Learning Framework for Predicting the Severity of Somatization Disorder

    Directory of Open Access Journals (Sweden)

    Xinen Lv

    2018-02-01

    Full Text Available It is of great clinical significance to establish an accurate intelligent model to diagnose the somatization disorder of community correctional personnel. In this study, a novel machine learning framework is proposed to predict the severity of somatization disorder in community correction personnel. The core of this framework is to adopt the improved bacterial foraging optimization (IBFO to optimize two key parameters (penalty coefficient and the kernel width of a kernel extreme learning machine (KELM and build an IBFO-based KELM (IBFO-KELM for the diagnosis of somatization disorder patients. The main innovation point of the IBFO-KELM model is the introduction of opposition-based learning strategies in traditional bacteria foraging optimization, which increases the diversity of bacterial species, keeps a uniform distribution of individuals of initial population, and improves the convergence rate of the BFO optimization process as well as the probability of escaping from the local optimal solution. In order to verify the effectiveness of the method proposed in this study, a 10-fold cross-validation method based on data from a symptom self-assessment scale (SCL-90 is used to make comparison among IBFO-KELM, BFO-KELM (model based on the original bacterial foraging optimization model, GA-KELM (model based on genetic algorithm, PSO-KELM (model based on particle swarm optimization algorithm and Grid-KELM (model based on grid search method. The experimental results show that the proposed IBFO-KELM prediction model has better performance than other methods in terms of classification accuracy, Matthews correlation coefficient (MCC, sensitivity and specificity. It can distinguish very well between severe somatization disorder and mild somatization and assist the psychological doctor with clinical diagnosis.

  9. An Improved User Selection Algorithm in Multiuser MIMO Broadcast with Channel Prediction

    Science.gov (United States)

    Min, Zhi; Ohtsuki, Tomoaki

    In multiuser MIMO-BC (Multiple-Input Multiple-Output Broadcasting) systems, user selection is important to achieve multiuser diversity. The optimal user selection algorithm is to try all the combinations of users to find the user group that can achieve the multiuser diversity. Unfortunately, the high calculation cost of the optimal algorithm prevents its implementation. Thus, instead of the optimal algorithm, some suboptimal user selection algorithms were proposed based on semiorthogonality of user channel vectors. The purpose of this paper is to achieve multiuser diversity with a small amount of calculation. For this purpose, we propose a user selection algorithm that can improve the orthogonality of a selected user group. We also apply a channel prediction technique to a MIMO-BC system to get more accurate channel information at the transmitter. Simulation results show that the channel prediction can improve the accuracy of channel information for user selections, and the proposed user selection algorithm achieves higher sum rate capacity than the SUS (Semiorthogonal User Selection) algorithm. Also we discuss the setting of the algorithm threshold. As the result of a discussion on the calculation complexity, which uses the number of complex multiplications as the parameter, the proposed algorithm is shown to have a calculation complexity almost equal to that of the SUS algorithm, and they are much lower than that of the optimal user selection algorithm.

  10. An improved distance-to-dose correlation for predicting bladder and rectum dose-volumes in knowledge-based VMAT planning for prostate cancer

    Science.gov (United States)

    Wall, Phillip D. H.; Carver, Robert L.; Fontenot, Jonas D.

    2018-01-01

    The overlap volume histogram (OVH) is an anatomical metric commonly used to quantify the geometric relationship between an organ at risk (OAR) and target volume when predicting expected dose-volumes in knowledge-based planning (KBP). This work investigated the influence of additional variables contributing to variations in the assumed linear DVH-OVH correlation for the bladder and rectum in VMAT plans of prostate patients, with the goal of increasing prediction accuracy and achievability of knowledge-based planning methods. VMAT plans were retrospectively generated for 124 prostate patients using multi-criteria optimization. DVHs quantified patient dosimetric data while OVHs quantified patient anatomical information. The DVH-OVH correlations were calculated for fractional bladder and rectum volumes of 30, 50, 65, and 80%. Correlations between potential influencing factors and dose were quantified using the Pearson product-moment correlation coefficient (R). Factors analyzed included the derivative of the OVH, prescribed dose, PTV volume, bladder volume, rectum volume, and in-field OAR volume. Out of the selected factors, only the in-field bladder volume (mean R  =  0.86) showed a strong correlation with bladder doses. Similarly, only the in-field rectal volume (mean R  =  0.76) showed a strong correlation with rectal doses. Therefore, an OVH formalism accounting for in-field OAR volumes was developed to determine the extent to which it improved the DVH-OVH correlation. Including the in-field factor improved the DVH-OVH correlation, with the mean R values over the fractional volumes studied improving from  -0.79 to  -0.85 and  -0.82 to  -0.86 for the bladder and rectum, respectively. A re-planning study was performed on 31 randomly selected database patients to verify the increased accuracy of KBP dose predictions by accounting for bladder and rectum volume within treatment fields. The in-field OVH led to significantly more precise

  11. Prostate cancer volume adds significantly to prostate-specific antigen in the prediction of early biochemical failure after external beam radiation therapy

    International Nuclear Information System (INIS)

    D'Amico, Anthony V.; Propert, Kathleen J.

    1996-01-01

    Purpose: A new clinical pretreatment quantity that closely approximates the true prostate cancer volume is defined. Methods and Materials: The cancer-specific prostate-specific antigen (PSA), PSA density, prostate cancer volume (V Ca ), and the volume fraction of the gland involved with carcinoma (V Ca fx) were calculated for 227 prostate cancer patients managed definitively with external beam radiation therapy. 1. PSA density PSA/ultrasound prostate gland volume 2. Cancer-specific PSA = PSA - [PSA from benign epithelial tissue] 3. V Ca = Cancer-specific PSA/[PSA in serum per cm 3 of cancer] 4. V Ca fx = V Ca /ultrasound prostate gland volume A Cox multiple regression analysis was used to test whether any of these-clinical pretreatment parameters added significantly to PSA in predicting early postradiation PSA failure. Results: The prostate cancer volume (p = 0.039) and the volume fraction of the gland involved by carcinoma (p = 0.035) significantly added to the PSA in predicting postradiation PSA failure. Conversely, the PSA density and the cancer-specific PSA did not add significantly (p > 0.05) to PSA in predicting postradiation PSA failure. The 20-month actuarial PSA failure-free rates for patients with calculated tumor volumes of ≤0.5 cm 3 , 0.5-4.0 cm 3 , and >4.0 cm 3 were 92, 80, and 47%, respectively (p = 0.00004). Conclusion: The volume of prostate cancer (V Ca ) and the resulting volume fraction of cancer both added significantly to PSA in their ability to predict for early postradiation PSA failure. These new parameters may be used to select patients in prospective randomized trials that examine the efficacy of combining radiation and androgen ablative therapy in patients with clinically localized disease, who are at high risk for early postradiation PSA failure

  12. Attributing Predictable Signals at Subseasonal Timescales

    Science.gov (United States)

    Shelly, A.; Norton, W.; Rowlands, D.; Beech-Brandt, J.

    2016-12-01

    Subseasonal forecasts offer significant economic value in the management of energy infrastructure and through the associated financial markets. Models are now accurate enough to provide, for some occasions, good forecasts in the subseasonal range. However, it is often not clear what the drivers of these subseasonal signals are and if the forecasts could be more accurate with better representation of physical processes. Also what are the limits of predictability in the subseasonal range? To address these questions, we have run the ECMWF monthly forecast system over the 2015/16 winter with a set of 6 week ensemble integrations initialised every week over the period. In these experiments, we have relaxed the band 15N to 15S to reanalysis fields. Hence, we have a set of forecasts where the tropics is constrained to actual events and we can analyse the changes in predictability in middle latitudes - in particular in regions of high energy consumption like North America and Europe. Not surprisingly, the forecast of some periods are significantly improved while others show no improvement. We discuss events/patterns that have extended range predictability and also the tropical forecast errors which prevent the potential predictability in middle latitudes from being realised.

  13. Improved Fuzzy Modelling to Predict the Academic Performance of Distance Education Students

    Directory of Open Access Journals (Sweden)

    Osman Yildiz

    2013-12-01

    Full Text Available It is essential to predict distance education students’ year-end academic performance early during the course of the semester and to take precautions using such prediction-based information. This will, in particular, help enhance their academic performance and, therefore, improve the overall educational quality. The present study was on the development of a mathematical model intended to predict distance education students’ year-end academic performance using the first eight-week data on the learning management system. First, two fuzzy models were constructed, namely the classical fuzzy model and the expert fuzzy model, the latter being based on expert opinion. Afterwards, a gene-fuzzy model was developed optimizing membership functions through genetic algorithm. The data on distance education were collected through Moodle, an open source learning management system. The data were on a total of 218 students who enrolled in Basic Computer Sciences in 2012. The input data consisted of the following variables: When a student logged on to the system for the last time after the content of a lesson was uploaded, how often he/she logged on to the system, how long he/she stayed online in the last login, what score he/she got in the quiz taken in Week 4, and what score he/she got in the midterm exam taken in Week 8. A comparison was made among the predictions of the three models concerning the students’ year-end academic performance.

  14. Role of hyaluronic acid and laminin as serum markers for predicting significant fibrosis in patients with chronic hepatitis B

    Directory of Open Access Journals (Sweden)

    Feng Li

    Full Text Available OBJECTIVES: The aim of this study was to evaluate the diagnostic performance of serum HA and LN as serum markers for predicting significant fibrosis in CHB patients. METHODS: Serum HA and LN levels of 87 patients with chronic hepatitis B and 19 blood donors were assayed by RIA. Liver fibrosis stages were determined according to the Metavir scoring-system. The diagnostic performances of all indexes were evaluated by the receiver operating characteristic (ROC curves. RESULTS: Serum HA and LN concentrations increased significantly with the stage of hepatic fibrosis, which showed positive correlation with the stages of liver fibrosis (HA: r = 0.875, p < 0.001; LN: r = 0.610, p < 0.001. There were significant differences of serum HA and LN levels between F2-4 group in comparison with those in F0-F1 group (p < 0.001 and controls (p < 0.001, respectively. From ROC curves, 185.3 ng/mL as the optimal cut-off value of serum HA for diagnosis of significant fibrosis, giving its sensitivity, specificity, PPV, NPV, LR+, LR- and AC of 84.2%, 83.3%, 90.6%, 73.5%, 5.04, 0.19 and 83.9, respectively. While 132.7 ng/mL was the optimal cut-off value of serum LN, the sensitivity, specificity, PPV, NPV, LR+, LR- and AC were 71.9%, 80.0%, 87.2%, 60.0%, 3.59%, 0.35% and 74.7, respectively. Combinations of HA and LN by serial tests showed a perfect specificity and PPV of 100%, at the same time sensitivity declined to 63.2% and LR+ increased to 18.9, while parallel tests revealed a good sensitivity of 94.7%, NPV to 86.4%, and LR- declined to 0.08. CONCLUSIONS: Serum HA and LN concentrations showed positive correlation with the stages of liver fibrosis. Detection of serum HA and LN in predicting significant fibrosis showed good diagnostic performance, which would be further optimized by combination of the two indices. HA and LN would be clinically useful serum markers for predicting significant fibrosis in patients with chronic hepatitis B, when liver biopsy is

  15. Has growth mixture modeling improved our understanding of how early change predicts psychotherapy outcome?

    Science.gov (United States)

    Koffmann, Andrew

    2017-03-02

    Early change in psychotherapy predicts outcome. Seven studies have used growth mixture modeling [GMM; Muthén, B. (2001). Second-generation structural equation modeling with a combination of categorical and continuous latent variables: New opportunities for latent class-latent growth modeling. In L. M. Collins & A. G. Sawyers (Eds.), New methods for the analysis of change (pp. 291-322). Washington, DC: American Psychological Association] to identify patient classes based on early change but have yielded conflicting results. Here, we review the earlier studies and apply GMM to a new data set. In a university-based training clinic, 251 patients were administered the Outcome Questionnaire-45 [Lambert, M. J., Hansen, N. B., Umphress, V., Lunnen, K., Okiishi, J., Burlingame, G., … Reisinger, C. W. (1996). Administration and scoring manual for the Outcome Questionnaire (OQ 45.2). Wilmington, DE: American Professional Credentialing Services] at each psychotherapy session. We used GMM to identify class structure based on change in the first six sessions and examined trajectories as predictors of outcome. The sample was best described as a single class. There was no evidence of autoregressive trends in the data. We achieved better fit to the data by permitting latent variables some degree of kurtosis, rather than to assume multivariate normality. Treatment outcome was predicted by the amount of early improvement, regardless of initial level of distress. The presence of sudden early gains or losses did not further improve outcome prediction. Early improvement is an easily computed, powerful predictor of psychotherapy outcome. The use of GMM to investigate the relationship between change and outcome is technically complex and computationally intensive. To date, it has not been particularly informative.

  16. Variability in Cadence During Forced Cycling Predicts Motor Improvement in Individuals With Parkinson’s Disease

    Science.gov (United States)

    Ridgel, Angela L.; Abdar, Hassan Mohammadi; Alberts, Jay L.; Discenzo, Fred M.; Loparo, Kenneth A.

    2014-01-01

    Variability in severity and progression of Parkinson’s disease symptoms makes it challenging to design therapy interventions that provide maximal benefit. Previous studies showed that forced cycling, at greater pedaling rates, results in greater improvements in motor function than voluntary cycling. The precise mechanism for differences in function following exercise is unknown. We examined the complexity of biomechanical and physiological features of forced and voluntary cycling and correlated these features to improvements in motor function as measured by the Unified Parkinson’s Disease Rating Scale (UPDRS). Heart rate, cadence, and power were analyzed using entropy signal processing techniques. Pattern variability in heart rate and power were greater in the voluntary group when compared to forced group. In contrast, variability in cadence was higher during forced cycling. UPDRS Motor III scores predicted from the pattern variability data were highly correlated to measured scores in the forced group. This study shows how time series analysis methods of biomechanical and physiological parameters of exercise can be used to predict improvements in motor function. This knowledge will be important in the development of optimal exercise-based rehabilitation programs for Parkinson’s disease. PMID:23144045

  17. Linking precipitation, evapotranspiration and soil moisture content for the improvement of predictability over land

    Science.gov (United States)

    Catalano, Franco; Alessandri, Andrea; De Felice, Matteo

    2013-04-01

    Climate change scenarios are expected to show an intensification of the hydrological cycle together with modifications of evapotranspiration and soil moisture content. Evapotranspiration changes have been already evidenced for the end of the 20th century. The variance of evapotranspiration has been shown to be strongly related to the variance of precipitation over land. Nevertheless, the feedbacks between evapotranspiration, soil moisture and precipitation have not yet been completely understood at present-day. Furthermore, soil moisture reservoirs are associated to a memory and thus their proper initialization may have a strong influence on predictability. In particular, the linkage between precipitation and soil moisture is modulated by the effects on evapotranspiration. Therefore, the investigation of the coupling between these variables appear to be of primary importance for the improvement of predictability over the continents. The coupled manifold (CM) technique (Navarra and Tribbia 2005) is a method designed to separate the effects of the variability of two variables which are connected. This method has proved to be successful for the analysis of different climate fields, like precipitation, vegetation and sea surface temperature. In particular, the coupled variables reveal patterns that may be connected with specific phenomena, thus providing hints regarding potential predictability. In this study we applied the CM to recent observational datasets of precipitation (from CRU), evapotranspiration (from GIMMS and MODIS satellite-based estimates) and soil moisture content (from ESA) spanning a time period of 23 years (1984-2006) with a monthly frequency. Different data stratification (monthly, seasonal, summer JJA) have been employed to analyze the persistence of the patterns and their characteristical time scales and seasonality. The three variables considered show a significant coupling among each other. Interestingly, most of the signal of the

  18. Regression trees for predicting mortality in patients with cardiovascular disease: What improvement is achieved by using ensemble-based methods?

    Science.gov (United States)

    Austin, Peter C; Lee, Douglas S; Steyerberg, Ewout W; Tu, Jack V

    2012-01-01

    In biomedical research, the logistic regression model is the most commonly used method for predicting the probability of a binary outcome. While many clinical researchers have expressed an enthusiasm for regression trees, this method may have limited accuracy for predicting health outcomes. We aimed to evaluate the improvement that is achieved by using ensemble-based methods, including bootstrap aggregation (bagging) of regression trees, random forests, and boosted regression trees. We analyzed 30-day mortality in two large cohorts of patients hospitalized with either acute myocardial infarction (N = 16,230) or congestive heart failure (N = 15,848) in two distinct eras (1999–2001 and 2004–2005). We found that both the in-sample and out-of-sample prediction of ensemble methods offered substantial improvement in predicting cardiovascular mortality compared to conventional regression trees. However, conventional logistic regression models that incorporated restricted cubic smoothing splines had even better performance. We conclude that ensemble methods from the data mining and machine learning literature increase the predictive performance of regression trees, but may not lead to clear advantages over conventional logistic regression models for predicting short-term mortality in population-based samples of subjects with cardiovascular disease. PMID:22777999

  19. Improved Seasonal Prediction of European Summer Temperatures With New Five-Layer Soil-Hydrology Scheme

    Science.gov (United States)

    Bunzel, Felix; Müller, Wolfgang A.; Dobrynin, Mikhail; Fröhlich, Kristina; Hagemann, Stefan; Pohlmann, Holger; Stacke, Tobias; Baehr, Johanna

    2018-01-01

    We evaluate the impact of a new five-layer soil-hydrology scheme on seasonal hindcast skill of 2 m temperatures over Europe obtained with the Max Planck Institute Earth System Model (MPI-ESM). Assimilation experiments from 1981 to 2010 and 10-member seasonal hindcasts initialized on 1 May each year are performed with MPI-ESM in two soil configurations, one using a bucket scheme and one a new five-layer soil-hydrology scheme. We find the seasonal hindcast skill for European summer temperatures to improve with the five-layer scheme compared to the bucket scheme and investigate possible causes for these improvements. First, improved indirect soil moisture assimilation allows for enhanced soil moisture-temperature feedbacks in the hindcasts. Additionally, this leads to improved prediction of anomalies in the 500 hPa geopotential height surface, reflecting more realistic atmospheric circulation patterns over Europe.

  20. Modeling heterogeneous (co)variances from adjacent-SNP groups improves genomic prediction for milk protein composition traits

    DEFF Research Database (Denmark)

    Gebreyesus, Grum; Lund, Mogens Sandø; Buitenhuis, Albert Johannes

    2017-01-01

    Accurate genomic prediction requires a large reference population, which is problematic for traits that are expensive to measure. Traits related to milk protein composition are not routinely recorded due to costly procedures and are considered to be controlled by a few quantitative trait loci...... of large effect. The amount of variation explained may vary between regions leading to heterogeneous (co)variance patterns across the genome. Genomic prediction models that can efficiently take such heterogeneity of (co)variances into account can result in improved prediction reliability. In this study, we...... developed and implemented novel univariate and bivariate Bayesian prediction models, based on estimates of heterogeneous (co)variances for genome segments (BayesAS). Available data consisted of milk protein composition traits measured on cows and de-regressed proofs of total protein yield derived for bulls...

  1. Improved techniques for predicting spacecraft power

    International Nuclear Information System (INIS)

    Chmielewski, A.B.

    1987-01-01

    Radioisotope Thermoelectric Generators (RTGs) are going to supply power for the NASA Galileo and Ulysses spacecraft now scheduled to be launched in 1989 and 1990. The duration of the Galileo mission is expected to be over 8 years. This brings the total RTG lifetime to 13 years. In 13 years, the RTG power drops more than 20 percent leaving a very small power margin over what is consumed by the spacecraft. Thus it is very important to accurately predict the RTG performance and be able to assess the magnitude of errors involved. The paper lists all the error sources involved in the RTG power predictions and describes a statistical method for calculating the tolerance

  2. Restraint status improves the predictive value of motor vehicle crash criteria for pediatric trauma team activation.

    Science.gov (United States)

    Bozeman, Andrew P; Dassinger, Melvin S; Recicar, John F; Smith, Samuel D; Rettiganti, Mallikarjuna R; Nick, Todd G; Maxson, Robert T

    2012-12-01

    Most trauma centers incorporate mechanistic criteria (MC) into their algorithm for trauma team activation (TTA). We hypothesized that characteristics of the crash are less reliable than restraint status in predicting significant injury and the need for TTA. We identified 271 patients (age, <15 y) admitted with a diagnosis of motor vehicle crash. Mechanistic criteria and restraint status of each patient were recorded. Both MC and MC plus restraint status were evaluated as separate measures for appropriately predicting TTA based on treatment outcomes and injury scores. Improper restraint alone predicted a need for TTA with an odds ratios of 2.69 (P = .002). MC plus improper restraint predicted the need for TTA with an odds ratio of 2.52 (P = .002). In contrast, the odds ratio when using MC alone was 1.65 (P = .16). When the 5 MC were evaluated individually as predictive of TTA, ejection, death of occupant, and intrusion more than 18 inches were statistically significant. Improper restraint is an independent predictor of necessitating TTA in this single-institution study. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. At the Nexus of History, Ecology, and Hydrobiogeochemistry: Improved Predictions across Scales through Integration.

    Science.gov (United States)

    Stegen, James C

    2018-01-01

    To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modeling frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. We can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.

  4. Significant improvement in one-dimensional cursor control using Laplacian electroencephalography over electroencephalography

    Science.gov (United States)

    Boudria, Yacine; Feltane, Amal; Besio, Walter

    2014-06-01

    Objective. Brain-computer interfaces (BCIs) based on electroencephalography (EEG) have been shown to accurately detect mental activities, but the acquisition of high levels of control require extensive user training. Furthermore, EEG has low signal-to-noise ratio and low spatial resolution. The objective of the present study was to compare the accuracy between two types of BCIs during the first recording session. EEG and tripolar concentric ring electrode (TCRE) EEG (tEEG) brain signals were recorded and used to control one-dimensional cursor movements. Approach. Eight human subjects were asked to imagine either ‘left’ or ‘right’ hand movement during one recording session to control the computer cursor using TCRE and disc electrodes. Main results. The obtained results show a significant improvement in accuracies using TCREs (44%-100%) compared to disc electrodes (30%-86%). Significance. This study developed the first tEEG-based BCI system for real-time one-dimensional cursor movements and showed high accuracies with little training.

  5. Predicting Community College Outcomes: Does High School CTE Participation Have a Significant Effect?

    Science.gov (United States)

    Dietrich, Cecile; Lichtenberger, Eric; Kamalludeen, Rosemaliza

    2016-01-01

    This study explored the relative importance of participation in high school career and technical education (CTE) programs in predicting community college outcomes. A hierarchical generalized linear model (HGLM) was used to predict community college outcome attainment among a random sample of direct community college entrants. Results show that…

  6. Collaboration between a human group and artificial intelligence can improve prediction of multiple sclerosis course: a proof-of-principle study [version 1; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Andrea Tacchella

    2017-12-01

    Full Text Available Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR phase, which proceeds to a secondary progressive (SP form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients.

  7. PredictSNP: robust and accurate consensus classifier for prediction of disease-related mutations.

    Directory of Open Access Journals (Sweden)

    Jaroslav Bendl

    2014-01-01

    Full Text Available Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp.

  8. Exploring the significance of human mobility patterns in social link prediction

    KAUST Repository

    Alharbi, Basma Mohammed; Zhang, Xiangliang

    2014-01-01

    Link prediction is a fundamental task in social networks. Recently, emphasis has been placed on forecasting new social ties using user mobility patterns, e.g., investigating physical and semantic co-locations for new proximity measure. This paper

  9. NetMHCpan-3.0; improved prediction of binding to MHC class I molecules integrating information from multiple receptor and peptide length datasets

    DEFF Research Database (Denmark)

    Nielsen, Morten; Andreatta, Massimo

    2016-01-01

    Background: Binding of peptides to MHC class I molecules (MHC-I) is essential for antigen presentation to cytotoxic T-cells.Results: Here, we demonstrate how a simple alignment step allowing insertions and deletions in a pan-specific MHC-I binding machine-learning model enables combining informat...... specificities and ligand length scales, and demonstrated how this approach significantly improves the accuracy for prediction of peptide binding and identification of MHC ligands. The method is available at www.cbs.dtu.dk/services/NetMHCpan-3.0....

  10. Improving Flood Prediction By the Assimilation of Satellite Soil Moisture in Poorly Monitored Catchments.

    Science.gov (United States)

    Alvarez-Garreton, C. D.; Ryu, D.; Western, A. W.; Crow, W. T.; Su, C. H.; Robertson, D. E.

    2014-12-01

    Flood prediction in poorly monitored catchments is among the greatest challenges faced by hydrologists. To address this challenge, an increasing number of studies in the last decade have explored methods to integrate various existing observations from ground and satellites. One approach in particular, is the assimilation of satellite soil moisture (SM-DA) into rainfall-runoff models. The rationale is that satellite soil moisture (SSM) can be used to correct model soil water states, enabling more accurate prediction of catchment response to precipitation and thus better streamflow. However, there is still no consensus on the most effective SM-DA scheme and how this might depend on catchment scale, climate characteristics, runoff mechanisms, model and SSM products used, etc. In this work, an operational SM-DA scheme was set up in the poorly monitored, large (>40,000 km2), semi-arid Warrego catchment situated in eastern Australia. We assimilated passive and active SSM products into the probability distributed model (PDM) using an ensemble Kalman filter. We explored factors influencing the SM-DA framework, including relatively new techniques to remove model-observation bias, estimate observation errors and represent model errors. Furthermore, we explored the advantages of accounting for the spatial distribution of forcing and channel routing processes within the catchment by implementing and comparing lumped and semi-distributed model setups. Flood prediction is improved by SM-DA (Figure), with a 30% reduction of the average root-mean-squared difference of the ensemble prediction, a 20% reduction of the false alarm ratio and a 40% increase of the ensemble mean Nash-Sutcliffe efficiency. SM-DA skill does not significantly change with different observation error assumptions, but the skill strongly depends on the observational bias correction technique used, and more importantly, on the performance of the open-loop model before assimilation. Our findings imply that proper

  11. PXD101 significantly improves nuclear reprogramming and the in vitro developmental competence of porcine SCNT embryos

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Jun-Xue; Kang, Jin-Dan; Li, Suo; Jin, Long; Zhu, Hai-Ying; Guo, Qing; Gao, Qing-Shan; Yan, Chang-Guo; Yin, Xi-Jun, E-mail: yinxj33@msn.com

    2015-01-02

    Highlights: • First explored that the effects of PXD101 on the development of SCNT embryos in vitro. • 0.5 μM PXD101 treated for 24 h improved the development of porcine SCNT embryos. • Level of AcH3K9 was significantly higher than control group at early stages. - Abstract: In this study, we investigated the effects of the histone deacetylase inhibitor PXD101 (belinostat) on the preimplantation development of porcine somatic cell nuclear transfer (SCNT) embryos and their expression of the epigenetic markers histone H3 acetylated at lysine 9 (AcH3K9). We compared the in vitro developmental competence of SCNT embryos treated with various concentrations of PXD101 for 24 h. Treatment with 0.5 μM PXD101 significantly increased the proportion of SCNT embryos that reached the blastocyst stage, in comparison to the control group (23.3% vs. 11.5%, P < 0.05). We tested the in vitro developmental competence of SCNT embryos treated with 0.5 μM PXD101 for various amounts of times following activation. Treatment for 24 h significantly improved the development of porcine SCNT embryos, with a significantly higher proportion of embryos reaching the blastocyst stage in comparison to the control group (25.7% vs. 10.6%, P < 0.05). PXD101-treated SCNT embryos were transferred into two surrogate sows, one of whom became pregnant and four fetuses developed. PXD101 treatment significantly increased the fluorescence intensity of immunostaining for AcH3K9 in embryos at the pseudo-pronuclear and 2-cell stages. At these stages, the fluorescence intensities of immunostaining for AcH3K9 were significantly higher in PXD101-treated embryos than in control untreated embryos. In conclusion, this study demonstrates that PXD101 can significantly improve the in vitro and in vivo developmental competence of porcine SCNT embryos and can enhance their nuclear reprogramming.

  12. An On-Chip RBC Deformability Checker Significantly Improves Velocity-Deformation Correlation

    Directory of Open Access Journals (Sweden)

    Chia-Hung Dylan Tsai

    2016-10-01

    Full Text Available An on-chip deformability checker is proposed to improve the velocity–deformation correlation for red blood cell (RBC evaluation. RBC deformability has been found related to human diseases, and can be evaluated based on RBC velocity through a microfluidic constriction as in conventional approaches. The correlation between transit velocity and amount of deformation provides statistical information of RBC deformability. However, such correlations are usually only moderate, or even weak, in practical evaluations due to limited range of RBC deformation. To solve this issue, we implemented three constrictions of different width in the proposed checker, so that three different deformation regions can be applied to RBCs. By considering cell responses from the three regions as a whole, we practically extend the range of cell deformation in the evaluation, and could resolve the issue about the limited range of RBC deformation. RBCs from five volunteer subjects were tested using the proposed checker. The results show that the correlation between cell deformation and transit velocity is significantly improved by the proposed deformability checker. The absolute values of the correlation coefficients are increased from an average of 0.54 to 0.92. The effects of cell size, shape and orientation to the evaluation are discussed according to the experimental results. The proposed checker is expected to be useful for RBC evaluation in medical practices.

  13. The contribution of educational class in improving accuracy of cardiovascular risk prediction across European regions

    DEFF Research Database (Denmark)

    Ferrario, Marco M; Veronesi, Giovanni; Chambless, Lloyd E

    2014-01-01

    OBJECTIVE: To assess whether educational class, an index of socioeconomic position, improves the accuracy of the SCORE cardiovascular disease (CVD) risk prediction equation. METHODS: In a pooled analysis of 68 455 40-64-year-old men and women, free from coronary heart disease at baseline, from 47...

  14. CT image biomarkers to improve patient-specific prediction of radiation-induced xerostomia and sticky saliva

    NARCIS (Netherlands)

    van Dijk, Lisanne V.; Brouwer, Charlotte L.; van der Schaaf, Arjen; Burgerhof, Johannes G. M.; Beukinga, Roelof J.; Langendijk, Johannes A.; Sijtsema, Nanna M.; Steenbakkers, Roel J. H. M.

    Background and purpose: Current models for the prediction of late patient-rated moderate-to-severe xerostomia (XER12m) and sticky saliva (STIC12m) after radiotherapy are based on dose-volume parameters and baseline xerostomia (XERbase) or sticky saliva (STICbase) scores. The purpose is to improve

  15. Climatic extremes improve predictions of spatial patterns of tree species

    Science.gov (United States)

    Zimmermann, N.E.; Yoccoz, N.G.; Edwards, T.C.; Meier, E.S.; Thuiller, W.; Guisan, Antoine; Schmatz, D.R.; Pearman, P.B.

    2009-01-01

    Understanding niche evolution, dynamics, and the response of species to climate change requires knowledge of the determinants of the environmental niche and species range limits. Mean values of climatic variables are often used in such analyses. In contrast, the increasing frequency of climate extremes suggests the importance of understanding their additional influence on range limits. Here, we assess how measures representing climate extremes (i.e., interannual variability in climate parameters) explain and predict spatial patterns of 11 tree species in Switzerland. We find clear, although comparably small, improvement (+20% in adjusted D2, +8% and +3% in cross-validated True Skill Statistic and area under the receiver operating characteristics curve values) in models that use measures of extremes in addition to means. The primary effect of including information on climate extremes is a correction of local overprediction and underprediction. Our results demonstrate that measures of climate extremes are important for understanding the climatic limits of tree species and assessing species niche characteristics. The inclusion of climate variability likely will improve models of species range limits under future conditions, where changes in mean climate and increased variability are expected.

  16. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory

    2012-09-11

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  17. Significance of prostatic weight in prostatism

    DEFF Research Database (Denmark)

    Jensen, K M; Bruskewitz, R C; Iversen, P

    1983-01-01

    In addition to routine evaluation, 68 patients with prostatism underwent blinded urodynamic testing prior to transurethral prostatectomy and were reexamined symptomatologically and urodynamically at 3 and 12 months after surgery to determine if prostatic weight could predict postoperative outcome....... Resected prostatic weight correlated with estimated weight at cystoscopy and with obstructive symptoms, but not with urodynamic variables of infravesical obstruction. Patients with small prostates improved symptomatologically to the same degree as patients with larger glands, although they did not improve...... to the same degree urodynamically. Prostatic weight, therefore, could not be used to predict the outcome of transurethral surgery....

  18. A Model Predictive Control Approach for Fuel Economy Improvement of a Series Hydraulic Hybrid Vehicle

    Directory of Open Access Journals (Sweden)

    Tri-Vien Vu

    2014-10-01

    Full Text Available This study applied a model predictive control (MPC framework to solve the cruising control problem of a series hydraulic hybrid vehicle (SHHV. The controller not only regulates vehicle velocity, but also engine torque, engine speed, and accumulator pressure to their corresponding reference values. At each time step, a quadratic programming problem is solved within a predictive horizon to obtain the optimal control inputs. The objective is to minimize the output error. This approach ensures that the components operate at high efficiency thereby improving the total efficiency of the system. The proposed SHHV control system was evaluated under urban and highway driving conditions. By handling constraints and input-output interactions, the MPC-based control system ensures that the system operates safely and efficiently. The fuel economy of the proposed control scheme shows a noticeable improvement in comparison with the PID-based system, in which three Proportional-Integral-Derivative (PID controllers are used for cruising control.

  19. Assessment of subchannel code ASSERT-PV for flow-distribution predictions

    International Nuclear Information System (INIS)

    Nava-Dominguez, A.; Rao, Y.F.; Waddington, G.M.

    2014-01-01

    Highlights: • Assessment of the subchannel code ASSERT-PV 3.2 for the prediction of flow distribution. • Open literature and in-house experimental data to quantify ASSERT-PV predictions. • Model changes assessed against vertical and horizontal flow experiments. • Improvement of flow-distribution predictions under CANDU-relevant conditions. - Abstract: This paper reports an assessment of the recently released subchannel code ASSERT-PV 3.2 for the prediction of flow-distribution in fuel bundles, including subchannel void fraction, quality and mass fluxes. Experimental data from open literature and from in-house tests are used to assess the flow-distribution models in ASSERT-PV 3.2. The prediction statistics using the recommended model set of ASSERT-PV 3.2 are compared to those from previous code versions. Separate-effects sensitivity studies are performed to quantify the contribution of each flow-distribution model change or enhancement to the improvement in flow-distribution prediction. The assessment demonstrates significant improvement in the prediction of flow-distribution in horizontal fuel channels containing CANDU bundles

  20. Assessment of subchannel code ASSERT-PV for flow-distribution predictions

    Energy Technology Data Exchange (ETDEWEB)

    Nava-Dominguez, A., E-mail: navadoma@aecl.ca; Rao, Y.F., E-mail: raoy@aecl.ca; Waddington, G.M., E-mail: waddingg@aecl.ca

    2014-08-15

    Highlights: • Assessment of the subchannel code ASSERT-PV 3.2 for the prediction of flow distribution. • Open literature and in-house experimental data to quantify ASSERT-PV predictions. • Model changes assessed against vertical and horizontal flow experiments. • Improvement of flow-distribution predictions under CANDU-relevant conditions. - Abstract: This paper reports an assessment of the recently released subchannel code ASSERT-PV 3.2 for the prediction of flow-distribution in fuel bundles, including subchannel void fraction, quality and mass fluxes. Experimental data from open literature and from in-house tests are used to assess the flow-distribution models in ASSERT-PV 3.2. The prediction statistics using the recommended model set of ASSERT-PV 3.2 are compared to those from previous code versions. Separate-effects sensitivity studies are performed to quantify the contribution of each flow-distribution model change or enhancement to the improvement in flow-distribution prediction. The assessment demonstrates significant improvement in the prediction of flow-distribution in horizontal fuel channels containing CANDU bundles.

  1. Significant pre-accession factors predicting success or failure during a Marine Corps officer’s initial service obligation

    OpenAIRE

    Johnson, Jacob A.

    2015-01-01

    Approved for public release; distribution is unlimited Increasing diversity and equal opportunity in the military is a congressional and executive priority. At the same time, improving recruiting practices is a priority of the commandant of the Marine Corps. In an effort to provide information to the Marine Corps that may improve recruiting practice and enable retention of a higher quality and more diverse officer corps, probit econometric models are estimated to identify significant facto...

  2. Multidimensional assessment of patient condition and mutational analysis in peripheral blood, as tools to improve outcome prediction in myelodysplastic syndromes: A prospective study of the Spanish MDS group.

    Science.gov (United States)

    Ramos, Fernando; Robledo, Cristina; Pereira, Arturo; Pedro, Carmen; Benito, Rocío; de Paz, Raquel; Del Rey, Mónica; Insunza, Andrés; Tormo, Mar; Díez-Campelo, María; Xicoy, Blanca; Salido, Eduardo; Sánchez-Del-Real, Javier; Arenillas, Leonor; Florensa, Lourdes; Luño, Elisa; Del Cañizo, Consuelo; Sanz, Guillermo F; María Hernández-Rivas, Jesús

    2017-09-01

    The International Prognostic Scoring System and its revised form (IPSS-R) are the most widely used indices for prognostic assessment of patients with myelodysplastic syndromes (MDS), but can only partially account for the observed variation in patient outcomes. This study aimed to evaluate the relative contribution of patient condition and mutational status in peripheral blood when added to the IPSS-R, for estimating overall survival and the risk of leukemic transformation in patients with MDS. A prospective cohort (2006-2015) of 200 consecutive patients with MDS were included in the study series and categorized according to the IPSS-R. Patients were further stratified according to patient condition (assessed using the multidimensional Lee index for older adults) and genetic mutations (peripheral blood samples screened using next-generation sequencing). The change in likelihood-ratio was tested in Cox models after adding individual covariates. The addition of the Lee index to the IPSS-R significantly improved prediction of overall survival [hazard ratio (HR) 3.02, 95% confidence interval (CI) 1.96-4.66, P < 0.001), and mutational analysis significantly improved prediction of leukemic evolution (HR 2.64, 1.56-4.46, P < 0.001). Non-leukemic death was strongly linked to patient condition (HR 2.71, 1.72-4.25, P < 0.001), but not to IPSS-R score (P = 0.35) or mutational status (P = 0.75). Adjustment for exposure to disease-modifying therapy, evaluated as a time-dependent covariate, had no effect on the proposed model's predictive ability. In conclusion, patient condition, assessed by the multidimensional Lee index and patient mutational status can improve the prediction of clinical outcomes of patients with MDS already stratified by IPSS-R. © 2017 Wiley Periodicals, Inc.

  3. Towards improved hydrologic predictions using data assimilation techniques for water resource management at the continental scale

    Science.gov (United States)

    Naz, Bibi; Kurtz, Wolfgang; Kollet, Stefan; Hendricks Franssen, Harrie-Jan; Sharples, Wendy; Görgen, Klaus; Keune, Jessica; Kulkarni, Ketan

    2017-04-01

    More accurate and reliable hydrologic simulations are important for many applications such as water resource management, future water availability projections and predictions of extreme events. However, simulation of spatial and temporal variations in the critical water budget components such as precipitation, snow, evaporation and runoff is highly uncertain, due to errors in e.g. model structure and inputs (hydrologic parameters and forcings). In this study, we use data assimilation techniques to improve the predictability of continental-scale water fluxes using in-situ measurements along with remotely sensed information to improve hydrologic predications for water resource systems. The Community Land Model, version 3.5 (CLM) integrated with the Parallel Data Assimilation Framework (PDAF) was implemented at spatial resolution of 1/36 degree (3 km) over the European CORDEX domain. The modeling system was forced with a high-resolution reanalysis system COSMO-REA6 from Hans-Ertel Centre for Weather Research (HErZ) and ERA-Interim datasets for time period of 1994-2014. A series of data assimilation experiments were conducted to assess the efficiency of assimilation of various observations, such as river discharge data, remotely sensed soil moisture, terrestrial water storage and snow measurements into the CLM-PDAF at regional to continental scales. This setup not only allows to quantify uncertainties, but also improves streamflow predictions by updating simultaneously model states and parameters utilizing observational information. The results from different regions, watershed sizes, spatial resolutions and timescales are compared and discussed in this study.

  4. Improving protein-protein interaction prediction using evolutionary information from low-quality MSAs.

    Science.gov (United States)

    Várnai, Csilla; Burkoff, Nikolas S; Wild, David L

    2017-01-01

    Evolutionary information stored in multiple sequence alignments (MSAs) has been used to identify the interaction interface of protein complexes, by measuring either co-conservation or co-mutation of amino acid residues across the interface. Recently, maximum entropy related correlated mutation measures (CMMs) such as direct information, decoupling direct from indirect interactions, have been developed to identify residue pairs interacting across the protein complex interface. These studies have focussed on carefully selected protein complexes with large, good-quality MSAs. In this work, we study protein complexes with a more typical MSA consisting of fewer than 400 sequences, using a set of 79 intramolecular protein complexes. Using a maximum entropy based CMM at the residue level, we develop an interface level CMM score to be used in re-ranking docking decoys. We demonstrate that our interface level CMM score compares favourably to the complementarity trace score, an evolutionary information-based score measuring co-conservation, when combined with the number of interface residues, a knowledge-based potential and the variability score of individual amino acid sites. We also demonstrate, that, since co-mutation and co-complementarity in the MSA contain orthogonal information, the best prediction performance using evolutionary information can be achieved by combining the co-mutation information of the CMM with co-conservation information of a complementarity trace score, predicting a near-native structure as the top prediction for 41% of the dataset. The method presented is not restricted to small MSAs, and will likely improve interface prediction also for complexes with large and good-quality MSAs.

  5. Significant interarm blood pressure difference predicts cardiovascular risk in hypertensive patients: CoCoNet study.

    Science.gov (United States)

    Kim, Su-A; Kim, Jang Young; Park, Jeong Bae

    2016-06-01

    There has been a rising interest in interarm blood pressure difference (IAD), due to its relationship with peripheral arterial disease and its possible relationship with cardiovascular disease. This study aimed to characterize hypertensive patients with a significant IAD in relation to cardiovascular risk. A total of 3699 patients (mean age, 61 ± 11 years) were prospectively enrolled in the study. Blood pressure (BP) was measured simultaneously in both arms 3 times using an automated cuff-oscillometric device. IAD was defined as the absolute difference in averaged BPs between the left and right arm, and an IAD ≥ 10 mm Hg was considered to be significant. The Framingham risk score was used to calculate the 10-year cardiovascular risk. The mean systolic IAD (sIAD) was 4.3 ± 4.1 mm Hg, and 285 (7.7%) patients showed significant sIAD. Patients with significant sIAD showed larger body mass index (P < 0.001), greater systolic BP (P = 0.050), more coronary artery disease (relative risk = 1.356, P = 0.034), and more cerebrovascular disease (relative risk = 1.521, P = 0.072). The mean 10-year cardiovascular risk was 9.3 ± 7.7%. By multiple regression, sIAD was significantly but weakly correlated with the 10-year cardiovascular risk (β = 0.135, P = 0.008). Patients with significant sIAD showed a higher prevalence of coronary artery disease, as well as an increase in 10-year cardiovascular risk. Therefore, accurate measurements of sIAD may serve as a simple and cost-effective tool for predicting cardiovascular risk in clinical settings.

  6. Interpreting Disruption Prediction Models to Improve Plasma Control

    Science.gov (United States)

    Parsons, Matthew

    2017-10-01

    In order for the tokamak to be a feasible design for a fusion reactor, it is necessary to minimize damage to the machine caused by plasma disruptions. Accurately predicting disruptions is a critical capability for triggering any mitigative actions, and a modest amount of attention has been given to efforts that employ machine learning techniques to make these predictions. By monitoring diagnostic signals during a discharge, such predictive models look for signs that the plasma is about to disrupt. Typically these predictive models are interpreted simply to give a `yes' or `no' response as to whether a disruption is approaching. However, it is possible to extract further information from these models to indicate which input signals are more strongly correlated with the plasma approaching a disruption. If highly accurate predictive models can be developed, this information could be used in plasma control schemes to make better decisions about disruption avoidance. This work was supported by a Grant from the 2016-2017 Fulbright U.S. Student Program, administered by the Franco-American Fulbright Commission in France.

  7. Sphinx: merging knowledge-based and ab initio approaches to improve protein loop prediction.

    Science.gov (United States)

    Marks, Claire; Nowak, Jaroslaw; Klostermann, Stefan; Georges, Guy; Dunbar, James; Shi, Jiye; Kelm, Sebastian; Deane, Charlotte M

    2017-05-01

    Loops are often vital for protein function, however, their irregular structures make them difficult to model accurately. Current loop modelling algorithms can mostly be divided into two categories: knowledge-based, where databases of fragments are searched to find suitable conformations and ab initio, where conformations are generated computationally. Existing knowledge-based methods only use fragments that are the same length as the target, even though loops of slightly different lengths may adopt similar conformations. Here, we present a novel method, Sphinx, which combines ab initio techniques with the potential extra structural information contained within loops of a different length to improve structure prediction. We show that Sphinx is able to generate high-accuracy predictions and decoy sets enriched with near-native loop conformations, performing better than the ab initio algorithm on which it is based. In addition, it is able to provide predictions for every target, unlike some knowledge-based methods. Sphinx can be used successfully for the difficult problem of antibody H3 prediction, outperforming RosettaAntibody, one of the leading H3-specific ab initio methods, both in accuracy and speed. Sphinx is available at http://opig.stats.ox.ac.uk/webapps/sphinx. deane@stats.ox.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  8. Land-surface initialisation improves seasonal climate prediction skill for maize yield forecast.

    Science.gov (United States)

    Ceglar, Andrej; Toreti, Andrea; Prodhomme, Chloe; Zampieri, Matteo; Turco, Marco; Doblas-Reyes, Francisco J

    2018-01-22

    Seasonal crop yield forecasting represents an important source of information to maintain market stability, minimise socio-economic impacts of crop losses and guarantee humanitarian food assistance, while it fosters the use of climate information favouring adaptation strategies. As climate variability and extremes have significant influence on agricultural production, the early prediction of severe weather events and unfavourable conditions can contribute to the mitigation of adverse effects. Seasonal climate forecasts provide additional value for agricultural applications in several regions of the world. However, they currently play a very limited role in supporting agricultural decisions in Europe, mainly due to the poor skill of relevant surface variables. Here we show how a combined stress index (CSI), considering both drought and heat stress in summer, can predict maize yield in Europe and how land-surface initialised seasonal climate forecasts can be used to predict it. The CSI explains on average nearly 53% of the inter-annual maize yield variability under observed climate conditions and shows how concurrent heat stress and drought events have influenced recent yield anomalies. Seasonal climate forecast initialised with realistic land-surface achieves better (and marginally useful) skill in predicting the CSI than with climatological land-surface initialisation in south-eastern Europe, part of central Europe, France and Italy.

  9. Improvements in disruption prediction at ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Aledda, R., E-mail: raffaele.aledda@diee.unica.it; Cannas, B., E-mail: cannas@diee.unica.it; Fanni, A., E-mail: fanni@diee.unica.it; Pau, A., E-mail: alessandro.pau@diee.unica.it; Sias, G., E-mail: giuliana.sias@diee.unica.it

    2015-10-15

    Highlights: • A disruption prediction system for AUG, based on a logistic model, is designed. • The length of the disruptive phase is set for each disruption in the training set. • The model is tested on dataset different from that used during the training phase. • The generalization capability and the aging of the model have been tested. • The predictor performance is compared with the locked mode detector. - Abstract: In large-scale tokamaks disruptions have the potential to create serious damage to the facility. Hence disruptions must be avoided, but, when a disruption is unavoidable, minimizing its severity is mandatory. A reliable detection of a disruptive event is required to trigger proper mitigation actions. To this purpose machine learning methods have been widely studied to design disruption prediction systems at ASDEX Upgrade. The training phase of the proposed approaches is based on the availability of disrupted and non-disrupted discharges. In literature disruptive configurations were assumed appearing into the last 45 ms of each disruption. Even if the achieved results in terms of correct predictions were good, it has to be highlighted that the choice of such a fixed temporal window might have limited the prediction performance. In fact, it generates confusing information in cases of disruptions with disruptive phase different from 45 ms. The assessment of a specific disruptive phase for each disruptive discharge represents a relevant issue in understanding the disruptive events. In this paper, the Mahalanobis distance is applied to define a specific disruptive phase for each disruption, and a logistic regressor has been trained as disruption predictor. The results show that enhancements on the achieved performance on disruption prediction are possible by defining a specific disruptive phase for each disruption.

  10. Improvements in disruption prediction at ASDEX Upgrade

    International Nuclear Information System (INIS)

    Aledda, R.; Cannas, B.; Fanni, A.; Pau, A.; Sias, G.

    2015-01-01

    Highlights: • A disruption prediction system for AUG, based on a logistic model, is designed. • The length of the disruptive phase is set for each disruption in the training set. • The model is tested on dataset different from that used during the training phase. • The generalization capability and the aging of the model have been tested. • The predictor performance is compared with the locked mode detector. - Abstract: In large-scale tokamaks disruptions have the potential to create serious damage to the facility. Hence disruptions must be avoided, but, when a disruption is unavoidable, minimizing its severity is mandatory. A reliable detection of a disruptive event is required to trigger proper mitigation actions. To this purpose machine learning methods have been widely studied to design disruption prediction systems at ASDEX Upgrade. The training phase of the proposed approaches is based on the availability of disrupted and non-disrupted discharges. In literature disruptive configurations were assumed appearing into the last 45 ms of each disruption. Even if the achieved results in terms of correct predictions were good, it has to be highlighted that the choice of such a fixed temporal window might have limited the prediction performance. In fact, it generates confusing information in cases of disruptions with disruptive phase different from 45 ms. The assessment of a specific disruptive phase for each disruptive discharge represents a relevant issue in understanding the disruptive events. In this paper, the Mahalanobis distance is applied to define a specific disruptive phase for each disruption, and a logistic regressor has been trained as disruption predictor. The results show that enhancements on the achieved performance on disruption prediction are possible by defining a specific disruptive phase for each disruption.

  11. Significant improvement of optical traps by tuning standard water immersion objectives

    International Nuclear Information System (INIS)

    Reihani, S Nader S; Mir, Shahid A; Richardson, Andrew C; Oddershede, Lene B

    2011-01-01

    Focused infrared lasers are widely used for micromanipulation and visualization of biological specimens. An inherent practical problem is that off-the-shelf commercial microscope objectives are designed for use with visible and not infrared wavelengths. Less aberration is introduced by water immersion objectives than by oil immersion ones, however, even water immersion objectives induce significant aberration. We present a simple method to reduce the spherical aberration induced by water immersion objectives, namely by tuning the correction collar of the objective to a value that is ∼ 10% lower than the physical thickness of the coverslip. This results in marked improvements in optical trapping strengths of up to 100% laterally and 600% axially from a standard microscope objective designed for use in the visible range. The results are generally valid for any water immersion objective with any numerical aperture

  12. Thermosensitive Hydrogel Mask Significantly Improves Skin Moisture and Skin Tone; Bilateral Clinical Trial

    Directory of Open Access Journals (Sweden)

    Anna Quattrone

    2017-06-01

    Full Text Available Objective: A temperature-sensitive state-changing hydrogel mask was used in this study. Once it comes into contact with the skin and reaches the body temperature, it uniformly and quickly releases the active compounds, which possess moisturizing, anti-oxidant, anti-inflammatory and regenerative properties. Methods: An open label clinical trial was conducted to evaluate the effects of the test product on skin hydration, skin tone and skin ageing. Subjects applied the product to one side of their face and underwent Corneometer® and Chromameter measurements, Visual assessment of facial skin ageing and facial photography. All assessments and Self-Perception Questionnaires (SPQ were performed at baseline, after the first application of the test product and after four applications. Results: After a single treatment we observed an increase in skin moisturisation, an improvement of skin tone/luminosity and a reduction in signs of ageing, all statistically significant. After four applications a further improvement in all measured parameters was recorded. These results were confirmed by the subjects’ own perceptions, as reported in the SPQ both after one and four applications. Conclusion: The hydrogel mask tested in this study is very effective in improving skin hydration, skin radiance and luminosity, in encouraging an even skin tone and in reducing skin pigmentation.

  13. Research on the Wire Network Signal Prediction Based on the Improved NNARX Model

    Science.gov (United States)

    Zhang, Zipeng; Fan, Tao; Wang, Shuqing

    It is difficult to obtain accurately the wire net signal of power system's high voltage power transmission lines in the process of monitoring and repairing. In order to solve this problem, the signal measured in remote substation or laboratory is employed to make multipoint prediction to gain the needed data. But, the obtained power grid frequency signal is delay. In order to solve the problem, an improved NNARX network which can predict frequency signal based on multi-point data collected by remote substation PMU is describes in this paper. As the error curved surface of the NNARX network is more complicated, this paper uses L-M algorithm to train the network. The result of the simulation shows that the NNARX network has preferable predication performance which provides accurate real time data for field testing and maintenance.

  14. Considering Organic Carbon for Improved Predictions of Clay Content from Water Vapor Sorption

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per

    2014-01-01

    Accurate determination of the soil clay fraction (CF) is of crucial importance for characterization of numerous environmental, agricultural, and engineering processes. Because traditional methods for measurement of the CF are laborious and susceptible to errors, regression models relating the CF...... to water vapor sorption isotherms that can be rapidly measured with a fully automated vapor sorption analyzer are a viable alternative. In this presentation we evaluate the performance of recently developed regression models based on comparison with standard CF measurements for soils with high organic...... carbon (OC) content and propose a modification to improve prediction accuracy. Evaluation of the CF prediction accuracy for 29 soils with clay contents ranging from 6 to 25% and with OC contents from 2.0 to 8.4% showed that the models worked reasonably well for all soils when the OC content was below 2...

  15. Activity Prediction of Schiff Base Compounds using Improved QSAR Models of Cinnamaldehyde Analogues and Derivatives

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2015-10-01

    Full Text Available In past work, QSAR (quantitative structure-activity relationship models of cinnamaldehyde analogues and derivatives (CADs have been used to predict the activities of new chemicals based on their mass concentrations, but these approaches are not without shortcomings. Therefore, molar concentrations were used instead of mass concentrations to determine antifungal activity. New QSAR models of CADs against Aspergillus niger and Penicillium citrinum were established, and the molecular design of new CADs was performed. The antifungal properties of the designed CADs were tested, and the experimental Log AR values were in agreement with the predicted Log AR values. The results indicate that the improved QSAR models are more reliable and can be effectively used for CADs molecular design and prediction of the activity of CADs. These findings provide new insight into the development and utilization of cinnamaldehyde compounds.

  16. Improving acute kidney injury diagnostics using predictive analytics.

    Science.gov (United States)

    Basu, Rajit K; Gist, Katja; Wheeler, Derek S

    2015-12-01

    Acute kidney injury (AKI) is a multifactorial syndrome affecting an alarming proportion of hospitalized patients. Although early recognition may expedite management, the ability to identify patients at-risk and those suffering real-time injury is inconsistent. The review will summarize the recent reports describing advancements in the area of AKI epidemiology, specifically focusing on risk scoring and predictive analytics. In the critical care population, the primary underlying factors limiting prediction models include an inability to properly account for patient heterogeneity and underperforming metrics used to assess kidney function. Severity of illness scores demonstrate limited AKI predictive performance. Recent evidence suggests traditional methods for detecting AKI may be leveraged and ultimately replaced by newer, more sophisticated analytical tools capable of prediction and identification: risk stratification, novel AKI biomarkers, and clinical information systems. Additionally, the utility of novel biomarkers may be optimized through targeting using patient context, and may provide more granular information about the injury phenotype. Finally, manipulation of the electronic health record allows for real-time recognition of injury. Integrating a high-functioning clinical information system with risk stratification methodology and novel biomarker yields a predictive analytic model for AKI diagnostics.

  17. Functional status and mortality prediction in community-acquired pneumonia.

    Science.gov (United States)

    Jeon, Kyeongman; Yoo, Hongseok; Jeong, Byeong-Ho; Park, Hye Yun; Koh, Won-Jung; Suh, Gee Young; Guallar, Eliseo

    2017-10-01

    Poor functional status (FS) has been suggested as a poor prognostic factor in both pneumonia and severe pneumonia in elderly patients. However, it is still unclear whether FS is associated with outcomes and improves survival prediction in community-acquired pneumonia (CAP) in the general population. Data on hospitalized patients with CAP and FS, assessed by the Eastern Cooperative Oncology Group (ECOG) scale were prospectively collected between January 2008 and December 2012. The independent association of FS with 30-day mortality in CAP patients was evaluated using multivariable logistic regression. Improvement in mortality prediction when FS was added to the CRB-65 (confusion, respiratory rate, blood pressure and age 65) score was evaluated for discrimination, reclassification and calibration. The 30-day mortality of study participants (n = 1526) was 10%. Mortality significantly increased with higher ECOG score (P for trend <0.001). In multivariable analysis, ECOG ≥3 was strongly associated with 30-day mortality (adjusted OR: 5.70; 95% CI: 3.82-8.50). Adding ECOG ≥3 significantly improved the discriminatory power of CRB-65. Reclassification indices also confirmed the improvement in discrimination ability when FS was combined with the CRB-65, with a categorized net reclassification index (NRI) of 0.561 (0.437-0.686), a continuous NRI of 0.858 (0.696-1.019) and a relative integrated discrimination improvement in the discrimination slope of 139.8 % (110.8-154.6). FS predicted 30-day mortality and improved discrimination and reclassification in consecutive CAP patients. Assessment of premorbid FS should be considered in mortality prediction in patients with CAP. © 2017 Asian Pacific Society of Respirology.

  18. FDG-PET/CT in the prediction of pulmonary function improvement in nonspecific interstitial pneumonia. A Pilot Study

    Energy Technology Data Exchange (ETDEWEB)

    Jacquelin, V. [AP-HP, Hosp. Avicenne, Department of Nuclear Medicine, Bobigny (France); Mekinian, A. [AP-HP, Hosp. Saint-Antoine, Department of Internal Medicine and Inflammation-Immunopathology-Biotherapy Department (DHU i2B), Paris (France); Brillet, P.Y. [AP-HP, Hosp. Avicenne, Department of Radiology, Bobigny (France); Univ. Paris 13, Sorbonne Paris Cité, Bobigny (France); Nunes, H. [AP-HP, Hosp. Avicenne, Department of Pneumology, Bobigny (France); Univ. Paris 13, Sorbonne Paris Cité, Bobigny (France); Fain, O. [AP-HP, Hosp. Saint-Antoine, Department of Internal Medicine and Inflammation-Immunopathology-Biotherapy Department (DHU i2B), Paris (France); Valeyre, D. [AP-HP, Hosp. Avicenne, Department of Pneumology, Bobigny (France); Univ. Paris 13, Sorbonne Paris Cité, Bobigny (France); Soussan, M., E-mail: michael.soussan@aphp.fr [AP-HP, Hosp. Avicenne, Department of Nuclear Medicine, Bobigny (France); Univ. Paris 13, Sorbonne Paris Cité, Bobigny (France)

    2016-12-15

    Purpose: Our study aimed to analyse the characteristics of nonspecific interstitial pneumonia (NSIP) using FDG-PET/CT (PET) and to evaluate its ability to predict the therapeutic response. Procedures: Eighteen NSIP patients were included. Maximum standardized uptake value (SUV{sub max}), FDG uptake extent (in percentage of lung volume), high resolution CT scan (HRCT) elementary lesions, and HRCT fibrosis score were recorded. The predictive value of the parameters for lung function improvement was evaluated using logistic regression and Receiver Operating Characteristic (ROC) curve analysis (n = 13/18). Results: All patients had an increased pulmonary FDG uptake (median SUV{sub max} = 3.1 [2–7.6]), with a median extent of 19% [6–67]. Consolidations, ground-glass opacities, honeycombing and reticulations showed uptake in 90%, 89%, 85% and 76%, respectively. FDG uptake extent was associated with improvement of pulmonary function under treatment (increase in forced vital capacity > 10%, p = 0.03), whereas SUV{sub max} and HRCT fibrosis score were not (p > 0.5). For FDG uptake extent, ROC analysis showed an area under the curve at 0.85 ± 0.11 and sensitivity/specificity was 88%/80% for a threshold fixed at 21%. Conclusions: Increased FDG uptake was observed in all NSIP patients, both in inflammatory and fibrotic HRCT lesions. The quantification of FDG uptake extent might be useful to predict functional improvement under treatment.

  19. FDG-PET/CT in the prediction of pulmonary function improvement in nonspecific interstitial pneumonia. A Pilot Study

    International Nuclear Information System (INIS)

    Jacquelin, V.; Mekinian, A.; Brillet, P.Y.; Nunes, H.; Fain, O.; Valeyre, D.; Soussan, M.

    2016-01-01

    Purpose: Our study aimed to analyse the characteristics of nonspecific interstitial pneumonia (NSIP) using FDG-PET/CT (PET) and to evaluate its ability to predict the therapeutic response. Procedures: Eighteen NSIP patients were included. Maximum standardized uptake value (SUV max ), FDG uptake extent (in percentage of lung volume), high resolution CT scan (HRCT) elementary lesions, and HRCT fibrosis score were recorded. The predictive value of the parameters for lung function improvement was evaluated using logistic regression and Receiver Operating Characteristic (ROC) curve analysis (n = 13/18). Results: All patients had an increased pulmonary FDG uptake (median SUV max = 3.1 [2–7.6]), with a median extent of 19% [6–67]. Consolidations, ground-glass opacities, honeycombing and reticulations showed uptake in 90%, 89%, 85% and 76%, respectively. FDG uptake extent was associated with improvement of pulmonary function under treatment (increase in forced vital capacity > 10%, p = 0.03), whereas SUV max and HRCT fibrosis score were not (p > 0.5). For FDG uptake extent, ROC analysis showed an area under the curve at 0.85 ± 0.11 and sensitivity/specificity was 88%/80% for a threshold fixed at 21%. Conclusions: Increased FDG uptake was observed in all NSIP patients, both in inflammatory and fibrotic HRCT lesions. The quantification of FDG uptake extent might be useful to predict functional improvement under treatment.

  20. Improved apparatus for predictive diagnosis of rotator cuff disease

    Science.gov (United States)

    Pillai, Anup; Hall, Brittany N.; Thigpen, Charles A.; Kwartowitz, David M.

    2014-03-01

    Rotator cuff disease impacts over 50% of the population over 60, with reports of incidence being as high as 90% within this population, causing pain and possible loss of function. The rotator cuff is composed of muscles and tendons that work in tandem to support the shoulder. Heavy use of these muscles can lead to rotator cuff tear, with the most common causes is age-related degeneration or sport injuries, both being a function of overuse. Tears ranges in severity from partial thickness tear to total rupture. Diagnostic techniques are based on physical assessment, detailed patient history, and medical imaging; primarily X-ray, MRI and ultrasonography are the chosen modalities for assessment. The final treatment technique and imaging modality; however, is chosen by the clinician is at their discretion. Ultrasound has been shown to have good accuracy for identification and measurement of full-thickness and partial-thickness rotator cuff tears. In this study, we report on the progress and improvement of our method of transduction and analysis of in situ measurement of rotator cuff biomechanics. We have improved the ability of the clinician to apply a uniform force to the underlying musculotendentious tissues while simultaneously obtaining the ultrasound image. This measurement protocol combined with region of interest (ROI) based image processing will help in developing a predictive diagnostic model for treatment of rotator cuff disease and help the clinicians choose the best treatment technique.

  1. A New Approach to Improve Accuracy of Grey Model GMC(1,n in Time Series Prediction

    Directory of Open Access Journals (Sweden)

    Sompop Moonchai

    2015-01-01

    Full Text Available This paper presents a modified grey model GMC(1,n for use in systems that involve one dependent system behavior and n-1 relative factors. The proposed model was developed from the conventional GMC(1,n model in order to improve its prediction accuracy by modifying the formula for calculating the background value, the system of parameter estimation, and the model prediction equation. The modified GMC(1,n model was verified by two cases: the study of forecasting CO2 emission in Thailand and forecasting electricity consumption in Thailand. The results demonstrated that the modified GMC(1,n model was able to achieve higher fitting and prediction accuracy compared with the conventional GMC(1,n and D-GMC(1,n models.

  2. An improved Corten-Dolan's model based on damage and stress state effects

    International Nuclear Information System (INIS)

    Gao, Huiying; Huang, Hong Zhong; Lv, Zhiqiang; Zuo, Fang Jun; Wang, Hai Kun

    2015-01-01

    The value of exponent d in Corten-Dolan's model is generally considered to be a constant. Nonetheless, the results predicted on the basis of this statement deviate significantly from the real values. In consideration of the effects of damage and stress state on fatigue life prediction, Corten-Dolan's model is improved by redefining the exponent d used in the traditional model. The improved model performs better than the traditional one with respect to the demonstration of a fatigue failure mechanism. Predictions of fatigue life on the basis of investigations into three metallic specimens indicate that the errors caused by the improved model are significantly smaller than those induced by the traditional model. Meanwhile, predictions derived according to the improved model fall into a narrower dispersion zone than those made as per Miner's rule and the traditional model. This finding suggests that the proposed model improves the life prediction accuracy of the other two models. The predictions obtained using the improved Corten-Dolan's model differ slightly from those derived according to a model proposed in previous literature; a few life predictions obtained on the basis of the former are more accurate than those derived according to the latter. Therefore, the improved model proposed in this paper is proven to be rational and reliable given the proven validity of the existing model. Therefore, the improved model can be feasibly and credibly applied to damage accumulation and fatigue life prediction to some extent.

  3. An improved Corten-Dolan's model based on damage and stress state effects

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Huiying; Huang, Hong Zhong; Lv, Zhiqiang; Zuo, Fang Jun; Wang, Hai Kun [University of Electronic Science and Technology of China, Chengdu (China)

    2015-08-15

    The value of exponent d in Corten-Dolan's model is generally considered to be a constant. Nonetheless, the results predicted on the basis of this statement deviate significantly from the real values. In consideration of the effects of damage and stress state on fatigue life prediction, Corten-Dolan's model is improved by redefining the exponent d used in the traditional model. The improved model performs better than the traditional one with respect to the demonstration of a fatigue failure mechanism. Predictions of fatigue life on the basis of investigations into three metallic specimens indicate that the errors caused by the improved model are significantly smaller than those induced by the traditional model. Meanwhile, predictions derived according to the improved model fall into a narrower dispersion zone than those made as per Miner's rule and the traditional model. This finding suggests that the proposed model improves the life prediction accuracy of the other two models. The predictions obtained using the improved Corten-Dolan's model differ slightly from those derived according to a model proposed in previous literature; a few life predictions obtained on the basis of the former are more accurate than those derived according to the latter. Therefore, the improved model proposed in this paper is proven to be rational and reliable given the proven validity of the existing model. Therefore, the improved model can be feasibly and credibly applied to damage accumulation and fatigue life prediction to some extent.

  4. Dynameomics: Data-driven methods and models for utilizing large-scale protein structure repositories for improving fragment-based loop prediction

    Science.gov (United States)

    Rysavy, Steven J; Beck, David AC; Daggett, Valerie

    2014-01-01

    Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼25–75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. PMID:25142412

  5. Dynameomics: data-driven methods and models for utilizing large-scale protein structure repositories for improving fragment-based loop prediction.

    Science.gov (United States)

    Rysavy, Steven J; Beck, David A C; Daggett, Valerie

    2014-11-01

    Protein function is intimately linked to protein structure and dynamics yet experimentally determined structures frequently omit regions within a protein due to indeterminate data, which is often due protein dynamics. We propose that atomistic molecular dynamics simulations provide a diverse sampling of biologically relevant structures for these missing segments (and beyond) to improve structural modeling and structure prediction. Here we make use of the Dynameomics data warehouse, which contains simulations of representatives of essentially all known protein folds. We developed novel computational methods to efficiently identify, rank and retrieve small peptide structures, or fragments, from this database. We also created a novel data model to analyze and compare large repositories of structural data, such as contained within the Protein Data Bank and the Dynameomics data warehouse. Our evaluation compares these structural repositories for improving loop predictions and analyzes the utility of our methods and models. Using a standard set of loop structures, containing 510 loops, 30 for each loop length from 4 to 20 residues, we find that the inclusion of Dynameomics structures in fragment-based methods improves the quality of the loop predictions without being dependent on sequence homology. Depending on loop length, ∼ 25-75% of the best predictions came from the Dynameomics set, resulting in lower main chain root-mean-square deviations for all fragment lengths using the combined fragment library. We also provide specific cases where Dynameomics fragments provide better predictions for NMR loop structures than fragments from crystal structures. Online access to these fragment libraries is available at http://www.dynameomics.org/fragments. © 2014 The Protein Society.

  6. The energetic significance of cooking.

    Science.gov (United States)

    Carmody, Rachel N; Wrangham, Richard W

    2009-10-01

    While cooking has long been argued to improve the diet, the nature of the improvement has not been well defined. As a result, the evolutionary significance of cooking has variously been proposed as being substantial or relatively trivial. In this paper, we evaluate the hypothesis that an important and consistent effect of cooking food is a rise in its net energy value. The pathways by which cooking influences net energy value differ for starch, protein, and lipid, and we therefore consider plant and animal foods separately. Evidence of compromised physiological performance among individuals on raw diets supports the hypothesis that cooked diets tend to provide energy. Mechanisms contributing to energy being gained from cooking include increased digestibility of starch and protein, reduced costs of digestion for cooked versus raw meat, and reduced energetic costs of detoxification and defence against pathogens. If cooking consistently improves the energetic value of foods through such mechanisms, its evolutionary impact depends partly on the relative energetic benefits of non-thermal processing methods used prior to cooking. We suggest that if non-thermal processing methods such as pounding were used by Lower Palaeolithic Homo, they likely provided an important increase in energy gain over unprocessed raw diets. However, cooking has critical effects not easily achievable by non-thermal processing, including the relatively complete gelatinisation of starch, efficient denaturing of proteins, and killing of food borne pathogens. This means that however sophisticated the non-thermal processing methods were, cooking would have conferred incremental energetic benefits. While much remains to be discovered, we conclude that the adoption of cooking would have led to an important rise in energy availability. For this reason, we predict that cooking had substantial evolutionary significance.

  7. Climate-based models for pulsed resources improve predictability of consumer population dynamics: outbreaks of house mice in forest ecosystems.

    Directory of Open Access Journals (Sweden)

    E Penelope Holland

    Full Text Available Accurate predictions of the timing and magnitude of consumer responses to episodic seeding events (masts are important for understanding ecosystem dynamics and for managing outbreaks of invasive species generated by masts. While models relating consumer populations to resource fluctuations have been developed successfully for a range of natural and modified ecosystems, a critical gap that needs addressing is better prediction of resource pulses. A recent model used change in summer temperature from one year to the next (ΔT for predicting masts for forest and grassland plants in New Zealand. We extend this climate-based method in the framework of a model for consumer-resource dynamics to predict invasive house mouse (Mus musculus outbreaks in forest ecosystems. Compared with previous mast models based on absolute temperature, the ΔT method for predicting masts resulted in an improved model for mouse population dynamics. There was also a threshold effect of ΔT on the likelihood of an outbreak occurring. The improved climate-based method for predicting resource pulses and consumer responses provides a straightforward rule of thumb for determining, with one year's advance warning, whether management intervention might be required in invaded ecosystems. The approach could be applied to consumer-resource systems worldwide where climatic variables are used to model the size and duration of resource pulses, and may have particular relevance for ecosystems where global change scenarios predict increased variability in climatic events.

  8. Using road topology to improve cyclist path prediction

    NARCIS (Netherlands)

    Pool, E.A.I.; Kooij, J.F.P.; Gavrila, D.; Ioannou, Petros; Zhang, Wei-Bin; Lu, Meng

    2017-01-01

    We learn motion models for cyclist path prediction on real-world tracks obtained from a moving vehicle, and propose to exploit the local road topology to obtain better predictive distributions. The tracks are extracted from the Tsinghua-Daimler Cyclist Benchmark for cyclist detection, and corrected

  9. Early signs that predict later haemodynamically significant patent ductus arteriosus.

    Science.gov (United States)

    Engür, Defne; Deveci, Murat; Türkmen, Münevver K

    2016-03-01

    Our aim was to determine the optimal cut-off values, sensitivity, specificity, and diagnostic power of 12 echocardiographic parameters on the second day of life to predict subsequent ductal patency. We evaluated preterm infants, born at ⩽32 weeks of gestation, starting on their second day of life, and they were evaluated every other day until ductal closure or until there were clinical signs of re-opening. We measured transductal diameter; pulmonary arterial diastolic flow; retrograde aortic diastolic flow; pulsatility index of the left pulmonary artery and descending aorta; left atrium and ventricle/aortic root ratio; left ventricular output; left ventricular flow velocity time integral; mitral early/late diastolic flow; and superior caval vein diameter and flow as well as performed receiver operating curve analysis. Transductal diameter (>1.5 mm); pulmonary arterial diastolic flow (>25.6 cm/second); presence of retrograde aortic diastolic flow; ductal diameter by body weight (>1.07 mm/kg); left pulmonary arterial pulsatility index (⩽0.71); and left ventricle to aortic root ratio (>2.2) displayed high sensitivity and specificity (p0.9). Parameters with moderate sensitivity and specificity were as follows: left atrial to aortic root ratio; left ventricular output; left ventricular flow velocity time integral; and mitral early/late diastolic flow ratio (p0.05) had low diagnostic value. Left pulmonary arterial pulsatility index, left ventricle/aortic root ratio, and ductal diameter by body weight are useful adjuncts offering a broader outlook for predicting ductal patency.

  10. The significance of collateral vessels, as seen on chest CT, in predicting SVC obstruction

    International Nuclear Information System (INIS)

    Yeouk, Young Soo; Kim, Sung Jin; Bae, Il Hun; Kim, Jae Youn; Hwang, Seung Min; Han, Gi Seok; Park, Kil Sun; Kim, Dae Young

    1998-01-01

    To evaluate the significance of collateral veins, as seen on chest CT, in the diagnosis of superior vena cava obstruction. We retrospectively the records of 81 patients in whom collateral veins were seen on chest CT. On spiral CT(n=49), contrast material was infused via power injector, and on conventional CT(n=32), 50 ml bolus infusion was followed by 50 ml drip infusion. Obstruction of the SVC was evaluated on chest CT; if, however, evaluation of the SVC of its major tributaries was difficult, as in five cases, the patient underwent SVC phlebography. Collateral vessels were assigned to one of ten categories. On conventional CT, the jugular venous arch in the only collateral vessel to predict SVC obstruction; on spiral CT, however, collateral vessels are not helpful in the diagnosis of SVC obstruction, but are a nonspecific finding. (author). 12 refs., 2 tab., 2 figs

  11. Natalizumab Significantly Improves Cognitive Impairment over Three Years in MS: Pattern of Disability Progression and Preliminary MRI Findings.

    Directory of Open Access Journals (Sweden)

    Flavia Mattioli

    Full Text Available Previous studies reported that Multiple Sclerosis (MS patients treated with natalizumab for one or two years exhibit a significant reduction in relapse rate and in cognitive impairment, but the long term effects on cognitive performance are unknown. This study aimed to evaluate the effects of natalizumab on cognitive impairment in a cohort of 24 consecutive patients with relapsing remitting MS treated for 3 years. The neuropsychological tests, as well as relapse number and EDSS, were assessed at baseline and yearly for three years. The impact on cortical atrophy was also considered in a subgroup of them, and are thus to be considered as preliminary. Results showed a significant reduction in the number of impaired neuropsychological tests after three years, a significant decrease in annualized relapse rate at each time points compared to baseline and a stable EDSS. In the neuropsychological assessment, a significant improvement in memory, attention and executive function test scores was detected. Preliminary MRI data show that, while GM volume did not change at 3 years, a significantly greater parahippocampal and prefrontal gray matter density was noticed, the former correlating with neuropsychological improvement in a memory test. This study showed that therapy with Natalizumab is helpful in improving cognitive performance, and is likely to have a protective role on grey matter, over a three years follow-up.

  12. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units

    International Nuclear Information System (INIS)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios, Martha; Baechler, Sébastien

    2015-01-01

    Purpose: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. Method: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). Results: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Conclusion: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables

  13. A genetic risk score combining ten psoriasis risk loci improves disease prediction.

    Directory of Open Access Journals (Sweden)

    Haoyan Chen

    2011-04-01

    Full Text Available Psoriasis is a chronic, immune-mediated skin disease affecting 2-3% of Caucasians. Recent genetic association studies have identified multiple psoriasis risk loci; however, most of these loci contribute only modestly to disease risk. In this study, we investigated whether a genetic risk score (GRS combining multiple loci could improve psoriasis prediction. Two approaches were used: a simple risk alleles count (cGRS and a weighted (wGRS approach. Ten psoriasis risk SNPs were genotyped in 2815 case-control samples and 858 family samples. We found that the total number of risk alleles in the cases was significantly higher than in controls, mean 13.16 (SD 1.7 versus 12.09 (SD 1.8, p = 4.577×10(-40. The wGRS captured considerably more risk than any SNP considered alone, with a psoriasis OR for high-low wGRS quartiles of 10.55 (95% CI 7.63-14.57, p = 2.010×10(-65. To compare the discriminatory ability of the GRS models, receiver operating characteristic curves were used to calculate the area under the curve (AUC. The AUC for wGRS was significantly greater than for cGRS (72.0% versus 66.5%, p = 2.13×10(-8. Additionally, the AUC for HLA-C alone (rs10484554 was equivalent to the AUC for all nine other risk loci combined (66.2% versus 63.8%, p = 0.18, highlighting the dominance of HLA-C as a risk locus. Logistic regression revealed that the wGRS was significantly associated with two subphenotypes of psoriasis, age of onset (p = 4.91×10(-6 and family history (p = 0.020. Using a liability threshold model, we estimated that the 10 risk loci account for only 11.6% of the genetic variance in psoriasis. In summary, we found that a GRS combining 10 psoriasis risk loci captured significantly more risk than any individual SNP and was associated with early onset of disease and a positive family history. Notably, only a small fraction of psoriasis heritability is captured by the common risk variants identified to date.

  14. Improved methods for predicting peptide binding affinity to MHC class II molecules.

    Science.gov (United States)

    Jensen, Kamilla Kjaergaard; Andreatta, Massimo; Marcatili, Paolo; Buus, Søren; Greenbaum, Jason A; Yan, Zhen; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten

    2018-01-06

    Major histocompatibility complex class II (MHC-II) molecules are expressed on the surface of professional antigen-presenting cells where they display peptides to T helper cells, which orchestrate the onset and outcome of many host immune responses. Understanding which peptides will be presented by the MHC-II molecule is therefore important for understanding the activation of T helper cells and can be used to identify T-cell epitopes. We here present updated versions of two MHC-II-peptide binding affinity prediction methods, NetMHCII and NetMHCIIpan. These were constructed using an extended data set of quantitative MHC-peptide binding affinity data obtained from the Immune Epitope Database covering HLA-DR, HLA-DQ, HLA-DP and H-2 mouse molecules. We show that training with this extended data set improved the performance for peptide binding predictions for both methods. Both methods are publicly available at www.cbs.dtu.dk/services/NetMHCII-2.3 and www.cbs.dtu.dk/services/NetMHCIIpan-3.2. © 2018 John Wiley & Sons Ltd.

  15. [Prognostic prediction of the functional capacity and effectiveness of functional improvement program of the musculoskeletal system among users of preventive care service under long-term care insurance].

    Science.gov (United States)

    Sone, Toshimasa; Nakaya, Naoki; Tomata, Yasutake; Aida, Jun; Okubo, Ichiro; Ohara, Satoko; Obuchi, Shuichi; Sugiyama, Michiko; Yasumura, Seiji; Suzuki, Takao; Tsuji, Ichiro

    2013-01-01

    The purpose of this study was to examine the effectiveness of the Functional Improvement Program of the Musculoskeletal System among users of Preventive Care Service under Long-Term Care Insurance. A total of 3,073 subjects were analyzed. We used the prediction formula to estimate the predicted value of the Kihon Checklist after one year, and calculated the measured value minus the predicted value. The subjects were divided into two groups according to the measured value minus predicted value tertiles: the lowest and middle tertile (good-to-fair measured value) and the highest tertile (poor measured value). We used a multiple logistic regression model to calculate the odds ratio (OR) and 95% confidence interval (CI) of the good-to-fair measured values of the Kihon Checklist after one year, according to the Functional Improvement Program of the Musculoskeletal System. In potentially dependent elderly, the multivariate adjusted ORs (95% CI) of the good-to-fair measured values were 2.4 (1.3-4.4) for those who attended the program eight times or more in a month (vs those who attended it three times or less in a month), 1.3 (1.0-1.8) for those who engaged in strength training using machines (vs those who did not train), and 1.4 (1.0-1.9) for those who engaged in endurance training. In this study, among potentially dependent elderly, those who attended the program eight times or more in a month and those who engaged in strength training using machines or endurance training showed a significant improvement of their functional capacity.

  16. Representing leaf and root physiological traits in CLM improves global carbon and nitrogen cycling predictions

    Science.gov (United States)

    Ghimire, Bardan; Riley, William J.; Koven, Charles D.; Mu, Mingquan; Randerson, James T.

    2016-06-01

    In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However, current Earth System Models (ESMs) do not mechanistically represent functional nitrogen allocation for photosynthesis or the linkage between nitrogen uptake and root traits. The current version of CLM (4.5) links nitrogen availability and plant productivity via (1) an instantaneous downregulation of potential photosynthesis rates based on soil mineral nitrogen availability, and (2) apportionment of soil nitrogen between plants and competing nitrogen consumers assumed to be proportional to their relative N demands. However, plants do not photosynthesize at potential rates and then downregulate; instead photosynthesis rates are governed by nitrogen that has been allocated to the physiological processes underpinning photosynthesis. Furthermore, the role of plant roots in nutrient acquisition has also been largely ignored in ESMs. We therefore present a new plant nitrogen model for CLM4.5 with (1) improved representations of linkages between leaf nitrogen and plant productivity based on observed relationships in a global plant trait database and (2) plant nitrogen uptake based on root-scale Michaelis-Menten uptake kinetics. Our model improvements led to a global bias reduction in GPP, LAI, and biomass of 70%, 11%, and 49%, respectively. Furthermore, water use efficiency predictions were improved conceptually, qualitatively, and in magnitude. The new model's GPP responses to nitrogen deposition, CO2 fertilization, and climate also differed from the baseline model. The mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers led to overall improvements in global carbon cycling predictions.

  17. Improving a prediction system for oil spills in the Yellow Sea: effect of tides on subtidal flow.

    Science.gov (United States)

    Kim, Chang-Sin; Cho, Yang-Ki; Choi, Byoung-Ju; Jung, Kyung Tae; You, Sung Hyup

    2013-03-15

    A multi-nested prediction system for the Yellow Sea using drifter trajectory simulations was developed to predict the movements of an oil spill after the MV Hebei Spirit accident. The speeds of the oil spill trajectories predicted by the model without tidal forcing were substantially faster than the observations; however, predictions taking into account the tides, including both tidal cycle and subtidal periods, were satisfactorily improved. Subtidal flow in the simulation without tides was stronger than in that with tides because of reduced frictional effects. Friction induced by tidal stress decelerated the southward subtidal flows driven by northwesterly winter winds along the Korean coast of the Yellow Sea. These results strongly suggest that in order to produce accurate predictions of oil spill trajectories, simulations must include tidal effects, such as variations within a tidal cycle and advections over longer time scales in tide-dominated areas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Assessment of Arctic and Antarctic sea ice predictability in CMIP5 decadal hindcasts

    Directory of Open Access Journals (Sweden)

    C.-Y. Yang

    2016-10-01

    Full Text Available This paper examines the ability of coupled global climate models to predict decadal variability of Arctic and Antarctic sea ice. We analyze decadal hindcasts/predictions of 11 Coupled Model Intercomparison Project Phase 5 (CMIP5 models. Decadal hindcasts exhibit a large multi-model spread in the simulated sea ice extent, with some models deviating significantly from the observations as the predicted ice extent quickly drifts away from the initial constraint. The anomaly correlation analysis between the decadal hindcast and observed sea ice suggests that in the Arctic, for most models, the areas showing significant predictive skill become broader associated with increasing lead times. This area expansion is largely because nearly all the models are capable of predicting the observed decreasing Arctic sea ice cover. Sea ice extent in the North Pacific has better predictive skill than that in the North Atlantic (particularly at a lead time of 3–7 years, but there is a re-emerging predictive skill in the North Atlantic at a lead time of 6–8 years. In contrast to the Arctic, Antarctic sea ice decadal hindcasts do not show broad predictive skill at any timescales, and there is no obvious improvement linking the areal extent of significant predictive skill to lead time increase. This might be because nearly all the models predict a retreating Antarctic sea ice cover, opposite to the observations. For the Arctic, the predictive skill of the multi-model ensemble mean outperforms most models and the persistence prediction at longer timescales, which is not the case for the Antarctic. Overall, for the Arctic, initialized decadal hindcasts show improved predictive skill compared to uninitialized simulations, although this improvement is not present in the Antarctic.

  19. Energy optimization and prediction of complex petrochemical industries using an improved artificial neural network approach integrating data envelopment analysis

    International Nuclear Information System (INIS)

    Han, Yong-Ming; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-01-01

    Graphical abstract: This paper proposed an energy optimization and prediction of complex petrochemical industries based on a DEA-integrated ANN approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA-ANN prediction model is effectively verified by executing a linear comparison between all DMUs and the effective DMUs through the standard data source from the UCI (University of California at Irvine) repository. Finally, the proposed model is validated through an application in a complex ethylene production system of China petrochemical industry. Meanwhile, the optimization result and the prediction value are obtained to reduce energy consumption of the ethylene production system, guide ethylene production and improve energy efficiency. - Highlights: • The DEA-integrated ANN approach is proposed. • The DEA-ANN prediction model is effectively verified through the standard data source from the UCI repository. • The energy optimization and prediction framework of complex petrochemical industries based on the proposed method is obtained. • The proposed method is valid and efficient in improvement of energy efficiency in complex petrochemical plants. - Abstract: Since the complex petrochemical data have characteristics of multi-dimension, uncertainty and noise, it is difficult to accurately optimize and predict the energy usage of complex petrochemical systems. Therefore, this paper proposes a data envelopment analysis (DEA) integrated artificial neural network (ANN) approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA

  20. Theoretical study on new bias factor methods to effectively use critical experiments for improvement of prediction accuracy of neutronic characteristics

    International Nuclear Information System (INIS)

    Kugo, Teruhiko; Mori, Takamasa; Takeda, Toshikazu

    2007-01-01

    Extended bias factor methods are proposed with two new concepts, the LC method and the PE method, in order to effectively use critical experiments and to enhance the applicability of the bias factor method for the improvement of the prediction accuracy of neutronic characteristics of a target core. Both methods utilize a number of critical experimental results and produce a semifictitious experimental value with them. The LC and PE methods define the semifictitious experimental values by a linear combination of experimental values and the product of exponentiated experimental values, respectively, and the corresponding semifictitious calculation values by those of calculation values. A bias factor is defined by the ratio of the semifictitious experimental value to the semifictitious calculation value in both methods. We formulate how to determine weights for the LC method and exponents for the PE method in order to minimize the variance of the design prediction value obtained by multiplying the design calculation value by the bias factor. From a theoretical comparison of these new methods with the conventional method which utilizes a single experimental result and the generalized bias factor method which was previously proposed to utilize a number of experimental results, it is concluded that the PE method is the most useful method for improving the prediction accuracy. The main advantages of the PE method are summarized as follows. The prediction accuracy is necessarily improved compared with the design calculation value even when experimental results include large experimental errors. This is a special feature that the other methods do not have. The prediction accuracy is most effectively improved by utilizing all the experimental results. From these facts, it can be said that the PE method effectively utilizes all the experimental results and has a possibility to make a full-scale-mockup experiment unnecessary with the use of existing and future benchmark

  1. Detection and characterization of 3D-signature phosphorylation site motifs and their contribution towards improved phosphorylation site prediction in proteins

    Directory of Open Access Journals (Sweden)

    Selbig Joachim

    2009-04-01

    Full Text Available Abstract Background Phosphorylation of proteins plays a crucial role in the regulation and activation of metabolic and signaling pathways and constitutes an important target for pharmaceutical intervention. Central to the phosphorylation process is the recognition of specific target sites by protein kinases followed by the covalent attachment of phosphate groups to the amino acids serine, threonine, or tyrosine. The experimental identification as well as computational prediction of phosphorylation sites (P-sites has proved to be a challenging problem. Computational methods have focused primarily on extracting predictive features from the local, one-dimensional sequence information surrounding phosphorylation sites. Results We characterized the spatial context of phosphorylation sites and assessed its usability for improved phosphorylation site predictions. We identified 750 non-redundant, experimentally verified sites with three-dimensional (3D structural information available in the protein data bank (PDB and grouped them according to their respective kinase family. We studied the spatial distribution of amino acids around phosphorserines, phosphothreonines, and phosphotyrosines to extract signature 3D-profiles. Characteristic spatial distributions of amino acid residue types around phosphorylation sites were indeed discernable, especially when kinase-family-specific target sites were analyzed. To test the added value of using spatial information for the computational prediction of phosphorylation sites, Support Vector Machines were applied using both sequence as well as structural information. When compared to sequence-only based prediction methods, a small but consistent performance improvement was obtained when the prediction was informed by 3D-context information. Conclusion While local one-dimensional amino acid sequence information was observed to harbor most of the discriminatory power, spatial context information was identified as

  2. Using Terrain Analysis and Remote Sensing to Improve Snow Mass Balance and Runoff Prediction

    Science.gov (United States)

    Venteris, E. R.; Coleman, A. M.; Wigmosta, M. S.

    2010-12-01

    Approximately 70-80% of the water in the international Columbia River basin is sourced from snowmelt. The demand for this water has competing needs, as it is used for agricultural irrigation, municipal, hydro and nuclear power generation, and environmental in-stream flow requirements. Accurate forecasting of water supply is essential for planning current needs and prediction of future demands due to growth and climate change. A significant limitation on current forecasting is spatial and temporal uncertainty in snowpack characteristics, particularly snow water equivalent. Currently, point measurements of snow mass balance are provided by the NRCS SNOTEL network. Each site consists of a snow mass sensor and meteorology station that monitors snow water equivalent, snow depth, precipitation, and temperature. There are currently 152 sites in the mountains of Oregon and Washington. An important step in improving forecasts is determining how representative each SNOTEL site is of the total mass balance of the watershed through a full accounting of the spatiotemporal variability in snowpack processes. This variation is driven by the interaction between meteorological processes, land cover, and landform. Statistical and geostatistical spatial models relate the state of the snowpack (characterized through SNOTEL, snow course measurements, and multispectral remote sensing) to terrain attributes derived from digital elevation models (elevation, aspect, slope, compound topographic index, topographic shading, etc.) and land cover. Time steps representing the progression of the snow season for several meteorologically distinct water years are investigated to identify and quantify dominant physical processes. The spatially distributed snow balance data can be used directly as model inputs to improve short- and long-range hydrologic forecasts.

  3. Composition-Based Prediction of Temperature-Dependent Thermophysical Food Properties: Reevaluating Component Groups and Prediction Models.

    Science.gov (United States)

    Phinney, David Martin; Frelka, John C; Heldman, Dennis Ray

    2017-01-01

    Prediction of temperature-dependent thermophysical properties (thermal conductivity, density, specific heat, and thermal diffusivity) is an important component of process design for food manufacturing. Current models for prediction of thermophysical properties of foods are based on the composition, specifically, fat, carbohydrate, protein, fiber, water, and ash contents, all of which change with temperature. The objectives of this investigation were to reevaluate and improve the prediction expressions for thermophysical properties. Previously published data were analyzed over the temperature range from 10 to 150 °C. These data were analyzed to create a series of relationships between the thermophysical properties and temperature for each food component, as well as to identify the dependence of the thermophysical properties on more specific structural properties of the fats, carbohydrates, and proteins. Results from this investigation revealed that the relationships between the thermophysical properties of the major constituents of foods and temperature can be statistically described by linear expressions, in contrast to the current polynomial models. Links between variability in thermophysical properties and structural properties were observed. Relationships for several thermophysical properties based on more specific constituents have been identified. Distinctions between simple sugars (fructose, glucose, and lactose) and complex carbohydrates (starch, pectin, and cellulose) have been proposed. The relationships between the thermophysical properties and proteins revealed a potential correlation with the molecular weight of the protein. The significance of relating variability in constituent thermophysical properties with structural properties--such as molecular mass--could significantly improve composition-based prediction models and, consequently, the effectiveness of process design. © 2016 Institute of Food Technologists®.

  4. A systems biology approach to transcription factor binding site prediction.

    Directory of Open Access Journals (Sweden)

    Xiang Zhou

    2010-03-01

    Full Text Available The elucidation of mammalian transcriptional regulatory networks holds great promise for both basic and translational research and remains one the greatest challenges to systems biology. Recent reverse engineering methods deduce regulatory interactions from large-scale mRNA expression profiles and cross-species conserved regulatory regions in DNA. Technical challenges faced by these methods include distinguishing between direct and indirect interactions, associating transcription regulators with predicted transcription factor binding sites (TFBSs, identifying non-linearly conserved binding sites across species, and providing realistic accuracy estimates.We address these challenges by closely integrating proven methods for regulatory network reverse engineering from mRNA expression data, linearly and non-linearly conserved regulatory region discovery, and TFBS evaluation and discovery. Using an extensive test set of high-likelihood interactions, which we collected in order to provide realistic prediction-accuracy estimates, we show that a careful integration of these methods leads to significant improvements in prediction accuracy. To verify our methods, we biochemically validated TFBS predictions made for both transcription factors (TFs and co-factors; we validated binding site predictions made using a known E2F1 DNA-binding motif on E2F1 predicted promoter targets, known E2F1 and JUND motifs on JUND predicted promoter targets, and a de novo discovered motif for BCL6 on BCL6 predicted promoter targets. Finally, to demonstrate accuracy of prediction using an external dataset, we showed that sites matching predicted motifs for ZNF263 are significantly enriched in recent ZNF263 ChIP-seq data.Using an integrative framework, we were able to address technical challenges faced by state of the art network reverse engineering methods, leading to significant improvement in direct-interaction detection and TFBS-discovery accuracy. We estimated the accuracy

  5. Best of both worlds: combining pharma data and state of the art modeling technology to improve in Silico pKa prediction.

    Science.gov (United States)

    Fraczkiewicz, Robert; Lobell, Mario; Göller, Andreas H; Krenz, Ursula; Schoenneis, Rolf; Clark, Robert D; Hillisch, Alexander

    2015-02-23

    In a unique collaboration between a software company and a pharmaceutical company, we were able to develop a new in silico pKa prediction tool with outstanding prediction quality. An existing pKa prediction method from Simulations Plus based on artificial neural network ensembles (ANNE), microstates analysis, and literature data was retrained with a large homogeneous data set of drug-like molecules from Bayer. The new model was thus built with curated sets of ∼14,000 literature pKa values (∼11,000 compounds, representing literature chemical space) and ∼19,500 pKa values experimentally determined at Bayer Pharma (∼16,000 compounds, representing industry chemical space). Model validation was performed with several test sets consisting of a total of ∼31,000 new pKa values measured at Bayer. For the largest and most difficult test set with >16,000 pKa values that were not used for training, the original model achieved a mean absolute error (MAE) of 0.72, root-mean-square error (RMSE) of 0.94, and squared correlation coefficient (R(2)) of 0.87. The new model achieves significantly improved prediction statistics, with MAE = 0.50, RMSE = 0.67, and R(2) = 0.93. It is commercially available as part of the Simulations Plus ADMET Predictor release 7.0. Good predictions are only of value when delivered effectively to those who can use them. The new pKa prediction model has been integrated into Pipeline Pilot and the PharmacophorInformatics (PIx) platform used by scientists at Bayer Pharma. Different output formats allow customized application by medicinal chemists, physical chemists, and computational chemists.

  6. Modified ensemble Kalman filter for nuclear accident atmospheric dispersion: prediction improved and source estimated.

    Science.gov (United States)

    Zhang, X L; Su, G F; Yuan, H Y; Chen, J G; Huang, Q Y

    2014-09-15

    Atmospheric dispersion models play an important role in nuclear power plant accident management. A reliable estimation of radioactive material distribution in short range (about 50 km) is in urgent need for population sheltering and evacuation planning. However, the meteorological data and the source term which greatly influence the accuracy of the atmospheric dispersion models are usually poorly known at the early phase of the emergency. In this study, a modified ensemble Kalman filter data assimilation method in conjunction with a Lagrangian puff-model is proposed to simultaneously improve the model prediction and reconstruct the source terms for short range atmospheric dispersion using the off-site environmental monitoring data. Four main uncertainty parameters are considered: source release rate, plume rise height, wind speed and wind direction. Twin experiments show that the method effectively improves the predicted concentration distribution, and the temporal profiles of source release rate and plume rise height are also successfully reconstructed. Moreover, the time lag in the response of ensemble Kalman filter is shortened. The method proposed here can be a useful tool not only in the nuclear power plant accident emergency management but also in other similar situation where hazardous material is released into the atmosphere. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Medium- and Long-term Prediction of LOD Change by the Leap-step Autoregressive Model

    Science.gov (United States)

    Wang, Qijie

    2015-08-01

    The accuracy of medium- and long-term prediction of length of day (LOD) change base on combined least-square and autoregressive (LS+AR) deteriorates gradually. Leap-step autoregressive (LSAR) model can significantly reduce the edge effect of the observation sequence. Especially, LSAR model greatly improves the resolution of signals’ low-frequency components. Therefore, it can improve the efficiency of prediction. In this work, LSAR is used to forecast the LOD change. The LOD series from EOP 08 C04 provided by IERS is modeled by both the LSAR and AR models. The results of the two models are analyzed and compared. When the prediction length is between 10-30 days, the accuracy improvement is less than 10%. When the prediction length amounts to above 30 day, the accuracy improved obviously, with the maximum being around 19%. The results show that the LSAR model has higher prediction accuracy and stability in medium- and long-term prediction.

  8. Multitrait, Random Regression, or Simple Repeatability Model in High-Throughput Phenotyping Data Improve Genomic Prediction for Wheat Grain Yield.

    Science.gov (United States)

    Sun, Jin; Rutkoski, Jessica E; Poland, Jesse A; Crossa, José; Jannink, Jean-Luc; Sorrells, Mark E

    2017-07-01

    High-throughput phenotyping (HTP) platforms can be used to measure traits that are genetically correlated with wheat ( L.) grain yield across time. Incorporating such secondary traits in the multivariate pedigree and genomic prediction models would be desirable to improve indirect selection for grain yield. In this study, we evaluated three statistical models, simple repeatability (SR), multitrait (MT), and random regression (RR), for the longitudinal data of secondary traits and compared the impact of the proposed models for secondary traits on their predictive abilities for grain yield. Grain yield and secondary traits, canopy temperature (CT) and normalized difference vegetation index (NDVI), were collected in five diverse environments for 557 wheat lines with available pedigree and genomic information. A two-stage analysis was applied for pedigree and genomic selection (GS). First, secondary traits were fitted by SR, MT, or RR models, separately, within each environment. Then, best linear unbiased predictions (BLUPs) of secondary traits from the above models were used in the multivariate prediction models to compare predictive abilities for grain yield. Predictive ability was substantially improved by 70%, on average, from multivariate pedigree and genomic models when including secondary traits in both training and test populations. Additionally, (i) predictive abilities slightly varied for MT, RR, or SR models in this data set, (ii) results indicated that including BLUPs of secondary traits from the MT model was the best in severe drought, and (iii) the RR model was slightly better than SR and MT models under drought environment. Copyright © 2017 Crop Science Society of America.

  9. PubMed-supported clinical term weighting approach for improving inter-patient similarity measure in diagnosis prediction.

    Science.gov (United States)

    Chan, Lawrence Wc; Liu, Ying; Chan, Tao; Law, Helen Kw; Wong, S C Cesar; Yeung, Andy Ph; Lo, K F; Yeung, S W; Kwok, K Y; Chan, William Yl; Lau, Thomas Yh; Shyu, Chi-Ren

    2015-06-02

    Similarity-based retrieval of Electronic Health Records (EHRs) from large clinical information systems provides physicians the evidence support in making diagnoses or referring examinations for the suspected cases. Clinical Terms in EHRs represent high-level conceptual information and the similarity measure established based on these terms reflects the chance of inter-patient disease co-occurrence. The assumption that clinical terms are equally relevant to a disease is unrealistic, reducing the prediction accuracy. Here we propose a term weighting approach supported by PubMed search engine to address this issue. We collected and studied 112 abdominal computed tomography imaging examination reports from four hospitals in Hong Kong. Clinical terms, which are the image findings related to hepatocellular carcinoma (HCC), were extracted from the reports. Through two systematic PubMed search methods, the generic and specific term weightings were established by estimating the conditional probabilities of clinical terms given HCC. Each report was characterized by an ontological feature vector and there were totally 6216 vector pairs. We optimized the modified direction cosine (mDC) with respect to a regularization constant embedded into the feature vector. Equal, generic and specific term weighting approaches were applied to measure the similarity of each pair and their performances for predicting inter-patient co-occurrence of HCC diagnoses were compared by using Receiver Operating Characteristics (ROC) analysis. The Areas under the curves (AUROCs) of similarity scores based on equal, generic and specific term weighting approaches were 0.735, 0.728 and 0.743 respectively (p PubMed. Our findings suggest that the optimized similarity measure with specific term weighting to EHRs can improve significantly the accuracy for predicting the inter-patient co-occurrence of diagnosis when compared with equal and generic term weighting approaches.

  10. Improved Predictions of the Geographic Distribution of Invasive Plants Using Climatic Niche Models

    Science.gov (United States)

    Ramírez-Albores, Jorge E.; Bustamante, Ramiro O.

    2016-01-01

    naturally established individuals because this improves the accuracy of predictions about their distribution ranges. PMID:27195983

  11. Spontaneous Resolution of Long-Standing Macular Detachment due to Optic Disc Pit with Significant Visual Improvement.

    Science.gov (United States)

    Parikakis, Efstratios A; Chatziralli, Irini P; Peponis, Vasileios G; Karagiannis, Dimitrios; Stratos, Aimilianos; Tsiotra, Vasileia A; Mitropoulos, Panagiotis G

    2014-01-01

    To report a case of spontaneous resolution of a long-standing serous macular detachment associated with an optic disc pit, leading to significant visual improvement. A 63-year-old female presented with a 6-month history of blurred vision and micropsia in her left eye. Her best-corrected visual acuity was 6/24 in the left eye, and fundoscopy revealed serous macular detachment associated with optic disc pit, which was confirmed by optical coherence tomography (OCT). The patient was offered vitrectomy as a treatment alternative, but she preferred to be reviewed conservatively. Three years after initial presentation, neither macular detachment nor subretinal fluid was evident in OCT, while the inner segment/outer segment (IS/OS) junction line was intact. Her visual acuity was improved from 6/24 to 6/12 in her left eye, remaining stable at the 6-month follow-up after resolution. We present a case of spontaneous resolution of a long-standing macular detachment associated with an optic disc pit with significant visual improvement, postulating that the integrity of the IS/OS junction line may be a prognostic factor for final visual acuity and suggesting OCT as an indicator of visual prognosis and the probable necessity of a surgical management.

  12. Spontaneous Resolution ofLong-Standing Macular Detachment due to Optic Disc Pit with Significant Visual Improvement

    Directory of Open Access Journals (Sweden)

    Efstratios A. Parikakis

    2014-03-01

    Full Text Available Purpose: To report a case of spontaneous resolution of a long-standing serous macular detachment associated with an optic disc pit, leading to significant visual improvement. Case Presentation: A 63-year-old female presented with a 6-month history of blurred vision and micropsia in her left eye. Her best-corrected visual acuity was 6/24 in the left eye, and fundoscopy revealed serous macular detachment associated with optic disc pit, which was confirmed by optical coherence tomography (OCT. The patient was offered vitrectomy as a treatment alternative, but she preferred to be reviewed conservatively. Three years after initial presentation, neither macular detachment nor subretinal fluid was evident in OCT, while the inner segment/outer segment (IS/OS junction line was intact. Her visual acuity was improved from 6/24 to 6/12 in her left eye, remaining stable at the 6-month follow-up after resolution. Conclusion: We present a case of spontaneous resolution of a long-standing macular detachment associated with an optic disc pit with significant visual improvement, postulating that the integrity of the IS/OS junction line may be a prognostic factor for final visual acuity and suggesting OCT as an indicator of visual prognosis and the probable necessity of a surgical management.

  13. Use of in situ volumetric water content at field capacity to improve prediction of soil water retention properties

    OpenAIRE

    Al Majou , Hassan; Bruand , Ary; Duval , Odile

    2008-01-01

    International audience; Use of in situ volumetric water content at field capacity to improve prediction of soil water retention properties. Most pedotransfer functions (PTFs) developed over the last three decades to generate water retention characteristics use soil texture, bulk density and organic carbon content as predictors. Despite of the high number of PTFs published, most being class- or continuous-PTFs, accuracy of prediction remains limited. In this study, we compared the performance ...

  14. Collaborative Proposal: Improving Decadal Prediction of Arctic Climate Variability and Change Using a Regional Arctic System Model (RASM)

    Energy Technology Data Exchange (ETDEWEB)

    Maslowski, Wieslaw [Naval Postgraduate School, Monterey, CA (United States)

    2016-10-17

    This project aims to develop, apply and evaluate a regional Arctic System model (RASM) for enhanced decadal predictions. Its overarching goal is to advance understanding of the past and present states of arctic climate and to facilitate improvements in seasonal to decadal predictions. In particular, it will focus on variability and long-term change of energy and freshwater flows through the arctic climate system. The project will also address modes of natural climate variability as well as extreme and rapid climate change in a region of the Earth that is: (i) a key indicator of the state of global climate through polar amplification and (ii) which is undergoing environmental transitions not seen in instrumental records. RASM will readily allow the addition of other earth system components, such as ecosystem or biochemistry models, thus allowing it to facilitate studies of climate impacts (e.g., droughts and fires) and of ecosystem adaptations to these impacts. As such, RASM is expected to become a foundation for more complete Arctic System models and part of a model hierarchy important for improving climate modeling and predictions.

  15. Assessing the clinical significance of tumor markers in common neoplasms.

    Science.gov (United States)

    Beketic-Oreskovic, Lidija; Maric, Petra; Ozretic, Petar; Oreskovic, Darko; Ajdukovic, Mia; Levanat, Sonja

    2012-06-01

    The term tumor markers include a spectrum of molecules and substances with widely divergent characteristics whose presence in the significant amount can be related to the malignant disease. An ideal tumor marker should have high specificity and sensitivity, which would allow its use in early diagnosis and prognosis of malignant disease, as well as in prediction of therapeutic response and follow-up of the patients. Numerous biochemical entities have emerged as potentially valuable tumor markers so far, but only few markers showed to be of considerable clinical reliability and have been accepted into standard clinical practice. Recent development of genomics and proteomics has enabled the examination of many new potential tumor markers. Scientific studies on discovery, development, and application of tumor markers have been proceeding quite rapidly providing great opportunities for improving the management of cancer patients. This review is focusing on the clinical usefulness of various tumor markers already in clinical practice as well as certain potential markers, giving a brief description of their prognostic and predictive significance in most common malignancies.

  16. Improving predictive capabilities of environmental change with GLOBE data

    Science.gov (United States)

    Robin, Jessica Hill

    This dissertation addresses two applications of Normalized Difference Vegetation Index (NDVI) essential for predicting environmental changes. The first study focuses on whether NDVI can improve model simulations of evapotranspiration for temperate Northern (>35°) regions. The second study focuses on whether NDVI can detect phenological changes in start of season (SOS) for high Northern (>60°) environments. The overall objectives of this research were to (1) develop a methodology for utilizing GLOBE data in NDVI research; and (2) provide a critical analysis of NDVI as a long-term monitoring tool for environmental change. GLOBE is an international partnership network of K-12 students, teachers, and scientists working together to study and understand the global environment. The first study utilized data collected by one GLOBE school in Greenville, Pennsylvania and the second utilized phenology observations made by GLOBE students in Alaska. Results from the first study showed NDVI could predict transpiration periods for environments like Greenville, Pennsylvania. In phenological terms, these environments have three distinct periods (QI, QII, and QIII). QI reflects onset of the growing season (mid March--mid May) when vegetation is greening up (NDVI 0.60). Results from the second study showed that a climate threshold of 153 +/- 22 growing degree days was a better predictor of SOS for Fairbanks than a NDVI threshold applied to temporal AVHRR and MODIS datasets. Accumulated growing degree days captured the interannual variability of SOS better than the NDVI threshold and most closely resembled actual SOS observations made by GLOBE students. Overall, biweekly composites and effects of clouds, snow, and conifers limit the ability of NDVI to monitor phenological changes in Alaska. Both studies did show that GLOBE data provides an important source of input and validation information for NDVI research.

  17. The Urgent Need for Improved Climate Models and Predictions

    Science.gov (United States)

    Goddard, Lisa; Baethgen, Walter; Kirtman, Ben; Meehl, Gerald

    2009-09-01

    An investment over the next 10 years of the order of US$2 billion for developing improved climate models was recommended in a report (http://wcrp.wmo.int/documents/WCRP_WorldModellingSummit_Jan2009.pdf) from the May 2008 World Modelling Summit for Climate Prediction, held in Reading, United Kingdom, and presented by the World Climate Research Programme. The report indicated that “climate models will, as in the past, play an important, and perhaps central, role in guiding the trillion dollar decisions that the peoples, governments and industries of the world will be making to cope with the consequences of changing climate.” If trillions of dollars are going to be invested in making decisions related to climate impacts, an investment of $2 billion, which is less than 0.1% of that amount, to provide better climate information seems prudent. One example of investment in adaptation is the World Bank's Climate Investment Fund, which has drawn contributions of more than $6 billion for work on clean technologies and adaptation efforts in nine pilot countries and two pilot regions. This is just the beginning of expenditures on adaptation efforts by the World Bank and other mechanisms, focusing on only a small fraction of the nations of the world and primarily aimed at anticipated anthropogenic climate change. Moreover, decisions are being made now, all around the world—by individuals, companies, and governments—that affect people and their livelihoods today, not just 50 or more years in the future. Climate risk management, whether related to projects of the scope of the World Bank's or to the planning and decisions of municipalities, will be best guided by meaningful climate information derived from observations of the past and model predictions of the future.

  18. Improved Prediction of Preterm Delivery Using Empirical Mode Decomposition Analysis of Uterine Electromyography Signals.

    Directory of Open Access Journals (Sweden)

    Peng Ren

    Full Text Available Preterm delivery increases the risk of infant mortality and morbidity, and therefore developing reliable methods for predicting its likelihood are of great importance. Previous work using uterine electromyography (EMG recordings has shown that they may provide a promising and objective way for predicting risk of preterm delivery. However, to date attempts at utilizing computational approaches to achieve sufficient predictive confidence, in terms of area under the curve (AUC values, have not achieved the high discrimination accuracy that a clinical application requires. In our study, we propose a new analytical approach for assessing the risk of preterm delivery using EMG recordings which firstly employs Empirical Mode Decomposition (EMD to obtain their Intrinsic Mode Functions (IMF. Next, the entropy values of both instantaneous amplitude and instantaneous frequency of the first ten IMF components are computed in order to derive ratios of these two distinct components as features. Discrimination accuracy of this approach compared to those proposed previously was then calculated using six differently representative classifiers. Finally, three different electrode positions were analyzed for their prediction accuracy of preterm delivery in order to establish which uterine EMG recording location was optimal signal data. Overall, our results show a clear improvement in prediction accuracy of preterm delivery risk compared with previous approaches, achieving an impressive maximum AUC value of 0.986 when using signals from an electrode positioned below the navel. In sum, this provides a promising new method for analyzing uterine EMG signals to permit accurate clinical assessment of preterm delivery risk.

  19. An initiative to improve the management of clinically significant test results in a large health care network.

    Science.gov (United States)

    Roy, Christopher L; Rothschild, Jeffrey M; Dighe, Anand S; Schiff, Gordon D; Graydon-Baker, Erin; Lenoci-Edwards, Jennifer; Dwyer, Cheryl; Khorasani, Ramin; Gandhi, Tejal K

    2013-11-01

    The failure of providers to communicate and follow up clinically significant test results (CSTR) is an important threat to patient safety. The Massachusetts Coalition for the Prevention of Medical Errors has endorsed the creation of systems to ensure that results can be received and acknowledged. In 2008 a task force was convened that represented clinicians, laboratories, radiology, patient safety, risk management, and information systems in a large health care network with the goals of providing recommendations and a road map for improvement in the management of CSTR and of implementing this improvement plan during the sub-force sequent five years. In drafting its charter, the task broadened the scope from "critical" results to "clinically significant" ones; clinically significant was defined as any result that requires further clinical action to avoid morbidity or mortality, regardless of the urgency of that action. The task force recommended four key areas for improvement--(1) standardization of policies and definitions, (2) robust identification of the patient's care team, (3) enhanced results management/tracking systems, and (4) centralized quality reporting and metrics. The task force faced many challenges in implementing these recommendations, including disagreements on definitions of CSTR and on who should have responsibility for CSTR, changes to established work flows, limitations of resources and of existing information systems, and definition of metrics. This large-scale effort to improve the communication and follow-up of CSTR in a health care network continues with ongoing work to address implementation challenges, refine policies, prepare for a new clinical information system platform, and identify new ways to measure the extent of this important safety problem.

  20. Combination of baseline metabolic tumour volume and early response on PET/CT improves progression-free survival prediction in DLBCL

    Energy Technology Data Exchange (ETDEWEB)

    Mikhaeel, N.G.; Smith, Daniel [Guy' s and St Thomas' NHS Foundation Trust, Department of Clinical Oncology, London (United Kingdom); Dunn, Joel T.; Phillips, Michael; Barrington, Sally F. [King' s College London, PET Imaging Centre at St Thomas' Hospital, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Moeller, Henrik [King' s College London, Department of Cancer Epidemiology and Population Health, London (United Kingdom); Fields, Paul A.; Wrench, David [Guy' s and St Thomas' NHS Foundation Trust, Department of Haematology, London (United Kingdom)

    2016-07-15

    The study objectives were to assess the prognostic value of quantitative PET and to test whether combining baseline metabolic tumour burden with early PET response could improve predictive power in DLBCL. A total of 147 patients with DLBCL underwent FDG-PET/CT scans before and after two cycles of RCHOP. Quantitative parameters including metabolic tumour volume (MTV) and total lesion glycolysis (TLG) were measured, as well as the percentage change in these parameters. Cox regression analysis was used to test the relationship between progression-free survival (PFS) and the study variables. Receiver operator characteristics (ROC) analysis determined the optimal cut-off for quantitative variables, and Kaplan-Meier survival analysis was performed. The median follow-up was 3.8 years. As MTV and TLG measures correlated strongly, only MTV measures were used for multivariate analysis (MVA). Baseline MTV (MTV-0) was the only statistically significant predictor of PFS on MVA. The optimal cut-off for MTV-0 was 396 cm{sup 3}. A model combing MTV-0 and Deauville score (DS) separated the population into three distinct prognostic groups: good (MTV-0 < 400; 5-year PFS > 90 %), intermediate (MTV-0 ≥ 400+ DS1-3; 5-year PFS 58.5 %) and poor (MTV-0 ≥ 400+ DS4-5; 5-year PFS 29.7 %) MTV-0 is an important prognostic factor in DLBCL. Combining MTV-0 and early PET/CT response improves the predictive power of interim PET and defines a poor-prognosis group in whom most of the events occur. (orig.)

  1. Small angle X-ray scattering and cross-linking for data assisted protein structure prediction in CASP 12 with prospects for improved accuracy

    KAUST Repository

    Ogorzalek, Tadeusz L.

    2018-01-04

    Experimental data offers empowering constraints for structure prediction. These constraints can be used to filter equivalently scored models or more powerfully within optimization functions toward prediction. In CASP12, Small Angle X-ray Scattering (SAXS) and Cross-Linking Mass Spectrometry (CLMS) data, measured on an exemplary set of novel fold targets, were provided to the CASP community of protein structure predictors. As HT, solution-based techniques, SAXS and CLMS can efficiently measure states of the full-length sequence in its native solution conformation and assembly. However, this experimental data did not substantially improve prediction accuracy judged by fits to crystallographic models. One issue, beyond intrinsic limitations of the algorithms, was a disconnect between crystal structures and solution-based measurements. Our analyses show that many targets had substantial percentages of disordered regions (up to 40%) or were multimeric or both. Thus, solution measurements of flexibility and assembly support variations that may confound prediction algorithms trained on crystallographic data and expecting globular fully-folded monomeric proteins. Here, we consider the CLMS and SAXS data collected, the information in these solution measurements, and the challenges in incorporating them into computational prediction. As improvement opportunities were only partly realized in CASP12, we provide guidance on how data from the full-length biological unit and the solution state can better aid prediction of the folded monomer or subunit. We furthermore describe strategic integrations of solution measurements with computational prediction programs with the aim of substantially improving foundational knowledge and the accuracy of computational algorithms for biologically-relevant structure predictions for proteins in solution. This article is protected by copyright. All rights reserved.

  2. Small angle X-ray scattering and cross-linking for data assisted protein structure prediction in CASP 12 with prospects for improved accuracy

    KAUST Repository

    Ogorzalek, Tadeusz L.; Hura, Greg L.; Belsom, Adam; Burnett, Kathryn H.; Kryshtafovych, Andriy; Tainer, John A.; Rappsilber, Juri; Tsutakawa, Susan E.; Fidelis, Krzysztof

    2018-01-01

    Experimental data offers empowering constraints for structure prediction. These constraints can be used to filter equivalently scored models or more powerfully within optimization functions toward prediction. In CASP12, Small Angle X-ray Scattering (SAXS) and Cross-Linking Mass Spectrometry (CLMS) data, measured on an exemplary set of novel fold targets, were provided to the CASP community of protein structure predictors. As HT, solution-based techniques, SAXS and CLMS can efficiently measure states of the full-length sequence in its native solution conformation and assembly. However, this experimental data did not substantially improve prediction accuracy judged by fits to crystallographic models. One issue, beyond intrinsic limitations of the algorithms, was a disconnect between crystal structures and solution-based measurements. Our analyses show that many targets had substantial percentages of disordered regions (up to 40%) or were multimeric or both. Thus, solution measurements of flexibility and assembly support variations that may confound prediction algorithms trained on crystallographic data and expecting globular fully-folded monomeric proteins. Here, we consider the CLMS and SAXS data collected, the information in these solution measurements, and the challenges in incorporating them into computational prediction. As improvement opportunities were only partly realized in CASP12, we provide guidance on how data from the full-length biological unit and the solution state can better aid prediction of the folded monomer or subunit. We furthermore describe strategic integrations of solution measurements with computational prediction programs with the aim of substantially improving foundational knowledge and the accuracy of computational algorithms for biologically-relevant structure predictions for proteins in solution. This article is protected by copyright. All rights reserved.

  3. Communication: Proper treatment of classically forbidden electronic transitions significantly improves detailed balance in surface hopping

    Energy Technology Data Exchange (ETDEWEB)

    Sifain, Andrew E. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Wang, Linjun [Department of Chemistry, Zhejiang University, Hangzhou 310027 (China); Prezhdo, Oleg V. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Department of Chemistry, University of Southern California, Los Angeles, California 90089-1062 (United States)

    2016-06-07

    Surface hopping is the most popular method for nonadiabatic molecular dynamics. Many have reported that it does not rigorously attain detailed balance at thermal equilibrium, but does so approximately. We show that convergence to the Boltzmann populations is significantly improved when the nuclear velocity is reversed after a classically forbidden hop. The proposed prescription significantly reduces the total number of classically forbidden hops encountered along a trajectory, suggesting that some randomization in nuclear velocity is needed when classically forbidden hops constitute a large fraction of attempted hops. Our results are verified computationally using two- and three-level quantum subsystems, coupled to a classical bath undergoing Langevin dynamics.

  4. The improvement of MOSFET prediction in space environments using the conversion model

    International Nuclear Information System (INIS)

    Shvetzov-Shilovsky, I.N.; Cherepko, S.V.; Pershenkov, V.S.

    1994-01-01

    The modeling of MOS device response to a low dose rate irradiation has been performed. The existing conversion model based on the linear dependence between positive oxide charge annealing and interface trap buildup accurately predicts the long time response of MOSFETs with relatively thick oxides but overestimates the threshold voltage shift for radiation hardened MOSFETs with thin oxides. To give an explanation to this fact, the authors investigate the impulse response function for threshold voltage. A revised model, which incorporates the different energy levels of hole traps in the oxide improves the fit between the model and data and gives an explanation to the fitting parameters dependence on oxide field

  5. Optimization approach of background value and initial item for improving prediction precision of GM(1,1) model

    Institute of Scientific and Technical Information of China (English)

    Yuhong Wang; Qin Liu; Jianrong Tang; Wenbin Cao; Xiaozhong Li

    2014-01-01

    A combination method of optimization of the back-ground value and optimization of the initial item is proposed. The sequences of the unbiased exponential distribution are simulated and predicted through the optimization of the background value in grey differential equations. The principle of the new information priority in the grey system theory and the rationality of the initial item in the original GM(1,1) model are ful y expressed through the improvement of the initial item in the proposed time response function. A numerical example is employed to il ustrate that the proposed method is able to simulate and predict sequences of raw data with the unbiased exponential distribution and has better simulation performance and prediction precision than the original GM(1,1) model relatively.

  6. Costello Syndrome with Severe Nodulocystic Acne: Unexpected Significant Improvement of Acanthosis Nigricans after Oral Isotretinoin Treatment

    Directory of Open Access Journals (Sweden)

    Leelawadee Sriboonnark

    2015-01-01

    Full Text Available We report the case of 17-year-old female diagnosed with Costello syndrome. Genetic testing provided a proof with G12S mutation in the HRAS gene since 3 years of age with a presentation of severe nodulocystic acne on her face. After 2 months of oral isotretinoin treatment, improvement in her acne was observed. Interestingly, an unexpected significant improvement of acanthosis nigricans on her neck and dorsum of her hands was found as well. We present this case as a successful treatment option by using oral isotretinoin for the treatment of acanthosis nigricans in Costello syndrome patients.

  7. Improved Performance and Safety for High Energy Batteries Through Use of Hazard Anticipation and Capacity Prediction

    Science.gov (United States)

    Atwater, Terrill

    1993-01-01

    Prediction of the capacity remaining in used high rate, high energy batteries is important information to the user. Knowledge of the capacity remaining in used batteries results in better utilization. This translates into improved readiness and cost savings due to complete, efficient use. High rate batteries, due to their chemical nature, are highly sensitive to misuse (i.e., over discharge or very high rate discharge). Battery failure due to misuse or manufacturing defects could be disastrous. Since high rate, high energy batteries are expensive and energetic, a reliable method of predicting both failures and remaining energy has been actively sought. Due to concerns over safety, the behavior of lithium/sulphur dioxide cells at different temperatures and current drains was examined. The main thrust of this effort was to determine failure conditions for incorporation in hazard anticipation circuitry. In addition, capacity prediction formulas have been developed from test data. A process that performs continuous, real-time hazard anticipation and capacity prediction was developed. The introduction of this process into microchip technology will enable the production of reliable, safe, and efficient high energy batteries.

  8. Google goes cancer: improving outcome prediction for cancer patients by network-based ranking of marker genes.

    Directory of Open Access Journals (Sweden)

    Christof Winter

    Full Text Available Predicting the clinical outcome of cancer patients based on the expression of marker genes in their tumors has received increasing interest in the past decade. Accurate predictors of outcome and response to therapy could be used to personalize and thereby improve therapy. However, state of the art methods used so far often found marker genes with limited prediction accuracy, limited reproducibility, and unclear biological relevance. To address this problem, we developed a novel computational approach to identify genes prognostic for outcome that couples gene expression measurements from primary tumor samples with a network of known relationships between the genes. Our approach ranks genes according to their prognostic relevance using both expression and network information in a manner similar to Google's PageRank. We applied this method to gene expression profiles which we obtained from 30 patients with pancreatic cancer, and identified seven candidate marker genes prognostic for outcome. Compared to genes found with state of the art methods, such as Pearson correlation of gene expression with survival time, we improve the prediction accuracy by up to 7%. Accuracies were assessed using support vector machine classifiers and Monte Carlo cross-validation. We then validated the prognostic value of our seven candidate markers using immunohistochemistry on an independent set of 412 pancreatic cancer samples. Notably, signatures derived from our candidate markers were independently predictive of outcome and superior to established clinical prognostic factors such as grade, tumor size, and nodal status. As the amount of genomic data of individual tumors grows rapidly, our algorithm meets the need for powerful computational approaches that are key to exploit these data for personalized cancer therapies in clinical practice.

  9. A predictive maintenance approach for improved nuclear plant availability

    International Nuclear Information System (INIS)

    Verma, R.M.P.; Pandya, M.B.; Kini, M.P.

    1979-01-01

    Predictive maintenance programme as against preventive maintenance programme aims at diagnosing, inspecting, monitoring, and objective condition-checking of equipment. It helps in forecasting failures, and scheduling the optimal frequencies for overhauls, replacements, lubrication etc. It also helps in establishing work load, manpower, resource planning and inventory control. Various stages of predictive maintenance programme for a nuclear power plant are outlined. A partial list of instruments for predictive maintenance is given. (M.G.B.)

  10. Improved Prediction of Blood-Brain Barrier Permeability Through Machine Learning with Combined Use of Molecular Property-Based Descriptors and Fingerprints.

    Science.gov (United States)

    Yuan, Yaxia; Zheng, Fang; Zhan, Chang-Guo

    2018-03-21

    Blood-brain barrier (BBB) permeability of a compound determines whether the compound can effectively enter the brain. It is an essential property which must be accounted for in drug discovery with a target in the brain. Several computational methods have been used to predict the BBB permeability. In particular, support vector machine (SVM), which is a kernel-based machine learning method, has been used popularly in this field. For SVM training and prediction, the compounds are characterized by molecular descriptors. Some SVM models were based on the use of molecular property-based descriptors (including 1D, 2D, and 3D descriptors) or fragment-based descriptors (known as the fingerprints of a molecule). The selection of descriptors is critical for the performance of a SVM model. In this study, we aimed to develop a generally applicable new SVM model by combining all of the features of the molecular property-based descriptors and fingerprints to improve the accuracy for the BBB permeability prediction. The results indicate that our SVM model has improved accuracy compared to the currently available models of the BBB permeability prediction.

  11. Trading network predicts stock price.

    Science.gov (United States)

    Sun, Xiao-Qian; Shen, Hua-Wei; Cheng, Xue-Qi

    2014-01-16

    Stock price prediction is an important and challenging problem for studying financial markets. Existing studies are mainly based on the time series of stock price or the operation performance of listed company. In this paper, we propose to predict stock price based on investors' trading behavior. For each stock, we characterize the daily trading relationship among its investors using a trading network. We then classify the nodes of trading network into three roles according to their connectivity pattern. Strong Granger causality is found between stock price and trading relationship indices, i.e., the fraction of trading relationship among nodes with different roles. We further predict stock price by incorporating these trading relationship indices into a neural network based on time series of stock price. Experimental results on 51 stocks in two Chinese Stock Exchanges demonstrate the accuracy of stock price prediction is significantly improved by the inclusion of trading relationship indices.

  12. Combined use of serum MCP-1/IL-10 ratio and uterine artery Doppler index significantly improves the prediction of preeclampsia.

    Science.gov (United States)

    Cui, Shihong; Gao, Yanan; Zhang, Linlin; Wang, Yuan; Zhang, Lindong; Liu, Pingping; Liu, Ling; Chen, Juan

    2017-10-01

    Monocyte chemotactic protein-1 (MCP-1, or CCL2) is a member of the chemokine subfamily involved in recruitment of monocytes in inflammatory tissues. IL-10 is a key regulator for maintaining the balance of anti-inflammatory and pro-inflammatory milieu at the feto-maternal interface. Doppler examination has been routinely performed for the monitoring and management of preeclampsia patients. This study evaluates the efficiency of these factors alone, or in combination, for the predication of preeclampsia. The serum levels of MCP-1 and IL-10 in 78 preeclampsia patients and 143 age-matched normal controls were measured. The Doppler ultrasonography was performed and Artery Pulsatility Index (PI) and Resistance Index (RI) were calculated for the same subjects. It was found that while the second-trimester serum MCP-1, IL-10, MCP-1/IL-10 ratio, PI, and RI showed some power in predicting preeclampsia, the combination of MCP-1/IL-10 and PI and RI accomplishes the highest efficiency, achieving an AUC of 0.973 (95% CI, 0.000-1.000, Ppreeclampsia. Future studies using a larger sample can be conducted to construct an algorithm capable of quantitative assessment on the risk of preeclampsia. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Improved USLE-K factor prediction: A case study on water erosion areas in China

    Directory of Open Access Journals (Sweden)

    Bin Wang

    2016-09-01

    Full Text Available Soil erodibility (K-factor is an essential factor in soil erosion prediction and conservation practises. The major obstacles to any accurate, large-scale soil erodibility estimation are the lack of necessary data on soil characteristics and the misuse of variable K-factor calculators. In this study, we assessed the performance of available erodibility estimators Universal Soil Loss Equation (USLE, Revised Universal Soil Loss Equation (RUSLE, Erosion Productivity Impact Calculator (EPIC and the Geometric Mean Diameter based (Dg model for different geographic regions based on the Chinese soil erodibility database (CSED. Results showed that previous estimators overestimated almost all K-values. Furthermore, only the USLE and Dg approaches could be directly and reliably applicable to black and loess soil regions. Based on the nonlinear best fitting techniques, we improved soil erodibility prediction by combining Dg and soil organic matter (SOM. The NSE, R2 and RE values were 0.94, 0.67 and 9.5% after calibrating the results independently; similar model performance was showed for the validation process. The results obtained via the proposed approach were more accurate that the former K-value predictions. Moreover, those improvements allowed us to effectively establish a regional soil erodibility map (1:250,000 scale of water erosion areas in China. The mean K-value of Chinese water erosion regions was 0.0321 (t ha h·(ha MJ mm−1 with a standard deviation of 0.0107 (t ha h·(ha MJ mm−1; K-values present a decreasing trend from North to South in water erosion areas in China. The yield soil erodibility dataset also satisfactorily corresponded to former K-values from different scales (local, regional, and national.

  14. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Science.gov (United States)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  15. Aerodynamic noise prediction of a Horizontal Axis Wind Turbine using Improved Delayed Detached Eddy Simulation and acoustic analogy

    International Nuclear Information System (INIS)

    Ghasemian, Masoud; Nejat, Amir

    2015-01-01

    Highlights: • The noise predictions are performed by Ffowcs Williams and Hawkings method. • There is a direct relation between the radiated noise and the wind speed. • The tonal peaks in the sound spectra match with the blade passing frequency. • The quadrupole noises have negligible effect on the low frequency noises. - Abstract: This paper presents the results of the aerodynamic and aero-acoustic prediction of the flow field around the National Renewable Energy Laboratory Phase VI wind turbine. The Improved Delayed Detached Eddy Simulation turbulence model is applied to obtain the instantaneous turbulent flow field. The noise prediction is carried out using the Ffowcs Williams and Hawkings acoustic analogy. Simulations are performed for three different inflow conditions, U = 7, 10, 15 m/s. The capability of the Improved Delayed Detached Eddy Simulation turbulence model in massive separation is verified with available experimental data for pressure coefficient. The broadband noises of the turbulent boundary layers and the tonal noises due to the blade passing frequency are predicted via flow field noise simulation. The contribution of the thickness, loading and quadrupole noises are investigated, separately. The results indicated that there is a direct relation between the strength of the radiated noise and the wind speed. Furthermore, the effect of the receiver location on the Overall Sound Pressure Level is investigated

  16. The value of model averaging and dynamical climate model predictions for improving statistical seasonal streamflow forecasts over Australia

    Science.gov (United States)

    Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.

    2013-10-01

    Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.

  17. Map-based prediction of organic carbon in headwater streams improved by downstream observations from the river outlet

    Science.gov (United States)

    Temnerud, J.; von Brömssen, C.; Fölster, J.; Buffam, I.; Andersson, J.-O.; Nyberg, L.; Bishop, K.

    2016-01-01

    In spite of the great abundance and ecological importance of headwater streams, managers are usually limited by a lack of information about water chemistry in these headwaters. In this study we test whether river outlet chemistry can be used as an additional source of information to improve the prediction of the chemistry of upstream headwaters (size interquartile range (IQR)) of headwater stream TOC for a given catchment, based on a large number of candidate variables including sub-catchment characteristics from GIS, and measured river chemistry at the catchment outlet. The best candidate variables from the PLS models were then used in hierarchical linear mixed models (MM) to model TOC in individual headwater streams. Three predictor variables were consistently selected for the MM calibration sets: (1) proportion of forested wetlands in the sub-catchment (positively correlated with headwater stream TOC), (2) proportion of lake surface cover in the sub-catchment (negatively correlated with headwater stream TOC), and (3) river outlet TOC (positively correlated with headwater stream TOC). Including river outlet TOC improved predictions, with 5-15 % lower prediction errors than when using map information alone. Thus, data on water chemistry measured at river outlets offer information which can complement GIS-based modelling of headwater stream chemistry.

  18. A New Hybrid Method for Improving the Performance of Myocardial Infarction Prediction

    Directory of Open Access Journals (Sweden)

    Hojatollah Hamidi

    2016-06-01

    Full Text Available Abstract Introduction: Myocardial Infarction, also known as heart attack, normally occurs due to such causes as smoking, family history, diabetes, and so on. It is recognized as one of the leading causes of death in the world. Therefore, the present study aimed to evaluate the performance of classification models in order to predict Myocardial Infarction, using a feature selection method that includes Forward Selection and Genetic Algorithm. Materials & Methods: The Myocardial Infarction data set used in this study contains the information related to 519 visitors to Shahid Madani Specialized Hospital of Khorramabad, Iran. This data set includes 33 features. The proposed method includes a hybrid feature selection method in order to enhance the performance of classification algorithms. The first step of this method selects the features using Forward Selection. At the second step, the selected features were given to a genetic algorithm, in order to select the best features. Classification algorithms entail Ada Boost, Naïve Bayes, J48 decision tree and simpleCART are applied to the data set with selected features, for predicting Myocardial Infarction. Results: The best results have been achieved after applying the proposed feature selection method, which were obtained via simpleCART and J48 algorithms with the accuracies of 96.53% and 96.34%, respectively. Conclusion: Based on the results, the performances of classification algorithms are improved. So, applying the proposed feature selection method, along with classification algorithms seem to be considered as a confident method with respect to predicting the Myocardial Infarction.

  19. Subject-specific knee joint geometry improves predictions of medial tibiofemoral contact forces

    Science.gov (United States)

    Gerus, Pauline; Sartori, Massimo; Besier, Thor F.; Fregly, Benjamin J.; Delp, Scott L.; Banks, Scott A.; Pandy, Marcus G.; D’Lima, Darryl D.; Lloyd, David G.

    2013-01-01

    Estimating tibiofemoral joint contact forces is important for understanding the initiation and progression of knee osteoarthritis. However, tibiofemoral contact force predictions are influenced by many factors including muscle forces and anatomical representations of the knee joint. This study aimed to investigate the influence of subject-specific geometry and knee joint kinematics on the prediction of tibiofemoral contact forces using a calibrated EMG-driven neuromusculoskeletal model of the knee. One participant fitted with an instrumented total knee replacement walked at a self-selected speed while medial and lateral tibiofemoral contact forces, ground reaction forces, whole-body kinematics, and lower-limb muscle activity were simultaneously measured. The combination of generic and subject-specific knee joint geometry and kinematics resulted in four different OpenSim models used to estimate muscle-tendon lengths and moment arms. The subject-specific geometric model was created from CT scans and the subject-specific knee joint kinematics representing the translation of the tibia relative to the femur was obtained from fluoroscopy. The EMG-driven model was calibrated using one walking trial, but with three different cost functions that tracked the knee flexion/extension moments with and without constraint over the estimated joint contact forces. The calibrated models then predicted the medial and lateral tibiofemoral contact forces for five other different walking trials. The use of subject-specific models with minimization of the peak tibiofemoral contact forces improved the accuracy of medial contact forces by 47% and lateral contact forces by 7%, respectively compared with the use of generic musculoskeletal model. PMID:24074941

  20. Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) significantly improve prostate cancer detection at initial biopsy in a total PSA range of 2-10 ng/ml.

    Science.gov (United States)

    Ferro, Matteo; Bruzzese, Dario; Perdonà, Sisto; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; D'Esposito, Vittoria; Cosimato, Vincenzo; Buonerba, Carlo; Di Lorenzo, Giuseppe; Musi, Gennaro; De Cobelli, Ottavio; Chun, Felix K; Terracciano, Daniela

    2013-01-01

    Many efforts to reduce prostate specific antigen (PSA) overdiagnosis and overtreatment have been made. To this aim, Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) have been proposed as new more specific biomarkers. We evaluated the ability of phi and PCA3 to identify prostate cancer (PCa) at initial prostate biopsy in men with total PSA range of 2-10 ng/ml. The performance of phi and PCA3 were evaluated in 300 patients undergoing first prostate biopsy. ROC curve analyses tested the accuracy (AUC) of phi and PCA3 in predicting PCa. Decision curve analyses (DCA) were used to compare the clinical benefit of the two biomarkers. We found that the AUC value of phi (0.77) was comparable to those of %p2PSA (0.76) and PCA3 (0.73) with no significant differences in pairwise comparison (%p2PSA vs phi p = 0.673, %p2PSA vs. PCA3 p = 0.417 and phi vs. PCA3 p = 0.247). These three biomarkers significantly outperformed fPSA (AUC = 0.60), % fPSA (AUC = 0.62) and p2PSA (AUC = 0.63). At DCA, phi and PCA3 exhibited a very close net benefit profile until the threshold probability of 25%, then phi index showed higher net benefit than PCA3. Multivariable analysis showed that the addition of phi and PCA3 to the base multivariable model (age, PSA, %fPSA, DRE, prostate volume) increased predictive accuracy, whereas no model improved single biomarker performance. Finally we showed that subjects with active surveillance (AS) compatible cancer had significantly lower phi and PCA3 values (pphi and PCA3 comparably increase the accuracy in predicting the presence of PCa in total PSA range 2-10 ng/ml at initial biopsy, outperforming currently used %fPSA.

  1. A tool for safety evaluations of road improvements.

    Science.gov (United States)

    Peltola, Harri; Rajamäki, Riikka; Luoma, Juha

    2013-11-01

    Road safety impact assessments are requested in general, and the directive on road infrastructure safety management makes them compulsory for Member States of the European Union. However, there is no widely used, science-based safety evaluation tool available. We demonstrate a safety evaluation tool called TARVA. It uses EB safety predictions as the basis for selecting locations for implementing road-safety improvements and provides estimates of safety benefits of selected improvements. Comparing different road accident prediction methods, we demonstrate that the most accurate estimates are produced by EB models, followed by simple accident prediction models, the same average number of accidents for every entity and accident record only. Consequently, advanced model-based estimates should be used. Furthermore, we demonstrate regional comparisons that benefit substantially from such tools. Comparisons between districts have revealed significant differences. However, comparisons like these produce useful improvement ideas only after taking into account the differences in road characteristics between areas. Estimates on crash modification factors can be transferred from other countries but their benefit is greatly limited if the number of target accidents is not properly predicted. Our experience suggests that making predictions and evaluations using the same principle and tools will remarkably improve the quality and comparability of safety estimations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. pEPito: a significantly improved non-viral episomal expression vector for mammalian cells

    Directory of Open Access Journals (Sweden)

    Ogris Manfred

    2010-03-01

    Full Text Available Abstract Background The episomal replication of the prototype vector pEPI-1 depends on a transcription unit starting from the constitutively expressed Cytomegalovirus immediate early promoter (CMV-IEP and directed into a 2000 bp long matrix attachment region sequence (MARS derived from the human β-interferon gene. The original pEPI-1 vector contains two mammalian transcription units and a total of 305 CpG islands, which are located predominantly within the vector elements necessary for bacterial propagation and known to be counterproductive for persistent long-term transgene expression. Results Here, we report the development of a novel vector pEPito, which is derived from the pEPI-1 plasmid replicon but has considerably improved efficacy both in vitro and in vivo. The pEPito vector is significantly reduced in size, contains only one transcription unit and 60% less CpG motives in comparison to pEPI-1. It exhibits major advantages compared to the original pEPI-1 plasmid, including higher transgene expression levels and increased colony-forming efficiencies in vitro, as well as more persistent transgene expression profiles in vivo. The performance of pEPito-based vectors was further improved by replacing the CMV-IEP with the human CMV enhancer/human elongation factor 1 alpha promoter (hCMV/EF1P element that is known to be less affected by epigenetic silencing events. Conclusions The novel vector pEPito can be considered suitable as an improved vector for biotechnological applications in vitro and for non-viral gene delivery in vivo.

  3. Understanding the origin of the solar cyclic activity for an improved earth climate prediction

    Science.gov (United States)

    Turck-Chièze, Sylvaine; Lambert, Pascal

    This review is dedicated to the processes which could explain the origin of the great extrema of the solar activity. We would like to reach a more suitable estimate and prediction of the temporal solar variability and its real impact on the Earth climatic models. The development of this new field is stimulated by the SoHO helioseismic measurements and by some recent solar modelling improvement which aims to describe the dynamical processes from the core to the surface. We first recall assumptions on the potential different solar variabilities. Then, we introduce stellar seismology and summarize the main SOHO results which are relevant for this field. Finally we mention the dynamical processes which are presently introduced in new solar models. We believe that the knowledge of two important elements: (1) the magnetic field interplay between the radiative zone and the convective zone and (2) the role of the gravity waves, would allow to understand the origin of the grand minima and maxima observed during the last millennium. Complementary observables like acoustic and gravity modes, radius and spectral irradiance from far UV to visible in parallel to the development of 1D-2D-3D simulations will improve this field. PICARD, SDO, DynaMICCS are key projects for a prediction of the next century variability. Some helioseismic indicators constitute the first necessary information to properly describe the Sun-Earth climatic connection.

  4. Four-phonon scattering significantly reduces intrinsic thermal conductivity of solids

    Science.gov (United States)

    Feng, Tianli; Lindsay, Lucas; Ruan, Xiulin

    2017-10-01

    For decades, the three-phonon scattering process has been considered to govern thermal transport in solids, while the role of higher-order four-phonon scattering has been persistently unclear and so ignored. However, recent quantitative calculations of three-phonon scattering have often shown a significant overestimation of thermal conductivity as compared to experimental values. In this Rapid Communication we show that four-phonon scattering is generally important in solids and can remedy such discrepancies. For silicon and diamond, the predicted thermal conductivity is reduced by 30% at 1000 K after including four-phonon scattering, bringing predictions in excellent agreement with measurements. For the projected ultrahigh-thermal conductivity material, zinc-blende BAs, a competitor of diamond as a heat sink material, four-phonon scattering is found to be strikingly strong as three-phonon processes have an extremely limited phase space for scattering. The four-phonon scattering reduces the predicted thermal conductivity from 2200 to 1400 W/m K at room temperature. The reduction at 1000 K is 60%. We also find that optical phonon scattering rates are largely affected, being important in applications such as phonon bottlenecks in equilibrating electronic excitations. Recognizing that four-phonon scattering is expensive to calculate, in the end we provide some guidelines on how to quickly assess the significance of four-phonon scattering, based on energy surface anharmonicity and the scattering phase space. Our work clears the decades-long fundamental question of the significance of higher-order scattering, and points out ways to improve thermoelectrics, thermal barrier coatings, nuclear materials, and radiative heat transfer.

  5. The effectiveness of research-based physics learning module with predict-observe-explain strategies to improve the student’s competence

    Science.gov (United States)

    Usmeldi

    2018-05-01

    The preliminary study shows that many students are difficult to master the concept of physics. There are still many students who have not mastery learning physics. Teachers and students still use textbooks. Students rarely do experiments in the laboratory. One model of learning that can improve students’ competence is a research-based learning with Predict- Observe-Explain (POE) strategies. To implement this learning, research-based physics learning modules with POE strategy are used. The research aims to find out the effectiveness of implementation of research-based physics learning modules with POE strategy to improving the students’ competence. The research used a quasi-experimental with pretest-posttest group control design. Data were collected using observation sheets, achievement test, skill assessment sheets, questionnaire of attitude and student responses to learning implementation. The results of research showed that research-based physics learning modules with POE strategy was effective to improve the students’ competence, in the case of (1) mastery learning of physics has been achieved by majority of students, (2) improving the students competency of experimental class including high category, (3) there is a significant difference between the average score of students’ competence of experimental class and the control class, (4) the average score of the students competency of experimental class is higher than the control class, (5) the average score of the students’ responses to the learning implementation is very good category, this means that most students can implement research-based learning with POE strategies.

  6. The prognostic significance of UCA1 for predicting clinical outcome in patients with digestive system malignancies.

    Science.gov (United States)

    Liu, Fang-Teng; Dong, Qing; Gao, Hui; Zhu, Zheng-Ming

    2017-06-20

    Urothelial Carcinoma Associated 1 (UCA1) was an originally identified lncRNA in bladder cancer. Previous studies have reported that UCA1 played a significant role in various types of cancer. This study aimed to clarify the prognostic value of UCA1 in digestive system cancers. The meta-analysis of 15 studies were included, comprising 1441 patients with digestive system cancers. The pooled results of 14 studies indicated that high expression of UCA1 was significantly associated with poorer OS in patients with digestive system cancers (HR: 1.89, 95 % CI: 1.52-2.26). In addition, UCA1 could be as an independent prognostic factor for predicting OS of patients (HR: 1.85, 95 % CI: 1.45-2.25). The pooled results of 3 studies indicated a significant association between UCA1 and DFS in patients with digestive system cancers (HR = 2.50; 95 % CI = 1.30-3.69). Statistical significance was also observed in subgroup meta-analysis. Furthermore, the clinicopathological values of UCA1 were discussed in esophageal cancer, colorectal cancer and pancreatic cancer. A comprehensive retrieval was performed to search studies evaluating the prognostic value of UCA1 in digestive system cancers. Many databases were involved, including PubMed, Web of Science, Embase and Chinese National Knowledge Infrastructure and Wanfang database. Quantitative meta-analysis was performed with standard statistical methods and the prognostic significance of UCA1 in digestive system cancers was qualified. Elevated level of UCA1 indicated the poor clinical outcome for patients with digestive system cancers. It may serve as a new biomarker related to prognosis in digestive system cancers.

  7. Stabilizing intermediate-term medium-range earthquake predictions

    International Nuclear Information System (INIS)

    Kossobokov, V.G.; Romashkova, L.L.; Panza, G.F.; Peresan, A.

    2001-12-01

    A new scheme for the application of the intermediate-term medium-range earthquake prediction algorithm M8 is proposed. The scheme accounts for the natural distribution of seismic activity, eliminates the subjectivity in the positioning of the areas of investigation and provides additional stability of the predictions with respect to the original variant. According to the retroactive testing in Italy and adjacent regions, this improvement is achieved without any significant change of the alarm volume in comparison with the results published so far. (author)

  8. Predictive significance of standardized uptake value parameters of FDG-PET in patients with non-small cell lung carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Duan, X-Y.; Wang, W.; Li, M.; Li, Y.; Guo, Y-M. [PET-CT Center, The First Affiliated Hospital of Xi' an, Jiaotong University, Xi' an, Shaanxi (China)

    2015-02-03

    {sup 18}F-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) is widely used to diagnose and stage non-small cell lung cancer (NSCLC). The aim of this retrospective study was to evaluate the predictive ability of different FDG standardized uptake values (SUVs) in 74 patients with newly diagnosed NSCLC. {sup 18}F-FDG PET/CT scans were performed and different SUV parameters (SUV{sub max}, SUV{sub avg}, SUV{sub T/L}, and SUV{sub T/A}) obtained, and their relationship with clinical characteristics were investigated. Meanwhile, correlation and multiple stepwise regression analyses were performed to determine the primary predictor of SUVs for NSCLC. Age, gender, and tumor size significantly affected SUV parameters. The mean SUVs of squamous cell carcinoma were higher than those of adenocarcinoma. Poorly differentiated tumors exhibited higher SUVs than well-differentiated ones. Further analyses based on the pathologic type revealed that the SUV{sub max}, SUV{sub avg}, and SUV{sub T/L} of poorly differentiated adenocarcinoma tumors were higher than those of moderately or well-differentiated tumors. Among these four SUV parameters, SUV{sub T/L} was the primary predictor for tumor differentiation. However, in adenocarcinoma, SUV{sub max} was the determining factor for tumor differentiation. Our results showed that these four SUV parameters had predictive significance related to NSCLC tumor differentiation; SUV{sub T/L} appeared to be most useful overall, but SUV{sub max} was the best index for adenocarcinoma tumor differentiation.

  9. Sub-seasonal prediction of significant wave heights over the Western Pacific and Indian Oceans, part II: The impact of ENSO and MJO

    Science.gov (United States)

    Shukla, Ravi P.; Kinter, James L.; Shin, Chul-Su

    2018-03-01

    This study evaluates the effect of El Niño and the Southern Oscillation (ENSO) and Madden Julian Oscillation (MJO) events on 14-day mean significant wave height (SWH) at 3 weeks lead time (Wk34) over the Western Pacific and Indian Oceans using the National Centers for Environmental Prediction (NCEP) Climate Forecast System, version 2 (CFSv2). The WAVEWATCH-3 (WW3) model is forced with daily 10m-winds predicted by a modified version of CFSv2 that is initialized with multiple ocean analyses in both January and May for 1979-2008. A significant anomaly correlation of predicted and observed SWH anomalies (SWHA) at Wk34 lead-time is found over portions of the domain, including the central western Pacific, South China Sea (SCS), Bay of Bengal (BOB) and southern Indian Ocean (IO) in January cases, and over BOB, equatorial western Pacific, the Maritime Continent and southern IO in May cases. The model successfully predicts almost all the important features of the observed composite SWHA during El Niño events in January, including negative SWHA in the central IO where westerly wind anomalies act on an easterly mean state, and positive SWHA over the southern Ocean (SO) where westerly wind anomalies act on a westerly mean state. The model successfully predicts the sign and magnitude of SWHA at Wk34 lead-time in May over the BOB and SCS in composites of combined phases-2-3 and phases-6-7 of MJO. The observed leading mode of SWHA in May and the third mode of SWHA in January are influenced by the combined effects of ENSO and MJO. Based on spatial and temporal correlations, the spatial patterns of SWHA in the model at Wk34 in both January and May are in good agreement with the observations over the equatorial western Pacific, equatorial and southern IO, and SO.

  10. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  11. Improving Prediction of Large-scale Regime Transitions

    Science.gov (United States)

    Gyakum, J. R.; Roebber, P.; Bosart, L. F.; Honor, A.; Bunker, E.; Low, Y.; Hart, J.; Bliankinshtein, N.; Kolly, A.; Atallah, E.; Huang, Y.

    2017-12-01

    Cool season atmospheric predictability over the CONUS on subseasonal times scales (1-4 weeks) is critically dependent upon the structure, configuration, and evolution of the North Pacific jet stream (NPJ). The NPJ can be perturbed on its tropical side on synoptic time scales by recurving and transitioning tropical cyclones (TCs) and on subseasonal time scales by longitudinally varying convection associated with the Madden-Julian Oscillation (MJO). Likewise, the NPJ can be perturbed on its poleward side on synoptic time scales by midlatitude and polar disturbances that originate over the Asian continent. These midlatitude and polar disturbances can often trigger downstream Rossby wave propagation across the North Pacific, North America, and the North Atlantic. The project team is investigating the following multiscale processes and features: the spatiotemporal distribution of cyclone clustering over the Northern Hemisphere; cyclone clustering as influenced by atmospheric blocking and the phases and amplitudes of the major teleconnection indices, ENSO and the MJO; composite and case study analyses of representative cyclone clustering events to establish the governing dynamics; regime change predictability horizons associated with cyclone clustering events; Arctic air mass generation and modification; life cycles of the MJO; and poleward heat and moisture transports of subtropical air masses. A critical component of the study is weather regime classification. These classifications are defined through: the spatiotemporal clustering of surface cyclogenesis; a general circulation metric combining data at 500-hPa and the dynamic tropopause; Self Organizing Maps (SOM), constructed from dynamic tropopause and 850 hPa equivalent potential temperature data. The resultant lattice of nodes is used to categorize synoptic classes and their predictability, as well as to determine the robustness of the CFSv2 model climate relative to observations. Transition pathways between these

  12. Prostate health index (phi) and prostate cancer antigen 3 (PCA3) significantly improve diagnostic accuracy in patients undergoing prostate biopsy.

    Science.gov (United States)

    Perdonà, Sisto; Bruzzese, Dario; Ferro, Matteo; Autorino, Riccardo; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; Longo, Michele; Spinelli, Rosa; Di Lorenzo, Giuseppe; Oliva, Andrea; De Sio, Marco; Damiano, Rocco; Altieri, Vincenzo; Terracciano, Daniela

    2013-02-15

    Prostate health index (phi) and prostate cancer antigen 3 (PCA3) have been recently proposed as novel biomarkers for prostate cancer (PCa). We assessed the diagnostic performance of these biomarkers, alone or in combination, in men undergoing first prostate biopsy for suspicion of PCa. One hundred sixty male subjects were enrolled in this prospective observational study. PSA molecular forms, phi index (Beckman coulter immunoassay), PCA3 score (Progensa PCA3 assay), and other established biomarkers (tPSA, fPSA, and %fPSA) were assessed before patients underwent a 18-core first prostate biopsy. The discriminating ability between PCa-negative and PCa-positive biopsies of Beckman coulter phi and PCA3 score and other used biomarkers were determined. One hundred sixty patients met inclusion criteria. %p2PSA (p2PSA/fPSA × 100), phi and PCA3 were significantly higher in patients with PCa compared to PCa-negative group (median values: 1.92 vs. 1.55, 49.97 vs. 36.84, and 50 vs. 32, respectively, P ≤ 0.001). ROC curve analysis showed that %p2PSA, phi, and PCA3 are good indicator of malignancy (AUCs = 0.68, 0.71, and 0.66, respectively). A multivariable logistic regression model consisting of both the phi index and PCA3 score allowed to reach an overall diagnostic accuracy of 0.77. Decision curve analysis revealed that this "combined" marker achieved the highest net benefit over the examined range of the threshold probability. phi and PCA3 showed no significant difference in the ability to predict PCa diagnosis in men undergoing first prostate biopsy. However, diagnostic performance is significantly improved by combining phi and PCA3. Copyright © 2012 Wiley Periodicals, Inc.

  13. Improved therapy-success prediction with GSS estimated from clinical HIV-1 sequences.

    Science.gov (United States)

    Pironti, Alejandro; Pfeifer, Nico; Kaiser, Rolf; Walter, Hauke; Lengauer, Thomas

    2014-01-01

    Rules-based HIV-1 drug-resistance interpretation (DRI) systems disregard many amino-acid positions of the drug's target protein. The aims of this study are (1) the development of a drug-resistance interpretation system that is based on HIV-1 sequences from clinical practice rather than hard-to-get phenotypes, and (2) the assessment of the benefit of taking all available amino-acid positions into account for DRI. A dataset containing 34,934 therapy-naïve and 30,520 drug-exposed HIV-1 pol sequences with treatment history was extracted from the EuResist database and the Los Alamos National Laboratory database. 2,550 therapy-change-episode baseline sequences (TCEB) were assigned to test set A. Test set B contains 1,084 TCEB from the HIVdb TCE repository. Sequences from patients absent in the test sets were used to train three linear support vector machines to produce scores that predict drug exposure pertaining to each of 20 antiretrovirals: the first one uses the full amino-acid sequences (DEfull), the second one only considers IAS drug-resistance positions (DEonlyIAS), and the third one disregards IAS drug-resistance positions (DEnoIAS). For performance comparison, test sets A and B were evaluated with DEfull, DEnoIAS, DEonlyIAS, geno2pheno[resistance], HIVdb, ANRS, HIV-GRADE, and REGA. Clinically-validated cut-offs were used to convert the continuous output of the first four methods into susceptible-intermediate-resistant (SIR) predictions. With each method, a genetic susceptibility score (GSS) was calculated for each therapy episode in each test set by converting the SIR prediction for its compounds to integer: S=2, I=1, and R=0. The GSS were used to predict therapy success as defined by the EuResist standard datum definition. Statistical significance was assessed using a Wilcoxon signed-rank test. A comparison of the therapy-success prediction performances among the different interpretation systems for test set A can be found in Table 1, while those for test set

  14. Preoperative prediction of reversible myocardial asynergy by postexercise radionuclide ventriculography

    International Nuclear Information System (INIS)

    Rozanski, A.; Berman, D.; Gray, R.; Diamond, G.; Raymond, M.; Prause, J.; Maddahi, J.; Swan, H.J.; Matloff, J.

    1982-01-01

    Myocardial asynergy is sometimes reversed by coronary bypass, and a noninvasive method of predicting which assess are reversible would be desirable. To assess whether changes in myocardial wall motion observed immediately after exercise can differentiate reversible from nonreversible myocardial asynergy, we evaluated 53 patients by radionuclide ventriculography before and after exercise and again at rest after coronary bypass surgery. Preoperative improvement in wall motion immediately after exercise was highly predictive of the surgical outcome (average chance-corrected agreement, 91 per cent). At surgery the asynergic segments that had improved after exercise were free of grossly apparent epicardial scarring. The accuracy of these predictions for postoperative improvement was significantly greater (P less than 0.01) than that of analysis of Q waves on resting electrocardiography (average chance-corrected agreement, 40 per cent). In contrast, preoperative changes in left ventricular ejection fraction after exercise were not predictive of postoperative resting ejection fraction. We conclude that postexercise radionuclide ventriculography can be used to identify reversible resting myocardial asynergy. This test should prove effective in predicting which patients with myocardial asynergy are most likely to benefit from aortocoronary revascularization

  15. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system

    Energy Technology Data Exchange (ETDEWEB)

    Beckon, William N., E-mail: William_Beckon@fws.gov

    2016-07-15

    Highlights: • A method for estimating response time in cause-effect relationships is demonstrated. • Predictive modeling is appreciably improved by taking into account this lag time. • Bioaccumulation lag is greater for organisms at higher trophic levels. • This methodology may be widely applicable in disparate disciplines. - Abstract: For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  16. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system

    International Nuclear Information System (INIS)

    Beckon, William N.

    2016-01-01

    Highlights: • A method for estimating response time in cause-effect relationships is demonstrated. • Predictive modeling is appreciably improved by taking into account this lag time. • Bioaccumulation lag is greater for organisms at higher trophic levels. • This methodology may be widely applicable in disparate disciplines. - Abstract: For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  17. Predictive analytics technology review: Similarity-based modeling and beyond

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, James; Doan, Don; Gandhi, Devang; Nieman, Bill

    2010-09-15

    Over 11 years ago, SmartSignal introduced Predictive Analytics for eliminating equipment failures, using its patented SBM technology. SmartSignal continues to lead and dominate the market and, in 2010, went one step further and introduced Predictive Diagnostics. Now, SmartSignal is combining Predictive Diagnostics with RCM methodology and industry expertise. FMEA logic reengineers maintenance work management, eliminates unneeded inspections, and focuses efforts on the real issues. This integrated solution significantly lowers maintenance costs, protects against critical asset failures, and improves commercial availability, and reduces work orders 20-40%. Learn how.

  18. NOAA's National Air Quality Predictions and Development of Aerosol and Atmospheric Composition Prediction Components for the Next Generation Global Prediction System

    Science.gov (United States)

    Stajner, I.; Hou, Y. T.; McQueen, J.; Lee, P.; Stein, A. F.; Tong, D.; Pan, L.; Huang, J.; Huang, H. C.; Upadhayay, S.

    2016-12-01

    NOAA provides operational air quality predictions using the National Air Quality Forecast Capability (NAQFC): ozone and wildfire smoke for the United States and airborne dust for the contiguous 48 states at http://airquality.weather.gov. NOAA's predictions of fine particulate matter (PM2.5) became publicly available in February 2016. Ozone and PM2.5 predictions are produced using a system that operationally links the Community Multiscale Air Quality (CMAQ) model with meteorological inputs from the North American mesoscale forecast Model (NAM). Smoke and dust predictions are provided using the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model. Current NAQFC focus is on updating CMAQ to version 5.0.2, improving PM2.5 predictions, and updating emissions estimates, especially for NOx using recently observed trends. Wildfire smoke emissions from a newer version of the USFS BlueSky system are being included in a new configuration of the NAQFC NAM-CMAQ system, which is re-run for the previous 24 hours when the wildfires were observed from satellites, to better represent wildfire emissions prior to initiating predictions for the next 48 hours. In addition, NOAA is developing the Next Generation Global Prediction System (NGGPS) to represent the earth system for extended weather prediction. NGGPS will include a representation of atmospheric dynamics, physics, aerosols and atmospheric composition as well as coupling with ocean, wave, ice and land components. NGGPS is being developed with a broad community involvement, including community developed components and academic research to develop and test potential improvements for potentially inclusion in NGGPS. Several investigators at NOAA's research laboratories and in academia are working to improve the aerosol and gaseous chemistry representation for NGGPS, to develop and evaluate the representation of atmospheric composition, and to establish and improve the coupling with radiation and microphysics

  19. Ensemble-based prediction of RNA secondary structures.

    Science.gov (United States)

    Aghaeepour, Nima; Hoos, Holger H

    2013-04-24

    Accurate structure prediction methods play an important role for the understanding of RNA function. Energy-based, pseudoknot-free secondary structure prediction is one of the most widely used and versatile approaches, and improved methods for this task have received much attention over the past five years. Despite the impressive progress that as been achieved in this area, existing evaluations of the prediction accuracy achieved by various algorithms do not provide a comprehensive, statistically sound assessment. Furthermore, while there is increasing evidence that no prediction algorithm consistently outperforms all others, no work has been done to exploit the complementary strengths of multiple approaches. In this work, we present two contributions to the area of RNA secondary structure prediction. Firstly, we use state-of-the-art, resampling-based statistical methods together with a previously published and increasingly widely used dataset of high-quality RNA structures to conduct a comprehensive evaluation of existing RNA secondary structure prediction procedures. The results from this evaluation clarify the performance relationship between ten well-known existing energy-based pseudoknot-free RNA secondary structure prediction methods and clearly demonstrate the progress that has been achieved in recent years. Secondly, we introduce AveRNA, a generic and powerful method for combining a set of existing secondary structure prediction procedures into an ensemble-based method that achieves significantly higher prediction accuracies than obtained from any of its component procedures. Our new, ensemble-based method, AveRNA, improves the state of the art for energy-based, pseudoknot-free RNA secondary structure prediction by exploiting the complementary strengths of multiple existing prediction procedures, as demonstrated using a state-of-the-art statistical resampling approach. In addition, AveRNA allows an intuitive and effective control of the trade-off between

  20. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    Science.gov (United States)

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  1. Review of the quality of total mesorectal excision does not improve the prediction of outcome.

    Science.gov (United States)

    Demetter, P; Jouret-Mourin, A; Silversmit, G; Vandendael, T; Sempoux, C; Hoorens, A; Nagy, N; Cuvelier, C; Van Damme, N; Penninckx, F

    2016-09-01

    A fair to moderate concordance in grading of the total mesorectal excision (TME) surgical specimen by local pathologists and a central review panel has been observed in the PROCARE (Project on Cancer of the Rectum) project. The aim of the present study was to evaluate the difference, if any, in the accuracy of predicting the oncological outcome through TME grading by local pathologists or by the review panel. The quality of the TME specimen was reviewed for 482 surgical specimens registered on a prospective database between 2006 and 2011. Patients with a Stage IV tumour, with unknown incidence date or without follow-up information were excluded, resulting in a study population of 383 patients. Quality assessment of the specimen was based on three grades including mesorectal resection (MRR), intramesorectal resection (IMR) and muscularis propria resection (MPR). Using univariable Cox regression models, local and review panel histopathological gradings of the quality of TME were assessed as predictors of local recurrence, distant metastasis and disease-free and overall survival. Differences in the predictions between local and review grading were determined. Resection planes were concordant in 215 (56.1%) specimens. Downgrading from MRR to MPR was noted in 23 (6.0%). There were no significant differences in the prediction error between the two models; local and central review TME grading predicted the outcome equally well. Any difference in grading of the TME specimen between local histopathologists and the review panel had no significant impact on the prediction of oncological outcome for this patient cohort. Grading of the quality of TME as reported by local histopathologists can therefore be used for outcome analysis. Quality control of TME grading is not warranted provided the histopathologist is adequately trained. Colorectal Disease © 2016 The Association of Coloproctology of Great Britain and Ireland.

  2. Simulated rat intestinal fluid improves oral exposure prediction for poorly soluble compounds over a wide dose range

    Directory of Open Access Journals (Sweden)

    Joerg Berghausen

    2016-03-01

    Full Text Available Solubility can be the absorption limiting factor for drug candidates and is therefore a very important input parameter for oral exposure prediction of compounds with limited solubility. Biorelevant media of the fasted and fed state have been published for humans, as well as for dogs in the fasted state. In a drug discovery environment, rodents are the most common animal model to assess the oral exposure of drug candidates. In this study a rat simulated intestinal fluid (rSIF is proposed as a more physiologically relevant media to describe drug solubility in rats. Equilibrium solubility in this medium was tested as input parameter for physiologically-based pharmacokinetics (PBPK simulations of oral pharmacokinetics in the rat. Simulations were compared to those obtained using other solubility values as input parameters, like buffer at pH 6.8, human simulated intestinal fluid and a comprehensive dissolution assay based on rSIF. Our study on nine different compounds demonstrates that the incorporation of rSIF equilibrium solubility values into PBPK models of oral drug exposure can significantly improve the reliability of simulations in rats for doses up to 300 mg/kg compared to other media. The comprehensive dissolution assay may help to improve further simulation outcome, but the greater experimental effort as compared to equilibrium solubility may limit its use in a drug discovery environment. Overall, PBPK simulations based on solubility in the proposed rSIF medium can improve prioritizing compounds in drug discovery as well as planning dose escalation studies, e.g. during toxicological investigations.

  3. Repeated assessments of symptom severity improve predictions for risk of death among patients with cancer.

    Science.gov (United States)

    Sutradhar, Rinku; Atzema, Clare; Seow, Hsien; Earle, Craig; Porter, Joan; Barbera, Lisa

    2014-12-01

    Although prior studies show the importance of self-reported symptom scores as predictors of cancer survival, most are based on scores recorded at a single point in time. To show that information on repeated assessments of symptom severity improves predictions for risk of death and to use updated symptom information for determining whether worsening of symptom scores is associated with a higher hazard of death. This was a province-based longitudinal study of adult outpatients who had a cancer diagnosis and had assessments of symptom severity. We implemented a time-to-death Cox model with a time-varying covariate for each symptom to account for changing symptom scores over time. This model was compared with that using only a time-fixed (baseline) covariate for each symptom. The regression coefficients of each model were derived based on a randomly selected 60% of patients, and then, the predictive performance of each model was assessed via concordance probabilities when applied to the remaining 40% of patients. This study had 66,112 patients diagnosed with cancer and more than 310,000 assessments of symptoms. The use of repeated assessments of symptom scores improved predictions for risk of death compared with using only baseline symptom scores. Increased pain and fatigue and reduced appetite were the strongest predictors for death. If available, researchers should consider including changing information on symptom scores, as opposed to only baseline information on symptom scores, when examining hazard of death among patients with cancer. Worsening of pain, fatigue, and appetite may be a flag for impending death. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  4. A Noise Trimming and Positional Significance of Transposon Insertion System to Identify Essential Genes in Yersinia pestis

    Science.gov (United States)

    Yang, Zheng Rong; Bullifent, Helen L.; Moore, Karen; Paszkiewicz, Konrad; Saint, Richard J.; Southern, Stephanie J.; Champion, Olivia L.; Senior, Nicola J.; Sarkar-Tyson, Mitali; Oyston, Petra C. F.; Atkins, Timothy P.; Titball, Richard W.

    2017-02-01

    Massively parallel sequencing technology coupled with saturation mutagenesis has provided new and global insights into gene functions and roles. At a simplistic level, the frequency of mutations within genes can indicate the degree of essentiality. However, this approach neglects to take account of the positional significance of mutations - the function of a gene is less likely to be disrupted by a mutation close to the distal ends. Therefore, a systematic bioinformatics approach to improve the reliability of essential gene identification is desirable. We report here a parametric model which introduces a novel mutation feature together with a noise trimming approach to predict the biological significance of Tn5 mutations. We show improved performance of essential gene prediction in the bacterium Yersinia pestis, the causative agent of plague. This method would have broad applicability to other organisms and to the identification of genes which are essential for competitiveness or survival under a broad range of stresses.

  5. Prediction of line failure fault based on weighted fuzzy dynamic clustering and improved relational analysis

    Science.gov (United States)

    Meng, Xiaocheng; Che, Renfei; Gao, Shi; He, Juntao

    2018-04-01

    With the advent of large data age, power system research has entered a new stage. At present, the main application of large data in the power system is the early warning analysis of the power equipment, that is, by collecting the relevant historical fault data information, the system security is improved by predicting the early warning and failure rate of different kinds of equipment under certain relational factors. In this paper, a method of line failure rate warning is proposed. Firstly, fuzzy dynamic clustering is carried out based on the collected historical information. Considering the imbalance between the attributes, the coefficient of variation is given to the corresponding weights. And then use the weighted fuzzy clustering to deal with the data more effectively. Then, by analyzing the basic idea and basic properties of the relational analysis model theory, the gray relational model is improved by combining the slope and the Deng model. And the incremental composition and composition of the two sequences are also considered to the gray relational model to obtain the gray relational degree between the various samples. The failure rate is predicted according to the principle of weighting. Finally, the concrete process is expounded by an example, and the validity and superiority of the proposed method are verified.

  6. An Inventory Controlled Supply Chain Model Based on Improved BP Neural Network

    Directory of Open Access Journals (Sweden)

    Wei He

    2013-01-01

    Full Text Available Inventory control is a key factor for reducing supply chain cost and increasing customer satisfaction. However, prediction of inventory level is a challenging task for managers. As one of the widely used techniques for inventory control, standard BP neural network has such problems as low convergence rate and poor prediction accuracy. Aiming at these problems, a new fast convergent BP neural network model for predicting inventory level is developed in this paper. By adding an error offset, this paper deduces the new chain propagation rule and the new weight formula. This paper also applies the improved BP neural network model to predict the inventory level of an automotive parts company. The results show that the improved algorithm not only significantly exceeds the standard algorithm but also outperforms some other improved BP algorithms both on convergence rate and prediction accuracy.

  7. Dynamic Filtering Improves Attentional State Prediction with fNIRS

    Science.gov (United States)

    Harrivel, Angela R.; Weissman, Daniel H.; Noll, Douglas C.; Huppert, Theodore; Peltier, Scott J.

    2016-01-01

    Brain activity can predict a person's level of engagement in an attentional task. However, estimates of brain activity are often confounded by measurement artifacts and systemic physiological noise. The optimal method for filtering this noise - thereby increasing such state prediction accuracy - remains unclear. To investigate this, we asked study participants to perform an attentional task while we monitored their brain activity with functional near infrared spectroscopy (fNIRS). We observed higher state prediction accuracy when noise in the fNIRS hemoglobin [Hb] signals was filtered with a non-stationary (adaptive) model as compared to static regression (84% +/- 6% versus 72% +/- 15%).

  8. Integrated genomic and immunophenotypic classification of pancreatic cancer reveals three distinct subtypes with prognostic/predictive significance.

    Science.gov (United States)

    Wartenberg, Martin; Cibin, Silvia; Zlobec, Inti; Vassella, Erik; Eppenberger-Castori, Serenella M M; Terracciano, Luigi; Eichmann, Micha; Worni, Mathias; Gloor, Beat; Perren, Aurel; Karamitopoulou, Eva

    2018-04-16

    Current clinical classification of pancreatic ductal adenocarcinoma (PDAC) is unable to predict prognosis or response to chemo- or immunotherapy and does not take into account the host reaction to PDAC-cells. Our aim is to classify PDAC according to host- and tumor-related factors into clinically/biologically relevant subtypes by integrating molecular and microenvironmental findings. A well-characterized PDAC-cohort (n=110) underwent next-generation sequencing with a hotspot cancer panel, while Next-generation Tissue-Microarrays were immunostained for CD3, CD4, CD8, CD20, PD-L1, p63, hyaluronan-mediated motility receptor (RHAMM) and DNA mismatch-repair proteins. Previous data on FOXP3 were integrated. Immune-cell counts and protein expression were correlated with tumor-derived driver mutations, clinicopathologic features (TNM 8. 2017), survival and epithelial-mesenchymal-transition (EMT)-like tumor budding.  Results: Three PDAC-subtypes were identified: the "immune-escape" (54%), poor in T- and B-cells and enriched in FOXP3+Tregs, with high-grade budding, frequent CDKN2A- , SMAD4- and PIK3CA-mutations and poor outcome; the "immune-rich" (35%), rich in T- and B-cells and poorer in FOXP3+Tregs, with infrequent budding, lower CDKN2A- and PIK3CA-mutation rate and better outcome and a subpopulation with tertiary lymphoid tissue (TLT), mutations in DNA damage response genes (STK11, ATM) and the best outcome; and the "immune-exhausted" (11%) with immunogenic microenvironment and two subpopulations: one with PD-L1-expression and high PIK3CA-mutation rate and a microsatellite-unstable subpopulation with high prevalence of JAK3-mutations. The combination of low budding, low stromal FOXP3-counts, presence of TLTs and absence of CDKN2A-mutations confers significant survival advantage in PDAC-patients. Immune host responses correlate with tumor characteristics leading to morphologically recognizable PDAC-subtypes with prognostic/predictive significance. Copyright ©2018

  9. Improvement of bottom-quark associated Higgs-boson production predictions for LHC using HERA data

    Energy Technology Data Exchange (ETDEWEB)

    Andrii, Gizhko; Achim, Geiser [Deutsches Elektronen-Synchrotron, Hamburg (Germany)

    2016-07-01

    The dependence of the inclusive total cross section of the bottom-quark associated Higgs-boson production predictions at the LHC, pp → (b anti b)H+X on the treatment of the beauty quark mass is studied in the context of CMS measurements. For two different schemes (four flavour scheme (4FS) and five flavour scheme (5FS)) the theoretical uncertainty due to the beauty quark mass is estimated, and the potential improvement arising from a QCD analysis of HERA beauty data is demonstrated.

  10. Improvement of prediction ability for genomic selection of dairy cattle by including dominance effects.

    Directory of Open Access Journals (Sweden)

    Chuanyu Sun

    Full Text Available Dominance may be an important source of non-additive genetic variance for many traits of dairy cattle. However, nearly all prediction models for dairy cattle have included only additive effects because of the limited number of cows with both genotypes and phenotypes. The role of dominance in the Holstein and Jersey breeds was investigated for eight traits: milk, fat, and protein yields; productive life; daughter pregnancy rate; somatic cell score; fat percent and protein percent. Additive and dominance variance components were estimated and then used to estimate additive and dominance effects of single nucleotide polymorphisms (SNPs. The predictive abilities of three models with both additive and dominance effects and a model with additive effects only were assessed using ten-fold cross-validation. One procedure estimated dominance values, and another estimated dominance deviations; calculation of the dominance relationship matrix was different for the two methods. The third approach enlarged the dataset by including cows with genotype probabilities derived using genotyped ancestors. For yield traits, dominance variance accounted for 5 and 7% of total variance for Holsteins and Jerseys, respectively; using dominance deviations resulted in smaller dominance and larger additive variance estimates. For non-yield traits, dominance variances were very small for both breeds. For yield traits, including additive and dominance effects fit the data better than including only additive effects; average correlations between estimated genetic effects and phenotypes showed that prediction accuracy increased when both effects rather than just additive effects were included. No corresponding gains in prediction ability were found for non-yield traits. Including cows with derived genotype probabilities from genotyped ancestors did not improve prediction accuracy. The largest additive effects were located on chromosome 14 near DGAT1 for yield traits for both

  11. Prediction of Negative Conversion Days of Childhood Nephrotic Syndrome Based on the Improved Backpropagation Neural Network with Momentum

    Directory of Open Access Journals (Sweden)

    Yi-jun Liu

    2015-12-01

    Full Text Available Childhood nephrotic syndrome is a chronic disease harmful to growth of children. Scientific and accurate prediction of negative conversion days for children with nephrotic syndrome offers potential benefits for treatment of patients and helps achieve better cure effect. In this study, the improved backpropagation neural network with momentum is used for prediction. Momentum speeds up convergence and maintains the generalization performance of the neural network, and therefore overcomes weaknesses of the standard backpropagation algorithm. The three-tier network structure is constructed. Eight indicators including age, lgG, lgA and lgM, etc. are selected for network inputs. The scientific computing software of MATLAB and its neural network tools are used to create model and predict. The training sample of twenty-eight cases is used to train the neural network. The test sample of six typical cases belonging to six different age groups respectively is used to test the predictive model. The low mean absolute error of predictive results is achieved at 0.83. The experimental results of the small-size sample show that the proposed approach is to some degree applicable for the prediction of negative conversion days of childhood nephrotic syndrome.

  12. Urban Ecological Security Simulation and Prediction Using an Improved Cellular Automata (CA) Approach-A Case Study for the City of Wuhan in China.

    Science.gov (United States)

    Gao, Yuan; Zhang, Chuanrong; He, Qingsong; Liu, Yaolin

    2017-06-15

    Ecological security is an important research topic, especially urban ecological security. As highly populated eco-systems, cities always have more fragile ecological environments. However, most of the research on urban ecological security in literature has focused on evaluating current or past status of the ecological environment. Very little literature has carried out simulation or prediction of future ecological security. In addition, there is even less literature exploring the urban ecological environment at a fine scale. To fill-in the literature gap, in this study we simulated and predicted urban ecological security at a fine scale (district level) using an improved Cellular Automata (CA) approach. First we used the pressure-state-response (PSR) method based on grid-scale data to evaluate urban ecological security. Then, based on the evaluation results, we imported the geographically weighted regression (GWR) concept into the CA model to simulate and predict urban ecological security. We applied the improved CA approach in a case study-simulating and predicting urban ecological security for the city of Wuhan in Central China. By comparing the simulated ecological security values from 2010 using the improved CA model to the actual ecological security values of 2010, we got a relatively high value of the kappa coefficient, which indicates that this CA model can simulate or predict well future development of ecological security in Wuhan. Based on the prediction results for 2020, we made some policy recommendations for each district in Wuhan.

  13. Urban Ecological Security Simulation and Prediction Using an Improved Cellular Automata (CA) Approach—A Case Study for the City of Wuhan in China

    Science.gov (United States)

    Gao, Yuan; Zhang, Chuanrong; He, Qingsong; Liu, Yaolin

    2017-01-01

    Ecological security is an important research topic, especially urban ecological security. As highly populated eco-systems, cities always have more fragile ecological environments. However, most of the research on urban ecological security in literature has focused on evaluating current or past status of the ecological environment. Very little literature has carried out simulation or prediction of future ecological security. In addition, there is even less literature exploring the urban ecological environment at a fine scale. To fill-in the literature gap, in this study we simulated and predicted urban ecological security at a fine scale (district level) using an improved Cellular Automata (CA) approach. First we used the pressure-state-response (PSR) method based on grid-scale data to evaluate urban ecological security. Then, based on the evaluation results, we imported the geographically weighted regression (GWR) concept into the CA model to simulate and predict urban ecological security. We applied the improved CA approach in a case study—simulating and predicting urban ecological security for the city of Wuhan in Central China. By comparing the simulated ecological security values from 2010 using the improved CA model to the actual ecological security values of 2010, we got a relatively high value of the kappa coefficient, which indicates that this CA model can simulate or predict well future development of ecological security in Wuhan. Based on the prediction results for 2020, we made some policy recommendations for each district in Wuhan. PMID:28617348

  14. Bankruptcy prediction for credit risk using neural networks: a survey and new results.

    Science.gov (United States)

    Atiya, A F

    2001-01-01

    The prediction of corporate bankruptcies is an important and widely studied topic since it can have significant impact on bank lending decisions and profitability. This work presents two contributions. First we review the topic of bankruptcy prediction, with emphasis on neural-network (NN) models. Second, we develop an NN bankruptcy prediction model. Inspired by one of the traditional credit risk models developed by Merton (1974), we propose novel indicators for the NN system. We show that the use of these indicators in addition to traditional financial ratio indicators provides a significant improvement in the (out-of-sample) prediction accuracy (from 81.46% to 85.5% for a three-year-ahead forecast).

  15. Prediction of wall motion improvement after coronary revascularization in patients with postmyocardial infarction. Diagnostic value of dobutamine stress echocardiography and myocardial contrast echocardiography

    International Nuclear Information System (INIS)

    Waku, Sachiko; Ohkubo, Tomoyuki; Takada, Kiyoshi; Ishihara, Tadashi; Ohsawa, Nakaaki; Adachi, Itaru; Narabayashi, Isamu

    1997-01-01

    The diagnostic value of dobutamine stress echocardiography, myocardial contrast echocardiography and dipyridamole stress thallium-201 single photon emission computed tomography (SPECT) for predicting recovery of wall motion abnormality after revascularization was evaluated in 13 patients with postmyocardial infarction. Seventeen segments showed severe wall motion abnormalities before revascularization. Nine segments which had relatively good Tl uptake on delayed SPECT images despite severely abnormal wall motion were opacified during myocardial contrast echocardiography, and showed improved wall motion after revascularization. In contrast, three segments which had poor Tl uptake and severely abnormal wall motion were not opacified during myocardial contrast echocardiography, and showed no improvement in wall motion during dobutamine stress echocardiography and after revascularization. The following three findings were assumed to be signs of myocardial viability: good Tl uptake on delayed SPECT images, improved wall motion by dobutamine stress echocardiography, and positive opacification of the myocardium by myocardiai contrast echocardiography. Myocardial contrast echocardiography had the highest sensitivity (100%) and negative predictive value (100%). Delayed SPECT images had the highest specificity (100%) and positive predictive value (100%). Dobutamine stress echocardiography had a sensitivity of 83.0%, specificity of 80.0%, positive predictive value of 90.9%, and negative predictive value of 66.7%, respectively. Myocardial contrast echocardiography showed the lowest specificity (60.0%). The techniques of dobutamine stress echocardiography and SPECT, though noninvasive, may underestimate wall motion improvement after revascularization. Further examination by myocardial contrast echocardiography is recommended to assess myocardial viability for determining the indications for coronary revascularization in spite of its invasiveness. (author)

  16. Does early improvement in depressive symptoms predict subsequent remission in patients with depression who are treated with duloxetine?

    Directory of Open Access Journals (Sweden)

    Sueki A

    2016-05-01

    Full Text Available Akitsugu Sueki, Eriko Suzuki, Hitoshi Takahashi, Jun Ishigooka Department of Neuropsychiatry, Tokyo Women’s Medical University, Tokyo, Japan Purpose: In this prospective study, we examined whether early reduction in depressive symptoms predicts later remission to duloxetine in the treatment of depression, as monitored using the Montgomery–Asberg Depression Rating Scale (MADRS. Patients and methods: Among the 106 patients who were enrolled in this study, 67 were included in the statistical analysis. A clinical evaluation using the MADRS was performed at weeks 0, 4, 8, 12, and 16 after commencing treatment. For each time point, the MADRS total score was separated into three components: dysphoria, retardation, and vegetative scores. Results: Remission was defined as an MADRS total score of ≤10 at end point. From our univariate logistic regression analysis, we found that improvements in both the MADRS total score and the dysphoria score at week 4 had a significant interaction with subsequent remission. Furthermore, age and sex were significant predictors of remission. There was an increase of approximately 4% in the odds of remission for each unit increase in age, and female sex had an odds of remission of 0.318 times that of male sex (remission rate for men was 73.1% [19/26] and for women 46.3% [19/41]. However, in the multivariate model using the change from baseline in the total MADRS, dysphoria, retardation, and vegetative scores at week 4, in which age and sex were included as covariates, only sex retained significance, except for an improvement in the dysphoria score. Conclusion: No significant interaction was found between early response to duloxetine and eventual remission in this study. Sex difference was found to be a predictor of subsequent remission in patients with depression who were treated with duloxetine, with the male sex having greater odds of remission. Keywords: antidepressant, early response, sex difference, serotonin

  17. Improvement of cardiovascular risk prediction: time to review current knowledge, debates, and fundamentals on how to assess test characteristics.

    Science.gov (United States)

    Romanens, Michel; Ackermann, Franz; Spence, John David; Darioli, Roger; Rodondi, Nicolas; Corti, Roberto; Noll, Georg; Schwenkglenks, Matthias; Pencina, Michael

    2010-02-01

    Cardiovascular risk assessment might be improved with the addition of emerging, new tests derived from atherosclerosis imaging, laboratory tests or functional tests. This article reviews relative risk, odds ratios, receiver-operating curves, posttest risk calculations based on likelihood ratios, the net reclassification improvement and integrated discrimination. This serves to determine whether a new test has an added clinical value on top of conventional risk testing and how this can be verified statistically. Two clinically meaningful examples serve to illustrate novel approaches. This work serves as a review and basic work for the development of new guidelines on cardiovascular risk prediction, taking into account emerging tests, to be proposed by members of the 'Taskforce on Vascular Risk Prediction' under the auspices of the Working Group 'Swiss Atherosclerosis' of the Swiss Society of Cardiology in the future.

  18. Base Oils Biodegradability Prediction with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Malika Trabelsi

    2010-02-01

    Full Text Available In this paper, we apply various data mining techniques including continuous numeric and discrete classification prediction models of base oils biodegradability, with emphasis on improving prediction accuracy. The results show that highly biodegradable oils can be better predicted through numeric models. In contrast, classification models did not uncover a similar dichotomy. With the exception of Memory Based Reasoning and Decision Trees, tested classification techniques achieved high classification prediction. However, the technique of Decision Trees helped uncover the most significant predictors. A simple classification rule derived based on this predictor resulted in good classification accuracy. The application of this rule enables efficient classification of base oils into either low or high biodegradability classes with high accuracy. For the latter, a higher precision biodegradability prediction can be obtained using continuous modeling techniques.

  19. Advances in Rosetta structure prediction for difficult molecular-replacement problems

    International Nuclear Information System (INIS)

    DiMaio, Frank

    2013-01-01

    Modeling advances using Rosetta structure prediction to aid in solving difficult molecular-replacement problems are discussed. Recent work has shown the effectiveness of structure-prediction methods in solving difficult molecular-replacement problems. The Rosetta protein structure modeling suite can aid in the solution of difficult molecular-replacement problems using templates from 15 to 25% sequence identity; Rosetta refinement guided by noisy density has consistently led to solved structures where other methods fail. In this paper, an overview of the use of Rosetta for these difficult molecular-replacement problems is provided and new modeling developments that further improve model quality are described. Several variations to the method are introduced that significantly reduce the time needed to generate a model and the sampling required to improve the starting template. The improvements are benchmarked on a set of nine difficult cases and it is shown that this improved method obtains consistently better models in less running time. Finally, strategies for best using Rosetta to solve difficult molecular-replacement problems are presented and future directions for the role of structure-prediction methods in crystallography are discussed

  20. Strategies to predict and improve eating quality of cooked beef using carcass and meat composition traits in Angus cattle.

    Science.gov (United States)

    Mateescu, R G; Oltenacu, P A; Garmyn, A J; Mafi, G G; VanOverbeke, D L

    2016-05-01

    Product quality is a high priority for the beef industry because of its importance as a major driver of consumer demand for beef and the ability of the industry to improve it. A 2-prong approach based on implementation of a genetic program to improve eating quality and a system to communicate eating quality and increase the probability that consumers' eating quality expectations are met is outlined. The objectives of this study were 1) to identify the best carcass and meat composition traits to be used in a selection program to improve eating quality and 2) to develop a relatively small number of classes that reflect real and perceptible differences in eating quality that can be communicated to consumers and identify a subset of carcass and meat composition traits with the highest predictive accuracy across all eating quality classes. Carcass traits, meat composition, including Warner-Bratzler shear force (WBSF), intramuscular fat content (IMFC), trained sensory panel scores, and mineral composition traits of 1,666 Angus cattle were used in this study. Three eating quality indexes, EATQ1, EATQ2, and EATQ3, were generated by using different weights for the sensory traits (emphasis on tenderness, flavor, and juiciness, respectively). The best model for predicting eating quality explained 37%, 9%, and 19% of the variability of EATQ1, EATQ2, and EATQ3, and 2 traits, WBSF and IMFC, accounted for most of the variability explained by the best models. EATQ1 combines tenderness, juiciness, and flavor assessed by trained panels with 0.60, 0.15, and 0.25 weights, best describes North American consumers, and has a moderate heritability (0.18 ± 0.06). A selection index (I= -0.5[WBSF] + 0.3[IMFC]) based on phenotypic and genetic variances and covariances can be used to improve eating quality as a correlated trait. The 3 indexes (EATQ1, EATQ2, and EATQ3) were used to generate 3 equal (33.3%) low, medium, and high eating quality classes, and linear combinations of traits that

  1. Sweat loss prediction using a multi-model approach.

    Science.gov (United States)

    Xu, Xiaojiang; Santee, William R

    2011-07-01

    A new multi-model approach (MMA) for sweat loss prediction is proposed to improve prediction accuracy. MMA was computed as the average of sweat loss predicted by two existing thermoregulation models: i.e., the rational model SCENARIO and the empirical model Heat Strain Decision Aid (HSDA). Three independent physiological datasets, a total of 44 trials, were used to compare predictions by MMA, SCENARIO, and HSDA. The observed sweat losses were collected under different combinations of uniform ensembles, environmental conditions (15-40°C, RH 25-75%), and exercise intensities (250-600 W). Root mean square deviation (RMSD), residual plots, and paired t tests were used to compare predictions with observations. Overall, MMA reduced RMSD by 30-39% in comparison with either SCENARIO or HSDA, and increased the prediction accuracy to 66% from 34% or 55%. Of the MMA predictions, 70% fell within the range of mean observed value ± SD, while only 43% of SCENARIO and 50% of HSDA predictions fell within the same range. Paired t tests showed that differences between observations and MMA predictions were not significant, but differences between observations and SCENARIO or HSDA predictions were significantly different for two datasets. Thus, MMA predicted sweat loss more accurately than either of the two single models for the three datasets used. Future work will be to evaluate MMA using additional physiological data to expand the scope of populations and conditions.

  2. Carfilzomib significantly improves the progression-free survival of high-risk patients in multiple myeloma.

    Science.gov (United States)

    Avet-Loiseau, Hervé; Fonseca, Rafael; Siegel, David; Dimopoulos, Meletios A; Špička, Ivan; Masszi, Tamás; Hájek, Roman; Rosiñol, Laura; Goranova-Marinova, Vesselina; Mihaylov, Georgi; Maisnar, Vladimír; Mateos, Maria-Victoria; Wang, Michael; Niesvizky, Ruben; Oriol, Albert; Jakubowiak, Andrzej; Minarik, Jiri; Palumbo, Antonio; Bensinger, William; Kukreti, Vishal; Ben-Yehuda, Dina; Stewart, A Keith; Obreja, Mihaela; Moreau, Philippe

    2016-09-01

    The presence of certain high-risk cytogenetic abnormalities, such as translocations (4;14) and (14;16) and deletion (17p), are known to have a negative impact on survival in multiple myeloma (MM). The phase 3 study ASPIRE (N = 792) demonstrated that progression-free survival (PFS) was significantly improved with carfilzomib, lenalidomide, and dexamethasone (KRd), compared with lenalidomide and dexamethasone (Rd) in relapsed MM. This preplanned subgroup analysis of ASPIRE was conducted to evaluate KRd vs Rd by baseline cytogenetics according to fluorescence in situ hybridization. Of 417 patients with known cytogenetic risk status, 100 patients (24%) were categorized with high-risk cytogenetics (KRd, n = 48; Rd, n = 52) and 317 (76%) were categorized with standard-risk cytogenetics (KRd, n = 147; Rd, n = 170). For patients with high-risk cytogenetics, treatment with KRd resulted in a median PFS of 23.1 months, a 9-month improvement relative to treatment with Rd. For patients with standard-risk cytogenetics, treatment with KRd led to a 10-month improvement in median PFS vs Rd. The overall response rates for KRd vs Rd were 79.2% vs 59.6% (high-risk cytogenetics) and 91.2% vs 73.5% (standard-risk cytogenetics); approximately fivefold as many patients with high- or standard-risk cytogenetics achieved a complete response or better with KRd vs Rd (29.2% vs 5.8% and 38.1% vs 6.5%, respectively). KRd improved but did not abrogate the poor prognosis associated with high-risk cytogenetics. This regimen had a favorable benefit-risk profile in patients with relapsed MM, irrespective of cytogenetic risk status, and should be considered a standard of care in these patients. This trial was registered at www.clinicaltrials.gov as #NCT01080391. © 2016 by The American Society of Hematology.

  3. An Improved Metabolism Grey Model for Predicting Small Samples with a Singular Datum and Its Application to Sulfur Dioxide Emissions in China

    Directory of Open Access Journals (Sweden)

    Wei Zhou

    2016-01-01

    Full Text Available This study proposes an improved metabolism grey model [IMGM(1,1] to predict small samples with a singular datum, which is a common phenomenon in daily economic data. This new model combines the fitting advantage of the conventional GM(1,1 in small samples and the additional advantages of the MGM(1,1 in new real-time data, while overcoming the limitations of both the conventional GM(1,1 and MGM(1,1 when the predicted results are vulnerable at any singular datum. Thus, this model can be classified as an improved grey prediction model. Its improvements are illustrated through a case study of sulfur dioxide emissions in China from 2007 to 2013 with a singular datum in 2011. Some features of this model are presented based on the error analysis in the case study. Results suggest that if action is not taken immediately, sulfur dioxide emissions in 2016 will surpass the standard level required by the Twelfth Five-Year Plan proposed by the China State Council.

  4. Evoked Emotions Predict Food Choice

    OpenAIRE

    Dalenberg, Jelle R.; Gutjar, Swetlana; ter Horst, Gert J.; de Graaf, Kees; Renken, Remco J.; Jager, Gerry

    2014-01-01

    In the current study we show that non-verbal food-evoked emotion scores significantly improve food choice prediction over merely liking scores. Previous research has shown that liking measures correlate with choice. However, liking is no strong predictor for food choice in real life environments. Therefore, the focus within recent studies shifted towards using emotion-profiling methods that successfully can discriminate between products that are equally liked. However, it is unclear how well ...

  5. The Significance of the PD-L1 Expression in Non-Small-Cell Lung Cancer: Trenchant Double Swords as Predictive and Prognostic Markers.

    Science.gov (United States)

    Takada, Kazuki; Toyokawa, Gouji; Shoji, Fumihiro; Okamoto, Tatsuro; Maehara, Yoshihiko

    2018-03-01

    Lung cancer is the leading cause of death due to cancer worldwide. Surgery, chemotherapy, and radiotherapy have been the standard treatment for lung cancer, and targeted molecular therapy has greatly improved the clinical course of patients with non-small-cell lung cancer (NSCLC) harboring driver mutations, such as in epidermal growth factor receptor and anaplastic lymphoma kinase genes. Despite advances in such therapies, the prognosis of patients with NSCLC without driver oncogene mutations remains poor. Immunotherapy targeting programmed cell death-1 (PD-1) and programmed cell death-ligand 1 (PD-L1) has recently been shown to improve the survival in advanced NSCLC. The PD-L1 expression on the surface of tumor cells has emerged as a potential biomarker for predicting responses to immunotherapy and prognosis after surgery in NSCLC. However, the utility of PD-L1 expression as a predictive and prognostic biomarker remains controversial because of the existence of various PD-L1 antibodies, scoring systems, and positivity cutoffs. In this review, we summarize the data from representative clinical trials of PD-1/PD-L1 immune checkpoint inhibitors in NSCLC and previous reports on the association between PD-L1 expression and clinical outcomes in patients with NSCLC. Furthermore, we discuss the future perspectives of immunotherapy and immune checkpoint factors. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Individualized performance prediction during total sleep deprivation: accounting for trait vulnerability to sleep loss.

    Science.gov (United States)

    Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Thorsley, David; Wesensten, Nancy J; Balkin, Thomas J; Reifman, Jaques

    2012-01-01

    Individual differences in vulnerability to sleep loss can be considerable, and thus, recent efforts have focused on developing individualized models for predicting the effects of sleep loss on performance. Individualized models constructed using a Bayesian formulation, which combines an individual's available performance data with a priori performance predictions from a group-average model, typically need at least 40 h of individual data before showing significant improvement over the group-average model predictions. Here, we improve upon the basic Bayesian formulation for developing individualized models by observing that individuals may be classified into three sleep-loss phenotypes: resilient, average, and vulnerable. For each phenotype, we developed a phenotype-specific group-average model and used these models to identify each individual's phenotype. We then used the phenotype-specific models within the Bayesian formulation to make individualized predictions. Results on psychomotor vigilance test data from 48 individuals indicated that, on average, ∼85% of individual phenotypes were accurately identified within 30 h of wakefulness. The percentage improvement of the proposed approach in 10-h-ahead predictions was 16% for resilient subjects and 6% for vulnerable subjects. The trade-off for these improvements was a slight decrease in prediction accuracy for average subjects.

  7. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Shiraishi, Satomi; Moore, Kevin L., E-mail: kevinmoore@ucsd.edu [Department of Radiation Medicine and Applied Sciences, University of California, San Diego, La Jolla, California 92093 (United States); Tan, Jun [Department of Radiation Oncology, UT Southwestern Medical Center, Dallas, Texas 75490 (United States); Olsen, Lindsey A. [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri 63110 (United States)

    2015-02-15

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V{sub 10Gy} (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM{sub clin} − QM{sub pred}, and a coefficient of determination, R{sup 2}. For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are

  8. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    International Nuclear Information System (INIS)

    Shiraishi, Satomi; Moore, Kevin L.; Tan, Jun; Olsen, Lindsey A.

    2015-01-01

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V 10Gy (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM clin − QM pred , and a coefficient of determination, R 2 . For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are stratified based on

  9. TCF7L2 variant genotypes and type 2 diabetes risk in Brazil: significant association, but not a significant tool for risk stratification in the general population

    Directory of Open Access Journals (Sweden)

    Mill JG

    2008-12-01

    Full Text Available Abstract Background Genetic polymorphisms of the TCF7L2 gene are strongly associated with large increments in type 2 diabetes risk in different populations worldwide. In this study, we aimed to confirm the effect of the TCF7L2 polymorphism rs7903146 on diabetes risk in a Brazilian population and to assess the use of this genetic marker in improving diabetes risk prediction in the general population. Methods We genotyped the single nucleotide polymorphisms (SNP rs7903146 of the TCF7L2 gene in 560 patients with known coronary disease enrolled in the MASS II (Medicine, Angioplasty, or Surgery Study Trial and in 1,449 residents of Vitoria, in Southeast Brazil. The associations of this gene variant to diabetes risk and metabolic characteristics in these two different populations were analyzed. To access the potential benefit of using this marker for diabetes risk prediction in the general population we analyzed the impact of this genetic variant on a validated diabetes risk prediction tool based on clinical characteristics developed for the Brazilian general population. Results SNP rs7903146 of the TCF7L2 gene was significantly associated with type 2 diabetes in the MASS-II population (OR = 1.57 per T allele, p = 0.0032, confirming, in the Brazilian population, previous reports of the literature. Addition of this polymorphism to an established clinical risk prediction score did not increased model accuracy (both area under ROC curve equal to 0.776. Conclusion TCF7L2 rs7903146 T allele is associated with a 1.57 increased risk for type 2 diabetes in a Brazilian cohort of patients with known coronary heart disease. However, the inclusion of this polymorphism in a risk prediction tool developed for the general population resulted in no improvement of performance. This is the first study, to our knowledge, that has confirmed this recent association in a South American population and adds to the great consistency of this finding in studies around the world

  10. Improving Clinical Prediction of Bipolar Spectrum Disorders in Youth

    Directory of Open Access Journals (Sweden)

    Thomas W. Frazier

    2014-03-01

    Full Text Available This report evaluates whether classification tree algorithms (CTA may improve the identification of individuals at risk for bipolar spectrum disorders (BPSD. Analyses used the Longitudinal Assessment of Manic Symptoms (LAMS cohort (629 youth, 148 with BPSD and 481 without BPSD. Parent ratings of mania symptoms, stressful life events, parenting stress, and parental history of mania were included as risk factors. Comparable overall accuracy was observed for CTA (75.4% relative to logistic regression (77.6%. However, CTA showed increased sensitivity (0.28 vs. 0.18 at the expense of slightly decreased specificity and positive predictive power. The advantage of CTA algorithms for clinical decision making is demonstrated by the combinations of predictors most useful for altering the probability of BPSD. The 24% sample probability of BPSD was substantially decreased in youth with low screening and baseline parent ratings of mania, negative parental history of mania, and low levels of stressful life events (2%. High screening plus high baseline parent-rated mania nearly doubled the BPSD probability (46%. Future work will benefit from examining additional, powerful predictors, such as alternative data sources (e.g., clinician ratings, neurocognitive test data; these may increase the clinical utility of CTA models further.

  11. Prefrontal Cortex Structure Predicts Training-Induced Improvements in Multitasking Performance.

    Science.gov (United States)

    Verghese, Ashika; Garner, K G; Mattingley, Jason B; Dux, Paul E

    2016-03-02

    The ability to perform multiple, concurrent tasks efficiently is a much-desired cognitive skill, but one that remains elusive due to the brain's inherent information-processing limitations. Multitasking performance can, however, be greatly improved through cognitive training (Van Selst et al., 1999, Dux et al., 2009). Previous studies have examined how patterns of brain activity change following training (for review, see Kelly and Garavan, 2005). Here, in a large-scale human behavioral and imaging study of 100 healthy adults, we tested whether multitasking training benefits, assessed using a standard dual-task paradigm, are associated with variability in brain structure. We found that the volume of the rostral part of the left dorsolateral prefrontal cortex (DLPFC) predicted an individual's response to training. Critically, this association was observed exclusively in a task-specific training group, and not in an active-training control group. Our findings reveal a link between DLPFC structure and an individual's propensity to gain from training on a task that taps the limits of cognitive control. Cognitive "brain" training is a rapidly growing, multibillion dollar industry (Hayden, 2012) that has been touted as the panacea for a variety of disorders that result in cognitive decline. A key process targeted by such training is "cognitive control." Here, we combined an established cognitive control measure, multitasking ability, with structural brain imaging in a sample of 100 participants. Our goal was to determine whether individual differences in brain structure predict the extent to which people derive measurable benefits from a cognitive training regime. Ours is the first study to identify a structural brain marker-volume of left hemisphere dorsolateral prefrontal cortex-associated with the magnitude of multitasking performance benefits induced by training at an individual level. Copyright © 2016 the authors 0270-6474/16/362638-08$15.00/0.

  12. Pathological fracture prediction in patients with metastatic lesions can be improved with quantitative computed tomography based computer models

    NARCIS (Netherlands)

    Tanck, Esther; van Aken, Jantien B.; van der Linden, Yvette M.; Schreuder, H.W. Bart; Binkowski, Marcin; Huizenga, Henk; Verdonschot, Nico

    2009-01-01

    Purpose: In clinical practice, there is an urgent need to improve the prediction of fracture risk for cancer patients with bone metastases. The methods that are currently used to estimate fracture risk are dissatisfying, hence affecting the quality of life of patients with a limited life expectancy.

  13. Compression stockings significantly improve hemodynamic performance in post-thrombotic syndrome irrespective of class or length.

    Science.gov (United States)

    Lattimer, Christopher R; Azzam, Mustapha; Kalodiki, Evi; Makris, Gregory C; Geroulakos, George

    2013-07-01

    Graduated elastic compression (GEC) stockings have been demonstrated to reduce the morbidity associated with post-thrombotic syndrome. The ideal length or compression strength required to achieve this is speculative and related to physician preference and patient compliance. The aim of this study was to evaluate the hemodynamic performance of four different stockings and determine the patient's preference. Thirty-four consecutive patients (40 legs, 34 male) with post-thrombotic syndrome were tested with four different stockings (Mediven plus open toe, Bayreuth, Germany) of their size in random order: class 1 (18-21 mm Hg) and class II (23-32 mm Hg), below-knee (BK) and above-knee thigh-length (AK). The median age, Venous Clinical Severity Score, Venous Segmental Disease Score, and Villalta scale were 62 years (range, 31-81 years), 8 (range, 1-21), 5 (range, 2-10), and 10 (range, 2-22), respectively. The C of C0-6EsAs,d,pPr,o was C0 = 2, C2 = 1, C3 = 3, C4a = 12, C4b = 7, C5 = 12, C6 = 3. Obstruction and reflux was observed on duplex in 47.5% legs, with deep venous reflux alone in 45%. Air plethysmography was used to measure the venous filling index (VFI), venous volume, and time to fill 90% of the venous volume. Direct pressure measurements were obtained while lying and standing using the PicoPress device (Microlab Elettronica, Nicolò, Italy). The pressure sensor was placed underneath the test stocking 5 cm above and 2 cm posterior to the medial malleolus. At the end of the study session, patients stated their preferred stocking based on comfort. The VFI, venous volume, and time to fill 90% of the venous volume improved significantly with all types of stocking versus no compression. In class I, the VFI (mL/s) improved from a median of 4.9 (range, 1.7-16.3) without compression to 3.7 (range, 0-14) BK (24.5%) and 3.6 (range, 0.6-14.5) AK (26.5%). With class II, the corresponding improvement was to 4.0 (range, 0.3-16.2) BK (18.8%) and 3.7 (range, 0.5-14.2) AK (24

  14. An evaluation of the potential of Sentinel 1 for improving flash flood predictions via soil moisture–data assimilation

    Directory of Open Access Journals (Sweden)

    L. Cenci

    2017-11-01

    Full Text Available The assimilation of satellite-derived soil moisture estimates (soil moisture–data assimilation, SM–DA into hydrological models has the potential to reduce the uncertainty of streamflow simulations. The improved capacity to monitor the closeness to saturation of small catchments, such as those characterizing the Mediterranean region, can be exploited to enhance flash flood predictions. When compared to other microwave sensors that have been exploited for SM–DA in recent years (e.g. the Advanced SCATterometer – ASCAT, characterized by low spatial/high temporal resolution, the Sentinel 1 (S1 mission provides an excellent opportunity to monitor systematically soil moisture (SM at high spatial resolution and moderate temporal resolution. The aim of this research was thus to evaluate the impact of S1-based SM–DA for enhancing flash flood predictions of a hydrological model (Continuum that is currently exploited for civil protection applications in Italy. The analysis was carried out in a representative Mediterranean catchment prone to flash floods, located in north-western Italy, during the time period October 2014–February 2015. It provided some important findings: (i revealing the potential provided by S1-based SM–DA for improving discharge predictions, especially for higher flows; (ii suggesting a more appropriate pre-processing technique to be applied to S1 data before the assimilation; and (iii highlighting that even though high spatial resolution does provide an important contribution in a SM–DA system, the temporal resolution has the most crucial role. S1-derived SM maps are still a relatively new product and, to our knowledge, this is the first work published in an international journal dealing with their assimilation within a hydrological model to improve continuous streamflow simulations and flash flood predictions. Even though the reported results were obtained by analysing a relatively short time period, and thus should be

  15. An evaluation of the potential of Sentinel 1 for improving flash flood predictions via soil moisture-data assimilation

    Science.gov (United States)

    Cenci, Luca; Pulvirenti, Luca; Boni, Giorgio; Chini, Marco; Matgen, Patrick; Gabellani, Simone; Squicciarino, Giuseppe; Pierdicca, Nazzareno

    2017-11-01

    The assimilation of satellite-derived soil moisture estimates (soil moisture-data assimilation, SM-DA) into hydrological models has the potential to reduce the uncertainty of streamflow simulations. The improved capacity to monitor the closeness to saturation of small catchments, such as those characterizing the Mediterranean region, can be exploited to enhance flash flood predictions. When compared to other microwave sensors that have been exploited for SM-DA in recent years (e.g. the Advanced SCATterometer - ASCAT), characterized by low spatial/high temporal resolution, the Sentinel 1 (S1) mission provides an excellent opportunity to monitor systematically soil moisture (SM) at high spatial resolution and moderate temporal resolution. The aim of this research was thus to evaluate the impact of S1-based SM-DA for enhancing flash flood predictions of a hydrological model (Continuum) that is currently exploited for civil protection applications in Italy. The analysis was carried out in a representative Mediterranean catchment prone to flash floods, located in north-western Italy, during the time period October 2014-February 2015. It provided some important findings: (i) revealing the potential provided by S1-based SM-DA for improving discharge predictions, especially for higher flows; (ii) suggesting a more appropriate pre-processing technique to be applied to S1 data before the assimilation; and (iii) highlighting that even though high spatial resolution does provide an important contribution in a SM-DA system, the temporal resolution has the most crucial role. S1-derived SM maps are still a relatively new product and, to our knowledge, this is the first work published in an international journal dealing with their assimilation within a hydrological model to improve continuous streamflow simulations and flash flood predictions. Even though the reported results were obtained by analysing a relatively short time period, and thus should be supported by further

  16. Improving prediction of Alzheimer’s disease using patterns of cortical thinning and homogenizing images according to disease stage

    DEFF Research Database (Denmark)

    Eskildsen, Simon Fristed; Coupé, Pierrick; García-Lorenzo, Daniel

    Predicting Alzheimer’s disease (AD) in individuals with some symptoms of cognitive decline may have great influence on treatment choice and guide subject selection in trials on disease modifying drugs. Structural MRI has the potential of revealing early signs of neurodegeneration in the human brain...... and may thus aid in predicting and diagnosing AD. Surface-based cortical thickness measurements from T1-weighted MRI have demonstrated high sensitivity to cortical gray matter changes. In this study, we investigated the possibility of using patterns of cortical thickness measurements for predicting AD...... of conversion from MCI to AD can be improved by learning the atrophy patterns that are specific to the different stages of disease progression. This has the potential to guide the further development of imaging biomarkers in AD....

  17. Risk terrain modeling predicts child maltreatment.

    Science.gov (United States)

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Millisecond photo-thermal process on significant improvement of supercapacitor’s performance

    International Nuclear Information System (INIS)

    Wang, Kui; Wang, Jixiao; Wu, Ying; Zhao, Song; Wang, Zhi; Wang, Shichang

    2016-01-01

    Graphical abstract: A high way for charge transfer is created by a millisecond photo-thermal process which could decrease contact resistance among nanomaterials and improve the electrochemical performances. - Highlights: • Improve conductivity among nanomaterials with a millisecond photo-thermal process. • The specific capacitance can increase about 25% with an photo-thermal process. • The circle stability and rate capability can be improved above 10% with photo-thermal process. • Provide a new way that create electron path to improve electrochemical performance. - Abstract: Supercapacitors fabricated with nanomaterials usually have high specific capacitance and excellent performance. However, the small size of nanomaterials renders a considerable limitation of the contact area among nanomaterials, which is harmful to charge carrier transfer. This fact may hinder the development and application of nanomaterials in electrochemical storage systems. Here, a millisecond photo-thermal process was introduced to create a charge carries transfer path to decrease the contact resistance among nanomaterials, and enhance the electrochemical performance of supercapacitors. Polyaniline (PANI) nanowire, as a model nanomaterial, was used to modify electrodes under different photo-thermal process conditions. The modified electrodes were characterized by scanning electronic microscopy (SEM), cyclic voltammetry (CV), electrochemical impedance spectroscopy (EIS) and the results were analysed by equivalent circuit simulation. These results demonstrate that the photo-thermal process can alter the morphology of PANI nanowires, lower the charge transfer resistances and thus improve the performance of electrodes. The specific capacitance increase of the modified electrodes is about 25%. The improvement of the circle stability and rate capability are above 10%. To the best of our knowledge, this is the first attempt on research the effect of photo-thermal process on the conductivity

  19. Clinical significance and predictive factors of early massive recurrence after radiofrequency ablation in patients with a single small hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Ju-Yeon Cho

    2016-12-01

    Full Text Available Background/Aims Radiofrequency ablation (RFA is one of the most frequently applied curative treatments in patients with a single small hepatocellular carcinoma (HCC. However, the clinical significance of and risk factors for early massive recurrence after RFA—a dreadful event limiting further curative treatment—have not been fully evaluated. Methods In total, 438 patients with a single HCC of size ≤3 cm who underwent percutaneous RFA as an initial treatment between 2006 and 2009 were included. Baseline patient characteristics, overall survival, predictive factors, and recurrence after RFA were evaluated. In addition, the incidence, impact on survival, and predictive factors of early massive recurrence, and initial recurrence beyond the Milan criteria within 2 years were also investigated. Results During the median follow-up of 68.4 months, recurrent HCC was confirmed in 302 (68.9% patients, with early massive recurrence in 27 patients (6.2%. The 1-, 3-, and 5-year overall survival rates were 95.4%, 84.7%, and 81.8%, respectively, in patients with no recurrence, 99.6%, 86.4%, and 70.1% in patients with recurrence within the Milan criteria or late recurrence, and 92.6%, 46.5%, and 0.05% in patients with early massive recurrence. Multivariable analysis identified older age, Child-Pugh score B or C, and early massive recurrence as predictive of poor overall survival. A tumor size of ≥2 cm and tumor location adjacent to the colon were independent risk factors predictive of early massive recurrence. Conclusions Early massive recurrence is independently predictive of poor overall survival after RFA in patients with a single small HCC. Tumors sized ≥2 cm and located adjacent to the colon appear to be independent risk factors for early massive recurrence.

  20. Improving sub-pixel imperviousness change prediction by ensembling heterogeneous non-linear regression models

    Science.gov (United States)

    Drzewiecki, Wojciech

    2016-12-01

    In this work nine non-linear regression models were compared for sub-pixel impervious surface area mapping from Landsat images. The comparison was done in three study areas both for accuracy of imperviousness coverage evaluation in individual points in time and accuracy of imperviousness change assessment. The performance of individual machine learning algorithms (Cubist, Random Forest, stochastic gradient boosting of regression trees, k-nearest neighbors regression, random k-nearest neighbors regression, Multivariate Adaptive Regression Splines, averaged neural networks, and support vector machines with polynomial and radial kernels) was also compared with the performance of heterogeneous model ensembles constructed from the best models trained using particular techniques. The results proved that in case of sub-pixel evaluation the most accurate prediction of change may not necessarily be based on the most accurate individual assessments. When single methods are considered, based on obtained results Cubist algorithm may be advised for Landsat based mapping of imperviousness for single dates. However, Random Forest may be endorsed when the most reliable evaluation of imperviousness change is the primary goal. It gave lower accuracies for individual assessments, but better prediction of change due to more correlated errors of individual predictions. Heterogeneous model ensembles performed for individual time points assessments at least as well as the best individual models. In case of imperviousness change assessment the ensembles always outperformed single model approaches. It means that it is possible to improve the accuracy of sub-pixel imperviousness change assessment using ensembles of heterogeneous non-linear regression models.

  1. Hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) and its application to predicting key process variables.

    Science.gov (United States)

    He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-03-01

    In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  2. ECMWF seasonal forecast system 3 and its prediction of sea surface temperature

    Energy Technology Data Exchange (ETDEWEB)

    Stockdale, Timothy N.; Anderson, David L.T.; Balmaseda, Magdalena A.; Ferranti, Laura; Mogensen, Kristian; Palmer, Timothy N.; Molteni, Franco; Vitart, Frederic [ECMWF, Reading (United Kingdom); Doblas-Reyes, Francisco [ECMWF, Reading (United Kingdom); Institut Catala de Ciencies del Clima (IC3), Barcelona (Spain)

    2011-08-15

    The latest operational version of the ECMWF seasonal forecasting system is described. It shows noticeably improved skill for sea surface temperature (SST) prediction compared with previous versions, particularly with respect to El Nino related variability. Substantial skill is shown for lead times up to 1 year, although at this range the spread in the ensemble forecast implies a loss of predictability large enough to account for most of the forecast error variance, suggesting only moderate scope for improving long range El Nino forecasts. At shorter ranges, particularly 3-6 months, skill is still substantially below the model-estimated predictability limit. SST forecast skill is higher for more recent periods than earlier ones. Analysis shows that although various factors can affect scores in particular periods, the improvement from 1994 onwards seems to be robust, and is most plausibly due to improvements in the observing system made at that time. The improvement in forecast skill is most evident for 3-month forecasts starting in February, where predictions of NINO3.4 SST from 1994 to present have been almost without fault. It is argued that in situations where the impact of model error is small, the value of improved observational data can be seen most clearly. Significant skill is also shown in the equatorial Indian Ocean, although predictive skill in parts of the tropical Atlantic are relatively poor. SST forecast errors can be especially high in the Southern Ocean. (orig.)

  3. Numerical prediction of cavitating flow around a hydrofoil using pans and improved shear stress transport k-omega model

    Directory of Open Access Journals (Sweden)

    Zhang De-Sheng

    2015-01-01

    Full Text Available The prediction accuracies of partially-averaged Navier-Stokes model and improved shear stress transport k-ω turbulence model for simulating the unsteady cavitating flow around the hydrofoil were discussed in this paper. Numerical results show that the two turbulence models can effectively reproduce the cavitation evolution process. The numerical prediction for the cycle time of cavitation inception, development, detachment, and collapse agrees well with the experimental data. It is found that the vortex pair induced by the interaction between the re-entrant jet and mainstream is responsible for the instability of the cavitation shedding flow.

  4. Prediction of flood abnormalities for improved public safety using a modified adaptive neuro-fuzzy inference system.

    Science.gov (United States)

    Aqil, M; Kita, I; Yano, A; Nishiyama, S

    2006-01-01

    It is widely accepted that an efficient flood alarm system may significantly improve public safety and mitigate economical damages caused by inundations. In this paper, a modified adaptive neuro-fuzzy system is proposed to modify the traditional neuro-fuzzy model. This new method employs a rule-correction based algorithm to replace the error back propagation algorithm that is employed by the traditional neuro-fuzzy method in backward pass calculation. The final value obtained during the backward pass calculation using the rule-correction algorithm is then considered as a mapping function of the learning mechanism of the modified neuro-fuzzy system. Effectiveness of the proposed identification technique is demonstrated through a simulation study on the flood series of the Citarum River in Indonesia. The first four-year data (1987 to 1990) was used for model training/calibration, while the other remaining data (1991 to 2002) was used for testing the model. The number of antecedent flows that should be included in the input variables was determined by two statistical methods, i.e. autocorrelation and partial autocorrelation between the variables. Performance accuracy of the model was evaluated in terms of two statistical indices, i.e. mean average percentage error and root mean square error. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach, and evolving graphical features, and can be adopted for any similar situation to predict the streamflow. The main data processing includes gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood data, to train/test the model using various input options, and to visualize results. The program code consists of a set of files, which can be modified as well to match other

  5. Low-dose vaporized cannabis significantly improves neuropathic pain.

    Science.gov (United States)

    Wilsey, Barth; Marcotte, Thomas; Deutsch, Reena; Gouaux, Ben; Sakai, Staci; Donaghe, Haylee

    2013-02-01

    We conducted a double-blind, placebo-controlled, crossover study evaluating the analgesic efficacy of vaporized cannabis in subjects, the majority of whom were experiencing neuropathic pain despite traditional treatment. Thirty-nine patients with central and peripheral neuropathic pain underwent a standardized procedure for inhaling medium-dose (3.53%), low-dose (1.29%), or placebo cannabis with the primary outcome being visual analog scale pain intensity. Psychoactive side effects and neuropsychological performance were also evaluated. Mixed-effects regression models demonstrated an analgesic response to vaporized cannabis. There was no significant difference between the 2 active dose groups' results (P > .7). The number needed to treat (NNT) to achieve 30% pain reduction was 3.2 for placebo versus low-dose, 2.9 for placebo versus medium-dose, and 25 for medium- versus low-dose. As these NNTs are comparable to those of traditional neuropathic pain medications, cannabis has analgesic efficacy with the low dose being as effective a pain reliever as the medium dose. Psychoactive effects were minimal and well tolerated, and neuropsychological effects were of limited duration and readily reversible within 1 to 2 hours. Vaporized cannabis, even at low doses, may present an effective option for patients with treatment-resistant neuropathic pain. The analgesia obtained from a low dose of delta-9-tetrahydrocannabinol (1.29%) in patients, most of whom were experiencing neuropathic pain despite conventional treatments, is a clinically significant outcome. In general, the effect sizes on cognitive testing were consistent with this minimal dose. As a result, one might not anticipate a significant impact on daily functioning. Published by Elsevier Inc.

  6. An improved method for predicting the lightning performance of high and extra-high-voltage substation shielding

    Science.gov (United States)

    Vinh, T.

    1980-08-01

    There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.

  7. Application of Genome Wide Association and Genomic Prediction for Improvement of Cacao Productivity and Resistance to Black and Frosty Pod Diseases

    Directory of Open Access Journals (Sweden)

    J. Alberto Romero Navarro

    2017-11-01

    Full Text Available Chocolate is a highly valued and palatable confectionery product. Chocolate is primarily made from the processed seeds of the tree species Theobroma cacao. Cacao cultivation is highly relevant for small-holder farmers throughout the tropics, yet its productivity remains limited by low yields and widespread pathogens. A panel of 148 improved cacao clones was assembled based on productivity and disease resistance, and phenotypic single-tree replicated clonal evaluation was performed for 8 years. Using high-density markers, the diversity of clones was expressed relative to 10 known ancestral cacao populations, and significant effects of ancestry were observed in productivity and disease resistance. Genome-wide association (GWA was performed, and six markers were significantly associated with frosty pod disease resistance. In addition, genomic selection was performed, and consistent with the observed extensive linkage disequilibrium, high predictive ability was observed at low marker densities for all traits. Finally, quantitative trait locus mapping and differential expression analysis of two cultivars with contrasting disease phenotypes were performed to identify genes underlying frosty pod disease resistance, identifying a significant quantitative trait locus and 35 differentially expressed genes using two independent differential expression analyses. These results indicate that in breeding populations of heterozygous and recently admixed individuals, mapping approaches can be used for low complexity traits like pod color cacao, or in other species single gene disease resistance, however genomic selection for quantitative traits remains highly effective relative to mapping. Our results can help guide the breeding process for sustainable improved cacao productivity.

  8. Seasonal climate prediction for North Eurasia

    International Nuclear Information System (INIS)

    Kryjov, Vladimir N

    2012-01-01

    An overview of the current status of the operational seasonal climate prediction for North Eurasia is presented. It is shown that the performance of existing climate models is rather poor in seasonal prediction for North Eurasia. Multi-model ensemble forecasts are more reliable than single-model ones; however, for North Eurasia they tend to be close to climatological ones. Application of downscaling methods may improve predictions for some locations (or regions). However, general improvement of the reliability of seasonal forecasts for North Eurasia requires improvement of the climate prediction models. (letter)

  9. Validation of three noninvasive laboratory variables to predict significant fibrosis and cirrhosis in patients with chronic hepatitis C in Saudi Arabia

    International Nuclear Information System (INIS)

    Ado, Ayman A.; Al-Swat, Khalid; Azzam, N.; Al-Faleh, Faleh; Ahmed, S.

    2007-01-01

    We tested the clinical utility of the platelet count, aspartate aminotransferase/alanine aminotransferase (AST/ALT) ratio, and the AST to platelet ratio index (APRI) score in predicting the presence or absence of advanced fibrosis and cirrhosis in patients with chronic hepatitis C in Saudi Arabia. Liver biopsy procedures performed on chronic hepatitis C patients in our gastroenterology unit at King Khalid University Hospital were traced form records between 1998 to 2003. The hospital computer database was then accessed and detailed laboratory parameters obtained. By plotting receiver operating characteristic curves (ROC), three selected models (platelet count, AST/ALT ratio and the APRI score) were compared in terms of the best variable to predict significant fibrosis. Two hundred and forty-six patients with hepatitis C were included in this analysis. Overall, 26% of patients had advanced fibrosis. When comparing the three above mentioned prediction models, APRI score was the one associated with the highest area under the curve (AUC) = 0.812 (95%Cl, 0.756-0.868) on the ROC curves, compared to the platelet count and AST/ALT ratio, which yielded an AUC of 0.783 (0.711-0.855) and 0.716 (0.642-0.789), respectively. The APRI score seemed to be the best predictive variable for the presence or absence of advanced fibrosis in Saudi hepatitis C patients. (author)

  10. Prostate Health Index (Phi and Prostate Cancer Antigen 3 (PCA3 significantly improve prostate cancer detection at initial biopsy in a total PSA range of 2-10 ng/ml.

    Directory of Open Access Journals (Sweden)

    Matteo Ferro

    Full Text Available Many efforts to reduce prostate specific antigen (PSA overdiagnosis and overtreatment have been made. To this aim, Prostate Health Index (Phi and Prostate Cancer Antigen 3 (PCA3 have been proposed as new more specific biomarkers. We evaluated the ability of phi and PCA3 to identify prostate cancer (PCa at initial prostate biopsy in men with total PSA range of 2-10 ng/ml. The performance of phi and PCA3 were evaluated in 300 patients undergoing first prostate biopsy. ROC curve analyses tested the accuracy (AUC of phi and PCA3 in predicting PCa. Decision curve analyses (DCA were used to compare the clinical benefit of the two biomarkers. We found that the AUC value of phi (0.77 was comparable to those of %p2PSA (0.76 and PCA3 (0.73 with no significant differences in pairwise comparison (%p2PSA vs phi p = 0.673, %p2PSA vs. PCA3 p = 0.417 and phi vs. PCA3 p = 0.247. These three biomarkers significantly outperformed fPSA (AUC = 0.60, % fPSA (AUC = 0.62 and p2PSA (AUC = 0.63. At DCA, phi and PCA3 exhibited a very close net benefit profile until the threshold probability of 25%, then phi index showed higher net benefit than PCA3. Multivariable analysis showed that the addition of phi and PCA3 to the base multivariable model (age, PSA, %fPSA, DRE, prostate volume increased predictive accuracy, whereas no model improved single biomarker performance. Finally we showed that subjects with active surveillance (AS compatible cancer had significantly lower phi and PCA3 values (p<0.001 and p = 0.01, respectively. In conclusion, both phi and PCA3 comparably increase the accuracy in predicting the presence of PCa in total PSA range 2-10 ng/ml at initial biopsy, outperforming currently used %fPSA.

  11. Complete Au@ZnO core-shell nanoparticles with enhanced plasmonic absorption enabling significantly improved photocatalysis

    Science.gov (United States)

    Sun, Yiqiang; Sun, Yugang; Zhang, Tao; Chen, Guozhu; Zhang, Fengshou; Liu, Dilong; Cai, Weiping; Li, Yue; Yang, Xianfeng; Li, Cuncheng

    2016-05-01

    Nanostructured ZnO exhibits high chemical stability and unique optical properties, representing a promising candidate among photocatalysts in the field of environmental remediation and solar energy conversion. However, ZnO only absorbs the UV light, which accounts for less than 5% of total solar irradiation, significantly limiting its applications. In this article, we report a facile and efficient approach to overcome the poor wettability between ZnO and Au by carefully modulating the surface charge density on Au nanoparticles (NPs), enabling rapid synthesis of Au@ZnO core-shell NPs at room temperature. The resulting Au@ZnO core-shell NPs exhibit a significantly enhanced plasmonic absorption in the visible range due to the Au NP cores. They also show a significantly improved photocatalytic performance in comparison with their single-component counterparts, i.e., the Au NPs and ZnO NPs. Moreover, the high catalytic activity of the as-synthesized Au@ZnO core-shell NPs can be maintained even after many cycles of photocatalytic reaction. Our results shed light on the fact that the Au@ZnO core-shell NPs represent a promising class of candidates for applications in plasmonics, surface-enhanced spectroscopy, light harvest devices, solar energy conversion, and degradation of organic pollutants.Nanostructured ZnO exhibits high chemical stability and unique optical properties, representing a promising candidate among photocatalysts in the field of environmental remediation and solar energy conversion. However, ZnO only absorbs the UV light, which accounts for less than 5% of total solar irradiation, significantly limiting its applications. In this article, we report a facile and efficient approach to overcome the poor wettability between ZnO and Au by carefully modulating the surface charge density on Au nanoparticles (NPs), enabling rapid synthesis of Au@ZnO core-shell NPs at room temperature. The resulting Au@ZnO core-shell NPs exhibit a significantly enhanced plasmonic

  12. Analysis and Prediction on Vehicle Ownership Based on an Improved Stochastic Gompertz Diffusion Process

    Directory of Open Access Journals (Sweden)

    Huapu Lu

    2017-01-01

    Full Text Available This paper aims at introducing a new improved stochastic differential equation related to Gompertz curve for the projection of vehicle ownership growth. This diffusion model explains the relationship between vehicle ownership and GDP per capita, which has been studied as a Gompertz-like function before. The main innovations of the process lie in two parts: by modifying the deterministic part of the original Gompertz equation, the model can present the remaining slow increase when the S-shaped curve has reached its saturation level; by introducing the stochastic differential equation, the model can better fit the real data when there are fluctuations. Such comparisons are carried out based on data from US, UK, Japan, and Korea with a time span of 1960–2008. It turns out that the new process behaves better in fitting curves and predicting short term growth. Finally, a prediction of Chinese vehicle ownership up to 2025 is presented with the new model, as China is on the initial stage of motorization with much fluctuations in growth.

  13. Addition of 2-(ethylamino)acetonitrile group to nitroxoline results in significantly improved anti-tumor activity in vitro and in vivo.

    Science.gov (United States)

    Mitrović, Ana; Sosič, Izidor; Kos, Špela; Tratar, Urša Lampreht; Breznik, Barbara; Kranjc, Simona; Mirković, Bojana; Gobec, Stanislav; Lah, Tamara; Serša, Gregor; Kos, Janko

    2017-08-29

    Lysosomal cysteine peptidase cathepsin B, involved in multiple processes associated with tumor progression, is validated as a target for anti-cancer therapy. Nitroxoline, a known antimicrobial agent, is a potent and selective inhibitor of cathepsin B, hence reducing tumor progression in vitro and in vivo . In order to further improve its anti-cancer properties we developed a number of derivatives using structure-based chemical synthesis. Of these, the 7-aminomethylated derivative (compound 17 ) exhibited significantly improved kinetic properties over nitroxoline, inhibiting cathepsin B endopeptidase activity selectively. In the present study, we have evaluated its anti-cancer properties. It was more effective than nitroxoline in reducing tumor cell invasion and migration, as determined in vitro on two-dimensional cell models and tumor spheroids, under either endpoint or real time conditions. Moreover, it exhibited improved action over nitroxoline in impairing tumor growth in vivo in LPB mouse fibrosarcoma tumors in C57Bl/6 mice. Taken together, the addition of a 2-(ethylamino)acetonitrile group to nitroxoline at position 7 significantly improves its pharmacological characteristics and its potential for use as an anti-cancer drug.

  14. Transverse charge and magnetization densities: Improved chiral predictions down to b=1 fms

    Energy Technology Data Exchange (ETDEWEB)

    Alarcon, Jose Manuel [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Hiller Blin, Astrid N. [Johannes Gutenberg Univ., Mainz (Germany); Vicente Vacas, Manuel J. [Spanish National Research Council (CSIC), Valencia (Spain). Univ. of Valencia (UV), Inst. de Fisica Corpuscular; Weiss, Christian [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2018-03-01

    The transverse charge and magnetization densities provide insight into the nucleon’s inner structure. In the periphery, the isovector components are clearly dominant, and can be computed in a model-independent way by means of a combination of chiral effective field theory (cEFT) and dispersion analysis. With a novel N=D method, we incorporate the pion electromagnetic formfactor data into the cEFT calculation, thus taking into account the pion-rescattering effects and r-meson pole. As a consequence, we are able to reliably compute the densities down to distances b1 fm, therefore achieving a dramatic improvement of the results compared to traditional cEFT calculations, while remaining predictive and having controlled uncertainties.

  15. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  16. Improved Outcome Prediction Using CT Angiography in Addition to Standard Ischemic Stroke Assessment: Results from the STOPStroke Study

    Science.gov (United States)

    González, R. Gilberto; Lev, Michael H.; Goldmacher, Gregory V.; Smith, Wade S.; Payabvash, Seyedmehdi; Harris, Gordon J.; Halpern, Elkan F.; Koroshetz, Walter J.; Camargo, Erica C. S.; Dillon, William P.; Furie, Karen L.

    2012-01-01

    Purpose To improve ischemic stroke outcome prediction using imaging information from a prospective cohort who received admission CT angiography (CTA). Methods In a prospectively designed study, 649 stroke patients diagnosed with acute ischemic stroke had admission NIH stroke scale scores, noncontrast CT (NCCT), CTA, and 6-month outcome assessed using the modified Rankin scale (mRS) scores. Poor outcome was defined as mRS>2. Strokes were classified as “major” by the (1) Alberta Stroke Program Early CT Score (ASPECTS+) if NCCT ASPECTS was≤7; (2) Boston Acute Stroke Imaging Scale (BASIS+) if they were ASPECTS+ or CTA showed occlusion of the distal internal carotid, proximal middle cerebral, or basilar arteries; and (3) NIHSS for scores>10. Results Of 649 patients, 253 (39.0%) had poor outcomes. NIHSS, BASIS, and age, but not ASPECTS, were independent predictors of outcome. BASIS and NIHSS had similar sensitivities, both superior to ASPECTS (p10/BASIS+ had poor outcomes, versus 21.5% (77/358) with NIHSS≤10/BASIS− (p10/BASIS+ compared to patients who are NIHSS≤10/BASIS−; the odds ratio is 5.4 (95% CI: 3.5 to 8.5) when compared to patients who are only NIHSS>10 or BASIS+. Conclusions BASIS and NIHSS are independent outcome predictors. Their combination is stronger than either instrument alone in predicting outcomes. The findings suggest that CTA is a significant clinical tool in routine acute stroke assessment. PMID:22276182

  17. Common and rare variants in the exons and regulatory regions of osteoporosis-related genes improve osteoporotic fracture risk prediction.

    Science.gov (United States)

    Lee, Seung Hun; Kang, Moo Il; Ahn, Seong Hee; Lim, Kyeong-Hye; Lee, Gun Eui; Shin, Eun-Soon; Lee, Jong-Eun; Kim, Beom-Jun; Cho, Eun-Hee; Kim, Sang-Wook; Kim, Tae-Ho; Kim, Hyun-Ju; Yoon, Kun-Ho; Lee, Won Chul; Kim, Ghi Su; Koh, Jung-Min; Kim, Shin-Yoon

    2014-11-01

    Osteoporotic fracture risk is highly heritable, but genome-wide association studies have explained only a small proportion of the heritability to date. Genetic data may improve prediction of fracture risk in osteopenic subjects and assist early intervention and management. To detect common and rare variants in coding and regulatory regions related to osteoporosis-related traits, and to investigate whether genetic profiling improves the prediction of fracture risk. This cross-sectional study was conducted in three clinical units in Korea. Postmenopausal women with extreme phenotypes (n = 982) were used for the discovery set, and 3895 participants were used for the replication set. We performed targeted resequencing of 198 genes. Genetic risk scores from common variants (GRS-C) and from common and rare variants (GRS-T) were calculated. Nineteen common variants in 17 genes (of the discovered 34 functional variants in 26 genes) and 31 rare variants in five genes (of the discovered 87 functional variants in 15 genes) were associated with one or more osteoporosis-related traits. Accuracy of fracture risk classification was improved in the osteopenic patients by adding GRS-C to fracture risk assessment models (6.8%; P risk in an osteopenic individual.

  18. Protein secondary structure prediction for a single-sequence using hidden semi-Markov models

    Directory of Open Access Journals (Sweden)

    Borodovsky Mark

    2006-03-01

    Full Text Available Abstract Background The accuracy of protein secondary structure prediction has been improving steadily towards the 88% estimated theoretical limit. There are two types of prediction algorithms: Single-sequence prediction algorithms imply that information about other (homologous proteins is not available, while algorithms of the second type imply that information about homologous proteins is available, and use it intensively. The single-sequence algorithms could make an important contribution to studies of proteins with no detected homologs, however the accuracy of protein secondary structure prediction from a single-sequence is not as high as when the additional evolutionary information is present. Results In this paper, we further refine and extend the hidden semi-Markov model (HSMM initially considered in the BSPSS algorithm. We introduce an improved residue dependency model by considering the patterns of statistically significant amino acid correlation at structural segment borders. We also derive models that specialize on different sections of the dependency structure and incorporate them into HSMM. In addition, we implement an iterative training method to refine estimates of HSMM parameters. The three-state-per-residue accuracy and other accuracy measures of the new method, IPSSP, are shown to be comparable or better than ones for BSPSS as well as for PSIPRED, tested under the single-sequence condition. Conclusions We have shown that new dependency models and training methods bring further improvements to single-sequence protein secondary structure prediction. The results are obtained under cross-validation conditions using a dataset with no pair of sequences having significant sequence similarity. As new sequences are added to the database it is possible to augment the dependency structure and obtain even higher accuracy. Current and future advances should contribute to the improvement of function prediction for orphan proteins inscrutable

  19. Significant improvements of electrical discharge machining performance by step-by-step updated adaptive control laws

    Science.gov (United States)

    Zhou, Ming; Wu, Jianyang; Xu, Xiaoyi; Mu, Xin; Dou, Yunping

    2018-02-01

    In order to obtain improved electrical discharge machining (EDM) performance, we have dedicated more than a decade to correcting one essential EDM defect, the weak stability of the machining, by developing adaptive control systems. The instabilities of machining are mainly caused by complicated disturbances in discharging. To counteract the effects from the disturbances on machining, we theoretically developed three control laws from minimum variance (MV) control law to minimum variance and pole placements coupled (MVPPC) control law and then to a two-step-ahead prediction (TP) control law. Based on real-time estimation of EDM process model parameters and measured ratio of arcing pulses which is also called gap state, electrode discharging cycle was directly and adaptively tuned so that a stable machining could be achieved. To this end, we not only theoretically provide three proved control laws for a developed EDM adaptive control system, but also practically proved the TP control law to be the best in dealing with machining instability and machining efficiency though the MVPPC control law provided much better EDM performance than the MV control law. It was also shown that the TP control law also provided a burn free machining.

  20. Reestablishing Open Rotor as an Option for Significant Fuel Burn Improvements

    Science.gov (United States)

    Van Zante, Dale

    2011-01-01

    A low-noise open rotor system is being tested in collaboration with General Electric and CFM International, a 50/50 joint company between Snecma and GE. Candidate technologies for lower noise will be investigated as well as installation effects such as pylon integration. Current test status is presented as well as future scheduled testing which includes the FAA/CLEEN test entry. Pre-test predictions show that Open Rotors have the potential for revolutionary fuel burn savings.